Google Takeout Archives - Downloading ALL the zip files at once?

https://www.reddit.com/r/google/comments/3v5cyj/google_takeout_archives_downloading_all_the_zip/

created by CaptainBroccoli on 02/12/2015 at 14:59 UTC

6 upvotes, 8 top-level comments (showing 8)

Hey Reddit! I am following instructions similar to these - http://www.howtogeek.com/216189/how-to-create-and-download-an-archive-of-all-your-google-data/[1] - to download local copies of all my Google Photos data. When the archive was created, it split up the data into 74 2GB files. Is there an easy way to download all those 2GB files at once instead of manually clicking each of the 74 files?

1: http://www.howtogeek.com/216189/how-to-create-and-download-an-archive-of-all-your-google-data/

Comments

Comment by Slapbox at 02/12/2015 at 15:10 UTC

5 upvotes, 0 direct replies

Not that I'm aware of and it took me so long to download mine that they were removed before I could finish. Really crappy system if you ask me..

Comment by bwillard at 02/12/2015 at 16:12 UTC

4 upvotes, 2 direct replies

I don't know of a good way to start all the downloads at once. However if you select either tgz or tbz as the archive format instead of zip the archives will be chunked into much bigger chunks (~50GB) so there will be a lot less to download.

Another possible alternative (depending on how much Drive quota you have) is to select the "Add to drive" option instead of the "send download link via email" option, then you can use the drive sync client to sync everything down in the background.

(As a side note the reason the zips are chunked into 2GB parts is because on old zip clients they can't handle more than that and so for compatibility reasons the zip are limited at 2GB)

Comment by redditrigal at 25/10/2024 at 09:09 UTC

2 upvotes, 1 direct replies

I know this might seem trivial, but initially, I was clicking each link individually. This started the download but also caused the page to refresh, which was really annoying due to the delay. Then I realized I could just click with the mouse wheel and… voilà! I still had to make 300+ clicks, but at least I didn’t have to wait through 300+ delays. :-D

Comment by [deleted] at 17/03/2024 at 03:40 UTC

1 upvotes, 0 direct replies

Not sure if you're still in need of help. I had the same problem, and I just wrote a script to bulk download these files.

I used google takeout to send the zip files to google drive. Then I wrote a script to download them all https://github.com/Fallenstedt/google-photos-takeout[1][2]

1: https://github.com/Fallenstedt/google-photos-takeout

2: https://github.com/Fallenstedt/google-photos-takeout

Comment by Revolutionary_Neck88 at 13/04/2024 at 16:57 UTC

1 upvotes, 2 direct replies

I wrote a script recently. Keep changing start. I would suggest to download 20-30 at once depending on the kind of computer and internet speed you got.

var l = document.links;
var start = 1; 
var max = 20;
var index = -1;
for(var i = 0; i < l.length; i++) {
    if(l[i].href.indexOf("takeout.google.com/takeout/download") > -1) {
        var pathSplit = ("" + l[i].href).split("&");
        var newIndex = -1;
        for(var j = 0; j < pathSplit.length; j++){
            if(pathSplit[j].split("=")[0] == "i") {
                newIndex = pathSplit[j].split("=")[1];
                break;
            }
        }
        if(newIndex != index) {
            index = newIndex;
        } else {
            console.log(index + " same index found. will skip");
            continue;
        }
        if(index >= start && index < (start + max)) {
            console.log(index + ":" + i + ":" + l[i].href);
            window.open(l[i].href, '_blank');
        }
    }
}

Comment by kisscsaba92 at 23/06/2024 at 11:03 UTC*

1 upvotes, 0 direct replies

Hello, if anyone finds this thread today. I have 200+ 50GB package from takeout export on one of my brand account. On the another one only 15 package. I try test right now, to download the last 10 package simultaneously in CCleaner Browser. In Chrome the download speed slightly higher, but because of the high memory usage I didn't want to risk the process. In 8 hours all of them will be downloaded I hope so, beaucase next time I will try 20 package. The CCleaner Browser network usage around 170Mbps from the 1Gbps.

Comment by Air-Op at 10/01/2023 at 22:24 UTC

1 upvotes, 2 direct replies

I tried using the DownThemAll extension for chrome. It did not work.

Basically, I suspect that the URLS generated on the buttons redirect to a page that has a browser initiated download. It downloaded the web page that redirects me to the file...

the html file that google sends me is: pwd.htm it is almost 2MB in size, and has lots of javascript, and stuff for multiple languages, and it probably takes the the info from some sort of json query. So we may want to use a small account and do some monitoring of the browser communication with the server.

I'm thinking that I'm just going to set the file size to 30 Gigs or something and try again.

I also want the ability to pause it, and to throttle it... so a nice download manager would be good.

We may be able to make a chrome extension that specifically works with this site.

We could probably figure out the pattern for these, but authentication is a thing.

The authentication on my browser times out every few hours, and it takes a while to download the 5 that I am willing to do at a time.

So we should ask google to actually give us a list of URLS for the files, and perhaps keep a session open for a while. My data is not that sensitive.. it is mostly stuff that I want to keep.

I have 137 2GB files to download, and I am not super happy about this.

Comment by No-Order2813 at 02/11/2023 at 03:44 UTC

1 upvotes, 0 direct replies

I have noticed that the google URL from the Manage Exports page, has incremented URLs that have the increment change. The email I received from Google when the files were ready has all the downloads listed with "Download 45 of N". These URLs are also incremented.

&#x200B;

https://preview.redd.it/w4sj3f5zvuxb1.png?width=268&amp;format=png&amp;auto=webp&amp;s=674a4275b8cec766683ec37499af7b3ae8824006

I am just looking to extract the links now and use a tool like Free Download manager or similar multi-thread download manager to grab them. I could just change the increment number and add them in but I am all about automation so I will explore a repeatable idempotent way to accomplish this since I have a lot of Takeout downloads I am creating. Even at 50GB chunks I want automation.

Just some FYI in case someone else finds it useful.