Comment by Pretend_Compliant on 12/10/2024 at 05:02 UTC

8 upvotes, 2 direct replies (showing 2)

View submission: Urgent help needed: Downloading Google Takeout data before expiration

View parent comment

I have no idea. But I'm totally willing to try this if you think it might work. It seems like Google is throttling the downloads at the account level (like some aggregated total bandwidth allowance), but the guy who wrote this script was able to have them all download in parallel in some insanely short time. Would what you are talking about potentially allow that?

Replies

Comment by mustardhamsters at 13/10/2024 at 03:58 UTC

2 upvotes, 0 direct replies

GCP instances benefit from being on Google’s network, and are therefore quite fast for “internal” transfers. It’s worth a shot to try this, you might be able to get it to move quite quickly.

Comment by ApricotPenguin at 14/10/2024 at 04:25 UTC

1 upvotes, 0 direct replies

My suggestion for renting a VPS or VM is to eliminate your download speed as being the limation. If Google is throttling the download speeds from TakeOut, then it's probably not too much of a viable strategy.

Did some short digging, and some suggestions I found that may help.

Copy a few of the Takeout links, and throw it into JDownloader or a similar download manager. Can't do too many since the links apparently expire after some time.

https://superuser.com/a/1323476[1][2]

1: https://superuser.com/a/1323476

2: https://superuser.com/a/1323476

Of note - apparently you need to pause it in your browser, rather than cancelling, when you hand off the link to your download manager (Ref here - https://diegocarrasco.com/how-to-download-google-takeout-files-150gb%2B-with-wget-ok-a-remote-server/[3][4] )

3: https://diegocarrasco.com/how-to-download-google-takeout-files-150gb%2B-with-wget-ok-a-remote-server/

4: https://diegocarrasco.com/how-to-download-google-takeout-files-150gb%2B-with-wget-ok-a-remote-server/

Another option (which probably doesn't help you now) is to have the TakeOut data be stored in Google Drive, then install the Google Drive Client on your machine, mark everything as an offline file, then let it sync down

https://www.reddit.com/r/DataHoarder/comments/1665pxh/comment/jyipisd/[5][6]

5: https://www.reddit.com/r/DataHoarder/comments/1665pxh/comment/jyipisd/

6: https://www.reddit.com/r/DataHoarder/comments/1665pxh/comment/jyipisd/

Or you can use this script (https://github.com/Fallenstedt/google-takeout-sucks[7]) to download the files sent to Google Drive (source here - https://www.reddit.com/r/google/comments/3v5cyj/google%5C_takeout%5C_archives%5C_downloading%5C_all%5C_the%5C_zip/[8] )

7: https://github.com/Fallenstedt/google-takeout-sucks

8: https://www.reddit.com/r/google/comments/3v5cyj/google%5C_takeout%5C_archives%5C_downloading%5C_all%5C_the%5C_zip/

Oh! And apparently choosing tar.gz as an export option means you will get larger files, so fewer download links to go through.