Files hosted at Google Drive service can be shared publicly. Nevertheless, if they’re too big, Google pages advice that “Google Drive can’t scan this file for viruses” and ask the user for confirmation in order to proceed with the download.
If for any reason you’re trying to do an unattended download of the file or downloading it on a resource-limited environment, this will prevent direct use of the link.
It seems that google exposes a confirmation key with the advice page, that must match a value returned with the next request. So the process transits within four main urls, starting from the (supposed) direct public link of the file:
- https:// docs.google.com/open?id=…
- https:// docs.google.com/uc?id=…&export=download&revid=…
- https:// docs.google.com/uc?export=download&confirm=SqLb&id=…&revid=…/…
- https:// doc-0s-ak-docs.googleusercontent.com/docs/securesc/…
The first url redirects after various steps to a page in which the next url of the process can be catch with:
/href=”(\/uc\?export=download[^”]+)/
This will be the second url of the process: this is the one which will ends up setting a “confirm” random value inside this final step’s url. This value seems to have a corresponding and different one on the response url inside the page: this value inside the page must be captured in order to form a new url similar to the 3 step: this time the cookie (previously set by google) returned will match the url value and it’ll eventually (after some redirections) end on an url like that of the 4 step, which is finally the one which will directly download the file.
Too much hassle for a single clean command line…
So, I’ve written a perl script that does all the work. With the help of wget, of course, which can follow “302” HTTP redirections, and load and store cookies. Neat tool!
Here it is at git repository.
The script runs on *nix… and also on a Windows with Perl and wget in the PATH.
Jan/2019:
There is a new v2.0 version intended to resume partially downloaded files.
For now, it lives on a different branch than the usual v1.x gdown.pl. Note that v2.0 may require wget v17 at least. It can also introduce a little initial delay in the download because it must detect when the real download begins after all the Google Drive redirections (this is not needed in v1.x).
I’d like to have posted the code here, but wordpress templates are a complete mess… I’d prefer plain text. Really.
NOTE:
Next lines correspond to a previous version of gdown.pl: from v1.1 on, it already uses –no-check-certificate by default.
Certificates can pose a problem on your system, because Google uses https all the time:
ERROR: Certificate verification error for drive.google.com: unable to get local issuer certificate
To connect to drive.google.com insecurely, use `–no-check-certificate’.
Unable to establish SSL connection.
If this is the case, install the “ca-certificates” package on your system, or replace “wget” in one line at the end of the script for “wget –no-check-certificate”.
Awesome. Saved me a couple of hours of downloading to a local machine only to upload to a remote one.
;-)
Thank you very much; It helped me and saved my time.
I wonder what it’s license.
I’m working on a bash script that downloads an entire google drive public folder recursively, and it works fine. Soon I’ll upload it to github; but it will be better if your script is included.
Thanks :)
Hi Nour,
glad to see it’s been useful for you.
License is GPL 3 : I’ve updated gdrive script and git with this info ;-)
So you can use it as you wish.
Greetings!
Hi Circulosmeos,
Thanks a lot. I will make the final touches and upload it as soon as possible.
Greetings :)
Hi Circulosmeos,
I’ve uploaded the first (unstable) release of the `gdrive-dl` after weeks of development; it’s here: https://github.com/noureddin/gdrive-dl .
I’m trying to do many improvements to it including, but not limited to, making the confirming and downloading (of big files) internally, and porting the entire script (gdrive-dl) to a cross-platform language like python.
Greetings :)
congrats, Nour!
Hello Circulosmeos,
While I was working on gdrive-dl, I modified your script a little to support resuming downloading, in addition to some other improvements.
It’s at https://github.com/noureddin/gdrive-dl/blob/master/gdown.pl
I hope you like the changes. :)
Regards,
Hi Nour,
good job!
For anyone seeking the modified gdown.pl (Circulosmeos’ script, modified to be able to continue downloading, mainly), it’s now at https://github.com/noureddin/gdrive-dl/blob/b04158a2d967ac5dfdca54b62ca78087d5c92114/gdown.pl, because the link I mentioned earlier won’t work now.
Sorry for polluting your comments section, Circulosmeos! :D
Greetings!
There is now gdown.pl *v2.0* intended to resume partially downloaded files: See it here (for now, it lives on a different branch at github): https://github.com/circulosmeos/gdown.pl/tree/with-resume
Does this already work for entire folders? If so, how? Thanks again for fixing this ridiculous flaw!
Hi Wouter,
I haven’t tested it for folders…
just check it! If it doesn’t work, just send me an example url and I’ll take a look at it.
Greetings
You can use the script developed by Nour: see previous comments thread!
Pingback: Google Drive Direct Downloads of large files | Heimic
Pingback: Google Drive Direct Download of large files | Heimic
hey how can i use it on cpanel
I’ve never used cpanel, but at least this user seems to have tried it:
https://gist.github.com/ilaraujo/8b9635c1546de664b9898cebf72e2b9e
hello, does this method is faster than download it from google drive?
Well, it shouldn’t: this is meant to be used from a script, command console, etc, so that a big file can be downloaded from Google Drive in an unattended mode. (Also check the branch ‘with-resume’ at github to resume previous incomplete downloads).
Thanks. Any way to download a full folder.?
I haven’t tested that: have you tried it? Send me the error and I could take a look at this.
You can also try to use this fork, though I haven’t tested this myself: https://github.com/noureddin/gdrive-dl
Let me know how it goes!
Hey, I wanted to implement this as a part of my Java code, so you have a Java version of this perl script?
No, I don’t, but the code is short and should be easy to port to Java… The difficult part is to substitute the wget command for a Java object that does the same thing: getting urls whilst using cookies. Or just use the wget command from Java. Good luck!
I need use this code but for php, for a script.
May be you can just call the script from php using exec(): https://www.php.net/manual/en/function.exec.php