https://drive.google.com/a/uci.edu/uc?export=download&confirm=LJ_a&id=0Bxy-54SBqeekTlE4Qy1mWWpsYTQ
I am attempting to use Wget to download the file above. However, it only generates 1 KB log file. I enter:
wget https://drive.google.com/a/uci.edu/uc?export=download&confirm=a-GD&id=0Bxy-54SBqeekTlE4Qy1mWWpsYTQ
However, this gives me a log file instead of actually downloading the file.
The file size is 13 GB tar. The log file looks like this:
--2017-11-14 13:59:32-- https://drive.google.com/a/uci.edu/uc export=download
Resolving drive.google.com (drive.google.com)... [IP ADDRESS GIVEN]
Connecting to drive.google.com (drive.google.com)|[IP ADDRESS GIVEN]... connected.
HTTP request sent, awaiting response... 400 Bad Request
2017-11-14 13:59:33 ERROR 400: Bad Request.
Open download link in the browser with incognito mode. Proceed normally as you would to download that file.
Click "Download" button. When download starts in the browser, pause it and copy the download link.
At command prompt, do
wget copied_link
It should work for any file. or you can directly download your file from here
Related
My institute recently installed a new proxy server for our network. I am trying to configure my Cygwin environment to be able to run wget and download data from a remote repository.
Browsing the internet I have found two different solutions to my problem, but no one of them seem to work in my case.
The first one I tried was to follow these instructions, so in Cygwin:
cd /cygdrive/c/cygwin64/etc/
nano wgetrc
at the end of the file, I added:
use_proxy = on
http_proxy=http://username:password#my.proxy.ip:my.port/
https_proxy=https://username:password#my.proxy.ip:my.port/
ftp_proxy=http://username:password#my.proxy.ip:my.port/
(of course, using my user and password)
The second approach was what was suggested by this SO post, so in my Cygwin environment:
export http_proxy=http://username:password#my.proxy.ip:my.port/
export https_proxy=https://username:password#my.proxy.ip:my.port/
export ftp_proxy=http://username:password#my.proxy.ip:my.port/
in both cases, if I try to test my wget, I get the following:
$ wget http://www.google.com
--2020-01-30 12:12:22-- http://www.google.com/
Resolving my.proxy.ip (my.proxy.ip)... 10.1XX.XXX.XX
Connecting to my.proxy.ip (my.proxy.ip)|10.1XX.XXX.XX|:8XXX... connected.
Proxy request sent, awaiting response... 407 Proxy Authentication Required
2020-01-30 12:12:22 ERROR 407: Proxy Authentication Required.
It looks like if my user and password are not ok, but I actually checked them on my browsers and my credentials work just fine.
Any idea on what this could be due to?
This problem was solved thanks to the suggestion of a User of the community AskUbuntu.
Basically, instead of editing the global configuration file wgetrc, I should have created a new .wgetrc with my proxy configuration in my Cygwin home directory.
In summary:
Step 1 - Create a .wgetrc file;
nano ~/.wgetrc
Step 2 - record in this file the proxy info:
use_proxy=on
http_proxy=http://my.proxy.ip:my.port
https_proxy=https://my.proxy.ip:my.port
ftp_proxy=http://my.proxy.ip:my.port
proxy_user=username
proxy_password=password
I was using wget to download a file,like this:
wget link/file.zip
the file.zip was about 100M, but I just receive 5552B:
and whatever I downloaded(large than 5552B, from some other hosts), I just receive 5552B!
The HTTP response header content-length was 5552B too!
I was using Ubuntu14.04, Are there some network configurations to solve this problem?
Thank you very much!!
I have a problem to make IIS to allow download file with Bitrate Throottling. I set limit file to 100kb/s. There is no problem without bitrate limitation. But with limit I have a problem.
I'm using a code similar to described in this article:
Securing Large Downloads Using C# and IIS 7
I also tried to switch off IIS Bitrate Throottling and control bitrate "by hand" calculating with TimeSpan the bitrate and using Thread.Sleep(10) in a while...
But all my tries was useless, I don't get any exceptions.
to test download I use wget, this way:
wget -t 1 http://db.realestate.ru/yrl/RealEstateExportToYandex.xml
(you can try it with wget for windows)
this is a 240Mb text file, wget always stops, at random position of downloading, 5% - 60% and throws this error message:
Read error at byte ... (Connection reset by peer).
May be the problem is not with IIS, because in may localhost is working well, but not online on highly loaded server.
Solved with this parameters specified in wget command:
wget -t 1 --header="Keep-Alive: 30000" -nv http://db.realestate.ru/yrl/RealEstateExportToYandex.xml
I used this code from the command prompt on a windows box (linux machine is at work):
ftp -u ftp://cran.R-project.org/incoming/ qdap_0.1.0.tar.gz
I used the info from:
https://github.com/hadley/devtools/wiki/Release
http://cran.r-project.org/doc/manuals/R-exts.html#Submitting-a-package-to-CRAN
I expected to see it show up here: ftp://cran.r-project.org/incoming/ but I do not see it.
Am I just being impatient or did my package not upload? Here is the command line output:
C:\Users\trinker\GitHub>ftp -u ftp://cran.R-project.org/incoming/ qdap_0.1.0.tar
.gz
Transfers files to and from a computer running an FTP server service
(sometimes called a daemon). Ftp can be used interactively.
FTP [-v] [-d] [-i] [-n] [-g] [-s:filename] [-a] [-A] [-x:sendbuffer] [-r:recvbuf
fer] [-b:asyncbuffers] [-w:windowsize] [host]
-v Suppresses display of remote server responses.
-n Suppresses auto-login upon initial connection.
-i Turns off interactive prompting during multiple file
transfers.
-d Enables debugging.
-g Disables filename globbing (see GLOB command).
-s:filename Specifies a text file containing FTP commands; the
commands will automatically run after FTP starts.
-a Use any local interface when binding data connection.
-A login as anonymous.
-x:send sockbuf Overrides the default SO_SNDBUF size of 8192.
-r:recv sockbuf Overrides the default SO_RCVBUF size of 8192.
-b:async count Overrides the default async count of 3
-w:windowsize Overrides the default transfer buffer size of 65535.
host Specifies the host name or IP address of the remote
host to connect to.
Notes:
- mget and mput commands take y/n/q for yes/no/quit.
- Use Control-C to abort commands.
(This was previously a comment and is being transferred to an answer here.)
Make sure you are not looking at a page cached earlier by your browser.
To perform the actual upload you might want to try the free cross platform FileZilla FTP software. You can upload and concurrently view the contents of the source directory on your machine (in the left pane) and the target directory on CRAN (in the right pane) and view a log of what is happening in the top pane and a progress indicator in the bottom pane. It also has a site manager to store the sites you upload to so you don't need to keep typing in their URL each time you do an upload.
Wordpress not using direct linking for the download links (looks like enterprise software developer who generate links dynamically to track installation).
Use wget http://wordpress.org/latest.tar.gz is not getting the right file name.
I dont want save in desktop and upload to server because I'm running slow internet connection.
I fail to see what the problem is:
marc#panic:~$ wget http://wordpress.org/latest.tar.gz
--2011-04-01 11:19:27-- http://wordpress.org/latest.tar.gz
Resolving wordpress.org... 72.233.56.139, 72.233.56.138
Connecting to wordpress.org|72.233.56.139|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [application/x-gzip]
Saving to: `latest.tar.gz'
[ <=> ] 2,365,766 1.09M/s