How to upload part of a file with curl? - http

I know how to upload a whole file with curl, like
curl "http://192.168.1.133/***" -F "file=#./test123"
to upload test123,
what if I upload the first 10 bytes of test123, how to do it?

curl doesn't provide this feature by itself. You can however achieve the same end goal with using some other tools in combination.
How about cutting out the beginning of the file with head and make curl read that part from stdin?
$ head -c10 test123 | curl "http://192.168.1.133/***" -F "file=#-"

Related

CURL Command To Create A File On Server

I have a mini program/server built on one of my computers (Machine1) and I am trying to create or overwrite a file through cURL on another computer (Machine2). So Machine2 is connected to Machine1. Ive been looking through cURL's documentation for command that will do this but have had no luck and as well on stack overflow.
https://curl.haxx.se/docs/manpage.html
I have also tried the examples on this SO post:
HTTP POST and GET using cURL in Linux
Any idea as to what the command might be through command prompt? (equivalent of a POST command). I have tried so far using -O, -K, -C and a multitude of others which have not worked.
In command line, all you need to do is using curl --form to simulate a multipart/form-data POST request:
curl --form "testfile=#thefilename.jpg" http://<Machine2>/<Path>
testfile is the field name used for form, if you don't care, just use any english word.
# is used here to make file thefilename.jpg get attached in the post as a file upload. Refer to curl man doc.
In server side, URL http://<Machine2>/<Path> should be listened. When curl send the previous POST request, server side program should get it, extract the attached file (thefilename.jpg), and save to disk.

how do i make a GET request to Download a file using phphttpclient

Basically I would like to see an example on how to download a file.
I can't seem to find on phphttpclient website any doc or example about it.
In curl, the request would look like this:
curl -o hello.txt -H "X-Auth-Token: xxxxxxxxxxxxxxxxxxx" http://site_where_to download/hello.txt
Thanks.

post data in meteorjs when using curl

I have the following method to post data to server :
curl --ipv4 http://localhost:3000/api/tests/1 -d #test.csv
I am trying to post a file with curl to a meter app
In meteor I am not able to read the data because I cant attach a key to the curl option data arrives as the key itself
example
contents of test.csv = > 1,1,1
at server
console.log('route to host' , this.request.body); yields {{1,1,1} : ''}
And yes I even tried -F data=#test.csv with no success as well
How can I add a key and make the contents of the file as value when posting through curl?
basically -d for curl means read the file and use its content as data
If you start the data with the letter #, the rest should be a file name to read the data from, or - if you want curl to read the data from stdin. Multiple files can also be specified. Posting data from a file named 'foobar' would thus be done with --data #foobar. When --data is told to read from a file like that, carriage returns and newlines will be stripped out. If you don't want the # character to have a special interpretation use --data-raw instead.
in order to send the file itself youll need something like -F
(HTTP) This lets curl emulate a filled-in form in which a user has pressed the submit button. This causes curl to POST data using the Content-Type multipart/form-data according to RFC 2388. This enables uploading of binary files etc. To force the 'content' part to be a file, prefix the file name with an # sign. To just get the content part from a file, prefix the file name with the symbol <. The difference between # and < is then that # makes a file get attached in the post as a file upload, while the < makes a text field and just get the contents for that text field from a file.
Example, to send your password file to the server, where 'password' is
the name of the form-field to which /etc/passwd will be the input:
curl -F password=#/etc/passwd www.mypasswords.com
in your case probably use -F
curl --ipv4 http://localhost:3000/api/tests/1 -F data=
if you want to file to be uploaded as a file use -F data=#test.csv
This works!!
curl --ipv4 --data-urlencode "csv#test.csv" http://localhost:3000/api/tests/1
Hope this helps someone :)

Download all files of a particular type from a website using wget stops in the starting url

The following did not work.
wget -r -A .pdf home_page_url
It stop with the following message:
....
Removing site.com/index.html.tmp since it should be rejected.
FINISHED
I don't know why it only stops in the starting url, do not go into the links in it to search for the given file type.
Any other way to recursively download all pdf files in an website. ?
It may be based on a robots.txt. Try adding -e robots=off.
Other possible problems are cookie based authentication or agent rejection for wget.
See these examples.
EDIT: The dot in ".pdf" is wrong according to sunsite.univie.ac.at
the following cmd works for me, it will download pictures of a site
wget -A pdf,jpg,png -m -p -E -k -K -np http://site/path/
This is certainly because of the links in the HTML don't end up with /.
Wget will not follow this has it think it's a file (but doesn't match your filter):
page
But will follow this:
page
You can use the --debug option to see if it's the actual problem.
I don't know any good solution for this. In my opinion this is a bug.
In my version of wget (GNU Wget 1.21.3), the -A/--accept and -r/--recursive flags don't play nicely with each other.
Here's my script for scraping a domain for PDFs (or any other filetype):
wget --no-verbose --mirror --spider https://example.com -o - | while read line
do
[[ $line == *'200 OK' ]] || continue
[[ $line == *'.pdf'* ]] || continue
echo $line | cut -c25- | rev | cut -c7- | rev | xargs wget --no-verbose -P scraped-files
done
Explanation: Recursively crawl https://example.com and pipe log output (containing all scraped URLs) to a while read block. When a line from the log output contains a PDF URL, strip the leading timestamp (25 characters) and tailing request info (7 characters) and use wget to download the PDF.

Uploading a file on a URL

Can anyone help me to find out a unix command that is used to upload/download a file on/from an URL?
Particular URL in which i'm trying to upload/download is protected with an user id and password.
I guess curl serves this purpose but not aware of how to use it? Could you please give me sugegstions on this?
curl has a command line argument named -d (for data) and you can use it like this to send a file(you need to add a # before a file-name to have curl treat it as a file and not a value:
curl -X POST -d #myfilename http://example.com/upload
You can add multiple -d arguments if you need to send a FORM value along with your file. Like so:
curl -X POST -d #myfilename -d name=MyFile http://example.com/upload

Resources