how to make curl accept secure cookies overt http connection - http

I am using curl to connect to an http server which sends back a secure flagged cookie, and I found out that curl doesn't handle such cookies (secure cookies received over http connection), in other words : even using -c cookieFile switch, such cookies are not saved.
A workaround is to use -D switch to save all headers then manually (externally to curl) read the cookie from the file and set it in the curl command to send it back to server.
I want to know if there is a possibility (may be I am missing some curl options) to make curl support such cookies ? I tried to look into curl manual but nothing useful to my use case.
Thanks in advance,

TL;DR: With recent versions of cURL it is no longer possible to save cookies with the secure attribute in conjunction with cookie related switches.
According the documentation cURL removed the ability to save cookies with the secure attribute in order to satisfy the RFC draft draft-ietf-httpbis-cookie-alone-01. This RFC draft mandates that secure cookies are only supposed to be handled, saved or overwritten by an HTTP client if said cookie was transferred over HTTPS.
I just stumbled over the exactly the same problem, so I can offer two alternatives:
use a cURL version before the feature was implemented
cURL < v7.46.0
see respective Github issue which led to this behaviour
dump the headers manually with curl -i or curl -D and extract the cookies
example to save all secure cookies and save them in a file cookies.txt
curl -i http://server.com | grep "Set-Cookie: " | sed 's/Set-Cookie: //g' > cookies.txt
Now, a cookie jar would be useless if you would not use the cookies inside. Especially regarding the second alternative, it may be necessary to remove the Secure attribute in order to make cURL send the saved cookies back to the web server.

Related

Curl redirect without sending the first POST

I'm using "curl -L --post302 -request PUT --data-binary #file " to post a file to a redirected address. At the moment the redirection is not optional since it will allow for signed headers and a new destination. The GET version works well. The PUT version under a certain file size threshold works also. I need a way for the PUT to allow itself to be redirected without sending the file on the first request (to the redirectorURL) and then only send the file when the POST is redirected to a new URL. In other words, I don't want to transfer the same file twice. Is this possible? According to the RFC (https://www.rfc-editor.org/rfc/rfc2616#section-8.2) it appears that a server may send a 100 "with an undeclared wait for 100 (Continue) status, applies only to HTTP/1.1 requests without the client asking to send its payload" so what I'm asking for may be thwarted by the server. Is there a way around this with one curl call? If not, two curl calls?
Try curl -L -T file $URL as the more "proper" way to PUT that file. (Often repeated by me: -X and --request should be avoided if possible, they cause misery.)
curl will use "Expect: 100" by itself in this case, but you'll also probably learn that servers widely don't care about supporting that anyway so it'll most likely still end up having to PUT twice...

Request parameters are not passed to the backend service

I configured a REST webservice (a Spring Boot webapplication) on WSO2 AM and used the default /* mapping for resources. My webservice takes an assignee (text) and file parameters.
When I perform the calls, I've noticed that request parameters are not forwarded (HTTP Headers are) to the backed services. For example:
curl -i -X POST -H "Content-Type: multipart/form-data" -H "X-PD20-BillingSubscriptionId: e87d4400-b05f-4f40-9c39-06ae4d28cf4d" -H "Authorization: Bearer rrxRV5F6jdkSBcEPXv7I1yFl2x8a" -F "documentFile=#src/test/resources/sample-files/test-fea-1firma.pdf" -F "assignee=bla.bla#gmail.com" http://api.linksmt.it:8280/fea/1.0.0/signRequest
As you can see, It's a form that posts 2 fields, one of them being a file and another a simple text field.
The call is succesfully forwarded to the backed service but without the actual fields values (the headers instead are correctly passed, though their keys are lower-cased, that is "X-PD20-BillingSubscriptionId" is passed as "x-pd20-billingsubscriptionid").
Any hint on why is this happening?
Thanks
Ok, the problem was the same as described in multipart form data file upload using WSO2 API manger ? and I had to uncomment the declarations for
within the $WSO2_AM/repository/conf/axis2/axis2.xml file (and restart the server).

How to convert curl request to browser URL

Is it possible to generate equivalent browser URL from all curl request.
Example :
If I am executing following
curl -v -X GET -H "Host:something.com" "http://foo.com/some?appAction=xux&a=1&b=2"
What will corresponding browser URL.(which I can hit directly in browser)
If by "equivalent" you mean "Using a host header claiming a different host than mentioned in the request URI", then the answer is: no, you can't. The browser will pull the host header from the entered URI.
You may be able to rewrite the headers using browser plugins.

What tool should I use to fetch HTTP header of a remote web server?

I am basically looking for something similar but simpler tool like cURL that fetches http header without the body. Not interested downloading the body. Noticed cURL seems to download the body and consumes unnecessary bandwidth for my need
use the -I flag to curl to make it issue a HEAD request, i.e., just the headers.
(not guaranteed to be exactly the same, but is supposed to be)
If you are using the libcurl library, the curl_easy_setopt() function has a CURLOPT_NOBODY option available, which causes libcurl to send a HEAD request to download just the headers, instead of a GET request to download the entire data.

How to perform an action when a remote (Http) file changed?

I want to create a script that checks an URL and perform an action (download + unzip) when the "Last-Modified" header of the remote file changed. I thought about fetching the header with curl but then I have to store it somewhere for each file and perform a date comparison.
Does anyone have a different idea using (mostly) standard unix tools?
thanks
A possible solution would be periodically running this algorithm on the client box.
Create a HTTP request indicating the If-Modified-Since header equal to the date of your local file. If the file does not exist yet do not include this header;
The server will either send you the file if it was changed since the If-Modified-Since header in the payload or send 304 Not Modified HTTP status.
If you receive a 200 OK HTTP status simply get the payload from the HTTP body and unzip the file.
If in the other hand you received a 304 Not Modified you know that your file is up-to-date.
Use the Last-Modified header to touch your local file. This way you will be in sync with the server datetime.
Another way would be for the server to push notifications (a broadcast package for example) when the file is changed. When the notification is received the client would then execute the above algorithm. This would imply code to live in the HTTP server that listens for file system changes and then broadcast them to interested parties.
Perhaps this info for the curl command is of some importance:
TIME CONDITIONS
HTTP allows a client to specify a time
condition for the document it
requests. It is If-Modified-Since or
If-Unmodified-Since. Curl allow you to
specify them with the -z/--time-cond
flag.
For example, you can easily make a
download that only gets performed if
the remote file is newer than a local
copy. It would be made like:
curl -z local.html
http://remote.server.com/remote.html
Or you can download a file only if the
local file is newer than the remote
one. Do this by prepending the date
string with a '-', as in:
curl -z -local.html
http://remote.server.com/remote.html
You can specify a "free text" date as
condition. Tell curl to only download
the file if it was updated since
yesterday:
curl -z yesterday
http://remote.server.com/remote.html
Curl will then accept a wide range of
date formats. You always make the date
check the other way around by
prepending it with a dash '-'.prepending it with a dash '-'.
To sum up, you will need:
curl command
touch command
some bash scripting
is Java applicable in your case? I did a similar thing in one of my homework using the Apache HTTPcore library, you need to add the header "If-Modified-Since" to your HTTP request before you send it to the server, if the status code of the response that you receive from the server is not 304 then you know that the file has changed since the time value that you're checking against.

Resources