cookies - the misunderstanding - http

My goal is that I want to log into a site with some java code and do some work when logged in. (in order to write some java cooking handling I first need to understand how this all actually works)
The problem is that I can't quite figure out how to manage the cookie session.
Here's what I've observed when using chrome's dev. tools:
1) on first request for the site's url I send no cookies and I get some in response:
Set-Cookie:PHPSESSID=la2ek8vq9bbu0rjngl2o67aop6; path=/; domain=.mtel.bg; HttpOnly
Set-Cookie:PHPSESSID=d32cj0v5j4mj4nt43jbhb9hbc5; path=/; domain=.mtel.bg; HttpOnly
2) on moving on log page I now send (on HTTP GET):
Cookie:PHPSESSID=d32cj0v5j4mj4nt43jbhb9hbc5;
__utma=209782857.1075318979.1349352741.1349352741.1349352741.1;
__utmb=209782857.1.10.1349352741;
__utmc=209782857;
__utmz=209782857.1349352741.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none);
__atuvc=1%7C40
and indeed when I check my stored cookies (after my first get request for the site's main url) in chrome's cookie tab - they exist in the way they're passed on the next get/post request.
Can you explain what's really happening after my 1st cookies are received and why such name/value pairs are stored?

There's a Javascript from Google Analytics on the page which sets some cookies.

Related

Why can't I see cookie parameters after I log in?

I'm working on making a Symfony website secure. I have taken a look at this page: How to set secure and httponly attributes on Symfony 4 session
and applied the suggested change into framework.yaml:
session:
handler_id: ~
cookie_secure: true
cookie_httponly: true
When I log in, in the network tab I see three instances of Set-Cookie. The first is a cookie removal, having secure and HttpOnly attributes. The second is a cookie creation, where the cookie identifier has the secure and HttpOnly attributes. The third is setting some parameters in an HTTP-encoded manner, this one also has the secure and HttpOnly attributes. So far so good. However, when I go to any page, I have a Cookie attribute among the Request Headers which has the same identifier as the one which was created earlier, but the secure and HttpOnly attributes are not specified.
So, when I log in and the cookie is created I have the attributes I expect, but later, on visiting separate pages I no longer see them. Why is the secure and HttpOnly attribute not specified on later, after-login Request Headers? Did I miss something?
The security attributes are set by the server in the Response headers and the browser uses them to determine if it has to send the cookie along in the Request, but it never sends the attributes themselves, just the cookie value. If you inspect an ajax or unsecure request the cookie header should not appear in the request at all.
You can see some examples in the RFC6265.

Are my cookies really HTTP only? Flag is absent in Cookie request header

Ive made some configurations to (finally) have my cookies set on HTTP only.
"Seem" to work.
Ive tried them with postman and I have the following:
When I hit the login page:
On the cookies section, my cookie with name JSESSIONID appears to be HTTP only (it has the check)
When I enter to the logged area , the same result...
The headers dont give me more details.
Then,
I check it with google chrome. I open the developers toolbar.
I load the login page.
At the headers on the response headers I get
Set-Cookie: JSESSIONID=434434..... HttpOnly
So, its fine (I guess).
Then I reload the page (or sign in).
Then the problem:
No response headers received.
The Request Headers brings my cookie (with the same ID at then the previous one) without the httponly, host info or any other cookie value I set before.
At the cookies tab I get Request Cookies only and no Response cookie.
And the request cookie is non http-only
At my resources tab, the Cookie is there, as HTTP only and with the previous values I set.
My question now is... Is it a really http-only cookie? Or my configuration is not properly set?
Should I always get the response cookie or the request cookie should be always http-only (In case I am trying to set it as http-only) or is this behavior normal (or at least accepted) ?
When I try to print my cookie with Javascript at both scenarios I get a null as response (what makes me think then it is correct).
Ideas?
Client doesn't send cookie attributes other than name and value back to server.
See also RFC6265 section 4.2.2 (emphasis mine).
4.2.2. Semantics
Each cookie-pair represents a cookie stored by the user agent. The
cookie-pair contains the cookie-name and cookie-value the user agent
received in the Set-Cookie header.
Notice that the cookie attributes are not returned. In particular,
the server cannot determine from the Cookie header alone when a
cookie will expire, for which hosts the cookie is valid, for which
paths the cookie is valid, or whether the cookie was set with the
Secure or HttpOnly attributes.
Everything's behaving as specified.

FormsAuthentication cookie not sent by browser when clicking external links

We've noticed that for some users of our website, they have a problem that if they following links to the website from external source (specifically Outlook and MS Word) that they arrive at the website in such a way that User.IsAuthenticated is false, even though they are still logged in in other tabs.
After hours of diagnosis, it appears to be because the FormsAuthentication cookie is not sent sometimes when the external link is clicked. If we examine in Fiddler, we see different headers for links clicked within the website, versus the headers which are as a result of clicking a link in a Word document or Email. There doesn't appear to be anything wrong with the cookie (has "/" as path, no domain, and a future expiration date).
Here is the cookie being set:
Set-Cookie: DRYXADMINAUTH2014=<hexdata>; expires=Wed, 01-Jul-2015 23:30:37 GMT; path=/
Here is a request sent from an internal link:
GET http://domain.com/searchresults/media/?sk=creative HTTP/1.1
Host: domain.com
Cookie: Diary_SessionID=r4krwqqhaoqvt1q0vcdzj5md; DRYXADMINAUTH2014=<hexdata>;
Here is a request sent from an external (Word) link:
GET http://domain.com/searchresults/media/?sk=creative HTTP/1.1
Host: domain.com
Cookie: Diary_SessionID=cpnriieepi4rzdbjtenfpvdb
Note that the .NET FormsAuthentication token is missing from the second request. The problem doesn't seem to be affected by which browser is set as default and happens in both Chrome and Firefox.
Is this normal/expected behaviour, or there a way we can fix this?
Turns out this a known issue with Microsoft Word, Outlook and other MS Office products: <sigh>
See: Why are cookies unrecognized when a link is clicked from an external source (i.e. Excel, Word, etc...)
Summary: Word tries to open the URL itself (in case it's an Office document) but gets redirected as it doesn't have the authentication cookie. Due to a bug in Word, it then incorrectly tries to open the redirected URL in the OS's default browser instead of the original URL. If you monitor the the "process" column in Fiddler it's easy to see the exact behaviour from the linked article occurring:

Routes using cookie authentication from previous version of Paw no longer work on new version

I have a route that uses an authentication cookie set by another route. I've created it like this:
This method no longer works in the new version. Paw complains that there is no set-cookie header in the response from the Authenticate request.
It seems this is because Paw now takes the cookies and handles them differently from other headers. I like this approach because it should make this sort of authentication easier, but, unfortunately, it's not working like I would expect.
Here's how I have configured a newer request:
So, I've set the cookie header to the Response Cookies dynamic value which, I believe, should pass along the cookies set previously. I would think I should select the Authenticate request from the dropdown (since it's the response from this request that actually sets the cookie, but the cookie value disappears if I do that. Instead, I have left the request value as Current Request since that seems to contain the correct value.
I've also noticed the Automatically send cookies setting which I thought might be an easy solution. I removed the manual cookie header from my request leaving this checked in hopes it might automatically send over any cookies from the cookie jar along with the request, but that doesn't seem to work either. No matter what I try, my request fails to produce the desired results because of authentication.
Can you help me understand how to configure these requests so that I can continue using Paw to test session-authenticated routes?
Here are a couple of things that will make you understand how cookies work in Paw (starting from version 2.1):
1. Cookies are stored in Jars
To allow users to keep multiple synchronous sessions, cookies are stored in jars, so you can switch between sessions (jars) easily.
Cookies stored in jars will be sent only if they match the request (hostname, path, is secure, etc.).
2. Cookies from jars are sent by default, unless Cookie header is overridden
If you set a Cookie header manually, cookies stored in jars wont' be sent.
And obviously unless Automatically Send Cookies is disabled.
3. The previous use of "Response Headers" was hacky. Use Response Cookies.
In fact, Set-Cookie (for responses) and Cookie (for requests) have different syntaxes. So you can't send back the original value of Set-Cookie (even though it seemed to be working in most cases).
The new Response Cookies dynamic value you mentioned has this purpose: send back the cookies set by a specific request.
Now, in your case, I would use a Response Cookies dynamic value all the time. As you have only 1 request doing auth / cookie setting, it might be the easiest to handle. Also, maybe check Ignore Domain, Path, Is Secure and Date to make sure your cookie is always sent even if you switch the host (or something else).
Deleting the cookies in the cookie jar and hitting the authentication endpoint again seems to have fixed the problem. Not sure why, but it seems to be working now either manually sending a cookie or using the Automatically send cookies setting.

Are JSON web services vulnerable to CSRF attacks?

I am building a web service that exclusively uses JSON for its request and response content (i.e., no form encoded payloads).
Is a web service vulnerable to CSRF attack if the following are true?
Any POST request without a top-level JSON object, e.g., {"foo":"bar"}, will be rejected with a 400. For example, a POST request with the content 42 would be thus rejected.
Any POST request with a content-type other than application/json will be rejected with a 400. For example, a POST request with content-type application/x-www-form-urlencoded would be thus rejected.
All GET requests will be Safe, and thus not modify any server-side data.
Clients are authenticated via a session cookie, which the web service gives them after they provide a correct username/password pair via a POST with JSON data, e.g. {"username":"user#example.com", "password":"my password"}.
Ancillary question: Are PUT and DELETE requests ever vulnerable to CSRF? I ask because it seems that most (all?) browsers disallow these methods in HTML forms.
EDIT: Added item #4.
EDIT: Lots of good comments and answers so far, but no one has offered a specific CSRF attack to which this web service is vulnerable.
Forging arbitrary CSRF requests with arbitrary media types is effectively only possible with XHR, because a form’s method is limited to GET and POST and a form’s POST message body is also limited to the three formats application/x-www-form-urlencoded, multipart/form-data, and text/plain. However, with the form data encoding text/plain it is still possible to forge requests containing valid JSON data.
So the only threat comes from XHR-based CSRF attacks. And those will only be successful if they are from the same origin, so basically from your own site somehow (e. g. XSS). Be careful not to mistake disabling CORS (i.e. not setting Access-Control-Allow-Origin: *) as a protection. CORS simply prevents clients from reading the response. The whole request is still sent and processed by the server.
Yes, it is possible. You can setup an attacker server which will send back a 307 redirect to the target server to the victim machine. You need to use flash to send the POST instead of using Form.
Reference: https://bugzilla.mozilla.org/show_bug.cgi?id=1436241
It also works on Chrome.
It is possible to do CSRF on JSON based Restful services using Ajax. I tested this on an application (using both Chrome and Firefox).
You have to change the contentType to text/plain and the dataType to JSON in order to avaoid a preflight request. Then you can send the request, but in order to send sessiondata, you need to set the withCredentials flag in your ajax request.
I discuss this in more detail here (references are included):
http://wsecblog.blogspot.be/2016/03/csrf-with-json-post-via-ajax.html
I have some doubts concerning point 3. Although it can be considered safe as it does not alter the data on the server side, the data can still be read, and the risk is that they can be stolen.
http://haacked.com/archive/2008/11/20/anatomy-of-a-subtle-json-vulnerability.aspx/
Is a web service vulnerable to CSRF attack if the following are true?
Yes. It's still HTTP.
Are PUT and DELETE requests ever vulnerable to CSRF?
Yes
it seems that most (all?) browsers disallow these methods in HTML forms
Do you think that a browser is the only way to make an HTTP request?

Resources