I am using OWASP's ZAP tool for vulnerability scanning, it shows alert for "secure page browser cache" vulnerability. Below are the details of ZAP alert:
Risk: Medium
Reliability: Warning
Description: Secure page can be cached in browser. Cache control is not set in HTTP header nor HTML header. Sensitive content can be recovered from browser storage.
Solution: The best way is to set HTTP header with: 'Pragma: No-cache' and 'Cache-control: No-cache'.
Alternatively, this can be set in the HTML header by:
but some browsers may have problem using this method.
Can you please tell me how this vulnerability will affect my application if its not fixed and how an attacker will user it to hack the application.
The problem is that information that should be kept private can be viewed by anyone with access to the files in the browser's cache directory.
This is a problem particularly with shared computers. If caching is not set properly, then anyone using the shared computer can view the private web pages after the original user has logged off the site which is hosting the secure material.
This can also be a problem if the computer has malware which can read files. The malware can gather information from the browser cache and transfer it off the computer.
Your application will not malfunction if the cache headers are not set properly. However, you might expose your users to the consequences of their private information being misused.
Related
I inherited a complex AWS system from someone and I have practically no AWS experience. I'm reading documentation and doing training, but there's one thing I can't figure out: when someone hits a page served by CloudFront, are they able to make changes that affect the origin server?
I would have thought "no, they're just static pages", but I'm seeing evidence to the contrary. We have some Wordpress installs and I think the users are hitting Cloudfront when they log in through the admin panel remotely, but they're still able to make changes and publish content. I also at one point cached admin-ajax.php without allowing OPTIONS, PUT, PATCH, POST, and DELETE requests, thinking it doesn't matter because our front-end site doesn't use ajax. This broke the admin panel, which requires ajax, even when logged in directly to the origin server and bypassing Cloudfront.
The cached page would render in the browser, but when the user "makes changes" the browser is going to send a separate HTTP GET or POST request from the one that requested the page. The "changes" request will not be cached by CloudFront, but be forwarded to the origin server.
Note that you will need to configure CloudFront appropriately to respect the server's cache headers and treat parameters as cache keys, etc., in order for this to work properly. You can also configure CloudFront behaviors for a specific path, like /admin/* to prevent caching of the pages at that path among other things.
I was using Fiddler see on-the-field how web sites use cookies in their login systems. Although I have some HTTP knowledge, I'm just just learning about cookies and how they are used within sites.
Initially I assumed that when submitting the form I'd see no cookies sent, and that the response would contain some cookie info that would then be saved by the browser.
In fact, just the opposite seems to be the case. It is the request that's sending in info, and the server returns nothing.
When fiddling about the issue, I noticed that even with a browser cleaned of cookies, the client seems to always be sending a RequestVerificationToken to the server, even when just looking around withot being signed in.
Why is this so?
Thanks
Cookies are set by the server with the Set-Cookie HTTP response header, and they can also be set through JavaScript.
A cookie has a path. If the path of a cookie matches the path of the document that is being requested, then the browser will include all such cookies in the Cookie HTTP request header.
You must make sure to be careful when setting or modifying cookies in order to avoid XSS attacks against your users. As such, it might be useful to include a hidden and unique secret within your login forms, and use such secret prior to setting any cookies. Alternatively, you can simply check that HTTP Referer header matches your site. Otherwise, a malicious site can copy your form fields, and create a login form to your site on their site, and do form.submit(), effectively logging out your user, or performing a brute-force attack on your site through unsuspecting users that happen to be visiting the malicious web-site.
The RequestVerificationToken that you mention has nothing to do with HTTP Cookies, it sounds like an implementation detail that some sites written in some specific site-scripting language use to protect their cookie-setting-pages against XSS attacks.
When you hit a page on a website, usually the response(the page that you landed on) contains instructions from the server in the http response to set some cookies.
Websites may use these to track information about your behavior or save your preferences for future or short term.
Website may do so on your first visit to any page or on you visit to a particular page.
The browser would then send all cookies that have been set with subsequent request to that domain.
Think about it, HTTP is stateless. You landed on Home Page and clicked set by background to blue. Then you went to a gallery page. The next request goes to your server but the server does not have any idea about your background color preference.
Now if the request contained a cookie telling the server about your preference, the website would serve you your right preference.
Now this is one way. Another way is a session. Think of cookies as information stored on client side. But what if server needs to store some temporary info about you on server side. Info that is maybe too sensitive to be exposed in cookies, which are local and easily intercepted.
Now you would ask, but HTTP is stateless. Correct. But Server could keep info about you in a map, whose is the session id. this session id is set on the client side as a cookie or resent with every request in parameters. Now server is only getting the key but can lookup information about you, like whether you are logged in successfully, what is your role in the system etc.
Wow, that a lot of text, but I hope it helped. If not feel free to ask more.
Does a Silverlight video player share the HTTP connection with its host?
Here is the scenario: a web site is password protected. The web page contains a Silverlight control with a video player. The video player plays a video from the same web site. Will the credentials from the web browser be used by the video player? I use MediaElement.Source to specify where the video is coming from.
If not, how can I fix this?
That depends on the way it communicates with the server... for example the webrequest class can be set to use BrowserHTTP or ClientHTTP...
BrowserHTTP uses the browser's HTTP implementation including Referer, Cookies etc.
ClientHTTP allows you to manage HTTP setting like Cookies manually...
Using MediaElement.SetSource you can leverage whatever connection you please (BrowserHTTP / ClientHTPP) with you specific settings (including Authorization header / Cookies etc.) as log as that connection provides a Stream interface for the content...
Further details see
http://msdn.microsoft.com/en-us/library/system.net.browser.webrequestcreator.browserhttp%28v=vs.95%29.aspx
http://msdn.microsoft.com/en-us/library/system.net.browser.webrequestcreator.clienthttp%28v=vs.95%29.aspx
http://msdn.microsoft.com/en-us/library/cc838250%28v=vs.95%29.aspx
http://msdn.microsoft.com/en-us/library/cc190669%28v=vs.95%29.aspx
I would like to create web application with admin/checkout sections being secured. Assuming I have SSL set up for subdomain.mydomain.com I would like to make sure that all that top-secret stuff ;) like checkout pages and admin section is transferred securely. Would it be ok to structure my application as below?
subdomain.mydomain.com
adminSectionFolder
adminPage1.php
adminPage2.php
checkoutPagesFolder
checkoutPage1.php
checkoutPage2.php
checkoutPage3.php
homepage.php
loginPage.php
someOtherPage.php
someNonSecureFolder
nonSecurePage1.php
nonSecurePage2.php
nonSecurePage3.php
imagesFolder
image1.jpg
image2.jpg
image3.jpg
Users would access my web application via http as there is no need for SSL for homepage and similar. Checkout/admin pages would have to be accessed via https though (that I would ensure via .htaccess redirects). I would also like to have login form on every page of the site, including non-secure pages. Now my questions are:
if I have form on non-secure page e.g http://subdomain.mydomain.com/homepage.php and that form sends data to https://subdomain.mydomain.com/loginPage.php, is data being send encrypted as if it were sent from https://subdomain.mydomain.com/homepage.php? I do realize users will not see padlock, but browser still should encrypt it, is it right?
EDIT: my apologies.. above in bold I originally typed http but meant https, my bad
2.If on secure page loginPage.php (or any other accessed via https for that instance) I created session, session ID would be assigned, and in case of my web app. something like username of the logged in user. Would I be able to access these session variable from http://subdomain.mydomain.com/homepage.php to for example display greeting message? If session ID is stored in cookies then it would be trouble I assume, but could someone clarify how it should be done? It seems important to have username and password send over SSL.
3.Related to above question I think.. would it actually make any sense to have login secured via SSL so usenrame/password would be transferred securely, and then session ID being transferred with no SSL? I mean wouldnt it be the same really if someone caught username and password being transferred, or caught session ID? Please let me know if I make sense here cause it feels like I'm missing something important.
EDIT: I came up with idea but again please let me know if that would work. Having above, so assuming that sharing session between http and https is as secure as login in user via plain http (not https), I guess on all non secure pages, like homepage etc. I could check if user is already logged in, and if so from php redirect to https version of same page. So user fills in login form from homepage.php, over ssl details are send to backend so probably https://.../homepage.php. Trying to access http://.../someOtherPage.php script would always check if session is created and if so redirect user to https version of this page so https://.../someOtherPage.php. Would that work?
4.To avoid browser popping message "this page contains non secure items..." my links to css, images and all assets, e.g. in case of http://subdomain.mydomain.com/checkoutPage1.php should be absolute so "/images/image1.jpg" or relative so "../images/image1.jpg"? I guess one of those would have to work :)
wow that's long post, thanks for your patience if you got that far and any answers :) oh yeh and I use php/apache on shared hosting
If the SSL termination is on the webserver itself, then you'll probably need to configure seperate document roots for the secure and non-secure parts - while you could specify that these both reference the same physical directory, you're going to get tied in knots switching between the parts. Similarly if your SSL termination is before the webserver you've got no systematic separation of the secure and non-secure parts.
Its a lot tidier to separate out the secure and non-secure parts into seperate trees - note that if you have non-SSL content on a secure page, the users will get warning messages.
Regards your specific questions
NO - whether data is encrypted depends on where it is GOING TO, not where it is coming from
YES - but only if you DO NOT set the secure_only cookie flag - note that if you follow my recommendations above, you also need to ensure that the cookie path is set to '/'
the page which processes the username and password MUST be secure. If not then you are exposing your clients authentication details (most people use the same password for all the sites they visit) and anyone running a network sniffer or proxy would have access.
Your EDIT left me a bit confused. SSL is computationally expensive and slow - so you want to minimise its use - but you need to balance this with your users perception of security - don't keep switching from SSL to non-SSL, and although its perfectly secure for users to enter their details on a page served up by non-SSL which sends to a SSL page, the users may not understand this distinction.
See the first part of my answer above.
C.
I have a directory with my media files and I need no to display them on other sites.
Server doesn't support .htaccess, because it uses nginx.
How can I enable hotlink protection for my files??
Thank you.
Easiest way would be to check for the Referer header in HTTP request. Basically if that header does not have URL from your site, then this could be hot linking.
This has following problems:
Referrer header can be forged -> hot linking works
All user agents do not necessarily send the Referrer header -> legitimate user might not get the content.
You could also set a cookie when user is browsing your site, and check for existence of that cookie when user is accessing the streaming content.
The details may be dated, but Igor gives an example of referrer mapping for image hotlink protection that might be useful here: http://nginx.org/pipermail/nginx/2007-June/001082.html
If you decide to go the referrer route.
If you are using memcached you could also store store client IP addresses for a time and only serve up your streaming media if an unexpired client IP is found in the cache. The client IP gets cached during normal browsing ensuring that the person viewing your streaming content has also recently been visiting your site.
On my hostgator site, they used nginx as a proxy to Apache(nginx+apache). maybe that will help you. Also if you have access to the logs, if you see a lot of traffic that way from a ip I would investigate, and if it points to a site, then block the other web server. Php's file_get_contents doesn't get stopped by htaccess or anything else I know besides blocking the ip.