I have the following code:
var previousPageUrl= document.referrer;
alert(previousPageUrl);
This will not work if the previous page url is of any external site, i.e., not of my application.
For example:
If I am in Page 1 and went to Page 2 of my application then I will get page 1 url in referrer in Page 2 load but when I go to external site say www.google.com then again when I come back to page 1 the I will not get www.google.com as referrer url.
Can Somebody tell to resove this issue.
Generally, Referer URLs are passed between unrelated sites when navigation occurs due to a link click or JavaScript-based navigation. Referer URLs are NOT sent if the user uses the browser's chrome (e.g. address bar, back/forward buttons/etc) to navigate.
For security/privacy reasons, the Referer URL is stripped out when navigating from a HTTPS site to a HTTP site (e.g. from https://google.com to http://example.com). It can also be deliberately stripped out via a variety of JavaScript and HTML tricks. There is no way to disable this behavior to get the Referer URL if it has been stripped.
Related
I need help with a weird problem. I have an .aspx page where I'm placing the canonical URL of the page like this:
<link rel="canonical" href="http://example.com/page.aspx" />
When I access the page via HTTP it displays exactly as expected, but when I access it via HTTPS, the text in the canonical href attribute changes to HTTPS. I don't want that, I want it just as I wrote it there HTTP
Is it possible that the .NET configuration does this? I don't find anything that could cause it. Is there a setting in IIS? Where should I look?
I've tried writing a custom text in the value of href and it displays the as expected both on HTTP and HTTPS
I've tried writing the https:// version in the href and it displays it as expected (https) both on HTTP and HTTPS
I've tried writing "//:link" (without the protocol) and it displays "//:link" on both HTTP and HTTPS
So to summarize the question, how do I get the canonical tag to display "http://" when I access it via HTTPS?
I've been banging my head on the walls for the last 2 days about this issue and can't figure it out.
Please help!
The whole thing started when I noticed Facebook Debugger and other crawler tools are unable to parse my page. Facebook throws a critical error saying that it cannot follow the redirect. I believe Search Engine bots are hitting the same end. The website is functioning normally via all major web browsers.
It's probably worth to mention I am experimenting with ASP.NET Routing, using Web Forms under IIS8.
Given a website (http://example.com), here's what happens.
Case 1: Trying to access the root, this is what I get with a Web Sniffer simulator
Case 1 observations:
First thing I notice is '302' redirect instead of '200 OK'. It gives a 302 redirect with or without leading 'www'.
I noticed is that the Location Header is simply "/", confirmed by the page from IIS that I cannot see with a regular browser, which says the page is moved to "/". I believe something messes up at this point and the crawler is unable to follow through for some reason.
Case 2: Trying to access a given category page with a Web Sniffer simulator
Case 2 observations:
As you might figured out already, identical to case 1. And once again Facebook debugger cannot go through it, resulting to a redirect it cannot follow.
Questions:
1: How can I force an absolute path in the location headers instead of relative and will this be sufficient for the crawlers to follow through?
2: What could cause a 302 redirect happening in the first place in both www and non-www versions of the website?
Your web application is most likely depending on a cookie. The application sends a Set-Cookie header and redirects to the same page in order to receive a new request with the cookie data available. Search engines / bots, the Facebook bot and your Web Sniffer simulator will not send that cookie data and hence the web application keeps sending the 302 redirect responses.
The solution is to change your application to not require cookies for just simply viewing your web pages.
I am creating a searching website using ASP.NET .On my one page I show URL of results.When I click on URL a new link is open but the URL path for the new link in the browser include loalhost:portnumbet.I do not want this in my URL.
For eg.
result
so on clicking result I go to browser where the URL is "https://localhost:8080//www.google.com"
why this localhost:8080 includes in the URL.
Thanks
When you are redirecting to the URL, you will not be adding any protocol information, so it will default to the current website/protocol.
For example;
Response.Redirect("www.google.com")
is not the same as;
Response.Redirect("http://www.google.com")
You need to add the fully qualified URL, otherwise it will believe it to berelative to the current website, therefore add the http(s):// to the redirect.
Regarding the X-Frame-Options (https://developer.mozilla.org/en-US/docs/The_X-FRAME-OPTIONS_response_header), I'm having a bit of a hard time parsing what the docs say and what I'm seeing. My understanding is that when the page returns SAMEORIGIN, browsers will only load the contents of the frame if the page that had the IFRAME came from the same domain.
I've got three machines. When I'm logged into SERVER-A, I navigate to a page that is hosted on SERVER-A. It contains an IFrame that loads a page from SERVER-B but it's in a different domain. This all works... but when I go to SERVER-C and browse to the same page (that's served from SERVER-A), it won't load. Looking at the IE Debugging Tools, the request for that IFramed page shows a status of aborted.
Ideas?
This is working as you'd expect from server C - you've stated that the iFrame shouldn't load in a page from a different domain in the X-Frame-Options, and it didn't. This security policy isn't applied for pages loaded from localhost, which sounds like it's what's happening here when you're on server A, similarly to this situation.
You haven't said which of the pages you've applied the X-Frame-Options to: it matters that it was on the page in the iFrame (i.e. on Server B in your setup). I don't think applying the header to server A will have made a difference.
I have a web application that have many pages and folders, I want to make the URL to this website fixed for all pages, example: if the web site is
www.testwebsite.com/home.aspx when I redirect to login.aspx (for example) I want the URL to be
www.testwebsite.com/home.aspx without any changes and so on
Any suggestions ?
You can do a Server.Transfer, instead of Response.Redirect from home.aspx to login.aspx. This will keep the url as home.aspx
Response.Redirect : Tells the browser to go and visit another url. So there is a response coming back to browser and then browser is navigating to the new page. So its like a new request now. You will see the new page url in your address bar.
Server.Transfer : there will not be any "Redirect" response coming back to browser. The Server itself change the destination page. So the client browser does not know that its another page. So the url will not be changed.The Transfer method preserves the QueryString and Form collections.