How Selenium WebDriver overcome same origin policy?
Same origin policy problem is in Selenium RC
First of all “Same Origin Policy” is introduced for security
reason, and it ensures that content of your site will never be
accessible by a script from another site. As per the policy, any code
loaded within the browser can only operate within that website’s
domain.
--------------------------------------------------------------------------------- ----------------------------------------------What it did???
Same Origin policy prohibits JavaScript code from accessing elements from a domain that is different from where it was launched.
Example, the HTML code in www.google.com uses a JavaScript program
"testScript.js". The same origin policy will only allow testScript.js to access pages within google.com such as google.com/mail, google.com/login, or google.com/signup. However, it cannot access pages from different sites such as
yahoo.com/search or fbk.com because they belong to different domains.
This is the reason why prior to Selenium RC, testers needed to install local copies of both Selenium Core (a JavaScript program) and the web server containing the web application being tested so they would belong to the same domain. ------------------------------------------------------------------------------------------------------------------------------------ How it is avoided???
To avoid “Same Origin Policy” proxy injection method is used, in
proxy injection mode the Selenium Server acts as a client
configured HTTP proxy , which sits between the browser and application
under test and then masks the AUT under a fictional URL
Selenium uses java script to drives tests on a browser; Selenium injects its own js to the response which is returned from aut. But there is a java script security restriction (same origin policy) which lets you modify html of page using js only if js also originates from the same domain as html. This security restriction is of utmost important but spoils the working of Selenium. This is where Selenium server comes to play an important role.
Before Selenium WebDriver, Selenium was "Javascript Task Runner". It would set itself up as a server (locally), and open a browser pointed to the Selenium server running locally. So the browser is now talking to the Selenium Server running locally.
This is a problem though, because the browser is getting a script from Selenium which tells it that it wants to fetch resources from http://websitetotest.com. But the browser got this script from http://127.0.0.1:9000/selenium (for example). The browser says "hey this script came from local host and now it's requesting a resource from some outside website. This violated the same-origin-policy.
WebDriver came along and created a proxy to trick the browser into thinking that it is talking to the same server where both Selenium and the websitetotest are "located". Abhishek provided a concise explanation on this.
This might be a late reply but, if you are referring to selenium webdriver and not selenium RC then the answer is you dont have to worry about same origin policy in case of webdriver since each browser has its own webdriver.This is the whole advantage of webdriver as opposed to RC i.e no selenium core injection into the browser and no middleware client server between the browser and AUT.Webdriver provides a native OS level support in controlling the browser automation.
Related
I am working on an API in .NET core 2.
Everything works great when testing on https://localhost:44333, but when trying on http://localhost:44333 it does not work anymore. It just loads, and loads, and loads.... Nothing to see in the logs or anything like that.
The thing is, I need to get it working on HTTP because I want to try it on my phone in the app. So I use iisexpress-proxy to proxy it. This works when I can access the API on HTTP, but it doesn't work with HTTPS.
So therefor I need it to work with HTTP, but I have no idea why it does not work on HTTP. All my previous projects worked fine on HTTP and for some reason this one does not. I have looked in my startup if it might be forced or something like that, but I cannot find any...
You probably need more information than this, but I don't know what you need, so If you ask in the comments I will provide some more information/logs/code you name it.
The http version will be served on a different port. You'll need to look at your project properties to see which port it's being served on.
Just as some background:
There's effectively a client-side and server-side component to SSL. The http or https is the client-side component. That means the browser or other web client will either try to negotiate a secure socket or not, respectively. The server-side component is the port binding, which will either be a secure socket or not.
The forever-loading is because your client is trying to make a non-secure request, but the server's socket is attempting to negotiate SSL. It's like one person speaking Chinese and the other speaking Spanish. They're both communicating, but nothing gets accomplished.
I have seen many posts about this and there are some conflicting messages or statements that I can't verify myself. I am using selenium IDE and I am trying to "Expire" or "Delete" the ASP.SessionId within the browser.
It is stated that you cannot delete the ASP.SessionId cookie because it is set by the server as detailed (Http only). This I can't verify. I have opened up Charles, Fiddler and Visual Studio webtests and I only see the cookie as "HTTP" and not "HTTP only". That is the first issue.
I see the cookie being set and passed back and forth in the requests in Fiddler and visual Studio but when Selenium IDE tries to grab the cookie to a variable I define, Selenium IDE says it is not found. This is the second issue.
When I run the command "DeleteAllVisibileCookies" it is successful (step is marked green in the IDE), but there is no change to the cookies as they still exist.
Does Selenium IDE have the ability to see browser cookies (if so, what are the caveats) and manage them?
Note: I am able to capture the AspxAutoDetectCookieSupport cookie and view/set it successfully. Not sure what the difference is with the ASP.SessionId cookie.
This (ASPsessionID) cookie is not able to be set/controlled by Selenium IDE and is only controlled by the server. This cookie is specified as "HTTP" only and that prohibits Selenium IDE from making changes to that cookie.
I'm currently deploying a spring boot 1.5.1 application to pivotal cloud foundry. The Apps manager is displaying the Spring icon but i cant configure the log level or see any of the settings. I'm getting a browser 'mix content exception'. Apps manager is trying to access /cloudfoundryapplication/info over http instead of https and the browser is blocking the request. Is there a setting to force Apps manager to only use https?
Our team encountered a similar issue. We feel it has nothing to do with the apps manager but rather as to how our app behaves.
In our case we had a bad configuration which was causing the URLS getting built as http when httpRequest.getScheme() was being called.
server.tomcat.internal-proxies: <ips other then your proxy>
Correcting this property in our case by letting it to default as defined here let the getScheme to be returned as https and there by when the call being made to /cloudfoundryapplication/info the scheme got built as https.
Also another suggestion made by one of our colleague which also resolves this issue but would not address the root cause is - fronting your application(highest precedence) with ForwardedHeaderFilter - this causes the X-FORWARDED-* headers to be available in your httpServletResquest as described here
I need to simulate web response for web requests during some tests. I was going to use fiddler core for that. So fiddler just acts like a proxy and Im able to set response for every request I like. But I need to run something like console application or standalone application to make fiddler core able to intercept the requests. And I need it to be somehow initialized inside my asp.net mvc test application, so that tester could access these fake data, by just using the urls, without the need to run fiddler or any other applications.
For now I tried to run my fiddler application in Controller action method, but it doesnt intercept anything.
I also tried to add URLMonInterop.SetProxyInProcess("127.0.0.1:"+ myPort, ""), but it doesnt work either.
Is there any way to self host fiddler core app and make it intercept the requests?
UPDATE:
In the end I managed to host fiddler core inside asp.net mvc app. I made initialization in a static method of a static class and it did the trick. Also, for some reason after calling shutdown and then performing initialization again I cant proxify anything. I even cal GC.Collect, nothing helps, but refreshing the host process, in my case IIS express.
As documented, SetProxyInProcess affects URLMon clients only, and .NET doesn't use URLMon for networking.
.NET clients typically pick up the current proxy setting automatically, but if you're running FiddlerCore in a different user account, that's not going to work (and you probably don't want your mocker to be messing with any traffic except your test application). So, instead you should configure your application explicitly to proxy its traffic through your FiddlerCore instance; see http://fiddlerbook.com/fiddler/help/hookup.asp#Q-DOTNET and http://fiddlerbook.com/fiddler/help/hookup.asp#Q-IIS and if your services are local http://fiddlerbook.com/fiddler/help/hookup.asp#Q-LocalTraffic
I'm trying to find a tool that will allow non-programmers to test files on a live server.
For example, they could modify an image on their computer, reload a webpage, then see the results of their work immediately.
I've tried finding a tool for this, because it seems obvious enough that someone must've thought of it, but a lot of software I see doesn't quite fit. A tool called Fiddler does this (they call it autoresponding) but it's Windows-only. I could change the hosts file to redirect to a local instance of nginx or something, but that seems difficult to maintain when all I really want is a simple tool that will something like this...
http://someserver.com/css/(.*) -> /home/user/localcss/$1
Does anybody have any recommendations?
Edit: Redirect clarification
Fiddler has this feature; just click the AutoResponder tab and map URLs to local files. Thousands of people do this every day.
See also video #5 here: http://www.fiddlerbook.com/fiddler/help/video/default.asp
I found Charles Proxy very useful for this
http://www.charlesproxy.com/documentation/tools/map-local/
Max's PAC solution was a life-saver so I'm providing more details (can't yet up vote)
To use a local version of, say, css files, create a file 'proxy.pac', which contains this function:
function FindProxyForURL(url, host)
{
// use regex to match requests ending with '.css'
// and redirect them to localhost
var regexpr = /.**\.css/;
if(regexpr.test(url))
{
return "PROXY localhost";
}
// Or else connect directly:
return "DIRECT";
}
Save 'proxy.pac' and point your browser to this file. In Firefox this is in Options > Advanced > Connection > Settings > Automatic Proxy Configuration URL
For best practice, also add a MIME type to your web server: map '.pac' to type 'application/x-ns-proxy-autoconfig'.
All requests to .css files will now be routed to localhost. Don't forget to ensure the file structure is the same on the proxy server.
In the case of CSS, it may well be easier to override CSS by using a local chrome. For example in Firefox, chrome/userContent.css. See http://kb.mozillazine.org/UserContent.css
It's been a while since I asked this question and I have an good technique that wasn't suggested.
PAC files are supported by all major browsers, and allow you to write a script that can redirect any individual request to a proxy server. So for example the proxy server could serve a PAC file, have the PAC file redirect whitelisted URLs to the proxy server, and return the local versions of these files. It can even support HTTPS.
Beware of one gotcha - Internet Explorer. It helpfully "caches" the results of this script incorrectly, so that if one URL on a domain is proxied, all URLs at that domain will be proxied. This feature can be disabled, however.
You can do this with the modify response rule in Requestly.
Using the local file option you can specify any file to be used as the response for the intercepted request.
According to their documentation it also supports hot reloading, i.e., as long as the file path remains the same, the rule will pick up the changes that you made.
As for the dynamic URL matching, they have support for regex and wildcards in their source filters
Note: This is currently only available in their desktop app.
If you want to implement this using their chrome extension ,which is what I personally did, you can use the Redirect rule paired with a mock server. Here is a page explaining this
You can setup a mock server / mock files endpoint within Requestly instead of using something nginx or a local server to do so. But this works only for text based content, not images
This would also bypass any setup on the tester's local machine. They would only need to install the extension. All you would have to do is send them the endpoint for your mock server and the redirect rule that you created.
Actually you can't do this because browsers don't allow files over http:// to access file on the local machine (just think a moment about it... What would happen if, for example, a malicious webpage loads some private files from your computer?).
Some browsers (e.g. Safari) allows files over file:// to access other file:// files, others don't, but no browser allows http:// to access file://.
Firefox has a feature called "Signed scripts", which are scripts digitally signed with a trusted certificate. They can ask the user to grant them access to the local hard drive. Look at this: http://www.mozilla.org/projects/security/components/signed-scripts.html
Do you mean the Fiddler Web Proxy (www.fiddler2.com)? There is a commercial Java-based alternative named Charles Web Proxy that may fit your needs.