Delete ASP.SessionId cookie from within Selenium IDE - asp.net

I have seen many posts about this and there are some conflicting messages or statements that I can't verify myself. I am using selenium IDE and I am trying to "Expire" or "Delete" the ASP.SessionId within the browser.
It is stated that you cannot delete the ASP.SessionId cookie because it is set by the server as detailed (Http only). This I can't verify. I have opened up Charles, Fiddler and Visual Studio webtests and I only see the cookie as "HTTP" and not "HTTP only". That is the first issue.
I see the cookie being set and passed back and forth in the requests in Fiddler and visual Studio but when Selenium IDE tries to grab the cookie to a variable I define, Selenium IDE says it is not found. This is the second issue.
When I run the command "DeleteAllVisibileCookies" it is successful (step is marked green in the IDE), but there is no change to the cookies as they still exist.
Does Selenium IDE have the ability to see browser cookies (if so, what are the caveats) and manage them?
Note: I am able to capture the AspxAutoDetectCookieSupport cookie and view/set it successfully. Not sure what the difference is with the ASP.SessionId cookie.

This (ASPsessionID) cookie is not able to be set/controlled by Selenium IDE and is only controlled by the server. This cookie is specified as "HTTP" only and that prohibits Selenium IDE from making changes to that cookie.

Related

How do I fix console message: Cookie "ARRAffinity" will be soon rejected?

I have a static website on an Azure web server/portal that holds our company's documentation. Recently, I've been making changes to our code that sets our cookies to ensure that they comply with the browser SameSite requirement as explained here:
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie/SameSite
I've been able to fix all my scripts that create my cookies, but while testing them today, I see that there's this cookie message that still appears in my FireFox console:
Cookie “ARRAffinity” will be soon rejected because it has the
“sameSite” attribute set to “none” or an invalid value, without the
“secure” attribute. To know more about the “sameSite“ attribute, read
https://developer.mozilla.org/docs/Web/HTTP/Headers/Set-Cookie/SameSite
This message only appears when I clear the cache from the site and load the page. Once I reload the page a second time or load any other page after that, I no longer see the message.
I believe this ARRAffinity cookie technically comes from Azure's Application Insights (AI)--or something on the Azure web server. It doesn't appear in our javascript files at all. We use AI for our analytics. Here is the code snippet that we got from Azure about two years ago. It gets injected into the header of each .htm page on our site:
var appInsights=window.appInsights||function(a){
function b(a){c[a]=function(){var b=arguments;c.queue.push(function(){c[a].apply(c,b)})}}var c={config:a},d=document,e=window;setTimeout(function(){var b=d.createElement("script");b.src=a.url||"https://az416426.vo.msecnd.net/scripts/a/ai.0.js",d.getElementsByTagName("script")[0].parentNode.appendChild(b)});try{c.cookie=d.cookie}catch(a){}c.queue=[];for(var f=["Event","Exception","Metric","PageView","Trace","Dependency"];f.length;)b("track"+f.pop());if(b("setAuthenticatedUserContext"),b("clearAuthenticatedUserContext"),b("startTrackEvent"),b("stopTrackEvent"),b("startTrackPage"),b("stopTrackPage"),b("flush"),!a.disableExceptionTracking){f="onerror",b("_"+f);var g=e[f];e[f]=function(a,b,d,e,h){var i=g&&g(a,b,d,e,h);return!0!==i&&c["_"+f](a,b,d,e,h),i}}return c
}({
instrumentationKey:"<The Key>"
});
window.appInsights=appInsights,appInsights.queue&&0===appInsights.queue.length&&appInsights.trackPageView();
(Note that <The Key> in the snippet above is actually a unique multi-character string that Azure gave us when we set up and configured the AI resource. I removed it here for privacy.)
I've since revisited the site where I got that code, but the snippet has changed to something newer:
https://learn.microsoft.com/en-us/azure/azure-monitor/app/javascript#snippet-based-setup
I'm not sure if I need to do anything to fix this.
Does ARRAffinity cookie come from some server-side script that Microsoft creates?
Do I need to do anything on my side to resolve this console message? If so, what?
ARRAffinity cookie is automatically created by Azure. You can turn it off by going to Configuration --> General Settings and then click on Off in the App Service as shown below.
As your's is a static website, i don't think this would be an issue. In fact, it is recommenced to turn ARR Affinity to Off for any Cloud Native applications.
When ARR Affinity is turned off, all the App Service instances (in a load balanced env) will be used effectively.
If ARR Affinity is turned on, all the requests for a given session will be sent to the same server irrespective of the load on it.
By default, the setting is on to to support legacy applications that needs Session stickiness.

HTTP POST from app.example.com to localhost: session cookie not sent

I have two Spring Web applications that work together. I'm running the first application from the IDE on localhost, while the second one is running in docker on app.127.0.0.1.nip.io.
The two applications interact indirectly through the users browser by redirecting and POSTing between the two apps. This is slightly similar to how an SP and an IdP work together in SAML2.
In my case, the first application on localhost is sending a 302 to the second application. After doing some work, the second application sends an HTML page with a form an JS code to autosubmit it, back to my first application on localhost. The HTML looks similar to this:
<form method=POST action="http://localhost:8080/some/path">
...
</form>
My first application is using Spring Session with a session cookie, and this works just fine. However, when the second application makes the browser POST the form, the browser does not send the session cookie with the POST request.
When both applications are running in docker under .127.0.0.1.nip.io, the cookie is sent.
I've tried to find any hint if this behaviour is expected, and what headers or other bits the applications could use to influence this.
At this point, this is mostly an annoyance while debugging, but I'm concerned that once the two applications will run on different FQDNs and/or different domains, the browsers will also block the cookie being sent.
I've tested this with current versions of Chrome and Firefox.
The problem is the new(ish) SameSite cookie policy that covers exactly this case: another application is POSTing to a host via HTTP. The default now is SameSite: lax, which does not allow sending the first-party cookie values on this request.
The solution is to allow the session cookie to be sent by specifying SameSite: none. Be aware however that this might create security vulnerabilities. For my application, this is not an issue, so I can allow the cookie to always be sent, and especially when I run my application in the debugger.
For the production deployment, I will be able to tighten this, since both applications will run under the same domain (a.example.com and b.example.com), and both will use TLS, so I can set the session cookie to SameSite: lax.
Here's a decent explanation: https://web.dev/samesite-cookies-explained/

Postman is not using cookie

I've been using Postman in my app development for some time and never had any issues. I typically use it with Google Chrome while I debug my ASP.NET API code.
About a month or so ago, I started having problems where Postman doesn't seem to send the cookie my site issued.
Through Fiddler, I inspect the call I'm making to my API and see that Postman is NOT sending the cookie issued by my API app. It's sending other cookies but not the one it is supposed to send -- see below:
Under "Cookies", I do see the cookie I issue i.e. .AspNetCore.mysite_cookie -- see below:
Any idea why this might be happening?
P.S. I think this issue started after I made some changes to my code to name my cookie. My API app uses social authentication and I decided to name both cookies i.e. the one I receive from Facebook/Google/LinkedIn once the user is authenticated and the one I issue to authenticated users. I call the cookie I get from social sites social_auth_cookie and the one I issue is named mysite_cookie. I think this has something to do with this issue I'm having.
The cookie in question cannot legally be sent over an HTTP connection because its secure attribute is set.
For some reason, mysite_cookie has its secure attribute set differently from social_auth_cookie, either because you are setting it in code...
var cookie = new HttpCookie("mysite_cookie", cookieValue);
cookie.Secure = true;
...or because the service is configured to automatically set it, e.g. with something like this in web.config:
<httpCookies httpOnlyCookies="true" requireSSL="true"/>
The flag could also potentially set by a network device (e.g. an SSL offloading appliance) in a production environment. But that's not very likely in your dev environment.
I suggest you try to same code base but over an https connection. If you are working on code that affects authentication mechanisms, you really really ought to set up your development environment with SSL anyway, or else you are going to miss a lot of bugs, and you won't be able to perform any meaningful pen testing or app scanning for potential threats.
You don't need to worry about cookies if you have them on your browser.
You can use your browser cookies by installing Postman Interceptor extension (left side of "In Sync" button).
I have been running into this issue recently with ASP.NET core 2.0. ASP.NET Core 1.1 however seems to be working just fine and the cookies are getting set in Postman
From what you have describe it seems like Postman is not picking up the cookie you want, because it doesn't recognize the name of the cookie or it is still pointing to use the old cookie.
Things you can try:
Undo all the name change and see if it works( just to get to the root of issue)
Rename one cookie and see if it still works, then proceed with other.
I hope by debugging in this way it will take you to the root cause of the issue.

How Selenium WebDriver overcomes Same Origin Policy

How Selenium WebDriver overcome same origin policy?
Same origin policy problem is in Selenium RC
First of all “Same Origin Policy” is introduced for security
reason, and it ensures that content of your site will never be
accessible by a script from another site. As per the policy, any code
loaded within the browser can only operate within that website’s
domain.
--------------------------------------------------------------------------------- ----------------------------------------------What it did???
Same Origin policy prohibits JavaScript code from accessing elements from a domain that is different from where it was launched.
Example, the HTML code in www.google.com uses a JavaScript program
"testScript.js". The same origin policy will only allow testScript.js to access pages within google.com such as google.com/mail, google.com/login, or google.com/signup. However, it cannot access pages from different sites such as
yahoo.com/search or fbk.com because they belong to different domains.
This is the reason why prior to Selenium RC, testers needed to install local copies of both Selenium Core (a JavaScript program) and the web server containing the web application being tested so they would belong to the same domain. ------------------------------------------------------------------------------------------------------------------------------------ How it is avoided???
To avoid “Same Origin Policy” proxy injection method is used, in
proxy injection mode the Selenium Server acts as a client
configured HTTP proxy , which sits between the browser and application
under test and then masks the AUT under a fictional URL
Selenium uses java script to drives tests on a browser; Selenium injects its own js to the response which is returned from aut. But there is a java script security restriction (same origin policy) which lets you modify html of page using js only if js also originates from the same domain as html. This security restriction is of utmost important but spoils the working of Selenium. This is where Selenium server comes to play an important role.
Before Selenium WebDriver, Selenium was "Javascript Task Runner". It would set itself up as a server (locally), and open a browser pointed to the Selenium server running locally. So the browser is now talking to the Selenium Server running locally.
This is a problem though, because the browser is getting a script from Selenium which tells it that it wants to fetch resources from http://websitetotest.com. But the browser got this script from http://127.0.0.1:9000/selenium (for example). The browser says "hey this script came from local host and now it's requesting a resource from some outside website. This violated the same-origin-policy.
WebDriver came along and created a proxy to trick the browser into thinking that it is talking to the same server where both Selenium and the websitetotest are "located". Abhishek provided a concise explanation on this.
This might be a late reply but, if you are referring to selenium webdriver and not selenium RC then the answer is you dont have to worry about same origin policy in case of webdriver since each browser has its own webdriver.This is the whole advantage of webdriver as opposed to RC i.e no selenium core injection into the browser and no middleware client server between the browser and AUT.Webdriver provides a native OS level support in controlling the browser automation.

JMeter NTLM/Windows Authentication Load Testing

What is to be done?
We have an application deployed on the Sharepoint (corporate) Server which uses the windows credentials to log into the application.
App URL format: http://testmachine:1000/sites/test/
Windows Credentials Format: user_id#domain.co.in
The objective is to perform the load/performance testing on the application (especially the log in functionality) for such n number of users.
Normally when I hit the app URL in the Firefox/IE, it pops up a window asking for credentials. I enter the credentials, browse the app and then log out. I intend to capture this in JMeter and simulate this for large number of users.
Where I’m stuck?
Now I start the JMeter proxy server, and then try the same steps as above. But when the pop up window appears, JMeter simply doesn’t record the it nor it does record anything else after the login.
What I’ve tried?
If I try the same steps after enabling “Automatically detect intranet network” in IE, then it simply auto detects my windows credentials (No credentials pop-up), logs me into the app (this is not recorded in JMeter either) and takes me to the home page. And any page thereafter I hit gets recorded in JMeter.
I’ve also tried to use the HTTP Authorization Manager using following parameters:
BaseURL : http://testmachine:1000/sites/test/
Username: DOMAIN\USER_ID
Password: i_wont_tell_you
Domain: \
Realm:
It didn't help. I am quite confused about how-to-use the above element. And not even sure whether its a right approach to get the solution to my problem.
Any help/suggestions?
P.S. I know about a tool called Badboy, but have to go for it as a last resource. Also not even sure if it records the pop windows.
And sorry if the post is verbose.
UPDATE:
I have also tried -
Username: USER_ID and Domain: my_company_domain
But this is not the actual problem. Problem is, when I try to hit the pages (automation) which I've recorded previously return success response even if I haven't used the HTTP Authorization Manager. I'm not sure what I'm missing.
OK. Finally I got what was missing.
First, I had to change the implementation of every request to HttpClient3.1
Second, it was really frustrating to see that JMeter documentation was misleading.
It says that the config file httpclient.parameters, should be edited as following:
http.authentication.preemptive$Boolean=false
But it didn't work. Changing it to true worked like a charm.
Hope this helps other people.
JMeter works at the HTTP layer so the proxy will only capture requests made over this protocol layer. It sounds to me like you have already found the right approach to use for recording by using '“Automatically detect intranet network” in IE', you can use this method to capture most requests and you will have to figure out authentication manually. How you do this depends on how your application communicates with your server to authenticate a user.

Resources