I have an HttpModule that I created and am running on IIS 6 using Windows Server 2003. I can send cookies to the browser, but I can not read them on the next request, cookie is always null.
If I run this module on IIS 7 though, it works just fine. IIS 7 not an option at the moment as we have not switched over and this needs to get done before that will happen.
Also, I've already tried using the PostAcquireRequestState hook.
public void Init(HttpApplication httpApp)
{
httpApp.BeginRequest += OnBeginRequest;
}
public void OnBeginRequest(Object sender, EventArgs e)
{
var httpApp = (HttpApplication)sender;
var context = httpApp.Context;
const string cookieName = "sId";
if (!string.IsNullOrEmpty(context.Request.QueryString["cookie"]))
{
var ck = new HttpCookie(cookieName)
{
Value = httpApp.Context.Request.QueryString["cookie"],
Expires = DateTime.Now.AddDays(1)
};
httpApp.Response.Cookies.Add(ck);
}
else
{
var cookie = httpApp.Request.Cookies[cookieName]
}
}
I ran into a similar problem, but had a different solution, so I thought I'd share, in case it helps someone. I took zengchun's suggestion as well to use some tools to inspect request & response headers. Since I'm using IE, the F12 Dev Tools works great for this. As soon as I saw the response header for the cookie, I noticed the secure flag was set. Sure enough, I had copied code from a production SSL-hosted site to a test site that did not use SSL, so the secure flag on the cookie prevented the code from being able to read it. I updated the web.config to remove requireSSL from the httpcookies node, and my site started working. :)
your code look worked.the problem may be occur in the client-side how to request the next page.you can use the firebug with firefox or the fidder tools that can log your client-side request and see the request whether send cookd value in the request header to the server.
for example
the request headers:
get /1.aspx
.....
Cookie: sId=123 [if the client has a cookie then it will appear in here.]
the response headers:
Set-Cookie: sId=123; expires=Fri, 30-Mar-2012 07:20:23 GMT;
path=/
if the server add cookie to the response,then response it look like the above.
now,i guess the problem in your cook domain or you cookie path is different.
the best method to set cookie is like the follow code:
var ck = new HttpCookie(cookieName)
{
Value = httpApp.Context.Request.QueryString["cookie"],
Expires = DateTime.Now.AddDays(1),
Path="/",
Domain="your domain"
};
good luck.
Thanks to zhengchun I was able to get to the root of the problem. It turns out I was unable to set the cookie using requests to static files. I created .aspx files for my initial requests that redirected to the static files after setting the cookie. My HttpModule could then read the cookie after being set in the .aspx file. Not sure why I need a .aspx file to set the cookie instead of the HttpModule, but this fixed it.
Related
I am working on an ASP.Net Webforms Application. My security guy wants all cookies to have Path and SameSite attribute. I can set Path and SameSite attribute to the cookies which i have created but how do set it on Session Cookie. FOr eg whenver a visitor visits my site i set some Session Variables to ASP.net generates a session cookie on Get Request, how Do i set path and SameSite attribute on First Get Request. Is there any Global.asax event which i can use. I have tried using Application_PreSendRequestHeaders event like below
protected void Application_PreSendRequestHeaders()
{
foreach (string str in Response.Cookies.AllKeys)
{
HttpCookie ck = Request.Cookies[str];
ck.Path = "/DBTDASHBOARD";
ck.Value+=";SameSite=Strict";
}
}
But this works on subsequent requests and not the first get call made to my landing page. Any help would be of great use.
Finally I was able to do this with the help of url reweite 2.0 module for IIS.
Title should say it all.
Here's the code to set the cookie:
// snip - some other code to create custom ticket
var httpCookie = new HttpCookie(FormsAuthentication.FormsCookieName, encodedTicket);
httpCookie.Domain = "mysite.com";
httpContextBase.Response.Cookies.Add(httpCookie);
Here's my code to signout of my website:
FormsAuthentication.SignOut();
Environment:
ASP.NET MVC 3 Web Application
IIS Express
Visual Studio 2010
Custom domain: "http://localhost.www.mysite.com"
So when i try and log-off, the cookie is still there. If i get rid of the httpCookie.Domain line (e.g default to null), it works fine.
Other weird thing i noticed is that when i set the domain, Chrome doesn't show my cookie in the Resources portion of developer tools, but when i dont set the domain, it does.
And secondly, when i actually create the cookie with the custom domain, on the next request when i read in the cookie from the request (to decrypt it), the cookie is there, but the domain is null?
I also tried creating another cookie with the same name and setting the expiry to yesterday. No dice.
What's going on? Can anyone help?
I believe if you set the domain attribute on the forms element in you web.config, to the same as the one in your custom cookie, it should work. (EDIT: that approach won't work because the SignOut method on FormsAuthentication sets other flags on the cookie that you are not, like HttpOnly) The SignOut method basically just sets the cookie's expiration date to 1999, and it needs the domain to set the right cookie.
If you can't hardcode the domain, you can roll your own sign out method:
private static void SignOut()
{
var myCookie = new HttpCookie(FormsAuthentication.FormsCookieName);
myCookie.Domain = "mysite.com";
myCookie.Expires = DateTime.Now.AddDays(-1d);
HttpContext.Current.Response.Cookies.Add(myCookie);
}
An authentication cookie is just a plain cookie; so you would remove it the same way you would any other cookie: expire it and make it invalid.
I had a similar problem. In my case, I was storing some userData in the AuthCookie and experienced the same effects as described above, and upon authentication at each request, reading the cookie and putting the userData in a static variable. It turned out in my case that the data was being persisted in the application. To get around it, I had to first clear my static variable, and then expire the cookie. I used the following in the LogOff method of my AccountController:
AuthCookie.Clear(); //STATIC CLASS holding my userdata implemented by me.
Response.Cookies[FormsAuthentication.FormsCookieName].Expires = DateTime.Now.AddYears(-1);
Response.Cookies[FormsAuthentication.FormsCookieName].Value = null;
return RedirectToAction("Index", "Home");
Hope this helps.
UPDATE
On a hunch after submitting, I replaced the middle two lines with:
FormsAuthentication.SignOut();
... and it worked fine where it didn't before.
Note:
AuthCookie.Clear();
... does not touch the AuthCookie, it just resets the static class I wrote to default values.
Again, hope this helps.
I have an asp.net .asmx webservice written to handle requests from a third party tool. The third party tool makes an http POST request to the webservice to get user information. I'm using IIS7
Running Fiddler with "Remove All Encodings" checked, I can see the webservice call and and everything functions properly. If I uncheck "Remove All Encodings", the webservice call fails with a 400 Bad Request. The difference I see is that the header "Content-Encoding: gzip" is being removed by Fiddler and the content is being decompressed.
So, when the Content-Encoding header is removed and the content is decompressed, my webservice functions perfectly. When the header is present and the content is compressed, the webservice fails.
How can I either:
Configure my webservice to tell the client that it won't accept compressed requests (and hope that the third party tool respects that)
Decompress the content early in the asp.net handling
Modify my webservice to work with compressed data
Update: To be clear, I don't need to configure gzip encoding in the Response, I need to deal with a Request TO my webservice that is gzip encoded.
Update 2: The third-party tool is the Salesforce.com Outlook plugin. So, I don't have access to modify it and it is used by many other companies without trouble. It's got to be something I'm doing (or not doing)
Update 3: I found one post here that says that IIS does not support incoming POST requests with compressed data, it only supports compressed Responses. Can this still be true?
The simplest technique is to create an HttpModule that replaces the request filter. It is more reusable and avoids having a Global.asax. There is also no need to create a new decompress stream class as the GZipStream is ready for that. Here is the full code, that also removes the Content-Encoding: gzip that is not needed any more:
public class GZipRequestDecompressingModule : IHttpModule
{
public void Init(HttpApplication context)
{
context.BeginRequest += (sender, e) =>
{
var request = (sender as HttpApplication).Request;
string contentEncoding = request.Headers["Content-Encoding"];
if (string.Equals(contentEncoding, "gzip",
StringComparison.OrdinalIgnoreCase))
{
request.Filter = new GZipStream(request.Filter,
CompressionMode.Decompress);
request.Headers.Remove("Content-Encoding");
}
};
}
public void Dispose()
{
}
}
To activate this module, add the following section into your web.config:
<system.webServer>
<modules runAllManagedModulesForAllRequests="true">
<add name="AnyUniqueName"
type="YourNamespace.GZipRequestDecompressingModule, YourAssembly"
preCondition="integratedMode" />
</modules>
</system.webServer>
Since the 3rd party service is just sending you a POST, I do not think that it is possible to tell them not to send in compressed.
You could try to override GetWebRequest and decompress it on the way in
public partial class MyWebService : System.Web.Services.Protocols.SoapHttpClientProtocol
{
protected override WebRequest GetWebRequest(Uri uri)
{
base.GetWebRequest(uri);request.AutomaticDecompression = System.Net.DecompressionMethods.GZip;
return request;
}
}
GZIP compression is a function of the server.
If you're using IIS6, consult this link.
If you're using IIS7, you could use ISAPI_Rewrite to disable gzip. See this link.
That said, because gzip is a function of IIS, you really shouldn't need to do anything "special" to get it to work with a web service (IIS should handle decompressing and compressing requests). Hopefully this info will get you further down the road to troubleshooting and resolving the issue.
I am not sure that IIS supports decompressing incoming requests, so this might have to be done further down the pipe.
Shiraz's answer has the potential of working and it would be the first thing I would try.
If that doesn't work you might consider switching your server .asmx service to WCF, which while a bit more difficult to setup it also gives more flexibility.
On the WCF side there are two things I can suggest. The first is quite easy to implement and is based on setting the WebRequest object used by WCF to automatically accept compression. You can find the details here. This one is the WCF equivalent to the solution proposed by Shiraz.
The second is more complicated, since it involves creating Custom Message Encoders, but if none of the above methods work, this should solve the problem. Creating a message compression encoder is described here. You might also want to check the answer in here which presents a sample config for the message encoder.
Please let me know if this helped or if you need more help.
I've found a partial answer here.
class DecompressStream : Stream
{
...
public override int Read(byte[] buffer, int offset, int count)
{
GZipStream test = new GZipStream(_sink, CompressionMode.Decompress);
int c = test.Read(buffer, offset, count);
return c;
}
...
}
I can then specify the filter on the request object like this:
void Application_BeginRequest(object sender, EventArgs e)
{
string contentEncoding = Request.Headers["Content-Encoding"];
Stream prevCompressedStream = Request.Filter;
if(contentEncoding == null || contentEncoding.Length == 0)
return;
contentEncoding = contentEncoding.ToLower();
if(contentEncoding.Contains("gzip"))
{
Request.Filter = new DecompressStream(Request.Filter);
}
}
I say partial answer because even though I can now process the incoming request, the response is getting a "Content-Encoding: gzip" header even though the response is not encoded. I can verify in Fiddler that the content is not encoded.
If I do encode the response, the client for the webservice fails. It seems that even though it is sending "Accept-Encoding: gzip", it does not in fact accept gzip compressed response. I can verify in Fiddler that the response is compressed and Fiddler will decompress it successfully.
So, now I'm stuck trying to get a stray "Content-Encoding: gzip" header removed from the response. I've removed all references I can find to compression from the application, the web.config, and IIS.
Where exactly does Forms Authentication exist in the Http Pipeline?
This is handled by an HTTP module, System.Web.Security.FormsAuthenticationModule. If you look at the system-wide web.config file, c:\Windows\Microsoft.NET\Framework\v2.0.50727\CONFIG\web.config, you can see where it's mentioned in the <httpModules> section. The site-specific web.config file will inherit the configuration in that file.
On each request, the module will look for an authentication cookie. If it's not present, the request is redirected to the login page. On a successful login, an authentication cookie is sent back to the browser. Then on subsequent requests, the browser will send the cookie, which will be validated by the module, and then the request is handled as usual.
Guess I should've thought of this first but it didn't dawn on me until I saw the answer from #Carl Raymond that I can just crack it open in reflector. So to answer my own question
public void Init(HttpApplication app)
{
if (!_fAuthChecked)
{
_fAuthRequired = AuthenticationConfig.Mode == AuthenticationMode.Forms;
_fAuthChecked = true;
}
if (_fAuthRequired)
{
FormsAuthentication.Initialize();
app.AuthenticateRequest += new EventHandler(this.OnEnter);
app.EndRequest += new EventHandler(this.OnLeave);
}
}
OnEnter calls the private method OnAuthenticate which passes in the application context and this is where it validates/writes out the Form Auth tickets.
In OnExit it checks the response for a Http Status Error Code 401 and if it finds it, that's when it redirects to the Login Url.
Why does the property SessionID on the Session-object in an ASP.NET-page change between requests?
I have a page like this:
...
<div>
SessionID: <%= SessionID %>
</div>
...
And the output keeps changing every time I hit F5, independent of browser.
This is the reason
When using cookie-based session state, ASP.NET does not allocate storage for session data until the Session object is used. As a result, a new session ID is generated for each page request until the session object is accessed. If your application requires a static session ID for the entire session, you can either implement the Session_Start method in the application's Global.asax file and store data in the Session object to fix the session ID, or you can use code in another part of your application to explicitly store data in the Session object.
http://msdn.microsoft.com/en-us/library/system.web.sessionstate.httpsessionstate.sessionid.aspx
So basically, unless you access your session object on the backend, a new sessionId will be generated with each request
EDIT
This code must be added on the file Global.asax. It adds an entry to the Session object so you fix the session until it expires.
protected void Session_Start(Object sender, EventArgs e)
{
Session["init"] = 0;
}
There is another, more insidious reason, why this may occur even when the Session object has been initialized as demonstrated by Cladudio.
In the Web.config, if there is an <httpCookies> entry that is set to requireSSL="true" but you are not actually using HTTPS: for a specific request, then the session cookie is not sent (or maybe not returned, I'm not sure which) which means that you end up with a brand new session for each request.
I found this one the hard way, spending several hours going back and forth between several commits in my source control, until I found what specific change had broken my application.
In my case I figured out that the session cookie had a domain that included www. prefix, while I was requesting page with no www..
Adding www. to the URL immediately fixed the problem. Later I changed cookie's domain to be set to .mysite.com instead of www.mysite.com.
my problem was that we had this set in web.config
<httpCookies httpOnlyCookies="true" requireSSL="true" />
this means that when debugging in non-SSL (the default), the auth cookie would not get sent back to the server. this would mean that the server would send a new auth cookie (with a new session) for every request back to the client.
the fix is to either set requiressl to false in web.config and true in web.release.config or turn on SSL while debugging:
Using Neville's answer (deleting requireSSL = true, in web.config) and slightly modifying Joel Etherton's code, here is the code that should handle a site that runs in both SSL mode and non SSL mode, depending on the user and the page (I am jumping back into code and haven't tested it on SSL yet, but expect it should work - will be too busy later to get back to this, so here it is:
if (HttpContext.Current.Response.Cookies.Count > 0)
{
foreach (string s in HttpContext.Current.Response.Cookies.AllKeys)
{
if (s == FormsAuthentication.FormsCookieName || s.ToLower() == "asp.net_sessionid")
{
HttpContext.Current.Response.Cookies[s].Secure = HttpContext.Current.Request.IsSecureConnection;
}
}
}
Another possibility that causes the SessionID to change between requests, even when Session_OnStart is defined and/or a Session has been initialized, is that the URL hostname contains an invalid character (such as an underscore). I believe this is IE specific (not verified), but if your URL is, say, http://server_name/app, then IE will block all cookies and your session information will not be accessible between requests.
In fact, each request will spin up a separate session on the server, so if your page contains multiple images, script tags, etc., then each of those GET requests will result in a different session on the server.
Further information: http://support.microsoft.com/kb/316112
My issue was with a Microsoft MediaRoom IPTV application. It turns out that MPF MRML applications don't support cookies; changing to use cookieless sessions in the web.config solved my issue
<sessionState cookieless="true" />
Here's a REALLY old article about it:
Cookieless ASP.NET
in my case it was because I was modifying session after redirecting from a gateway in an external application, so because I was using IP instead on localhost in that page url it was actually considered different website with different sessions.
In summary
pay more attention if you are debugging a hosted application on IIS instead of IIS express and mixing your machine http://Ip and http://localhost in various pages
In my case this was happening a lot in my development and test environments. After trying all of the above solutions without any success I found that I was able to fix this problem by deleting all session cookies. The web developer extension makes this very easy to do. I mostly use Firefox for testing and development, but this also happened while testing in Chrome. The fix also worked in Chrome.
I haven't had to do this yet in the production environment and have not received any reports of people not being able to log in. This also only seemed to happen after making the session cookies to be secure. It never happened in the past when they were not secure.
Update: this only started happening after we changed the session cookie to make it secure. I've determined that the exact issue was caused by there being two or more session cookies in the browser with the same path and domain. The one that was always the problem was the one that had an empty or null value. After deleting that particular cookie the issue was resolved. I've also added code in Global.asax.cs Sessin_Start method to check for this empty cookie and if so set it's expiration date to something in the past.
HttpCookieCollection cookies = Response.Cookies;
for (int i = 0; i < cookies.Count; i++)
{
HttpCookie cookie = cookies.Get(i);
if (cookie != null)
{
if ((cookie.Name == "ASP.NET_SessionId" || cookie.Name == "ASP.NET_SessionID") && String.IsNullOrEmpty(cookie.Value))
{
//Try resetting the expiration date of the session cookie to something in the past and/or deleting it.
//Reset the expiration time of the cookie to one hour, one minute and one second in the past
if (Response.Cookies[cookie.Name] != null)
Response.Cookies[cookie.Name].Expires = DateTime.Today.Subtract(new TimeSpan(1, 1, 1));
}
}
}
This was changing for me beginning with .NET 4.7.2 and it was due to the SameSite property on the session cookie. See here for more info: https://devblogs.microsoft.com/aspnet/upcoming-samesite-cookie-changes-in-asp-net-and-asp-net-core/
The default value changed to "Lax" and started breaking things. I changed it to "None" and things worked as expected.
Be sure that you do not have a session timeout that is very short, and also make sure that if you are using cookie based sessions that you are accepting the session.
The FireFox webDeveloperToolbar is helpful at times like this as you can see the cookies set for your application.
Session ID resetting may have many causes. However any mentioned above doesn't relate to my problem. So I'll describe it for future reference.
In my case a new session created on each request resulted in infinite redirect loop. The redirect action takes place in OnActionExecuting event.
Also I've been clearing all http headers (also in OnActionExecuting event using Response.ClearHeaders method) in order to prevent caching sites on client side. But that method clears all headers including informations about user's session, and consequently all data in Temp storage (which I was using later in program). So even setting new session in Session_Start event didn't help.
To resolve my problem I ensured not to remove the headers when a redirection occurs.
Hope it helps someone.
I ran into this issue a different way. The controllers that had this attribute [SessionState(SessionStateBehavior.ReadOnly)] were reading from a different session even though I had set a value in the original session upon app startup. I was adding the session value via the _layout.cshtml (maybe not the best idea?)
It was clearly the ReadOnly causing the issue because when I removed the attribute, the original session (and SessionId) would stay in tact. Using Claudio's/Microsoft's solution fixed it.
I'm on .NET Core 2.1 and I'm well aware that the question isn't about Core. Yet the internet is lacking and Google brought me here so hoping to save someone a few hours.
Startup.cs
services.AddCors(o => o.AddPolicy("AllowAll", builder =>
{
builder
.WithOrigins("http://localhost:3000") // important
.AllowCredentials() // important
.AllowAnyMethod()
.AllowAnyHeader(); // obviously just for testing
}));
client.js
const resp = await fetch("https://localhost:5001/api/user", {
method: 'POST',
credentials: 'include', // important
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(data)
})
Controllers/LoginController.cs
namespace WebServer.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class UserController : ControllerBase
{
[HttpPost]
public IEnumerable<string> Post([FromBody]LoginForm lf)
{
string prevUsername = HttpContext.Session.GetString("username");
Console.WriteLine("Previous username: " + prevUsername);
HttpContext.Session.SetString("username", lf.username);
return new string[] { lf.username, lf.password };
}
}
}
Notice that the session writing and reading works, yet no cookies seem to be passed to the browser. At least I couldn't find a "Set-Cookie" header anywhere.