This is the strangest problem and is driving me crazy.
We had an online questionnaire and I decided to split it up as opposed to having it in one long form.
The form submits to the same page, at the top of the page it checks which page it is on, reads in the data from the previous page and stores it in a session variable (an array). There are also a couple more session variables that sum up the answers etc.
It reads in the data perfectly but when the page is submitted again the session seems to lose the previous data and only has the newer information.
This is the curious bit. It works perfectly in chrome, firefox, safari and older versions of ie.
It also works in ie10 / 11 when i browse directly to the page e.g. questionnaire.asp but when I access the page via a url rewrite /questionnaire/stage-1 it fails to hold the session variables but again only in ie10 and 11.
This is driving me crazy. I can see any issues in the code so I feel it is something else maybe relating to how ie works with session variables with iis7 / url rewrite!
Going mad here!
Thanks
Maybe you just need to set the expiration time of session variables.
Try code below:
<% Session.Timeout=120 %>
That is in minutes.
Related
ASP.NET 4.0 Web Forms app.
User logs in and normally link URLs look like this:
https://domain.com/directory/Page.aspx
On occassion ALL links for the user will be rendered with a whole bunch of extra text in the form:
https://domain.com/(F(194_random_characters_Here)/Services/directory/page.aspx
Odd thing is, the links still work. Once it starts happening for a given user, nothing short of an app pool recycle will fix it, and even then sometimes the user also has to clear their browser cache before the extra text goes away. We've seen it occur in several versions each of Firefox, Chrome and IE.
It's almost like the ~ portion of paths/links are getting the extra text added to them somehow.
It looks like you have cookieless sessions enabled.
Look in your web.config or IIS configuration for disabling cookieless sessions.
The FB Like count on one of our pages was reset to zero after we temporarily took the page offline (we recently reinstated the page onto it's old URL).
I understand from the FB Developer docs that Facebook scrapes our pages every 24 hours; I also understand that Like are linked to URLs.
Why has the page's Like count been reset to zero, even though it has been republished using the same URL? How long after a page is taken offline does FB consider it to be dead, and reset the Like count?
Thanks for your help,
Alex
I noticed that FB's debugger (http://developers.facebook.com/tools/debug) was showing a Like count of 29 for our recently reinstated page - even though the page itself was still showing zero Likes. This gave me some hope that the missing Likes might be be added back onto the page.
Within minutes of playing with the debugger, the page's Like count was showing 29.
I'm still no closer to finding out the answer to my original question, but perhaps the FB debugger can help others with similar problems.
This is happening in multiple versions of Safari, including 5.x
It will post _EVENTTARGET=&_EVENTARGUMENT= but nothing for __VIEWSTATE=
This is only happening in Safari, and only on one page of our site.
I can't reproduce it - we've spent days trying to.
The viewstate isnt overly huge on this page.
Thanks!
We ran into a lot of viewstate problems with version 3. Safari sets limits to the amount of data that can appear in any one field that gets posted back to the server.
The way we got around our problems was to set viewstate to span multiple input controls.
You can do this in the system.web / pages section of the web.config. For example:
<system.web>
<pages maxPageSTateFieldLength="500" />
</system.web>
You might have to play with the value. I can't remember what the limits are for the various versions of safari. A few people have said 1k, but if I remember correctly from our testing some versions were only passing around 500 bytes.
Another option is to store viewstate server side. You can see an example of this here. You should also read this blog about potential issues. We did try this path and eventually abandoned it as it conflicted with some other encryption things we were doing.
(taking a different tact from previous answer)
To sum up what we know thus far:
only safari
only a particular page
there is a device called StrangeLoop in the mix which removes viewstate on the way out and puts it back in when the page is posted back. It does so through some type of token value.
A couple of questions:
First, is this limited to just a particular customer or set of people? I ask because it might be important that it's "only" safari.
Second, does the StrangeLoop device have some type of timeout value or traffic limit where it's token cache is garbage collected?
I can envision a scenario where a particular client goes to this page and sits for awhile (10 minutes.. longer?). In the meantime either a timeout value is met or the amount of traffic you have forces the strangeloop device to throw viewstate for this particular client out. Then when they go ahead and post back the device has no viewstate to inject back into the html stream.
It seems to me that in order for you to not have any viewstate at all, the device itself must not be injecting it. The only reason I can come up with for that would be if the token value wasn't sent by safari (unlikely as it has to be quite small) or the device couldn't locate a match in it's cache table.
Does the device have any sort of logging or metrics where you can see if it can't match an incoming token value?
A similar idea is if this page has some ajax going on. Does the device send a different token back for each request or does a single client browser retain the token for the entire browsing session? If it sends a different token.. then it might be that safari isn't properly updating itself client side with the new token value. Although this path ought to be pretty easy to duplicate.
I am working locally on an ASP.NET site and am experiencing problems with postbacks in IE8.
I have a page with a repeater that builds a table and each row has a LinkButton on it that is used to delete that row.
In FireFox and Chrome, the button works as expected - the forms posts back and all the values from the form are available for processing. In IE8, the form posts back but the forms collection is empty, except for the button that initiated the postback.
This is a problem because
it's odd and I don't understand and
I use the values from the posted back form to rebuild some business objects (I don't store them in viewstate or session but rebuild them from scratch based upon values input by user in form). When I post back from FF/Chrome, the full form is there (e.g. Request.Form.AllKeys has, say, 60 items) and I can derive the values for my business object. When I post back from IE, my form is practically empty and my rebuilding code fails (e.g. Request.Form.AllKeys has, say, only 9 items instead of the expected 60).
I am at a loss to explain why there is this difference in the contents of the form collection upon postback between FF/Chrome and IE and would greatly appreciate any insight/help in this regard.
I've tried to break the issue down as I see it - if any further info is required, please let me know. Thanks for your help.
Your problem sounds unusual and is not something I've ever come across. Whilst I can't help directly, I'd recommend using Fiddler - Web Debugging Proxy to examine the HTTP traffic as a means to diagnosing what is going on. To quote, "Fiddler is freeware and can debug traffic from virtually any application, including Internet Explorer, Mozilla Firefox, Opera, and thousands more..."
Solved this - turns out I had a form nested within the main form on my master page. Removed it and all is well.
We had similar problem in IE8 in Windows 2008 and the solution was related to User Right escalations.
On Windows 2008 Machine I had to go to:
Start > administrative Tools > Local Security Policy > Local Policies > Security Option
then select "User Account Control : Admin Approval Mode for Build-in Administrator account"
Select Enable & then Apply.
Thanks
Anugrah
Sometimes Microsoft does something so stunningly dumb that it makes my head hurt. Help me find out it's really not the case ... please!
I've got an issue with the login page of an ASP.NET (3.5) site I'm developing whereby IE (7 or 8 ... can't bear to open 6) doesn't offer to save the password when a user logs in. I've checked other browsers and Firefox, Chrome and Safari all offer to save the password just fine. I've also confirmed that IE password saving on my test boxes is is working OK on other sites and for e.g. Google etc it works fine.
The searching I've done has turned up very little, but what little it did turn up seems to suggest that IE won't offer to save a password if the form on the page contains more than two text controls. That's the case with my form which also has controls to allow a user to register. And when I remove these additional controls, IE magically prompts to save password, so this does seem to be true.
Now ... if ASP.NET would allow me to have multiple forms, all would be well and I would be able to separate out the two functions into standalone forms and IE would prompt to save passwords. But, ASP.NET doesn't allow me to do this as it only allows a single form. I could fudge a non runat=server form in there and try to do this, but guess what? Because my page uses a MasterPage, any form tag I add is automatically stripped out, even if it's a non runat=server form.
So, I don't see any way around this without fundamentally changing what I was trying to achieve. It looks like I have to explain to my users that they won't be prompted to have their passwords saved if they use IE (a Microsoft product) because I developed my site with ASP.NET (err ... a Microsoft product).
If this is so, I just can't get over how head-smackingly ridiculous this is. If anyone can offer any ideas on how to get around it, can tell me I've got it all wrong and am a big, stupid idiot myself, or just wants to confirm that it's not just me that thinks this is monumentously dumb, then please, please do so.
Just for the record, I really don't want to (and don't see why I should have to) compromise my design and split my pages in two (which will result in a worse experience for the user).
#Chris That's what I went for in the end.
So for the benefit of anyone else, I still have my activation controls in a runat=server form and process these in the code for that page. Then I have a second, standard HTML form with HTML input textfields that posts to a different .NET page. This deals with the users login. I pick up the values in this page via Request.Form and deal with the login from here.
Upsides:
It all works and users get their logins remembered as they would expect to.
Downsides:
I lost the ability to use a MasterPage (as I need two forms in the page) so I effectively have had to duplicate the template - I don't like this much.
If the users login is invalid or causes some kind of error, I have to redirect to the initial page and pass it a flag to get it to show a relevant error message - I don't like this much either.
Like I say, though, it just works and in this case that's what was most important. Thanks for your input.