In pages that have a viewstate that spans 10-15KB what would be an optimal value for
<pages maxPageStateFieldLength="">
in the web.config in order to reduce the risk of potential truncation leading to viewstate validation errors?
The maxPageStateFieldLength doesn't limit the size of the view state, so it won't be truncated.
What it does is simply control how much of the view state will be placed in one hidden field. If the view state is larger than the maxPageStateFieldLength value the view state will simply be split into several hidden fields.
The default value is -1, meaning that the view state is put in a single hidden field, which is normally the most optimal.
I am currently running into a problem where we are getting "HttpException Due to Invalid Viewstate" errors for only Firefox users. I am also seeing "System.Web.HttpException: An error occurred while communicating with the remote host. The error code is 0x80070001" exceptions paired up with the viewstate ones.
I believe that firefox has possibly got a max size on (possibly) hidden form fields - i need to verify this. I'm now looking to chunk up the viewstate using maxPageStateFieldLength to hopefully resolve (in the short term). The longer term solution is to refactor the aspx page to do a paging query for grid (instead of pulling down all the rows in one go - which is not a good thing to do).
IMHO, you should put in a maximum that is fairly large but not unlimited. I'd say 1MB is a start.
Related
I'm trying to debug a weird issue where a control is being rendered with multiple (different) id attributes. I believe something is manually adding the incorrect id, but I'm having issues figuring out when, and myControl.Attributes is empty.
Is it possible to set a breakpoint midway through Page_Render and read what has currently been written to the Output stream?
Note that Response.OutputStream.CanRead is false.
Or better yet, is there a way to view the string that would be rendered by a control at a given point in time?
I watched a talk recently ( http://vimeo.com/68390507 ) where the speaker is very serious, saying several times, to never set EnableViewStateMac=false.
While using Enterprise Web Library, I noticed that EnableViewStateMac is set to false. What is being done in EWL to make up for this? How can I trust that it's secure?
It's important to note that while EWL currently has a dependency on Web Forms (and view state), it is a weak dependency, and our roadmap calls for eliminating the dependency entirely. EWL completely overrides the saving and loading of view state by using Page.SavePageStateToPersistenceMedium and Page.LoadPageStateToPersistenceMedium, and this means that it is impossible for any controls on the page to store their own [possibly insecure] state.
Here's the complete list of what EWL stores in view state:
EWL "page state". This is data that needs to persist for as long as a user stays on a page, but shouldn't be stored in a database or other durable storage. For example, the current item display limit of an EwfTable, or a form field value that needs to be saved on an intermediate post-back so that parts of the page can be refreshed. This type of data is directly manipulated by the user and not any more secret than run-of-the-mill form field values. In fact, we're considering storing it even more openly in hidden fields, which will enable JavaScript to manipulate it without post-backs.
A "form value hash". This is a hash of all form field values at the time the page was rendered. It is used on post-back to inform the user if any of the data changed under their feet since the last time they loaded the page. If this hash is hacked, two things could happen. First, the user could receive a "concurrency error" even if no data changed. Second, the user could not receive a concurrency error even if data did change. This second case may sound bad, but keep in mind that most web applications in the wild do not even have this type of concurrency checking in the first place.
The ID of the data-modification that failed on the last post-back. This is either null, empty, or equal to one of the post-back IDs present in the HTML of the page, and is used to re-run a data modification in certain cases, in order to re-display validation errors. The worst hacking outcome is that a different, but still triggerable, set of validation errors gets displayed.
It is not secure, unfortunately. The point of EnableViewStateMac isn't to prevent a control from round-tripping insecure state. It's to prevent an attacker from injecting his own state and having a control interpret it as valid.
EnableViewStateMac=false is an insecure setting. Full stop. No conditions, no exceptions, no excuses. Applications should never under any circumstance set this switch to false.
In fact, since there's no valid reason for an application to ever do this, we (the ASP.NET team) are going to forbid setting EnableViewStateMac=false in an upcoming version of ASP.NET. This may break applications which have been deployed with this setting. Normally we wouldn't do something that impacts compatibility so greatly, but the fact that we're making an exception here I hope demonstrates how serious we are when we say "nobody should ever do this."
In a recent project we are currently getting 12031 errors. here is the complete error:
Sys.WebForms.PageRequestManagerServerErrorException 12031
the status code returned from the
server was 12031
The problem is, this doesn't happen all the time and we are unable to reproduce the error on development environment.
We use AJAX in our application and this exception happens on every page once in a while.
I've found a post on SO with the same problem and tried changing maxRequestLength to "1" to see if I constantly get the same error but I don't. Instead, I'm getting
Maximum request length exceeded.
So I'm starting to think that it is not related to maxRequestLength. I'm actually out of ideas. I have a ScriptManager in my MasterPage and its AsyncPostBackTimeout="240". That is the same amount of time (give or take). I get the 12031 error after 3,5 minutes of "nothing". I'm logging one of the pages and by logging, I mean logging every section of the page like "Page_Load is called" "xyz is called" etc and I have like 15 spots on the page for this. After user clicks a button and ScriptManager tries to do its job, no postback occurs, no logging happens. It is like the page wants to do a postback but too old to do it. Tries this for around 3,5 minutes and fails with the given error.
Please, if you have any ideas, HELP ME OUT .
Thank you
That error almost certainly has nothing to do with the size of the response, AsyncPostBackTimeout, or maxRequestLength.
Connection resets are usually indicative of poor network connectivity or a server loaded down to its capacity limits. A few things you could try:
Inspect the Windows Event Log during the time(s) that the kiosks were known to have received the error. Look for any relevant errors or warnings.
If feasible, ask the kiosk staff to use something like Pingtest to test the quality of their local network connection at the time that they receive the error in your app.
Use a service like Pingdom to ensure that the server itself isn't intermittently losing connectivity.
This error may be due to the HTTP Runtime limitation of the maxRequestLength. The default value is 4096.
Try adding (or editing) the following entry in your Web.Config:
"<httpRuntime maxRequestLength="8192" />" (effectively allowing 8mb of data transmission, instead of the default 4mb).
Please not....You can set data as per you max request. 8192 is not the limit. Also you need to add Page.Form.Attributes.Add("enctype", "multipart/form-data"); in Page_Load event of the page.
You'll want to enter this in the System.Web configuration section.
How do I avoid getting a PageRequestManagerParserErrorException?
To start with, don't do anything from the preceding list! Here's a matching list of how to avoid a given error (when possible):
Calls to Response.Write():
Place an or similar control on your page and set its Text property. The added benefit is that your pages will be valid HTML. When using Response.Write() you typically end up with pages that contain invalid markup.
Response filters:
The fix might just be to not use the filter. They're not used very often anyway. If possible, filter things at the control level and not at the response level.
HttpModules:Same as response filters.
Server trace is enabled:
Use some other form of tracing, such as writing to a log file, the Windows event log, or a custom mechanism.
Calls to Server.Transfer():
I'm not really sure why people use Server.Transfer() at all. Perhaps it's a legacy thing from Classic ASP. I'd suggest using Response.Redirect() with query string parameters or cross-page posting.
Another way to avoid the parse error is to do a regular postback instead of an asynchronous postback. For example, if you have a button that absolutely must do a Server.Transfer(), make it do regular postbacks. There are a number of ways of doing this:
The easiest is to simply place the button outside of any UpdatePanels. Unfortunately the layout of your page might not allow for this.
Add a PostBackTrigger to your UpdatePanel that points at the button. This works great if the button is declared statically through markup on the page.
Call ScriptManager.RegisterPostBackControl() and pass in the button in question. This is the best solution for controls that are added dynamically, such as those inside a repeating template.
Good luck!
I have received this error before when we had a Barracuda Device sitting in front of our website. It was a maximum request length issue because Barracuda protects against overloading the request size. We removed the device temporarily and it solved the problem. Not sure if this is your problem.
I have an application that holds data referencing 300,000 customers. When a user did a search the result was often bigger than our MaxRequestlength would allow, we have dealt with this in two ways: We have increased our MaxRequestLength to 102400 (KB) and required the user to supply two letters of the first Name and two letters of the last name, to limit the sheer # of customer records returned. This keeps us from exceeding the MaxRequestLength limit.
I was just wondering if anyone had any insight in to whether this was a particularly good approach, whether there is a limit to how big MaxRequestLength could be or should be, and what other options might be useful in this situation.
Most web applications I have seen deal with this by returning a paginated list, and displaying only the first page of results.
In modern implementations using ORM's, "Skip" and "Take" operators are used to retrieve only those records which are required for a given page.
So any given request is no longer than the number of records on one page.
I would recommend paging the results instead of displaying everything. I would also suggest adding multiple search fields allowing your users to filter their results even further. This will allow your user to find what they are looking for faster.
As you can guess from my comment, I think MaxRequestLength only restricts the size of the request (-> the amount of data sent from the client/browser to the server).
If you are exceeding this limit, then this probably means that you have a huge ViewState which is sent with every response. ViewState is stored in a hidden field on the page and is sent back to the server with every PostBack (and that's where the MaxRequestLength setting could come into play). You can easily check this by looking at the source of your page in the web browser and looking for a hidden INPUT element with the name "__VIEWSTATE" and a large string-value.
If this is the case, the you should try to reduce the size of the ViewState, e.g. by
setting ViewState="false" on your controls (GridView or whatever) and re-binding the control on every PostBack (this is the recommended approach)
storing the ViewState on the server side
compressing the ViewState
If your requirements allow it, I would suggest implementing server-side paging. That way you only send one page worth of records over the wire rather than the entire record set.
300,000 records is a completely unusable result set from a human perspective.
As others have said, page the results to something like the top 50 or 100 records. Let them sort it and provide a way to narrow the search criteria.
For perspective, look at google. They default to 10 records per page. Part of the reason for this is that people would rather provide more criteria than go spelunking through a large result set.
I am running into a randomly occuring issue, and it looks like either there is a bug in the third party control we are using or the size of a form field is limited. I've seen there is a limit in classic asp http://support.microsoft.com/default.aspx?scid=kb;EN;q273482 but is there a limit in .net?
I believe if we reached the max limit setting on the entire post body that asp.net would generate an error instead of truncating the form field. I know most likely this an error in the third party control but I want to vette all other possible options. Essentially what is occuring is they are posting a url encoded xml msg in the body and the xml is getting truncated sometimes.
Thanks in advance.
check to see that there isn't a limit in the database table or stored procedure. if there isn't a limit there then maybe the parameter variable is declared with a limit in the .net code. the default maxRequestLength set in the machine.config is 4096 which should accommodate any posting in a form
there shouldn't be any limits because i have projects where people post upwards of 200,000 characters to a single form field.