I have a very strange problem with a website where several objects are cached.
We have a lot of DataTables, strings, booleans and other stuff that are cached for quick fetching in later requests.
Sometimes we get a periodic error where it looks like some of the cache items have been mixed up.
An example of how this shows itself is when a piece of code fetches a DataTable from the cache and then tries to access a certain column of that DataTable.
We then see a yellow screen of death with the exception "Cannot find column [ColumnName]", where "ColumnName" of course is some column name that was supposed to be in the DataTable.
When I inspect the cache item with a little home made tool, I see that a completely different DataTable is in the cache item. It is almost like some of the cache items have been mixed up.
Does anybody have an idea how this happens?
We are not able to reproduce the error. It occurs at apparently random intervals.
Whats the issue
When you add items to the cache, you need to lock the process that you create them and added to the cache.
First lets clarify that the cache is keep a reference to your data, is not clone them, nether knows whats is not that data ! reference: http://msdn.microsoft.com/en-us/library/6hbbsfk6(VS.71).aspx
Second clarify that the default session of a page is lock the pages and is by make most of the request safe because all users lock until a page fully load and send.
When its appear
So the lock issue may appear when you try to make cache by a thread, or by a handler, or by a page that have session off.
How to lock
If you use only one pool, then the simple lock(object){} can work, if you use many pools then you need to use mutex() for lock
You need to lock the full process of making your data if you change them later and still existing on cache, or only the cache reference if you make clone of them.
For example, if you read some data that you have get from the cache, then the time you edit them, if some other read the same cache it will get corrupted data, because the cache is give you a reference to them.
Hope that all this helps.
Related
On a new website, I've an huge formular(meaning really big, needs at least 15-20min to finish it), that configure the whole website for one client for the next year.
It's distributed between several tabs(it's a wizard). Every time we go to the next tab, it makes a regular(non ajax) call to the server that generate the next "page". The previous informations are stored in the session(an object with a custom binder).
Everything was working fine until we test it today with all real data. Real data needs reflexion, work to find correct elements, ... And it takes times.
The problem we got is that the View receive a Model partialy empty. The session duration is set to 1440 minutes(and in IIS too). For now what I know is that I get a NullException the first time I try to access the Model into my view.
I'm checking the controller since something like 1 hour, but it's just impossible it gives a null model. If I put all those data very fast, I don't have any problem(but it's random data).
For now I did only manage to reproduce this problem on the IIS server, and I'm checking elmah logs to debug it, so it's not so easy to reproduce it.
Have you just any idea about how should I debug this? I'm a little lost here
I think you should assume session does not offer reliable persistence. I am not sure about details but I guess it will start freeing some elements when it exceeds its memory limit.
You will be safer if you use database to store that information or you could introduce your own implementation for persisting state.
in addition to ans provided by #Ufuk
you can easily send an ajax request every 1 minute which would actually do nothing but by doing this the session wont get expired and site will continue to run in extended periods
The problem was that the sessions wasn't having enough space I think. I resolved temporary my problem by restarting the application pool. Still searching a solution that will not implies to changes all this code. Maybe with other mode of session states, but I need to make my models serializable.
Can anyone help or provide me with some suggestions for the below query.
I have a web form (Minutes of Meeting) and 8 users that need to access this web page and update their area. A user may have more than one area to update and essentially i would like to some how lock down the web page if possible when a user is using it so that no other user can update this web page till joe bloggs has finished with it.
I have a Active Directory security group set up to restrict the site to that group of users only, but i need to think of a solution to the above?
Is there a way i can do this via a web control or via SQL?
There must be better ways to do it. However, Is it possible for you to introduce a sql table column similar to "UpdateInProgress" (bit). Any update process sees that column, If 0 then It updates to 1 and after It saves the changes and updates back to 0 so that the form is available for other to update. If update process sees 1, It can't update the web form because update is in progress.
I also suggest to introduce another column named "UpdateInProgressBy" to check who has opened it for editing.
First of all we must note that there is a big time from the moment the user reads the data, get it in a page, change them and then try to write them back. So we are not talking for the lock command on SQL, nether any other lock that happens in milliseconds and help to synchronize threads, but here we must synchronize people and what they write.
There is also a problem if the user leave the page for any reason and this can make the data lock for ever.
This problem can solve with two approaches.
the easy one, when a user try to save data you must check if the same data have been change in the middle, and warn him, or show a merge dialog, or merge programmaticall, or something similar - I do not know what you won.
the difficult way is to constantly monitor the page that read and change the data, and keep this monitor results on a common table in the data base, and there if a user have been and stay on page, the rest users get a warning and read only data, until the user go.
This monitor must be made with javasript and must know even if a user abandon the page.
SET TRANSACTION ISOLATION LEVEL as SERIALIZABLE
for more information check this link:
http://msdn.microsoft.com/en-us/library/ms173763.aspx
I am using master page in which menu is generating dynamically according to user role in code. The same menu is used for all application to a particular user up to log out. So instead of recreating it, i need the same menu for all of the application. The Menu is in StringBuilder which is very large size. Is Session or Data cache is better and less memory consuming in my situation and why. Please suggest?
I want to improve performance of master page.
Thanks
I think Cache will be better as you will have only one instance created for one role, but Session will make it create multiple instance as many as user is accessing, and you will have to wait session timeout sometimes to free the memory
If every user will get the same menu:
You should consider putting it in the Application "Cache" - Application["MyMenu"] or a static field on one of your objects.
The main reason for this is lifetime. If you put it in an application level object, then it will last for the lifetime of the application. Putting it in a session level object will cause it to be lost when that session ends - as a session is started per user, then you will soon find yourself recaching the data.
On the other hand... if it's unique per user:
The session provides a handy place to put this data, as it is unique to that user, and will not live long beyond that user leaving the site.
Also think about:
If you really think memory is going to be an issue or you want to define exactly how long you keep it for
Put it in the Cache. You can determine the amount of time it lives in the cache, and, additionally, the cache will start to dump objects when it gets short on memory - so it is more sensitive to load than the other options.
There is a good discussion of Session vs Cache on SO already
Additionally
Are you sure your menu is that big? If it is, you might want to consider alternatives - just how big are you talking?
Given the chart here, what should I be looking at to identify the bottleneck? As you can see, requests are averaging nearly 14 seconds under load and the bulk of that time is attributed to the CLR in New Relic's profiling data. In the performance breakdown for a particular page, it attributes the bulk of the time to the WebTransaction/.aspx page.
I see that the database is readed also (the orange) and this is seams that one of all pages have delay the rest of pages because of the lock that session make on the pages.
you can read also :
Replacing ASP.Net's session entirely
My suggestion is totally remove the session calls and if this is not possible, find an other way to save them somewhere in database by your self.
Actually in my pages I have made all three possible options. 1. I call the page with out session. 2 I have made totally custom session that are values connected to the user cookie, and last 3. I have made threads that are run away from the session and they make calculations on background, when they finish I show the results.
In some cases the calculations are done on iframe that call a page without session and there later I show the results.
In the Pro version, you can use Transaction Traces, which help pinpoint exactly where the issue is happening.
I have a web app, that consumes a web service. The main page runs a search - by passing parameters to a particular web service method, and I bind the results to a gridview.
I have implemented sorting and paging on the grid. By putting the datatable that the grid is bound to in the viewstate and then reading / sorting / filtering it as necessary - and rebinding to the grid.
As the amount of data coming back from the web service has increased dramatically, when I try to page/sort etc I receive the following errors.
The connection was reset
The connection to the server was reset while the page was loading.
I have searched around a bit, and it seems that a very large viewstate is to blame for this.
But surely the only other option is to
Limit the results
Stick the datatable in the session rather than the viewstate
Something else I am unaware of
Previously I did have the datatable in the session, as some of this data needed to persist from page to page - (not being posted however so viewstate was not an option). As the amount of data rose and the necessity to persist it was removed, I used the viewstate instead. Thinking this was a better option than the session because of the amount of data the session would have to hold and the number of users using the app.
It appears maybe not.
I thought that when the viewstate got very big, that .net split it over more than one hidden viewstate field, but it seems all I'm getting is one mammoth viewstate that I have trouble viewing in the source.
Can anyone enlighten me as to how to avoid the error I'm getting? If it is indeed to do with the amount of data in the viewstate?
It sounds like your caching the whole dataset for all pages even though you are only presenting one page of that data. I would change your pagination to only require the data for the current page the user is on.
If the query is heavy and you don't want to have to be constantly calling it over and over because there is a lot of paging back and forth (you should test typical useage pattern) then I would implement some type of caching on the web service end to cache page by page (by specific user if the data is specific to a user) and have it expire rather quick (eg a few minuites).
I think you need to limit the total amount of data your dealing with. Change your code to not pass back extra data that might never be needed is a good place to start.
EDIT: Based on your comments:
You can't change the web service
The user can manipulate the query by filtering or sorting
There is a large amount of data returned by the web service
The data is user specific
Well I think you have a perfect case for using the Session then. This can be taxing the the server with large amounts of users and data so you might want to implement some logic to clear the data from the Session and not wait for it to expire (like on certain landing pages you know the user will go when they are done, clear the session data).
You really want to get it out of the ViewState beacuse it is a huge bandwidth hog. Just look at your physical page size and that data is being passed back and forth with every action. Moving it to the Session would eliminate that bandwidth useage and allow for you to do everything you need.
You could also look at the data the web service is bringing back and store it in a custom object that you make as 'thin' as possible. If your storing a DataSet or a DataTable in your Session, those objects have some extra overhead you probably don't need so store the data as an array of some custom thin object and just bind to that. You would need to map the result from the WS to your custom object but this is a good option you cut down on memory useage.
Let me know if there is something else I am missing.
I wouldn't put the data in either the view state or the session. Instead store the bare minimum information to re-request the dataset from the web service and store that (in either view state or session, or even on the URL). Then call the web service using that data and reaction the data on each request. If necessary, look to use some form of caching (memCache) to improve performance.