Session state blocking async ajax call from processed concurrently - asp.net

I am trying to make 6 asynchronous jQuery ajax calls to my .NET Page Method all at once on document.ready to request for different sets of data from the database and in return render as charts to users.
Problem is that when one chart takes a long time to generate, it locks up generation of the next 5 charts, for instance, when each chart takes 1 min to generate, the user will be approx waiting for 6 mins, instead of 1 - 2 mins which i thought it will be when using async ajax calls and page method gets processed in parallel.
After reading a lot of useful posts in this forum, i found that this is because I have to read and write to session objects within the page methods, and asp.net will lock the whole request, as a result making them run sequentially.
I have seen people suggesting to set the session state to read only in #Page tag, but it will not address my problem because i need write to the session as well. I have considered moving from inProc session to sql database session, but my session object is not serializable and is used across the whole project. I also cannot change to use Cache instead because the session contains user specific details.
Can anyone please help and point me to the right direction? I have been spending days to investigate this page inefficiency and still haven't yet found a nice way yet.
Thanks in advance

From my personal experience, switching to SQL session will NOT help this problem as all of the concurrent threads will block in SQL as the first thread in will hold an exclusive lock on one or more rows in the database.
I'm curious as to why your session object isn't serializable. The only solution that I can think of is use a database table to store the user specific data that you are keeping in session and then only holding onto a database lock for as long as it takes you to update the user data.
You can use the ASP.NET session id or other unique cookie value as the database key.

The problem may not be server side at all.
Browsers have a built in limit on how many concurrent HTTP requests they will make - this is part of the HTTP/1.1 spec which sugests a limit of 2.
In IE7 the limit is 2. in IE8 it is 6. But when a page loads you could easily hit 6 due to the concurrent requests for CSS, JS, images etc.
A good source of info about these limits is BrowserScope (see Connections per Hostname column).
What about combining those 6 requests into 1 request? This will also load a little faster.

Related

Please wait page in Spring MVC + Apache Tiles

I'm using Spring MVC 3 + Tiles for a webapp. I have a slow operation, and I'd like a please wait page.
There are two main approaches to please wait pages, that I know of:
Long-lived requests: render and flush the "please wait" bit of the page, but don't complete the request until the action has finished, at which point you can stream out the rest of the response with some javascript to redirect away or update the page.
Return immediately, and start processing on a background thread. The client polls the server (in javascript, or via page refreshes), and redirects away when the background thread finishes.
(1) is nice as it keeps the action all single-threaded, but doesn't seem possible with Tiles, as each JSP must complete rendering in full before the page is assembled and returned to the client.
So I've started implementing (2). In my implementation, the first request starts the operation on a background thread, using Spring's #Async annotation, which returns a Future<Result>. It then returns a "please wait" page to the user, which refreshes every few seconds.
When the please wait page is refreshed, the controller needs to check on the progress of the background thread. What is the best way of doing this?
If I put the Future object in the Session directly, then the poll request threads can pull it out and check on the thread's progress. However, doesn't this mean my Sessions are not serializable, so my app can't be deployed with more than one web server (without requiring sticky sessions)?
I could put some kind of status flag in the Session, and have the background thread update the Session when it is finished. I'm very concerned that passing an HttpSession object to a non-request thread will result in hard to debug errors. Is this allowed? Can anyone cite any documentation either way? It works fine when the sessions are in-memory, of course, but what if the sessions are stored in a database? What if I have more than one web server?
I could put some kind of status flag in my database, keyed on the session id, or some other aspect of the slow operation. It seems weird to have session data in my domain database, and not in the session, but at least I know the database is thread-safe.
Is there another option I have missed?
The Spring MVC part of your question is rather easy, since the problem has nothing to do with Spring MVC. See a possible solution in this answer: https://stackoverflow.com/a/4427922/734687
As you can see in the code, the author is using a tokenService to store the future. The implementation is not included and here the problems begin, as you are already aware of, when you want failover.
It is not possible to serialize the future and let it jump to a second server instance. The thread is executed within a certain instance and therefore has to stay there. So session storage is no option.
As in the example link you could use a token service. This is normally just a HashMap where you can store your object and access it later again via the token (the String identifier). But again, this works only within the same web application, when the tokenService is a singleton.
The solution is not to save the future, but instead the state of the work (in work, finished, failed with result). Even when the querying session and the executing threads are on different machines, the state should be accessible and serialize able. But how would you do that? This could be implemented by storing it in a database or on the file system (the example above you could check if the zip file is available) or in a key/value store or in a cache or in a common object store (Terracota), ...
In fact, every batch framework (Spring Batch for example) works this way. It stores the current state of the jobs in the database. You are concerned that you mix domain data with operation data. But most applications do. On large applications there is the possibility to use two database instances, operational data and domain data.
So I recommend that you save the state and the result of the work in a database.
Hope that helps.

Running a query in Page Load a bad idea?

I'm running an ASP.NET app in which I have added an insert/update query to the [global] Page_Load. So, each time the user hits any page on the site, it updates the database with their activity (session ID, time, page they hit). I haven't implemented it yet, but this was the only suggestion given to me as to how to keep track of how many people are currently on my site.
Is this going to kill my database and/or IIS in the long run? We figure that the site averages between 30,000 and 50,000 users at one time. I can't have my site constantly locking up over a database hit with every single page hit for every single user. I'm concerned that's what will happen, however this is the first time I have attempted a solution like this so I may just be overly paranoid.
Do it Async.
Create a dll that handles the update, and in the page load do a fire and forget with parameters.
Insert-Based designs have less locking than Update-Based designs.
So if a user logged-in and then logged-out, in an Insert-Based design you would have multiple rows with a SessionID in each, one for each activity whereas in an Update-Based design, you would have a SessionId, LoginTime and a LogoutTime column and you would update the LogoutTime based on the SessionId.
I have seen many more locking and contention problems caused by Update activity more than Insert activity.
Activities such as counting and linking logins to logouts etc take more complex queries and a little more resources.
It goes without saying that your queries, especially the ones that run on every page, should be as fast as possible so that the site doesn't appear slow to users.
To keep track of how many users are currently on your site you could use performance counters. What you describe though sounds more like a full fledged logging of every page hit.
Lets say you realy have 50k users connected at any one time.
As long as you don't have contention between the updates (trying to lock the same record) a database can track a very high number of inserts and updates. You need to do some capacity planning to assure the load can be carried. 50k users visiting a page every minute will give you 50k inserts and 50k updates per minute, roughly 850 inserts and 850 updates per second, which have to commit (flush the log). Does your DB I/O subsytem support such a write pressure load, in addition to responding to all the requests (reads)?
Also 50k users doing 1 page hit per minute adds up to 72 mil hits per day, 72 mil. logging inserts, at such a rate you need to carefully plan the size capacity of the database and consider what kind of analysis you'll do on the collected data since querying ad-hoc 2 billion rows (one month data) will get you nowhere fast (actually... quite slow).
Doing it async can give you some relief over very short spikes, but not on the long run. If your DB system cannot handle the load then doing async calls will just create a backlog queue in the application process (in the ASP app pool) and this will grow until out of memory, at which moment the all vigilant IIS will 'recycle' the app pool, thus loosing all pending async updates.
I think updating the database in the begin session and end session will do the job. that will reduce the count of statements dramatically.
I think it makes no difference if you track hits or begin/end session. with hits you'll also need additional logic to subtract inactive users
EDIT: session end is not fired always. I would suggest to call an update statement/stored procedure in another session begin event (in addition to the other insert statement) that will fix invalid sessions.
I don't think that calling this "fix routine" is necessary in every page load event because I think you cant exactly count "current no. of visitors".
I would keep this in Application state instead - if possible. On ApplicationStart create some data structure saved to App state that you can update from anywhere in your application - session start, page load, wherever. Keep it out of the database. You are just using it to track "currently online" info anyway it sounds like.
If you have multiple instances of your app, or if there is a requirement to maintain historical info beyond the IIS logs, this won't work obviously. Go with chris' fire-and-forget solution in that case.
What's wrong with IIS Logs?
2009-05-01 12:30:31 207.219.27.35 GET /assocadmin/ibb-reg.asp - usernameremoved 544.566.570.575 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+SLCC1;+.NET+CLR+2.0.50727;+Media+Center+PC+5.0;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618) 200 0 0 40058
EDIT: I'd like to close this answer, but I want the comments to stay. Consider this answer withdrawn.
How about adding a small object to the session?
Something like LoggedInUserFlag:IDisposable
In the constructor, increment your counter however you decide to implement it.
Then in the Dispose method, decrement the counter.
This way, regardless of how the session is ended, the counter will always be (eventually) decremented.
see:
http://weblogs.asp.net/cnagel/archive/2005/01/23/359037.aspx
for info on using IDisposable.
I am not an ASP guy at all, but what about rather than logging all that other info, and insert their IP address?
If they have an IP address already in there, have a last_seen timestamp, and on each refresh just delete any row that isn't 10 minutes ago?
This is how I would take a shot at it. It is much more space efficient, but I am not sure about the checking and deleting so much on such a high profile site.
As a direct answer to your question, yes, running a database query in-line with every request is a bad idea:
Synchronous requests will tie up a thread, which will reduce your scalability (fewer simultaneous activities)
DB inserts (or updates) are writes to the DB, which will put a load on your log volume
DB accesses shouldn't be required in a single server / single AppPool scenario
I answered your question about how to count users in the other thread:
Best way to keep track of current online users
If you are operating in a multi-server / load-balanced environment, then DB accesses may in fact be required. In that case:
Queue them to a background thread so the foreground request thread doesn't have to wait
Use Resource Governor in SQL 2008 to reduce contention with other DB accesses
Collect several updates / inserts together into a single batch, in a single transaction, to minimize log disk I/O pressure
Return the current count with each DB access, to minimize round-trips
In case it's of any interest, I cover sync/async threading issues and the techniques above in detail in my book, along with code examples: Ultra-Fast ASP.NET.

Caching database data in session - getting the balance right

If you cache data from your database in ASP.NET session then you will speed up subsequent requests for that data (note: depending on the serialization/deserialization cost of the data concerned), at the expense of memory load in IIS.
(OK, this is probably a simplification of the reality of the situation - feel free to correct or refine my statement above)
Do you use any rules (simple rules of thumb or otherwise) to decide when it is appropriate to cache in Session?
Update
Can use of Session for storing read-only data without very well thought out rationale simply be considered another case of Premature Optimization? (and therefore, bad)
Keep in mind with caching in session if you use the InProc mode, you are limiting your scalability unless your code will go back to the DB when the session is empty. You will be forced to use a single server or pin your user to a particular server.
If your using Session State Server this doesn't apply, and if your using SQL to store session state then using session for caching is pointless.
Edit
Any advice around this topic is subject to your specific environment. As you stated, a very complex sql query might benefit from being cached even when using SQL Session State. I think the most important thing is first make your application perform the functions and achieve the business requirements. Then go back and test and optimize the application to handle the load you expect.
Edit 2
Based on your update no I don't think this is true. In one of my ASP.Net Applications I am using session state to store a complex object model that the user is then manipulating and modifying. We are using AJAX and so we have a lot of short communication to server as user updates the object. Keeping the object in session was done as a convience. The object performs a lot of customized calculations to generate different data points, so doing this on the server was ideal as opposed to trying to replicate the code in the servr and in javascript. Also keeping it in session lets us have an undo function very easily.
And Yes I know we sacrificed scalability, but I welcome the day when I sit down with my boss and explain we have a problem because we have too many users (And I know he would too).
So I think the question is why are you storing data in session is it for convience and to provide access to transient data while the user is logged in? That is different then storing it there for caching. One thing to remeber with caching is how are you going to flush the cache? How do you invalidate it? I don't session state is built to handle this.
Edit 3
Back at it well read only data, user specific, expensive to load from DB, go ahead cache it in session. You can write your code so if it's not in session then you hit the DB. Have a nice little helper class that does this hidding it from your web app that your even using session and you should be good. Nice thing about hidding where you store it from your web if you find you run into issues you only have one place to change it.
Given that it is very hard to guarantee that data in a session gets cleared up in a reasonable time frame, I would be very wary of cashing anything more than the odd integer there. When you get up to a large scale you will either hit memory issues or have all your users get session time out errors. Remember that unlike the Cache object sessions don't get killed off when memory is low.
Also as Josh says if you go to multiple servers and need to use either a Database or Session State Server your session object will need to be serialisable. In this case the cost of serialising and de serialising is likely to be worse than the cost of a well optimized query.

Does the concept of shared sessions exist in ASP.NET?

I am working on a web application (ASP.NET) game that would consist of a single page, and on that page, there would be a game board akin to Monopoly. I am trying to determine what the best architectural approach would be. The main requirements I have identified thus far are:
Up to six users share a single game state object.
The users need to keep (relatively) up to date on the current state of the game, i.e. whose turn it is, what did the active user just roll, how much money does each other user have, etc.
I have thought about keeping the game state in a database, but it seems like overkill to keep updating the database when a game state object (say, in a cache) could be kept up to date. For example, the flow might go like this:
Receive request for data from a user.
Look up data in database. Create object from that data.
Verify user has permissions to perform request based on the game's state (i.e. make sure it's really their turn or have enough money to buy that property).
Update the game object.
Write the game object back to the database.
Repeat for every single request.
Consider that a single server would be serving several concurrent games.
I have thought about using AJAX to make requests to an an ASP.NET page.
I have thought about using AJAX requests to a web service using silverlight.
I have thought about using WCF duplex channels in silverlight.
I can't figure out what the best approach is. All seem to have their drawbacks. Does anyone out there have experience with this sort of thing and care to share those experiences? Feel free to ask your own questions if I am being too ambiguous! Thanks.
Update: Does anyone have any suggestions for how to implement this connection to the server based on the three options I mention above?
You could use the ASP.Net Cache or the Application state to store the game object since these are shared between users. The cache would probably be the best place since objects can be removed from it to save memory.
If you store the game object in cache using a unique key you can then store the key in each visitors Session and use this to retrieve the shared game object. If the cache has been cleared you will recreate the object from the database.
While updating a database seems like overkill, it has advantages when it comes time to scale up, as you can have multiple webheads talking to one backend.
A larger concern is how you communicate the game state to the clients. While a full update of the game state from time to time ensures that any changes are caught and all clients remain in synchronization even if they miss a message, gamestate is often quite large.
Consider as well that usually you want gamestate messages to trigger animations or other display updates to portray the action (for example, of a piece moves, it shouldn't just appear at the destination in most cases... it should move across the board).
Because of that, one solution that combines the best of both worlds is to keep a database that collects all of the actions performed in a table, with sequential IDs. When a client requests an update, it can give all the actions after the last one it knew about, and the client can "act out" the moves. This means even if an request fails, it can simply retry the request and none of the actions will be lost.
The server can then maintain an internal view of the gamestate as well, from the same data. It can also reject illegal actions and prevent them from entering the game action table (and thus prevent other clients from being incorrectly updated).
Finally, because the server does have the "one true" gamestate, the clients can periodically check against that (which will allow you to find errors in your client or server code). Because the server database should be considered the primary, you can retransmit the entire gamestate to any client that gets incorrect state, so minor client errors won't (potentially) ruin the experience (except perhaps a pause while the state is downloaded).
Why don't you just create an application level object to store your details. See Application State and Global Variables in ASP.NET for details. You can use the sessionID to act as a key for the data for each player.
You could also use the Cache to do the same thing using a long time out. This does have the advantage that older data could be flushed from the Cache after a period of time ie 6 hours or whatever.

What to put in a session variable

I recently came across a ASP 1.1 web application that put a whole heap of stuff in the session variable - including all the DB data objects and even the DB connection object. It ends up being huge. When the web session times out (four hours after the user has finished using the application) sometimes their database transactions get rolled back. I'm assuming this is because the DB connection is not being closed properly when IIS kills the session.
Anyway, my question is what should be in the session variable? Clearly some things need to be in there. The user selects which plan they want to edit on the main screen, so the plan id goes into the session variable. Is it better to try and reduce the load on the DB by storing all the details about the user (and their manager etc.) and the plan they are editing in the session variable or should I try to minimise the stuff in the session variable and query the DB for everything I need in the Page_Load event?
This is pretty hard to answer because it's so application-specific, but here are a few guidelines I use:
Put as little as possible in the session.
User-specific selections that should only last during a given visit are a good choice
often, variables that need to be accessible to multiple pages throughout the user's visit to your site (to avoid passing them from page to page) are also good to put in the session.
From what little you've said about your application, I'd probably select your data from the db and try to find ways to minimize the impact of those queries instead of loading down the session.
Do not put database connection information in the session.
As far as caching, I'd avoid using the session for caching if possible -- you'll run into issues where someone else changes the data a user is using, plus you can't share the cached data between users. Use the ASP.NET Cache, or some other caching utility (like Memcached or Velocity).
As far as what should go in the session, anything that applies to all browser windows a user has open to your site (login, security settings, etc.) should be in the session. Things like what object is being viewed/edited should really be GET/POST variables passed around between the screens so a user can use multiple browser windows to work with your application (unless you'd like to prevent that).
DO NOT put UI objects in session.
beyond that, i'd say it varies. too much in session can slow you down if you aren't using the in process session because you are going to be serializing a lot + the speed of the provider. Cache and Session should be used sparingly and carefully. Don't just put in session because you can or is convenient. Sit down and analyze if it makes sense.
Ideally, the session in ASP should store the least amount of data that you can get away with. Storing a reference to any object that is holding system resources open (particularly a database connection) is a definite scalability killer. Also, storing uncommitted data in a session variable is just a bad idea in most cases. Overall it sounds like the current implementation is abusively using session objects to try and simulate a stateful application in a supposedly stateless environment.
Although it is much maligned, the ASP.NET model of managing state automatically through hidden fields should really eliminate the majority of the need to keep anything in session variables.
My rule of thumb is that the more scalable (in terms of users/hits) that the app needs to be, the less you can get away with using session state. There is, however, a trade-off. For web applications where the user is repeatedly accessing the same data and typically has a fairly long session per use of the site, some caching (if necessary in session objects) can actually help scalability by reducing the load on the DB server. The idea here is that it is much cheaper and less complex to farm the presentation layer than the back-end DB. Of course, with all things, this advice should be taken in moderation and doesn't apply in all situations, but for a fairly simple in-house CRUD app, it should serve you well.
A very similar question was asked regarding PHP sessions earlier. Basically, Sessions are a great place to store user-specific data that you need to access across several page loads. Sessions are NOT a great place to store database connection references; you'd be better to use some sort of connection pooling software or open/close your connection on each page load. As far as caching data in the session, this depends on how session data is being stored, how much security you need, and whether or not the data is specific to the user. A better bet would be to use something else for caching data.
storing navigation cues in sessions is tricky. The same user can have multiple windows open and then changes get propagated in a confusing manner. DB connections should definitely not be stored. ASP.NET maintains the connection pool for you, no need to resort to your own sorcery. If you need to cache stuff for short periods and the data set size is relatively small, look into ViewState as a possible option (at the cost of loading more bulk onto the page size)
A: Data that is only relative to one user. IE: a username, a user ID. At most an object representing a user. Sometimes URL-relative data (like where to take somebody) or an error message stack are useful to push into the session.
If you want to share stuff potentially between different users, use the Application store or the Cache. They're far superior.
Stephen,
Do you work for a company that starts with "I", that has a website that starts with "BC"? That sounds exactly like what I did when I first started developing in .net (and was young and stupid) -- I crammed everything I could think of in session and application. Needless to say, that was double-plus ungood.
In general, eschew session as much as possible. Certainly, non-serializable objects shouldn't be stored there (database connections and such), but even big, serializable objects shouldn't be either. You just don't want the overhead.
I would always keep very little information in session. Sessions use server memory resources which is expensive. Saving too many values in session increases the load on server and eventualy the performance of the site will go down. When you use load balance servers, usage of session can run into problems. So what I do is use minimal or no sessions, use cookies if the information is not very critical, use hidden fields more and database sessions.

Resources