I have a requirement that my site always display the number of users currently online. For example, "35741 Users Currently Online". This is not based on a log in, simply how many users are currently on my site. I have tried using Session Start/Session End for this, however session end is not reliable. Therefore I get inflated numbers, as my session start adds numbers but session end doesn't remove them because it doesn't fire.
There is no additional information to be gathered from this (reporting, etc), it's simply requested that the number show up. Very simple request that's turning into a huge deal. Any help is appreciated.
EDIT:
I should specify that I have also tried using a database for this. Simple table that contains a session ID and a last activity column. With each page hit, I check to see if the session is in my database. If not, insert. If so, update with activity time. Then I run a procedure that sweeps the database looking for sessions with no activity in the last 20 minutes. This approach seemed to kill my SQL server and/or IIS. Had to restart the site.
Best way is like you do, but time it out via activity. If a given session doesn't access a page within 5 minutes or so, you may consider them no longer active.
If you're using ASP.Net membership, take a look at GetNumberOfUsersOnline.
For every user action that you can record, you need to consider them "online" for a certain window of time. Depending on the site, you may set that to 5 minutes. The actual web request should take less than a second. You have to make some assumption about how long they might stay on that page and do nothing but be considered online.
This approach requires that you keep track of the time of each users last activity.
Use Performance Counters:
State Server Sessions Active: The
number of active user sessions.
Expanding what silky said in his answer - since really http is stateless to determine if the user is currently 'online' you can really only track how long since the user last accessed your site and make a determination on how long between requests your consider to still be active.
Since you stated that this isn't based upon users logging in may it's a simple of how many different IP addresses you received requests from in the past 5 minutes (or however long you consider the 'online' timeout to be).
Don't use sessions for this unless you also need sessions for something else; it's overkill otherwise.
Assuming a single-server installation, do something like this:
For each user, issue a cookie that contains a unique ID
Maintain a static table of unique IDs and their last access time
In an HttpModule (or Global.asax), enter new users into the table and update their access times (use appropriate locking to prevent race conditions)
Periodically, either from a background thread or in-line with a user request, remove entries from the table that haven't made a request within the last N minutes. You might also want to support an explicit "log out" feature.
Report the number of people online as the size of the table
If you do use sessions, you can use the Session ID as the unique identifier. However, keep in mind that Session IDs aren't issued until you store something in the Session dictionary, unless you have a Session_Start() event configured.
In a load balanced or web garden scenario, it gets a little more complicated, but you can use the same basic idea, just persisting the info in a database instead of in memory.
When the user logs in write his user name into the HttpContext.Current.Cache with a sliding expiration (say 20 minutes).
Then in the Global.asax.cs in the Application_PreRequestHandlerExecute "touch" the cache entry for the current users so it resets the sliding expiration.
When a user explicitly logs out, remove his username from HttpContext.Current.Cache.
If you do this, at any given time HttpContext.Current.Cache.Count will give you the # of current users.
Note: this is assuming you aren't using the Cache for other purposes.
Related
Here's my issue, we have a large patient object that is used on multiple screens throughout the admin. Each screen contains different information about the same patient. It can't all be on one screen.
The only time I want to persist the patient is when the user clicks save. I need to have an in memory patient somewhere. A user may be in the admin, change patient information on various screens, run validation and decide to not save that patient. This is typical use.
Is it ok to store this patient in the session? Or, is there a better approach to do this? At most this admin would have 20 users with access.
Opinions may vary on this. Session is tricky, especially if you use something other than in-memory session. Distributed session will break a non-serializable object. If this object is a simple POCO or object you control, try your best to make it play with serialization. If it does you're set. For an admin tool without much load I'd say you'd be fine.
Hey I found this - know nothing about the site, but illustrates my point:
https://www.fortify.com/vulncat/en/vulncat/dotnet/asp_dotnet_bad_practices_non_serializable_object_stored_in_session.html
I had a similar situation with similar amount of users. I did it and it worked great.
My situation was about scheduling events.
Someone would create an event and through multiple web pages would modify and configure this event. When they were all done it would save all the details to SQL. In the end, I was surprised just how well it worked.
Session should be fine here. You have what appears to be a light user load... but you might want to check exactly how much memory the object takes up, multiply that by the maximum number of users, and see where you are.
If you want to avoid the session altogether, you could use System.Web.Caching to store the object instead, and key the stored object using the users identifier plus some constant string.
In either case, you'll want to be aware of how many web servers are running the application. If it's just one web server, no worries. If you have multiple web servers, you'll want to make sure they are "sticky" - then the user is guaranteed to have all requests processed by the same server. How this is done is entirely dependent on your flavor of load balancing... normally the "IT folks" handle this for you.
I am using master page in which menu is generating dynamically according to user role in code. The same menu is used for all application to a particular user up to log out. So instead of recreating it, i need the same menu for all of the application. The Menu is in StringBuilder which is very large size. Is Session or Data cache is better and less memory consuming in my situation and why. Please suggest?
I want to improve performance of master page.
Thanks
I think Cache will be better as you will have only one instance created for one role, but Session will make it create multiple instance as many as user is accessing, and you will have to wait session timeout sometimes to free the memory
If every user will get the same menu:
You should consider putting it in the Application "Cache" - Application["MyMenu"] or a static field on one of your objects.
The main reason for this is lifetime. If you put it in an application level object, then it will last for the lifetime of the application. Putting it in a session level object will cause it to be lost when that session ends - as a session is started per user, then you will soon find yourself recaching the data.
On the other hand... if it's unique per user:
The session provides a handy place to put this data, as it is unique to that user, and will not live long beyond that user leaving the site.
Also think about:
If you really think memory is going to be an issue or you want to define exactly how long you keep it for
Put it in the Cache. You can determine the amount of time it lives in the cache, and, additionally, the cache will start to dump objects when it gets short on memory - so it is more sensitive to load than the other options.
There is a good discussion of Session vs Cache on SO already
Additionally
Are you sure your menu is that big? If it is, you might want to consider alternatives - just how big are you talking?
I need to store IDs (Contact IDs, Claim IDs, etc.) between multiple .aspx pages.
At the moment I am storing the ID in the Session and have set the Session timeout to 300 minutes. However, I am still getting errors because users are attempting to perform operations after the Session has expired.
I think users are leaving their web broswers open, locking their computers, going home for the evening, coming in the next morning and attempting to pick up where they left off.
I don't want to use the Querystring. Cookies are more for User IDs than Contact IDs and Claim IDs. Viewstate is only maintained per page. Persisting Session to a database seems unnecessarily complicated. I don't want to extend the Session timeout too much.
I'd ideally like them to be able to pick up where they left off in the morning.
What are the best practices for dealing with storing IDs between pages? How can I do this without them receiving a message saying their session has expired? Am I overlooking something obvious?!
I would put all this information into a database. This is classic use of a database, a means to store information needed for the application.
Cookies are bad because you have to assume they will be at the same PC from session to session. You have seem the problem with sessions.
If you do not want to use a database then you can use a file on the server. And read from that file.
You might actually find it much easier than complicated to create a sort of "landing pad" database where a user's item history is stored, or MRU list. I wouldn't tie it to their immediate session or cookie though. Let the session maintain the user state, let the database handle where they were when they went home or took that 7 hour lunch break.
As #David said this is a classic use of databases.
If you are in need of doing this without using databases, you can fall back on some of the "tricks" (read hacks) that are used in non-dynamic languages:
Create a hidden field which exists on every page, name the field appropriately for the data you are storing.
When you render the page create an XML fragment containing your ids and other data, encrypt it for display purposes (using base64 etc), place the value into the hidden field you created.
When the page is posted back, reverse #2 and recover your data, then apply the page changes.
Rinse and repeat. :)
Really this is not a very scalable solution, as the XML and base64 encoding will add 50% or more to your data size, and if there is significant data (100's of KB) this process can get out of control.
The last time I was asked to use this solution it was for moving a list of well defined data around with an application that did not always have access to the DB. It worked ok for the purposes, but was definitely not ideal.
You could go ahead and store the IDs in viewstate on the page, then on each page that requires that ID have a settable property for that page. Before redirecting to the page instantiate it and set the ID property.
You're correct, storing this type of data in session isn't really what it's meant for. Viewstate is a better mechanism for this.
I have an ASP.NET application that uses Session.SessionID to prevent multiple users viewing the same data at the same time.
I have a table that contains a set of images (stored in BLOB) that require processing. Only one user is supposed to be able to view the same image at the same time. To achieve this, as each record is retrieved by a user the record is updated with the Session.SessionID. This update occurs inside a ReaderWriterLock.
I have done a test to ensure the ReaderWriterLock is working correctly and can confirm that only one session can execute the code inside that block at once.
My current theory is that two different users are getting the same SessionID at the same time. A user of this application is allowed to view records they have locked or any unlocked images.
I have modified the application to display the SessionID in the footer of every page so that if the problem happens again I can check the SessionID value.
I've seen some articles online suggesting that SessionID is not unique and some saying that the SessionID is unique. I understand that SessionID is not unique forever but can the SessionID value be considered unique for active sessions?
This forum describes a similar problem
I have also read some suggestions that a Guid should be stored in the Session object and used as a unique ID instead of the Session ID.
Thanks for the responses so far. Here is a clarification based on the answers so far:
"Locked forever" - we prevent this by a lock timeout of 5 minutes. Before a user locks an image, while inside the ReaderWriterLock, we do a "cleanup" of old locks (which unlocks images locked for more than 5 minutes), a query to get the oldest unlocked image and an update statement to "lock" that image to the current session.
A possible cause of the problem would be if one user "locks" an image but then leaves the PC for a short break. If they did nothing for 5 minutes, that image on there screen would be unlocked and potentially opened by another user. I mentioned this scenario when the problem was reported and I was assured that the users had been working continuously.
"Different Window/Tab" - I haven't actually seen the error with my own eyes but the person who reported the problem has told me that it is two different PC's and two different usernames of the logged in user.
Hopefully now that I am displaying the Session ID on the page, next time it happens I will be able to say with certainty whether it is the same Session ID on two machines or if it is some other problem. This issue has never occurred during the testing phase so it appears to be a symptom of a larger number of concurrent users.
Thanks for the responses so far and I will update this question as more information comes to hand.
It seems that the user didn't give me the full story. Session ID is unique in our case as per the accepted answer. Two users were able to see the same image at the same time because user 1 was idle for the 5 minute "abandoned image" unlock process. The "abandoned image" timeout has been raised to match the session timeout to avoid this problem.
Session ID is unique per user as far as ASP.NET assigns them, though that is not a guarantee against malicious users (a user could manually copy the ID they've been assigned and give it to someone else).
What you are likely seeing here is multiple tabs or windows from the same user, as it is perfectly valid for a user to be making more then one request at a time.
To do what you want I would have to ask - how do you know when a user has stopped looking at an image, so you can unlock it. What if I view an image, and then just close my browser instead of going to another page/image, does it remain locked to my (lost forever) session id?
If you are using some kind of checkout scheme - the user must intentionally check out and check in an image, then you should perhaps be using a unique number for that checkout (new Guid), rather then the whole user.
Red flag here people.
Yes you can but it's absolutely essential that you can't or at the very least make it as difficult as possible to re-use them in the future.
This is dependent on how well your server side application is handling Session ID's and maintaining State. You should (actually MUST) think about limiting session_ID lifetime, how you look after authentication /authorisation state.
I saw in a critical trading platform at a top 10 investment bank whereby you could capture Session ID's on the fly (that contained authorised permissions) and re-use them (through a tool such as Paros from ParosProxy.org) to perform $multi million trades on someone else's behalf. In the current climate - is this an issue? ;-) Sorry - as much as I'd love to name & shame these clowns, I won't.
How likely is this scenario? Can you capture Session ID's on a Switched Network? Certainly within a local VLAN using the hacking tool CAIN and taking advantage of ARP poisioning you can.
In a poorly written Server Side application, you can also predict Session ID's. Check the tool WebScarab (which is in any Pen Testers armoury). This will detect the randomness of ID's. At the same bank with another critical application you could generate your own Session ID's to access & trade applications. Their focus was on low latency (which is business critical) rather than Security.
An intro can be found at Owasp.org
Noelie Dunne
This link says session ids are unique - http://support.microsoft.com/kb/899918
Session ids are unique in that only 1 session will ever have that id at any given time.
If this wasn't the case I think there would be a lot of people shouting very loudly about it.
The Session ID is almost certainly unique to the user. Failure modes where several users share a SessionID are very, very rare today. However, a user can do several things to create the effect you are seeing.
For instance the user can have several tabs open in her browser. Those tabs will all share the same Session ID. So if she switches back and forth between those browser tabs it might give the effect you are seeing.
Another issue is that users frequently doubleclick buttons and links. This means a processing request may get issued twice with the same Session ID. I would check for this possibility first.
I would like some advice from anyone experienced with implementing something like "pessimistic locking" in an asp.net application. This is the behavior I'm looking for:
User A opens order #313
User B attempts to open order #313 but is told that User A has had the order opened exclusively for X minutes.
Since I haven't implemented this functionality before, I have a few design questions:
What data should i attach to the order record? I'm considering:
LockOwnedBy
LockAcquiredTime
LockRefreshedTime
I would consider a record unlocked if the LockRefreshedTime < (Now - 10 min).
How do I guarantee that locks aren't held for longer than necessary but don't expire unexpectedly either?
I'm pretty comfortable with jQuery so approaches which make use of client script are welcome. This would be an internal web application so I can be rather liberal with my use of bandwidth/cycles. I'm also wondering if "pessimistic locking" is an appropriate term for this concept.
It sounds like you are most of the way there. I don't think you really need LockRefreshedTime though, it doesn't really add anything. You may just as well use the LockAcquiredTime to decide when a lock has become stale.
The other thing you will want to do is make sure you make use of transactions. You need to wrap the checking and setting of the lock within a database transaction, so that you don't end up with two users who think they have a valid lock.
If you have tasks that require gaining locks on more than one resource (i.e. more than one record of a given type or more than one type of record) then you need to apply the locks in the same order wherever you do the locking. Otherwise you can have a dead lock, where one bit of code has record A locked and is wanting to lock record B and another bit of code has B locked and is waiting for record A.
As to how you ensure locks aren't released unexpectedly. Make sure that if you have any long running process that could run longer than your lock timeout, that it refreshes its lock during its run.
The term "explicit locking" is also used to describe this time of locking.
I have done this manually.
Store the primary-key of the record to a lock table, and mark record
mode attribute to edit.
When another user tries to select this record, indicate the user's
ready only record.
Have a set-up maximum time for locking the records.
Refresh page data for locked records. While an user is allowed to
make changes, all other users are only allowed to check.
Lock table should have design similar to this:
User_ID, //who locked
Lock_start_Time,
Locked_Row_ID(Entity_ID), //this is primary key of the table of locked row.
Table_Name(Entity_Name) //table name of the locked row.
Remaining logic is something you have to figure out.
This is just an idea which I implemented 4 years ago on special request of a client. After that client no one has asked me again to do anything similar, so I haven't achieved any other method.