I have an ASP.NET MVC application, that suffers a horrible affliction. In one of the post methods the user is able to submit an update. This update takes maybe 10 seconds to compute, and impatitient users sometimes click more than once. I belive this is causing a database update race condition, and I don't know what to do. Where should I save the "isUpdating"-variable in order to block such repeat requests? It can't be a webrole instance, since those are independent, and my user may end up on one or the other. Nor can it be the database, because of the race condition. I'm sure there must be a stanard way. I could for example see a scenario where I restrict users to specific webroles. Is that possible, or is there a better way?
In this case it would probably be better to write the information from the user to a queue, then return the page to the user straight away.
Then have a worker role that picks the information out of the queue and updates the database.
Related
How to restrict the page by accessing only one user at a time. Using asp.net can i use global.asax, is there any other way to restrict, if one user accessing the page, another user not able to access that page. we have to give message that one user is accessing the page. is it possible. can you help me or give some reference.
Although there are probably many better ways of dealing with this sort of problem, I'm going to assume that you do actually need this.
What I would do:
Make your application so that when the page is loaded(when it isn't "locked"), it logs to a database that the page was loaded and "lock" it. In the actual page, I'd have some kind of AJAX to constantly poll the web server every 5-15 seconds to tell your application the user is still on the page. And then make it so that the page becomes unlocked after 5-15 seconds from the time saved to the database by the last AJAX call.
Again, I really suspect that there is a better way around an issue like this, but this is a direct answer to your question
Based on this:
yeah sure, jupaol, it is depend on accounts, in my web application, one report has to approve only one user, but the approve authority having two users. if both of them accessing the same page and approve at a time, it will big mess. here i an not using database.
The problem is related with concurrency, there are several ways to face an issue like this, for example the easiest one is to use optimistic concurrency. Even when you are not using a database for this, you can emulate it.
You should be storing the result of the approvers somewhere, in order to mark the report as approved, with this in mind you should be able to do something like this:
Before render the page get the latest report status
If the report has not been approved, render normally
If the report was approved seconds before, render it in read-only mode reporting who approved it (or similar approach)
Add a validation to your ChangeStatus method, in this method do the following:
Get the latest status of the current report
If the report is still not validated, then block the thread (you could use a Mutex or similar) and mark the report as validate it
If the report was already validate it, raise a domain exception and handle it in your page correctly (perhaps render the page in read-only mode explaining that the report was already validate it)
If you want a more responsive application, (RIA), you might want to consider the following approaches:
Perhaps this would be the worst approach but it's still an option, you could keep a log tracking when a user request your page, then in subsequent requests check if the log is still valid, if it is not, then redirect to another page indicating the page is in use, otherwise allow access to the page. I believe this is an error-prone approach because you would be relying on this simple validation in order to prevent an inconsistency in your system, besides you would have the polling problem described in the following approach
Using AJAX to poll data from a service checking if the report has been approved. Perhaps this is the easiest way to accomplish this but it is not recommended it, because you would be polling your server constantly, and eventually you would have scalability problems
You could use Comet to get notified to the browser (client) whenever a server event has occurred, in this case when your report has been approved. The problem with this approach is that you have to keep an opened connection with the server in order to get notified.
The last approach and the most recommended these days is to use Web Sockets, this is the technology used in StackOverflow to get notifications in real time.
I'm building ASP.Net MVC application "kinda Game" which deal a lot with online users.
I made Ajax request fired every "10s" to some Action to keep user online when he keeps site open.
this Action update LastActivityDate for this User - ((in static List and DataBase)).
So the Question is :
Is it better to store Online Users in static list and write some code when user log in to add him to that list and then keep manage this list every "10s" to kick out the offline users.
Or the best is to load online users
from DataBase whenever i want OnlineUsers.
note: I'm using technique from this SO Question to do some periodically tasks like re-manage OnlineUsers static list.
First, you wouldn't use a List<User> for this, but rather a Dictionary<int,User>, using the user's id as the key, so that you could immediately find the user to update. Second, I think it's probably a mixture of both. You keep a cached copy of the current users, periodically refreshed from the DB, and persist the data (asynchronously, if necessary) to the DB. You might want to think about a custom persistence class for Users that encapsulates this behavior if you find that you're doing this sort of operation in various places in your code.
If you intend on having a large number of users, and you would need to pull data from the DB frequently, it may be better to store a list of users in the Cache. Obviously this will be stored in the server's memory, so you wouldn't want to store a large amount of objects, but if it's just a simple list of online users it shouldn't be an issue.
The scope of static is always the scope of the process that runs in the operating system. So in a desktop application the use of static makes sense. However, I find the use of static a little bit arbitrary for server side applications because you don't control the processes. It's the web server that does this. What if the process ends unexpectedly? Or what if there are many processes that serve your application?
So, the use of the database is unavoidable. Still, you can the static scope as a temporary cache but you cannot rely on it.
In a new project, I'm planning to use ActiveDirectoryMembershipProvider and SqlRoleProvider to provide authentication and authorization, respectively.
One thing that isn't clear to me is how maintenance is handled -- when users that have logged in and been assigned roles are removed from Active Directory, how do you remove orphaned records in the mapping table used by SqlRoleProvider? I believe this is the aspnet_UsersInRoles table.
One could query Active Directory periodically for disabled users, then iterating through that list calling Roles.RemoveUserFromRoles(UserId, Roles.GetRolesForUser(UserId)) where UserId is also in aspnet_UsersInRoles. Hugely slow, I would imagine, for a large organization.
Or, alternatively, for each distint UserId in UsersInRoles, query ActiveDirectory and ensure the userAccountControl attribute's bitmask doesn't indicate the account is disabled. Also very inefficient, for a large number of application users.
An even more ugly but much more efficient approach would be to store last login date and periodically purge role associations for users that haven't logged in for, say, six months. This might cause headaches.
I'd love to hear suggestions.
Yes, you have to manually do the cleanup. Do you need instantaneous update? If you can do a batch process that runs nightly, that would be efficient since it isn't running during core operational hours. Or, it might make sense to kick off a process in another thread to handle the deletion of the role as soon as you are aware of it. Removing roles per user access shares the hit across users and makes them think that the application is slow.
How many times are roles removed? If a lot, then consider a batch process, if once in a few years, then it probably isn't as much of an issue to work it into the application during some process.
As far as how too, you can use the API, but the aspnet_UsersInRoles and aspnet_roles tables could be easily wiped on their own accord too via SQL Script.
HTH.
Here's the question scenario:
Suppose you have a multiple-page ASP.NET web site with the following
requirements:
User-specific data for the currently logged in user is loaded and is required on each individual page of the application during a user's session.
The application itself only allows a certain number of users to be logged in at one time.
The next time a specific user logs in, the user should be returned to the last page visited.
Given this information, briefly describe how you would use ASP.NET to manage the state of the application to meet these needs?
Here's my thoughts and reasons. Please provide yours.
User-specific data for the currently
logged in user is loaded and is
required on each individual page of
the application during a user's
session.
This is suggesting to me that the interviewer is looking to see if I would suggest using Master pages as a way to provide a common approach to displaying the same thing on every page.
The application itself only allows a
certain number of users to be logged
in at one time.
Could the sought response be that, because scaling isn't an issue due to the limited number of users, that it is OK to put this information in the Session object for performance reasons or is this a trap and some of approach is better?
The next time a specific user logs in,
the user should be returned to the
last page visited
A cookie seems the best approach to track the last page access, since this doesn't seem to be critical information.
Please tell me how you would handle these question if you wanted to make the best impression
Feel free to provide input or comment an any line item.
Thanks!
As far as (3) is concerned, consider a shared PC. User A logs into a website using their site based user name/password. Does a whole load of work and shuts down the browser. USer B then comes along and on the same PC logs into the same site using their details. However, they will get the cookie from User A and be redirected to the last page they saw. This happens because Cookies are tied to the browser / OS user, where as you are potentially applying the site security separately in the application.
In this situation you would either need to put the user name into the cookie (encrypted) or use a server side method to store the location
Here are my thoughts:
They might be looking for Master Pages, but my first thought here was whether you're going to cache this user data, so you're not making a database query every time they hit a new page. To really impress them, you might mention partial caching techniques so that the repetitive portions of the page don't even need to be re-rendered with each page load.
I think you're right: they're helping you to conclude that the session state is an appropriate place to cache the user data. Just be sure you ask the appropriate questions, like "How many users?", and "How much data per user?"
The cached data could be used to keep track of the last-requested page, and when the user's session expires, you could save this data into a database table to be retrieved next time they log in.
That third item is awfully tricky. What if the user was last looking at an object that has since been deleted? What would be the intended behavior if a user logged in from one computer, did some work, and then logged in simultaneously from another computer or browser? I'd be sure to ask these kinds of questions, not least to show that I understand the implications of a requirement like this. If their responses lead you to believe that they're looking for a simple solution, go with the simple solution. Otherwise, tweak your response to be only as complicated as necessary.
Just a small thought.. If the system are running in a "Farmed" environment the Session data can be cleared and need to be handled some way.
http://www.beansoftware.com/ASP.NET-Tutorials/Store-Session-State-Server.aspx
My question is how to best handle temporary data for an session. The scenario is similar to a shopping cart or like a bet slip. While the user is navigating the site and adding items with unique ID's. I'm only interested in the data collected this way if the user wants to commit it.
I'm developing in ASP .Net 3.5 with jQuery,JSON and a MS SQL DB.
As I see it there are a few possible ways to do this.
Perform a full post back to the server. Store every selections, update page controls accordingly.
Send selections via a Ajax request back to the server and update displaying control.
Build all functionality in JavaScript and store all values in a session cookie. Nothing being sent to server until user choose to commit.
I really want to consider performance here but I don't want to end up with 1000's of lines of JavaScript code..
Any suggestions of the best implementation with pro's and con's?
Cheers,
Stefan
Storing things in a session cookie is not a good idea, because that will be sent back to the server with every request. If you could find a way to store the state on the client without using a cookie, then you might have a viable client-centric option, but i can't think of anything portable off the top of my head. There are things in HTML5 and Flash that can do it, but you don't want to go there - yet, in the case of the former, and at all, in the case of the latter.
I'd use AJAX to post back to the server (with graceful degradation to a full post for browsers that can't handle that), then store the information in volatile memory there - ie not in the database. Write it to the database only when you need to. This is very easy to do in Java (you can associate information with the session), so i assume ASP.net has some way to do it too.
All three possibilities look good to me. The question, however, is: how much traffic do you expect?
Each of the options you presented suits better to a given scenario. Let's say you will have A LOT (thousand of thousands) users and not a lot of hardware available then you should probably try to minimize the number of requests to your app and store data in the client as much as possible before sending it to the server.
If it is smaller application then using Session or some other central database storage would be fine.
It all depends on your requirements.