Mark Attendance without notifying the user through website - asp.net

I am developing a web application for Hr & Payroll.
In which Attendance is marked through Biometrics Device..
The Problem is that at times user forget to punch the device and on the basis of that their salary is deducted from their account..
So to avoid this I have thought of an idea of marking attendance..is it possible that i use each n every employee IP address in my web application so that whenever a computer is switched on in a day for the first time that timing should get saved in My Database.
And when that computer is switched off for the last time than that attendance should also be marked in database..but i dont know how to do this..is this even possible.??.Cause this will give a gr8 functionality to the companies hr department..n their will be no legl hassles..please throw some light on this

You need to create window services for this task.
Here is a good example with the step by step process of creating window service
http://www.codeproject.com/Articles/3990/Simple-Windows-Service-Sample
We had similar situation in our organization.
We read the attendance table with a window scheduler at particular time. And sent the mail if user have not marked the attendance.

Do the users need to log on to a system somewhere? How about logging on to the network or computer? You could write a little quick and dirty that runs when the person logs in (you could make it run hidden) that would save something to the database.
The problem comes from when they leave. What I've usually done is have something that will update a record every 5 minutes that writes a ping/I'm here! to a record. Then at some point you could see - this is when we first started getting messages (they logged on) and here's where they stopped (have to assume they logged off). You could even do some kind of session ID so that you could have multiple records for the person.
You could make this a program that either runs completely in the background or appears in the notification area.

Related

What type of architecture would I need to deal with partially-completed operations in the event of a server crash?

I have been working on a social network powered by ASP.NET 5 for just over six months.
As I think about my architecture, I realize that there are a variety of issues with it. One of the biggest issues is that, in the event that the server would go down (for whatever reason), any operation that was occurring at the time of the outage would be lost. For example, a user deleting a picture album, and the pictures associated with it. The system deletes the album record and then the server crashes. Upon reboot, the delete operation would be lost, and my database would then be cluttered with redundant picture records and my storage server would have redundant image files.
What sort of architecture would allow me to solve this issue?
If you really expect this to become a problem, you may want to spend some more money on servers that stay up.
That being said, this edge case is always present. There can always be an HTTP request that is still in-flight between the client and the server, or that has just arrived at the first line of your code while the server goes down - without you having stored the request data, so you cannot repeat it.
There's really nothing you can do about that principle.
You can use transactions to let database operations remain atomic (i.e. all of it happens or none of it does), but when you've got work that needs to happen in both database and filesystem to stay synchronized, you've got a different problem.
You can however introduce some kind of job system. A "delete photo album job" could look like this:
User clicks "Delete album".
System stores a "DeleteAlbumJob" in the database. The album stays online (database) and present (filesystem).
Job system processes jobs until marked as successfully completed.
The job system, for example one through Hangfire, processes these jobs step by step - recording where it is. Now for example, the photos from the album have been removed from the filesystem, but the album still exists in the database, and the server goes down. Now the next reboot, the service will start again and the job wil continue to be processed from where it was left.
In the same transaction as creating the job in a database, you can flag the album as "DeletionPending", so it won't be shown to the user anymore.

Trigger a series of SMS alerts over time using Twilio/ASP.NET

I didn't see a situation quite like mine, so here goes:
Scenario highlights: The user wants a system that includes custom SMS alerts. A component of the functionality is to have a way to identify a start based on user input, then send SMS with personalized message according to a pre-defined interval after the trigger. I've never used Twilio before and am noodling around with the implementation.
First Pass Solution: Using Twilio account, I designated the .aspx that will receive the inbound triggering alert/SMS via GET. The receiving page declares and instantiates my SMSAlerter object within page load, which responds immediately with a first SMS and kicks off the System.Timer.Timer. Elementary, and functional to a point.
Problem: The alerts continue to be sent if the interval for the timer is a short time span. I tested it at a minute interval and was successful. When I went to 10 minutes, the immediate SMS is sent and the first message 10 minutes later is sent, but nothing after that.
My Observation: Since there is no interaction with the resource after the inbound text, the Session times out if left at default 20 minutes. Increasing Session timeout doesn't work, and even if it did does not seem correct since the interval will be on the order of hours, not minutes.
Using Cache to store each new SMSAlerter might be the way to go. For any SMSAlerter that is created, the schedule is used for roughly 12 hours and is replaced with a new SMSAlerter object when the same user notifies the system the following day. Is there a better way? Am I over/under-simplifying? I am not anticipating heavy traffic now (tens of users), but the user is thinking big.
Thank you for comments, suggestions. I didn't include the code, because the question is about design, not syntax.
I think your timer is going out of scope about 20 minutes after the original request, killing the timer. I have a feeling that if you keep refreshing the aspx page it won't happen - but obviously that doesn't help much.
You could launch a new thread that has the System.Timers.Timer object so it stays alive, and doesn't go out of scope when there are no follow up requests to the server. But this isn't a great idea to be honest - although it might help with understanding the issue.
Ultimately, you'll need some sort of continuously running service - as you don't want to depend on the app pool for this, so I'd suggest a Windows Service running in the background to handle it, which is going to be suitable for a long term solution.
Hope this helps!
(Edited slightly to make the windows service aspect clearer)

Limit concurrent access to Asp.Net application

I want to limit concurrent access to internet or intranet web applications.
I want to be able to allow certain number of concurrent access, let's say I want to allow maximum 20 concurrent access using the same username.
I can imagine creating a session at the login time and save it into DB with incrementing a counter, but there is no way to delete it from the DB and decrement the counter if user is logged off, as user may just close the browser.
Guys, what is the best way from your opinion?
Thanks,
You are right in that you cannot know if a user just kills his browser or pulls the cord.
Create a timer that does an Ajax call every 30 seconds and log it. If a user isn't in this log for 60 seconds - throw'em out at the server.
This won't stop someone from leaving their browser open over the night and hogging a slot though.
There is another answer mentioning an "last active" field. This on the other hand messes up when someone needs to leave their browser open to look at something for more than the timeout limit of minutes.
If someone is thrown out due to the user/timeout limit; I think it would be nice to have them automagically logged in again totally transparently if there is a slot open again.
Without thinking too hard about it I believe this solution will give birth to other problems.
Edited away:
As I commented - 20 users - are you really solving the right problem?
You may wish to have a "lastactive" field on your session. When someone attempts to log in or otherwise create a session, you can query for all sessions that have been active within a period of time (say 15 minutes), and use the resulting count to tell you how many concurrent sessions are active.

Running a query in Page Load a bad idea?

I'm running an ASP.NET app in which I have added an insert/update query to the [global] Page_Load. So, each time the user hits any page on the site, it updates the database with their activity (session ID, time, page they hit). I haven't implemented it yet, but this was the only suggestion given to me as to how to keep track of how many people are currently on my site.
Is this going to kill my database and/or IIS in the long run? We figure that the site averages between 30,000 and 50,000 users at one time. I can't have my site constantly locking up over a database hit with every single page hit for every single user. I'm concerned that's what will happen, however this is the first time I have attempted a solution like this so I may just be overly paranoid.
Do it Async.
Create a dll that handles the update, and in the page load do a fire and forget with parameters.
Insert-Based designs have less locking than Update-Based designs.
So if a user logged-in and then logged-out, in an Insert-Based design you would have multiple rows with a SessionID in each, one for each activity whereas in an Update-Based design, you would have a SessionId, LoginTime and a LogoutTime column and you would update the LogoutTime based on the SessionId.
I have seen many more locking and contention problems caused by Update activity more than Insert activity.
Activities such as counting and linking logins to logouts etc take more complex queries and a little more resources.
It goes without saying that your queries, especially the ones that run on every page, should be as fast as possible so that the site doesn't appear slow to users.
To keep track of how many users are currently on your site you could use performance counters. What you describe though sounds more like a full fledged logging of every page hit.
Lets say you realy have 50k users connected at any one time.
As long as you don't have contention between the updates (trying to lock the same record) a database can track a very high number of inserts and updates. You need to do some capacity planning to assure the load can be carried. 50k users visiting a page every minute will give you 50k inserts and 50k updates per minute, roughly 850 inserts and 850 updates per second, which have to commit (flush the log). Does your DB I/O subsytem support such a write pressure load, in addition to responding to all the requests (reads)?
Also 50k users doing 1 page hit per minute adds up to 72 mil hits per day, 72 mil. logging inserts, at such a rate you need to carefully plan the size capacity of the database and consider what kind of analysis you'll do on the collected data since querying ad-hoc 2 billion rows (one month data) will get you nowhere fast (actually... quite slow).
Doing it async can give you some relief over very short spikes, but not on the long run. If your DB system cannot handle the load then doing async calls will just create a backlog queue in the application process (in the ASP app pool) and this will grow until out of memory, at which moment the all vigilant IIS will 'recycle' the app pool, thus loosing all pending async updates.
I think updating the database in the begin session and end session will do the job. that will reduce the count of statements dramatically.
I think it makes no difference if you track hits or begin/end session. with hits you'll also need additional logic to subtract inactive users
EDIT: session end is not fired always. I would suggest to call an update statement/stored procedure in another session begin event (in addition to the other insert statement) that will fix invalid sessions.
I don't think that calling this "fix routine" is necessary in every page load event because I think you cant exactly count "current no. of visitors".
I would keep this in Application state instead - if possible. On ApplicationStart create some data structure saved to App state that you can update from anywhere in your application - session start, page load, wherever. Keep it out of the database. You are just using it to track "currently online" info anyway it sounds like.
If you have multiple instances of your app, or if there is a requirement to maintain historical info beyond the IIS logs, this won't work obviously. Go with chris' fire-and-forget solution in that case.
What's wrong with IIS Logs?
2009-05-01 12:30:31 207.219.27.35 GET /assocadmin/ibb-reg.asp - usernameremoved 544.566.570.575 Mozilla/4.0+(compatible;+MSIE+7.0;+Windows+NT+6.0;+SLCC1;+.NET+CLR+2.0.50727;+Media+Center+PC+5.0;+.NET+CLR+3.5.30729;+.NET+CLR+3.0.30618) 200 0 0 40058
EDIT: I'd like to close this answer, but I want the comments to stay. Consider this answer withdrawn.
How about adding a small object to the session?
Something like LoggedInUserFlag:IDisposable
In the constructor, increment your counter however you decide to implement it.
Then in the Dispose method, decrement the counter.
This way, regardless of how the session is ended, the counter will always be (eventually) decremented.
see:
http://weblogs.asp.net/cnagel/archive/2005/01/23/359037.aspx
for info on using IDisposable.
I am not an ASP guy at all, but what about rather than logging all that other info, and insert their IP address?
If they have an IP address already in there, have a last_seen timestamp, and on each refresh just delete any row that isn't 10 minutes ago?
This is how I would take a shot at it. It is much more space efficient, but I am not sure about the checking and deleting so much on such a high profile site.
As a direct answer to your question, yes, running a database query in-line with every request is a bad idea:
Synchronous requests will tie up a thread, which will reduce your scalability (fewer simultaneous activities)
DB inserts (or updates) are writes to the DB, which will put a load on your log volume
DB accesses shouldn't be required in a single server / single AppPool scenario
I answered your question about how to count users in the other thread:
Best way to keep track of current online users
If you are operating in a multi-server / load-balanced environment, then DB accesses may in fact be required. In that case:
Queue them to a background thread so the foreground request thread doesn't have to wait
Use Resource Governor in SQL 2008 to reduce contention with other DB accesses
Collect several updates / inserts together into a single batch, in a single transaction, to minimize log disk I/O pressure
Return the current count with each DB access, to minimize round-trips
In case it's of any interest, I cover sync/async threading issues and the techniques above in detail in my book, along with code examples: Ultra-Fast ASP.NET.

How to find how many people are online in a website

I have to show how many people are online in that site. The site has developed by using ASP.NET 2.0. Currently I am using the Session start (increase by 1) and Session End Event (decrease by 1) in Global.asax. But the Session_End Event not calling properly most of the times. Is there any better way to achieve this?
You're not going to get much better then incrementing in Session_Start and decrementing in Session_End unless you use some other means like a database.
When you are authenticating your requests you could update a timestamp in the database to show that the user has been active, and after a given time (say 15 minutes), the query that collects the number of concurrent users ignores that row in the database (thus decrementing the count).
A quick Google search revealed a handy way of doing this with a HttpModule.
All in all, Yohann did a great job with this. It does implement a set timeout that was suggested above, but otherwise there is no set way of doing this accurately outside of checking the server's perfmon.exe and checking the WebService >> WebAppPool's count of current connections.
When I implemented this myself I used a SQL Server table to store a date/time and user info on authentication. I decremented the count by re-assesing and matching the IP addresses whenever I had to refresh the data cache (once every 10 mins).
We have the same issue in a project, after tried several methods, we end with tracking idle time of each user, when the idle time is over the session timeout, we consider the user is not online anymore. of course you also need to consider the other issue such as the user log off, or log back in after timeout...
I've done this before and can attest that Session_End will only be called if you manually destroy the session (Session.Abandon). When you think about it, it makes sense. If the user isn't on the website, the code never gets executed.
What I did was store a hashtable in Application state that contained the Username and a Datetime for the last time the user was seen on the site. For every page load I would call a function that would either insert or update this value for the current user. Then it would cull the entire list and remove all of the entries that are older than the session timeout (20 minutes or whatever). Remember to use lock or sync to avoid race conditions when making changes to this list.
This has the added benefit of not only knowing how many people, but specifically which users.
If you don't have something unique like a Username, you can use Session.SessionID instead. It should be unique per visitor of your site.
But be careful, using an Application or App Instance state variable has its own share of problems since it won't share between processes in "Web Garden Mode" or in a multi-server setup. You would need a more persistent medium such as a database or distributed cache for larger scale setups.

Resources