How to share information between .NET programs - asp.net

I´m working on .NET web application that also runs different jobs at the background. These background jobs update the database, create report files, delete old files etc.
The jobs are triggered on a schedule and the scheduler itself is triggered every hour via Windows´ task scheduler.
So far I´ve been monitoring the status of the jobs by logging messages to windows events viewer however this method is very cumbersome.
I'd like to implement a new program with a GUI that will communicate with the jobs and monitor their activities. For start the gui should display the following information:
Status (run vs idle)
Progress bar ("24% done", "327 of 23423 items are finished")
The gui.exe should run independently from the jobs.exe (Starting or stopping the gui should not influence the jobs and vice versa).
I'm looking for advices, what is the best way to share information between jobs.exe and gui.exe

General answer - all forms of IPC involve using some shared resource accessible by both processes. Examples:
network sockets (with "network" as shared resource)
named pipes
shared memory
plain old disk files (not the best way to do IPC but still a valid one)
a shared database
Specific .NET options:
Remoting
WCF
Web API
named pipes
to name a few.
Research some of them and see if they fit your usecase

Depends whether the gui.exe will be running on the same system as your job services. If so then better not use any web api / network sockets / WCF for communication, but something rather more simple as IPC method. I personally like message queues.

Related

Background task polling external resource at certain intervals

Requirement: I need to create a background worker/task that will get data from an external source ( message queue) at certain intervals ( i.e. 10s) and update a database. Need to run non stop 24hrs. An ASP.NET application is placing the data to the message queue.
Possible solutions:
Windows service with timer
Pros: Takes load away from web server
Cons: Separate deployment overhead, Not load balanced
Use one of the methods described here : background task
Pros: No separation deployment required, Can be load balanced - if one server goes down another can pick it up
Cons: Overhead on web server (however, in my case with max 100 concurrent users and seeing the web server resources are under-utilized, I do not think it will be an issue)
Question: What would be a recommended solution and why?
I am looking for a .net based solution.
You shouldn't go with the second option unless there's a really good reason for it. Decoupling your background jobs from your web application brings a number of advantages:
Scalability - It's up to you where to deploy the service. It can share the same server with the web application or you can easily move it to a different server if you see the load going up.
Robustness - If there's a critical bug in either the web application or the service this won't bring the other component down.
Maintanance - Yes, there's a slight overhead as you will have to adjust your deployment process but it's as simple as copying all binaries from the output folder and you will have to do it once only. On the other hand, you won't have to redeploy the application thus brining it down for some time if you just need to fix a small bug in the service.
etc.
Though I recommend you to go with the first option I don't like the idea with timer. There's a much simpler and robust solution. I would implement a WCF service with MSMQ binding as it provides you with a lot of nice features out of the box:
You won't have to implement polling logic. On start up the service will connect to the queue and will sit waiting for new messages.
You can easily use transaction to process queue messages. For example, if there's something wrong with the database and you can't write to it the message which is being processed at the moment won't get lost. This will get back to the queue to be processed later.
You can deploy as many services listening to the same queue as you wish to ensure scalability and availability. WCF will make sure that the same queue message is not processed by more than one service that is if a message is being processed by service A, service B will skip it and get the next available message.
Many other features you can learn about here.
I suggest reading this article for a WCF + MSMQ service sample and see how simple it is to implement one and use the features I mentioned above. As soon as you are done with the WCF service you can easily host it in a windows service.
Hope it helps!

Single Page Application with signalR: performance testing

I have an issue to evaluate the amount of concurrent users that our website can handle. Website is a Single Page Application built on .net framework with Durandal.js on the frontend. We use signalR (hubs) for real time communication between server and client.
The only option I see is ‘browser testing’, so each test should run browser instance (or use phantomJs etc) to keep real time connection with the server (as in real usage). Are there any other options to do this except use tests that will use browser instance to emulate user’s behaviour? What is the best way to emulate load of e.g. 1000 concurrent users?
I’ve found several cloud services that support such load testing, e.g. loadimpact, blazemeter. Would be great if someone can share their experience of using such tools.
SignalR provides tool called Crank, which can be used to test how many connections can be handled by given machine.
More info: http://www.asp.net/signalr/overview/performance/signalr-connection-density-testing-with-crank
Make your own script to create virtual users! that is the most effective way to recreate real world load/stress! use Akka Actor model(for creating virtual users) with java signalr client! (if you want you can use Gatling tool as framework and attach your script written in java or scala to virtual users of Gatling!)
make script dynamic by storing user info(Authentication token or user credentials) in xml document.
Please comment questions I can guide you end to end as I completed building+deploying such tool...

How to fire off and poll a windows service from asp.net page

client wants an asp.net page that has a button to fire off a database update from an external source with hundreds of records. This process takes a long time. He also wants status update as the process runs, like "processing 10 out of 1000 records". In reading various articles, I'm thinking of putting the database update code in a windows service. I've never worked with windows services before and I can't find many tutorials on how to fire off a windows service and poll it from an asp.net page. My questions are is this the best way to handle this process? And, does anyone have any examples on how they've accomplished this?
There are a few ways to approach this.
You're right in that executing a long-running task within the Web's worker process doesn't usually end well: it ties up resources, the app pool can get recycled, etc. In most of my projects of any complexity, I usually end up with 4 pieces: the database, a DLL with my model, a "Worker" that is a Windows service, and an ASP.NET Web site.
The "Worker" is a Windows service that is always running and uses Quartz.net to execute scheduled tasks using the same model that the Web site uses. These can be all sorts of periodic tasks that seem to crop up when maintaining a Web site of any complexity: VacuumExpiredPickTicketsJob, BackupAndFtpDatabaseJob, SendBackorderReminderEmailsJob, etc.
Writing a Windows service is not difficult in C# (there is a built-in template in Visual Studio, but you pretty much inherit from ServiceBase and you're off to the races), and libraries like TopShelf make it even easier to deploy them.
What is left is triggering the update from the Web site and communicating the results back to the user. This can be as simple or as complicated as you want it to be. If this is something that has to scale up to lots of users, you might use something like MSMQ to queue up update commands to the Windows service, and the Windows service would respond to that queue. I get the impression that that is probably overkill here.
For a handful of users, you could override your service's OnCustomCommand(int command) method to be the trigger. Your Web site would then use ExecuteCommand() of the ServiceController class to get the process started. Your Web site and service would agree on the parameter value that means "do that update thing," let's say 142 (since it has to be a number between 128 and 255 for reasons of history).
As for communicating progress back to the client, it's probably easiest to just have the Web page use a timer and an AJAX call to poll for updated progress data. You can get fancy with new stuff like WebSockets (bleeding edge stuff as I write this) and long polling, but regular polling will simply work for something that doesn't need to scale.
Hope this helps!
In addition to Nicholas' thorough answer, another option is to deploy your back end processes as command line scripts, and schedule them to run through Window's built in task scheduler, which has improved quite a bit in Windows Server 2008+. Or you can use any other host of task scheduler applications.
I find the command line approach to be easier for MIS staff to understand and configure, and to migrate to new servers, versus standard Windows services.

Common Ways of handling long running process in ASP.NET

We have a long running data transfer process that is just an asp.net page that is called and run. It can take up to a couple hours to complete. It seems to work all right but I was just wondering what are some of the more popular ways to handle a long process like this. Do you create an application and run it through windows scheduler, or a web service or custom handler?
In a project for long running tasks in web-application, i made a windows service.
whenever the user has to do the time-consuming task, the IIS would give the task to the service which will return a token(a temporary name for the task) and in the background the service would do the task. At anytime, the user would see the status of his/her task which would be either pending in queue, processing, or completed. The service would do a fixed number of jobs in parallel, and would keep a queue for the next-incoming tasks.
A windows service is the typical solution. You do not want to use a web service or a custom handler as both of those will lie prey to the app pool recycling, which will kill your process.
Windows Workflow Foundation
What I find the most appealing about WF, is that workflows can be designed without much complexity to be persisted in SQL Server, so that if the server reboots in the middle of a process, the workflow can resume.
I use two types of processes depending on the needs of my BAs. For transfer processes that are run on demand and can be scheduled regularly, I typically write a WinForms (this is a personal preference) application that accepts command line parameters so I can schedule the job with params or run it on demand through an interactive window. I've written enough of them over the last few years that I have my own basic generic shell that I use to create new applications of this nature. For processes that must detect events (files appearing in folders, receiving CyberMation calls, or detecting SNMP traps), I prefer to use Windows Services so that they are always available. It's a little trickier simply because you have to be much more cautious of memory usage, leaks, recycling, security, etc. For me, the windows application tends to run faster on long duration jobs than they do when through an IIS process. I don't know if this is because it's attached to an IIS thread or if its memory/security is more limited. I've never investigated it.
I do know that .Net applications provide a lot of flexibility and management over resources, and with some standards and practice, they can be banged out fairly quickly and produce very positive results.

A Way to Run a Long Process From ASP.NET page

What are your most successful ways of running a long process, like 2 hours, in asp.net and return information to the client on the progress.
I've heard creating a windows service, httphandler and remoting can be successful.
Just a suggestion...
If you have logic that you are tyring to utilize already in asp.net... You could make an external app (windows service, console app, etc.) that calls a web service on your asp.net page.
For example, I had a similiar problem where the code I needed was asp.net and I needed to update about 3000 clients using this code. It started timing out, so I exposed the code through a web service. Then, instead of trying to run the whole 3000 clients at through asp.net all at once, I used a console app that is run by a nightly sql server job that ran the web service once for each client. This way all the time consuming processing was handled by the console app that doesn't have the time out issue, but the code we had already wrote in asp.net did not have to be recreated. In the end slighty modifying the design of my existing architecture allowed me easily get around this problem.
It really depends on the environment and constraints you have to deal with...Hope this helps.
There are two ways that I have handled this. First, you can simply run the process and let the client time out. This has two drawbacks: the UI isn't in synch and you are tying up an IIS thread for non-html purposes (I did this for a process that used to return quickly enough but that grew beyond time-out limits).
The better way to handle this is to write a "Service" application that handles the request as passed through a database table (put the details of the request there). Then you can create a window that gives the user a "window" into ongoing progress on the task (e.g. how many records have been processed or emails sent). This status window can either have a link to permit the user to refresh or you can automate the refresh using Ajax callbacks on a timer.
This isn't directly applicable but I wrote code that will let you run processes similar to "scheduled tasks" inside of ASP.NET without needing to use windows services or any type of cron jobs.
Scheduled Tasks in ASP.NET!
I very much prefer WCF service to scheduled tasks. You might (off the top of my head) pass an addr to the WCF service as a sort of 'callback' that the service can call with progress reports as it works.
I'd shy away from scheduled tasks... too course grained.

Resources