How to do asynchronous/independent handling of webpage request? - asynchronous

I'm looking to generate a set of images as a result of a user submitting a form on an ASP.net web page. There's a risk that the server will be overloaded with requests so I'm looking for some way of handing off the image generation to a separate process which does the generation and emails the user when they're ready for download. Ideally the separate process would be continuously running so that it can respond to requests promptly.
Any ideas what to use here? Needs to be .net.
Cheers, Ian.

I would create a Java Servlet and some class with a quartz scheduler that will handle the background jobs.
If a user submits the form you can create the background jobs, quartz will then handle the queuing and processing of those jobs.
You could then have the JavaScript client ask from time to time (every 10 seconds or so) if the jobs are finished and if so, return the urls to the pictures.

Related

Background task in ASP.NET

I am writing a web application using ASP.NET (not MVC), with .NET v4 (not v4.5).
I fetch some of the data which I must display from a 3rd-party web service, one of whose methods takes a long time (several seconds) to complete. The information to be fetched/prefetched varies depending on the users' initial requests (because different users ask for details about different objects).
In a single-user desktop application, I might:
Display my UI as quickly as possible
Have a non-UI background task to fetch the information in advance
Therefore hope have an already-fetched/cached version of the data, by the time the user drills down into the UI to request it
To do something similar using ASP.NET, I guessed I can:
Use a BackgroundWorker, passing the Session instance as a parameter to the worker
On completion of the worker's task, write fetched data to the Session
If the user's request for data arrives before the task is complete, then block until it it has completed
Do you foresee problems, can you suggest improvements?
[There are other questions on StackOverflow about ASP.NET and background tasks, but these all seem to be about fetching and updating global application data, not session-specific data.]
Why not use same discipline as in a desktop application:
Load the page without the data from the service ( = Display my UI as quickly as possible)
Fetch the service data using an ajax call (= Have a non-UI background task to fetch the information in advance)
this is actually the same, although you can show an animated gif indicating you are still in progress... (Therefore hope have an already-fetched/cached version of the data, by the time the user drills down into the UI to request it)
In order to post an example code it will be helpful to know if you are using jquery? plain javascript? something else? no javascript?
Edit
I am not sure if this was your plan but Another idea is to fetch the data on server side as well, and cache the data for future requests.
In this case the stages will be:
Get a request.
is the service data cached?
2.a. yes? post page with full data.
2.b. no? post page without service data.
2.b.i. On server side fetch service data and cache it for future requests.
2.b.ii. On client side fetch service data and cache it for current session.
Edit 2:
Bare in mind that the down side of this discipline is that in case the method you fetch the data changes, you will have to remember to modify it both on server and client side.

Data prefetching using scheduled tasks in an ASP.NET environment

We're developing an ASP.Net application that retrieve data from WCF services.
The user interface (develped with JQM and ASP.NET) shows a loading panel while the request is processed and data are server-side paged so our db returns only first n records per request.
In this scenario, user performs a search and until he navigates through pages, we want to prefetch results of the next n pages to store in a RAM cache object to avoid delay.
To do this we've created a scheduled process with quartz.net, triggered every 3 seconds, that checks a FIFO queue to request data for subsequent pages.
Scheduled JOB is configured to allow quartz.net to execute concurrent execution and quartz.net is set to use a maximum of 10 threads.
In your opinion, is this a good approach?
Which may be the issues?
Can the intensive use of a job (fired every 3 second) and eventually threads cause the application to be unstable?
A solution based on a FileSystemWatcher trigger would be better?
I assume that you cannot avoid the delay when the user searches for the first time. In such a case you can make an ajax request to cache data for subsequent requests.
If you are using classic asp.net then this link discusses making ajax calls in asp.net http://weblogs.asp.net/craigshoemaker/archive/2008/11/07/using-jquery-to-call-asp-net-ajax-page-methods-by-example.aspx
If you are decide to use quartz, then you need to embed it in asp.net application, which is an overhead.

A Way to Run a Long Process From ASP.NET page

What are your most successful ways of running a long process, like 2 hours, in asp.net and return information to the client on the progress.
I've heard creating a windows service, httphandler and remoting can be successful.
Just a suggestion...
If you have logic that you are tyring to utilize already in asp.net... You could make an external app (windows service, console app, etc.) that calls a web service on your asp.net page.
For example, I had a similiar problem where the code I needed was asp.net and I needed to update about 3000 clients using this code. It started timing out, so I exposed the code through a web service. Then, instead of trying to run the whole 3000 clients at through asp.net all at once, I used a console app that is run by a nightly sql server job that ran the web service once for each client. This way all the time consuming processing was handled by the console app that doesn't have the time out issue, but the code we had already wrote in asp.net did not have to be recreated. In the end slighty modifying the design of my existing architecture allowed me easily get around this problem.
It really depends on the environment and constraints you have to deal with...Hope this helps.
There are two ways that I have handled this. First, you can simply run the process and let the client time out. This has two drawbacks: the UI isn't in synch and you are tying up an IIS thread for non-html purposes (I did this for a process that used to return quickly enough but that grew beyond time-out limits).
The better way to handle this is to write a "Service" application that handles the request as passed through a database table (put the details of the request there). Then you can create a window that gives the user a "window" into ongoing progress on the task (e.g. how many records have been processed or emails sent). This status window can either have a link to permit the user to refresh or you can automate the refresh using Ajax callbacks on a timer.
This isn't directly applicable but I wrote code that will let you run processes similar to "scheduled tasks" inside of ASP.NET without needing to use windows services or any type of cron jobs.
Scheduled Tasks in ASP.NET!
I very much prefer WCF service to scheduled tasks. You might (off the top of my head) pass an addr to the WCF service as a sort of 'callback' that the service can call with progress reports as it works.
I'd shy away from scheduled tasks... too course grained.

Keeping my web app running after Browser close

I have a aspx web application that updates or adds files in a database. The clients access through the browser and one of the requirements is that they can start the update and be able to close the browser while the update continues. It appears to run for a little bit after I close the browser but then it stops. How can you keep the application running for asp.net?
That's something you could very well solve with WF (Workflow Foundation). Create a workflow for the task that should survive closing the browser. Workflows have their own threads and livecycles separate from ASP.NET.
The web application will keep running in the application pool, but this will be recycled eventually. As long as the users session runs the application should be kept alive, so by upping the session timeout you may fix the problem.
A better approach though would be to move the long-running task into a service instead, but that may require a rewrite of your application.
Usually for long-running or asynchronous processing, you want to dispatch the request to a back-end service to handle. Trying to keep the web-app alive to finish processing can lead to problems, especially with HTTP and session timeouts.
A common pattern for this is to put the request on a message queue and let a back-end service process it when it can.
I would create a separate windows service that you can push jobs onto from your web application, then check the status of the job(s) when the user logs in again.
The windows service won't be tied to the asp.net app domain so it will continue to run regardless of whats happening in your web application.
I've run into this pattern and you have to decouple the work from the HTTP request. The way we've solved it is to abstract the computing to be done as an event to be scheduled. So, say a user at a browser takes an action that requires a long lived (relatively) computation on the back end, this computation is given a name like 'doXYZForUser' and given a prameter vector like (userId, params...) and sent off to the work queue. Some time in the future the user logs in again and can see what the status of their job is.
I'm running a Java stack and a Java Message Service (JMS) but the principle is the same. The request from the browser queues up an event and the browser get an ACK back saying the event is on the work queue. The queue is managed by an entirely separately running process which in .NET I believe is just called the Message Queue. The job comes up on the queue gets processed and the results can be placed in a separate table containing a reference to the user that kicked off the job, so the next time they log in job status/results can be returned.

Notifying the user after a long Ajax task when they might be on a different page

I have an Ajax request to a web service that typically takes 30-60 seconds to complete. In some cases it could take as long as a few minutes. During this time the user can continue working on other tasks, which means they will probably be on a different page when the task finishes.
Is there a way to tell that the original request has been completed? The only thing that comes to mind is to:
wrap the web service with a web service of my own
use my web service to set a flag somewhere
check for that flag in subsequent page requests
Any better ways to do it? I am using jQuery and ASP.Net, if it matters.
You could add another method to your web service that allows you to check the status of a previous request. Then you can use ajax to poll the web service every 30 seconds or so. You can store the request id or whatever in Session so your ajax call knows what request ID to poll no matter what page you're on.
I would say you'd have to poll once in a while to see if request has ended and show some notifications, like this site does with badges for example.
At first make your request return immediately with something like "Started processing...". Then use a different request to poll for the result. It is not good neither for the server nor the client's browser to have long open HTTP sessions. Moreover the user should be informed and educated that he is starting a request that could take some time to complete.
To display the result you could have a"notification area" in all of your web pages. Alternatively you could have a dedicated page for this and instruct the user to navigate there. As others have suggested you could use polling to get the result.
You could use frames on your site, and perform all your long AJAX requests in an invisible frame. Frames add a certain level of pain to development, but might be the answer to your problems.
The only other way I could think of doing it is to actually load the other pages via an AJAX request, such that there are no real page reloads - this would mean that the AJAX requests aren't interrupted, but may cause issues with breaking browser functionality (back/forward, bookmarking, etc).
Since web development is stateless (you can't set a trigger/event on a server to update the client), the viable strategy is to setup up a status function that you can intermittently call using a javascript timer to check whether your code has finished executing. When it finishes, you can update your view.

Resources