Console Application or WebService - asp.net

I am working on a functionality that will generate around 2000 excel sheets. THe process will accept input from front end web application. I dont want my web application to wait till the process of generating the excel files is over. I should write a console application and trigger it through web application or a web service? Whic is the better way of implementation?
Thanks,
Rohit

I would suggest storing excel-gen as a task in a database, then have a background service constantly running, checking that task queue, processing it (creating spreadsheets), then finally updating job records as done. Your asp.net app can in turn, aside from creating job records, simply check on status periodically or on page-refresh.

Related

Windows scheduler API with console application Vs .net scheduler tools with asp.net mvc to execute long running processes inside my asp.net MVC

I am working on an asp.net mvc-5 web application, deployed under windows 2012 & iis-8. my asp.net mvc have many CRUD operations which are implemented as action methods inside my asp.net mvc.
But my asp.net mvc web application will be doing scheduled long running network scan process, the network scan will mainly do the following steps:-
Get the list of our servers and vms from our database.
Get the scanning username and password for each server and vm from a third party tool, using Rest API.
Call some powershell scripts to retrieve the servers & vms info such as network info, memory, name, etc.
Update our ERP system with the scan info using Rest API.
Now I did a pilot project using the following approach:-
I define a Model method inside my asp.net mvc to do the above 4 steps.
Then I install hangfire tool which will be calling the scan method on predefined scheduler.
Also I create a View inside my asp.net mvc which allow users to set the hangfire schedule settings (this require to do an IIS reset on the host server for hangfire to get the new settings).
Now I run a test scan for a round 150 servers which took around 40 minutes to complete , and it worked well. The only thing I noted is that if I set the schedule to run on non-business hours (where no activity is made on IIS) then hangfire will not be able to call the job, and once the first request is made the missed jobs will run. I overcome this limitation by defining a windows task which calls IIS each 15 minutes, to keep application pool live, and it worked well...
Now the other approach I am reading about is doing my above is as follow:-
Instead of defining Model method inside asp.net mvc to do the scan, I can create a separate console application to do the scan.
Then inside my asp.net mvc to create a view which allow users to create and schdule a task inside the windows tasks scheduler. I can do so by integrating with the windows task scheduler API.
Where this windows task will be calling the console application.
Now I am not sure which approach is better and why ? now generally speaking long running/background jobs should not run under iis.. But at the same time defining these long running processes as console app and calling these apps inside windows task scheduler will create extra dependencies on my web application. And will add extra effort when moving the application from move server to another (for example from test to live)..
Beside this I read that tools such as hangfire, quartz and other are designed to allow running long running tasks inside IIS and they eliminate the need to create console applications and scheduling these console applications using task scheduler ..
So can anyone advice on this?
In my opinion, if it is possible to solve the scheduling problem on the web application side, there is no need to create a scheduler task or a new console application for triggering purposes. The problem you will probably face when using scheduling task in a web application is generally common as you might see is that: The scheduler works like a charm during debugging of the web application, but not being able to trigger after publishing it to IIS. At this point the problem is generally related to IIS rather than the schedulers Quartz.NET, Hangfire, etc. Although there are lots of articles or solution methods posted on the web, unfortunately only some of them is working properly. In addition to this, most of them require lots of configuration settings on the web and machine configuration.
However, there are also some kind of solutions for such a kind of scheduling problem and I believe in that it is worthy to give a try Keep Alive Service For IIS 6.0/7.5. Just install it on the server to which you publish your application and enjoy. Then your published application will be alive after application pool recycling, IIS/Application restarting, etc. That is also used in our MVC application in order to send notification mails weekly and has been worked for months without any problem. Here are the sample code that I use in our MVC application. For more information please visit Scheduled Tasks In ASP.NET With Quartz.Net and Quartz.NET CronTrigger.
*Global.asax:*
protected void Application_Start()
{
JobScheduler.Start();
}
*EmailJob.cs:*
using Quartz;
public class EmailJob : IJob
{
public void Execute(IJobExecutionContext context)
{
SendEmail();
}
}
*JobScheduler.cs:*
using Quartz;
using Quartz.Impl;
public class JobScheduler
{
public static void Start()
{
IScheduler scheduler = StdSchedulerFactory.GetDefaultScheduler();
scheduler.Start();
IJobDetail job = JobBuilder.Create<EmailJob>().Build();
ITrigger trigger = TriggerBuilder.Create()
.WithIdentity("trigger1", "group1")
.StartNow()
.WithSchedule(CronScheduleBuilder
.WeeklyOnDayAndHourAndMinute(DayOfWeek.Monday, 10, 00)
//.WithMisfireHandlingInstructionDoNothing() //Do not fire if the firing is missed
.WithMisfireHandlingInstructionFireAndProceed() //MISFIRE_INSTRUCTION_FIRE_NOW
.InTimeZone(TimeZoneInfo.FindSystemTimeZoneById("GTB Standard Time")) //(GMT+02:00)
)
.Build();
scheduler.ScheduleJob(job, trigger);
}
}
Also I create a View inside my asp.net mvc which allow users to set the hangfire schedule settings (this require to do an IIS reset on the host server for hangfire to get the new settings).
You're resetting your webserver to update a task's schedule? That doesn't sound healthy. What you might do is keep track of what the scheduled time should be, and on execution, check if the current time is within a certain range of the scheduled time (or has already been executed), otherwise abort the job.
The only thing I noted is that if I set the schedule to run on non-business hours (where no activity is made on IIS) then hangfire will not be able to call the job, and once the first request is made the missed jobs will run. I overcome this limitation by defining a windows task which calls IIS each 15 minutes, to keep application pool live, and it worked well...
Hangfire's documentation has a page about running delayed tasks that mentions what you need to change to accomodate this.
Using Windows' Task Scheduler doesn't seem like a good idea; it's not meant for the execution of ad-hoc, short-lived tasks. You probably need elevation to create tasks, and you'd probably need to define another scheduled task to clean up the mountain of tasks that would exist after a few dozen background jobs have been executed.
You're also correct that using Windows' Task Scheduler would make it more difficult to move your application around.

Calling an asp.net mvc controller’s action method using windows task scheduler or another schdule tools

I am planning to create an asp.net mvc web application which will perform a single sync job, to achieve this:-
We have a 3rd party ERP system which will be generating a .csv file on timely basis, where it will generates a .csv file once per hour. The .csv file contain info about our company assets, such as type, price, name ,location, etc…
Now i will develop an asp.net mvc web application which will read the .csv data and update a database with this data.
So I am planning to do the following :-
I will create a new database, which will contain the data.
I will create a new asp.net mvc-5 web application which have a SYNC action method, which will read the .csv data, and update the database.
now the problem I am facing is that I need the sync job to run per hour or on a specific schedule. Now from my previous experience I can list these 2 approaches to call an action method on timely basis:-
Inside the asp.net mvc ‘s global.asax I can create a schedule which runs each hour.
I can use third party tools such as Hangfire to schedule the tasks.
Now using any of these approaches will cause this limitation:-
The global.asax or the third party tools such as Hangfire will run under the application pool, and if no action is performed on the application, then the schedule will never run, since the application pool will not be active. But on my previous applications this was not a real problem as the systems contained many views beside the schedule jobs , so the system stay active almost 100% over working hours.
But in my current project the web application will not be accessed by users, since it will only do a single sync job, and there is not any other functionalities for end users. so in this case the sync job will never run , since the application pool will not be active.
So can anyone advice if these approaches sound valid to fix my problem:-
To create a windows schedule task which will be calling the action method URL per hour:-
schtasks /create /tn "my scheduled task" /tr "powershell
-ExecutionPolicy unrestricted -Command \"(New-Object Net.WebClient).DownloadString(\\http://url
.to.be.executed/cron.php\\")\" /sc DAILY /st 07:00:00 /ru System
To create 2 schedule tasks inside the global.asax . one schedule task will call the application url each 5 minutes to keep the application pool alive, and another schedule task which call the sync action method each hour. In this case the application will keep calling itself every 5 minutes which will force the application pool to say alive and the sync job will run each hour even if no users access the system …
So can anyone advice on this please?
Thanks
MVC is all about user interaction, and many users interacting with the same application at that. If all you need is one task to be run on a schedule, then you don't even need a UI for that, and there's no issues of needing to handle multiple simultaneous remote requests or such that a web application satisfies. You could literally just create a console app and use Windows Task Scheduler. MVC is complete overkill, and more than that, unsuited to the purpose.

Hosting WF as Windows Service

I am trying to construct a simple windows workflow to monitor a directory for inbound files and do some DB updates using Windows WF 4.0. Currently I am planning to build a 'WCF Workflow Service' and host it as a 'Windows service' running 24/7 (with a daily service shutdown and startup).
Further in the future I am planning to consume this service using an ASP.NET/WPF application to create a basic dashboard kind of stuff.
Considering the idea of directory polling for files with WF hosted on windows service, does it seems to be a good idea? What can be the cons of this?
Please advice if there are any drawbacks on this or can this achieved by better means?
I'm actually doing this, but it is a bit more complex than you think, and should be avoided if possible.
You should not be blocking from within an Activity; if it is expected to be a long running Activity that is waiting from input from the outside (FileSystemWatcher event, for instance), the workflow should idle itself and wait to be woken from the outside.
How I did this was I created a workflow extension that hosted the FileSystemWatcher. Once the Activity was ready to watch for a file, it created a bookmark and passed it to the extension.
The extension then started the FSW, holding onto the bookmark.
When a FSW event was fired, the extension resumed the bookmark, passing in an object that contained details about the event. The Activity did what was needed with the event, then re-scheduled itself.
Normally I wouldn't have done this, but I had some requirements that forced me to use WF4 to accomplish this goal. If I didn't have to use WF4, I would have just spun up the FSW within the service and consumed the events.
Unless you expect to have to be very flexible with your configuration detailing what you do with the FSW event, and expect this to change relatively often during deployment of the service, I'd skip WF4.

Non-locking sleep/waitfor/delay function for ASP.NET

I am writing an ASP.NET class that interfaces with an external application. The flow of the transaction between the web server and this application is as follows:
My object writes a file to a directory.
The outside application detects this file and processes it. This can take between 1-5 seconds.
The outside application writes a response file to the same directory.
My object detects the response file and parses the results.
The 1-5 seconds it can take for the external application to process my file is my problem. The most straightforward way to wait for the file seems to be something like this:
Do While Not File.Exists(f)
Thread.Sleep(500)
Loop
Of course, Thread.Sleep() completely locks up the rest of my website until the outside application processes the file. Clearly, this is not a workable solution.
How can I effectively "wait" for my file to be processed without locking up the rest of my website?
Use the FileSystemWatcher - when a file will be created it will fire an event that you can subscribe to.

Calling a method in an ASP.NET application from a Windows application

Other than using a web service, is there anyway to call a method in a web app from a windows application? Both run on the same machine.
I basically want to schedule a job to run a windows app which updates some file (for a bayesian spam filter), then I want to notify the web app to reload that file.
I know this can be done in other ways but I'm curious to know whether it's possible anyway.
You can make your windows app connect to the web app and do a GET in a page that responds by reloading your file, I don't think it is strictly necessary to use a web service. This way you can also make it happen from a web browser.
A Web Service is the "right" way if you want them to communicate directly. However, I've found it easier in some situations to coordinate via database records. For example, my web app has bulk email capability. To make it work, the web app just leaves a database record behind specifying the email to be sent. The WinApp scans periodically for these records and, when it finds one with an "unprocessed" status, it takes the appropriate action. This works like a charm for me in a very high volume environment.
You cannot quite do this in the other direction only because web apps don't generally sit around in a timing loop (there are ways around this but they aren't worth the effort). Thus, you'll require some type of initiating action to let the web app know when to reload the file. To do this, you could use the following code to do a GET on a page:
WebRequest wrContent = WebRequest.Create("http://www.yourUrl.com/yourpage.aspx");
Stream objStream = wrContent.GetResponse().GetResponseStream();
// I don't think you'll need the stream Reader but I include it for completeness
StreamReader objStreamReader = new StreamReader(objStream);
You'll then reload the file in the PageLoad method whenever this page is opened.
How is the web application loading the file? If you were using a dependency on the Cache object, then simply updating the file will invalidate the Cache entry, causing your code to reload that entry when it is found to be null (or based on the "invalidated" event).
Otherwise, I don't know how you would notify the application to update the file.
An ASP.NET application only exists as an instance to serve a request. This is why web services are an easy way to handle this - the application has been instantiated to serve the service request. If you could be sure the instance existed and got a handle to it, you could use remoting. But without having a concrete handle to an instance of the application, you can't invoke the method directly.
There's plenty of other ways to communicate. You could use a database or some other kind of list which both applications poll and update periodically. There are plenty of asynchronous MQ solutions out there.
So you'll create a page in your webapp specifically for this purpose. Use a Get request and pass in a url parameter. Then in the page_load event check for this paremter. if it exists then do your processing. By passing in the parameter you'll prevent accidental page loads that will cause the file to be uploaded and processed when you don't want it to be.
From the windows app make the url Get request by using the .Net HttpWebRequest. Example here: http://www.codeproject.com/KB/webservices/HttpWebRequest_Response.aspx

Resources