Can Hangfire Work as A Simple Method Timer? - asp.net-core-webapi

I have a .NET Core Web API application. The app contains models for db access and a way to send emails to users. My end goal is to call a method nightly (to email users that their registration expired and to mark it expired in the database).
So, in short, I can build an endpoint and call it manually every night. Or build a windows service to call the endpoint. Or build a windows service to do the work. But I want to keep the logic in one application.
My ideal solution would be to have a timer running inside my app and calling a method in a service every 24 hours. Of course, that's not possible, so I am looking at Hangfire. The official documentation seems to indicate that there is a lot of overhead.
Hangfire keeps background jobs and other information that relates to the processing inside a persistent storage. Persistence helps background jobs to survive on application restarts, server reboots, etc.
Do I need this if I just want to call a method?
Background jobs are processed by Hangfire Server. It is implemented as a set of dedicated (not thread pool’s) background threads that fetch jobs from a storage and process them. Server is also responsible to keep the storage clean and remove old data automatically.
Do I even need jobs?
Is there a way to JUST call a method without all this overhead with Hangfire?
tl;dr: Are there options to opt out of the dashboard, database connectivity, etc and just have Hangfire work as a timer?

My ideal solution would be to have a timer running inside my app and calling a method in a service every 24 hours. Of course, that's not possible...
It's very possible, actually, using IHostedService. You should take some time to read the full documentation, but simply, for your scenario, you'd just need something like:
internal class NightlyEmailHostedService : IHostedService, IDisposable
{
private Timer _timer;
public Task StartAsync(CancellationToken cancellationToken)
{
_timer = new Timer(DoWork, null, TimeSpan.Zero,
TimeSpan.FromHours(24));
return Task.CompletedTask;
}
private void DoWork(object state)
{
// send email
}
public Task StopAsync(CancellationToken cancellationToken)
{
_timer?.Change(Timeout.Infinite, 0);
return Task.CompletedTask;
}
public void Dispose()
{
_timer?.Dispose();
}
}
Then, in Startup.cs just add:
services.AddHostedService<NightlyEmailHostedService>();
Now, that's an extremely naive approach. It basically just kicks off a timer that will run once every 24 hours, but depending on when your app started, it may not always be at night. In reality, you'd likely want to have the timer run every minute or so, and check against a particular time you actually want the email to go out. There's an interesting implementation of handling cron-style times via an IHostedService you might want to reference.
The long and short is that it's very possible to do this all in your app, without requiring anything additional like Hangfire. However, you have to a do a bit more work than you would have to using something like Hangfire, of course.

Related

doing database write after the response

I have a web service that receives requests from users and returns some json. I need to save the json string in the database so for the moment, the write query occurs before the response is sent back.
Is there a way to send the response first and then do the write query, after the response left the web service?
Thanks.
There's a couple of different options here - they all have tradeoffs, though, and would be pretty esoteric. You don't mention why you want to do this, so I'm guessing performance. If that's the case, I think you're barking up the wrong tree - a simple write is almost certainly not your performance problem.
So, off the top of my head:
Queuing, as Ragesh mentions, would be a nice approach. This gets you similar semantics of a transaction, while off loading the write. You still have to write to the queue, though, which may be about the same overhead as writing to the DB.
You could spawn a new thread (using either the ThreadPool or System.Threading.Thread - there's some debates about which is preferable in ASP.NET) to handle the write. This can generally work, but you may have issues with unhandled exceptions, app domain restarts, etc.
You could store the JSON data into a static or Application variable, then use a Timer to periodically write them to the DB. This will be multithreaded code, so you will need to synchronize read/writes to the collection.
Similar to #3, store the JSON data into Cache and use the invalidation callback to write to the DB.
Lots of variations on store somewhere (memory, disk, flat DB table, etc.), process later (ASP.NET, scheduled task, Windows Service, Sql Agent, etc.).
#frenchie says: a response starts by reading the json string from the db and ends with writing it back. In other words, if the user sends a request, the json string that's going to be read must be the one that was written in the previous response.
That complicates things, since inherent in async work is not knowing when something is done. If you require the async portion (writing back to the DB) to be done before handling the next request, you'll have to execute a wait to make sure it actually completed. In order to do that, you'll need to keep server side state on the client - not exactly a best practice as far as services go (though, it sounds like you're already doing that with these JSON request/response pairs).
Given the complications, I would make sure that you've done your profiling and determined it is indeed a performance problem.
You can do schedule a query work like
ThreadPool.QueueUserWorkItem(state =>
this.AsynchronousExecuteReference());
// and run
static void AsynchronousExecuteReference()
{
// run here your sql update
}
One other example using Thread inside an class and you can pass parameters to it.
public class RunThreadProcess
{
// Some parametres
public int cProductID;
// my thread
private Thread t = null;
// start it
public Thread Start()
{
t = new Thread(new ThreadStart(this.work));
t.IsBackground = true;
t.SetApartmentState(ApartmentState.MTA);
t.Start();
return t;
}
// actually work
private void work()
{
// do thread work
all parametres are available here
}
}
And here is how I run it
var OneAction = new RunThreadProcess();
OneAction.cProductID = 100;
OneAction.Start();
Do not worry about memory, CG knows that this process is used until the thread ends, so I have check it and CG not delete it and wait the thread to ends.
You should look at using message queues like MSMQ, ActiveMQ or RabbitMQ to do this. When you receive your request, you'll put the relevant data in to the queue, and send your response to the client. At the other end of the queue, you'll have some process that reads from the queue and inserts data in to your database.
this is missing the point of a request/response. unless you want to get into async commands like a service bus, but that's pub/sub, not request/response. the point of request/response is to do the work on the server after receiving the request and before sending the response. even if the work is sending an async message to a service bus.
You could try moving your web service URL to an ASPX page where the lifecycles come in to play.
In the code-behind, call your routine that does the main portion of the work in Page_Load or Page_Prerender (or whenever is appropriate prior to the response being sent) and then do your DB work in the Page_Unload event which occurs after the response has been sent (http://msdn.microsoft.com/en-us/library/ie/ms178472.aspx).

Recurring tasks in ASP .NET

I have an ASP .NET website running on GoDaddy in a shared environment. The application is a subscription-based service with options for recurring billing to users.
Every hour, we need to synchronize user data with our payment processor to update users who have upgraded or cancelled their accounts. The payment processor, does not have a mechanism for calling a URL or otherwise notifying us of changes.
The problem: We need to create a background thread that runs some code at a predefined interval. There are some good articles about background tasks in .NET but I am sure, there could be a simpler way around this. Maybe an application-wide timer that can call a function, etc.
The limitation: Shared environment does not allow windows services, external applications, full-trust, etc.
Since this is a production application, I would like to use the safest approach possible rather than arm-twisting IIS.
I had a similar problem, I'm developing a ASP proof of concept and use a background thread that performs a task that could take several hours. Problem is, ASP.Net can recycle the AppDomain at anytime (killing my background thread).
To prevent this, you can register your background thread to ASP.Net so it will notify your thread to shut down. To do this implement the following interface:
public interface IRegisteredObject
{
void Stop(bool immediate);
}
And register your object to ASP using the following static method:
HostingEnvironment.RegisterObject(this);
When ASP.NET tears down the AppDomain, it will first attempt to call Stop method on all registered objects. In most cases, it’ll call this method twice, once with immediate set to false. This gives your code a bit of time to finish what it is doing. ASP.NET gives all instances of IRegisteredObject a total of 30 seconds to complete their work, not 30 seconds each. After that time span, if there are any registered objects left, it will call them again with immediate set to true.
By preventing the Stop method from returning (by locking a field when the worker is busy), we stop ASP from shutting down the AppDomain until our work is finished.
public void Stop(bool immediate)
{
lock (_lock)
{
_shuttingDown = true;
}
HostingEnvironment.UnregisterObject(this);
}
public void DoWork(Action work)
{
lock (_lock)
{
if (_shuttingDown)
{
return;
}
work();
}
}
Use a Task instead of action to benefit from cancellation options. For your specific case you could start a timer that executes tasks like this.
PS. This is a hack and ASP isn't meant to run background tasks so use a windows service or WCF service when possible! I use this since it simplifies development, maintenance and installation.
For more information see my source: http://haacked.com/archive/2011/10/16/the-dangers-of-implementing-recurring-background-tasks-in-asp-net.aspx
To update for 2018 - The Hangfire NuGet package is perfect for this
Since there were no answers, I thought I'd post my solution in case it helps others.
Not the ideal approach by any means but for those who might gain from it, I created a cron job on another Linux hosting account we had to call the required ASP .NET url. Management horror but does the job.

ASP.NET MVC Multithreading

I want to implement such logic in my asp-net-mvc application:
user clicks a button ->
server executes some time-consuming logic in ~15 threads (i get data from really slow independent sources) ->
when all work is done, server merges the results and passes it back to user
The other day i've seen an article which explained why creating new Threads in asp-net application is highly not recommended, and ThreadPool is the one that should be used.
What are best practices for mvc in this case? Why shouldnt i create my threads, backgroundworkers, tasks, whatever by myself and use threadpool? The Application will be hosted on a public server, if it matters.
This seems like a really good place for using the new AsycController in ASP.NET MVC 2. It is really easy to use and lets you run queries against multiple independent sources without blocking request threads.
MSDN has a great example where they are querying a news service, a weather service, and a sports service.
You can see in the original code that they are querying each source sequentially, but in the final version, all the tasks run in parallel and control returns to the controller when they are all completed:
public void IndexAsync(string city)
{
AsyncManager.OutstandingOperations.Increment(3);
NewsService newsService = new NewsService();
newsService.GetHeadlinesCompleted += (sender, e) =>
{
AsyncManager.Parameters["headlines"] = e.Value;
AsyncManager.OutstandingOperations.Decrement();
};
newsService.GetHeadlinesAsync();
SportsService sportsService = new SportsService();
sportsService.GetScoresCompleted += (sender, e) =>
{
AsyncManager.Parameters["scores"] = e.Value;
AsyncManager.OutstandingOperations.Decrement();
};
sportsService.GetScoresAsync();
WeatherService weatherService = new WeatherService();
weatherService.GetForecastCompleted += (sender, e) =>
{
AsyncManager.Parameters["forecast"] = e.Value;
AsyncManager.OutstandingOperations.Decrement();
};
weatherService.GetForecastAsync();
}
public ActionResult IndexCompleted(string[] headlines, string[] scores, string[] forecast)
{
return View("Common", new PortalViewModel {
NewsHeadlines = headlines,
SportsScores = scores,
Weather = forecast
});
}
Yes JQuery and some AJAX will do it most properly.
Load the Page, then send ~15 separate Ajax queries back to the Server and let them finish asynchronously. This way you can let the web-server handle Threading (which is does well) and concentrate on displaying either a ticker or a virtual progress bar to the user while waiting.
If you're using .Net 4, I would even recommend looking at the parallel namespaces. They make this even simpler and do a better job of utilizing all your CPU cores.
I would also look at offloading this from your main web app altogether. Having a separate set of services or a message queue to handle this long running request will let you scale a lot more easily and will allow your web app to worry about servicing page requests, not performing long running logic. Google up something like "iis long running request" to get started.
Because a ThreadPool is designed to do that:
It provides a pool of threads that can be used to execute tasks, post work items, process asynchronous I/O, wait on behalf of other threads, and process timers.
I guess it's better to store the results on your server in a database when they are done.
And then you could use a timer and AJAX to periodically request if the process is done, if so, get data.
You shouldn't create your own threads because this has significant overhead. Creating a new thread is expensive. So the question should be: why shouldn't you use the threadpool? It has a number of threads ready for use. An even better approach would be to use the task parallel library, it provides a layer of abstraction above the threadpool that makes it a lot easier to work with threads.
You should realize that the only (easy) way to accomplish this, is by having the user wait until all work is done and then serve the page. A more complicated but also more responsive approach would be to use Ajax (via jQuery for example) to ask the server periodically how far the work has progressed. This way you can also provide some progress indicator.

How to detect if the current application pool is winding up in IIS7.5 and Asp.Net 3.5+

Well - exactly as the question subject states - any ideas on how you might do this?
I've been looking over the objects in System.Web.Hosting but nothing is standing out.
The reason? I'm getting one or two application errors which are typically occuring during a recycle (they happen about 25 hours apart and I've left my app pool recycle time at the default) and so I want to know if they're happening on a thread that's in the pool that's shutting down, or the one that's start(ed/ing) up.
I recently stumbled across this article on Brain.Save() which talks about exactly this issue from the point of view of hosting WCF (he's Steve Maine - A program manager at Redmond on the Connected Servies Division).
They need to be able to do this when a WCF service is hosted inside Asp.Net since they need to be able to shutdown any open listeners so that the WCF engine in the new app domain will be able to open them all up again.
As the article demonstrates, the answer is to implement the IRegisteredObject interface, call ApplicationManager.CreateObject to create an instance of your object and then register it with HostingEnvironment.RegisterObject (all detailed in the MSDN documentation for the interface).
When this object's IRegisteredObject.Stop(bool) implementation is called with false as the parameter, this is notification that the app domain is being shut down and that the object should be unregistered (kind of like a global dispose) with a call to HostingEnvironment.UnregisterObject.
When it's called with true it means you've not unregistered in good time, and that if you don't Unregister immediately, it'll be done for you.
I can certainly use this mechanism to find out, when an exception occurs, if the AppDomain is being killed or not. The nature of the object in question that throws the exception means that if it's not at shutdown, it must be during initial startup.
Equally, however, I may well start looking at this persistence mechanism for some of my other more complicated static information!
The History
The article also explains some of the history, and rationale, of why you would want to use IRegisteredObject rather than Application_Start and Application_End methods in global.asax:
Traditional ASP.NET applications can hook application lifecycle events (application startup/shutdown) by implementing methods like Application_Start and Application_Stop in global.asax. However, global.asax is for application code. Infrastructure pieces (of which the WCF hosting system is one) need a mechanism of hooking AppDomain lifecycle events that do not involve dumping infrastructure code in your global.asax file. That space is reserved for you, the user, and it would be rude of use to pollute that with a bunch of hosting goo we need to make the whole thing work. Instead, the ASP.NET folks did some great work during the Whidbey release to open up the hosting API’s and make it easy for people like WCF to come along and hook these lifecycle events in a way that’s invisible to application code.
You can check the value of System.Web.Hosting.HostingEnvironment.ShutdownReason, when the app pool is not in the process of closing / recycling it will have the ShutdownReason of None.
Adding the actual code to do this:
public class RecycleWatcher : IRegisteredObject
{
public static bool IsRecycling { get; private set; }
public void Register()
{
HostingEnvironment.RegisterObject(this);
}
public void Stop(bool immediate)
{
IsRecycling = true;
}
}
Then enable it by running
new RecycleWatcher().Register();
After that just check that property for IsRecycling to know if you are recyling or not.
if (RecycleWatcher.IsRecycling) DoSomething();
Not sure exactly what you want to do when the appication pool recycles but if you add the below event handler to Global.asax then the code in it will run when the application is shut down.
protected void Application_End(object sender, EventArgs e)
{
}

Custom Windows Workflow activity that executes an asynchronous operation - redone using generic service

I am writing a custom Windows Workflow Foundation activity, that starts some process asynchronously, and then should wake up when an async event arrives.
All the samples I’ve found (e.g. this one by Kirk Evans) involve a custom workflow service, that does most of the work, and then posts an event to the activity-created queue. The main reason for that seems to be that the only method to post an event [that works from a non-WF thread] is WorkflowInstance.EnqueueItem, and the activities don’t have access to workflow instances, so they can't post events (from non-WF thread where I receive the result of async operation).
I don't like this design, as this splits functionality into two pieces, and requires adding a service to a host when a new activity type is added. Ugly.
So I wrote the following generic service that I call from the activity’s async event handler, and that can reused by various async activities (error handling omitted):
class WorkflowEnqueuerService : WorkflowRuntimeService
{
public void EnqueueItem(Guid workflowInstanceId, IComparable queueId, object item)
{
this.Runtime.GetWorkflow(workflowInstanceId).EnqueueItem(queueId, item, null, null);
}
}
Now in the activity code, I can obtain and store a reference to this service, start my async operation, an when it completes, use this service to post an event to my queue. The benefits of this - I keep all the activity-specific code inside activity, and I don't have to add new services for each activity types.
But seeing the official and internet samples doing it will specialized non-reusable services, I would like to check if this approach is OK, or I’m creating some problems here?
There is a potential problem here with regard to workflow persistence.
If you create long running worklfows that are persisted in a database to the runtime will be able to restart these workflows are not reloaded into memory until there is some external event that reloads them. As there they are responsible for triggering the event themselves but cannot until they are reloaded. And we have a catch 22 :-(
The proper way to do this is using an external service. And while this might feel like dividing the code into two places it really isn't. The reason is that the workflow is responsible for the big picture, IE what should be done. And the runtime service is responsible for the actual implementation or how it should be done. That way you can change the how without changing the why and when part.
A followup - regardless of all the reasons, why it "should be done" using a service, this will be directly supported by .NET 4.0, which provides a clean way for an activity to start an asynchronous work, while suspending the persistence of the activity.
See
http://msdn.microsoft.com/en-us/library/system.activities.codeactivitycontext.setupasyncoperationblock(VS.100).aspx
for details.

Resources