Add Quartz Job&Trigger to running Razor Pages application - asp.net

I have a razor pages app that implements Quartz.NET to store jobs in a MySQL Database. The implementation works fine so far, I can connect to the db, store jobs and they are executed at the specified times. The issue I'm having currently is that I need to schedule and execute jobs based on user inputs(without restarting the app) and that I can't get to work. I'm very new to Quartz&Asp.net and I haven't been coding for very long either, so apologies if I've made any stupid mistakes.
I've read somewhere that I shouldn't initialize multiple schedulers so I've tried storing the scheduler object I've got so I can access and use it later. However when I try to access it from another class later then I get a Null reference exception. Tbh, this feels like it shouldn't even work so I'm not surprised it doesn't...can anyone please look at my code below and tell me if this can work? Or is there a better way to do this?
I've found one other solution where they basically create a job on startup that periodically checks a db for new jobs and adds them to the scheduler. I guess that would work, seems a bit clunky, though. Plus it's from 10 years ago so maybe there's a better way today? How to add job with trigger for running Quartz.NET scheduler instance without restarting server?
One other idea I've had was to open(and close) a new app whenever I need to create a job. I'm not sure I like that idea but seems less resource intensive than the recurring job described above. Would that be a viable option?
The code for my current solution:
Scheduler:
//Creating Scheduler
Scheduler = await schedulerFactory.GetScheduler();
Scheduler.JobFactory = jobFactory;
var key = new JobKey("Notify Job", "DEFAULT");
if (key == null)
{
//Create Job
IJobDetail jobDetail = CreateJob(jobMetaData);
//Create Trigger
ITrigger trigger = CreateTrigger(jobMetaData);
//Schedule Job
//await Scheduler.ScheduleJob(jobDetail, trigger, cancellationToken);
await Scheduler.AddJob(jobDetail, true);
}
//Start Scheduler
await Scheduler.Start(cancellationToken);
//Copying the scheduler object into a different class where it's easier to access.
ScheduleStore scheduleStore = new ScheduleStore();
scheduleStore.tempScheduler = Scheduler;
ScheduleStore:
public class ScheduleStore
{
public IScheduler tempScheduler { get; set; }
public ScheduleStore()
{
}
}
runtime Scheduler:
public class RunningScheduler : IHostedService {
public IScheduler scheduler { get; set; }
private readonly JobMetadata jobMetaData;
public RunningScheduler(JobMetadata job)
{
ScheduleStore scheduleStore = new ScheduleStore();
this.scheduler = scheduleStore.tempScheduler;
this.jobMetaData = job;
}
public async Task StartAsync(CancellationToken cancellationToken)
{
IJobDetail jdets = CreateJob(jobMetaData);
if (jobMetaData.CronExpression == "--")
{
ITrigger jtriggz = CreateSimpleTrigger(jobMetaData);
//the next line throws the exception.
await scheduler.ScheduleJob(jdets, jtriggz, cancellationToken);
//It's definitely the scheduler that's throwing the null pointer exception.
}
// the else does basically the same as the if, only with a cron trigger instead of a simple one so I've omitted it.

I see that you are using a hosted service. Have you noticed that Quartz has that support already built-in?
Quartz cannot handle new jobs "new code that runs" dynamically, but triggers for sure. You just need to obtain a reference to IScheduler and then you can add new triggers pointing to existing job or just call scheduler.TriggerJob which will call your job once with given parameters (job data map is powerful feature to pass execution parameters).
I'd advice checking the GitHub repository and its examples, there a specific ones for different features and ASP.NET Core and worker integrations.
Generatlly Quartz already has database persistence support which you can use. Just call scheduler methods to add jobs and triggers - they will be persisted and available between application restarts (and take effect immediately without the need for restart).

Related

masstransit access Activity from service bus message

I am using Masstransit to send Request/Response via servicebus between two services(dont ask why).
I would like to set-up custom application insights telemetry. I know that ServiceBus Messages add Diagnostic metadata so consumer can extract it and use it to add correlation between services. However I can't access it in MassTransit, or at least I dont know how.
Any tips?
Couple of months passed and solution i implemented proves to be a good one.
I created class that implements IReceiveObserver. On PreReceive I am able to access (ServiceBusReceiveContext)context and start Telemetry operation that has correct parent id. so it looks like this:
public Task PreReceive(ReceiveContext context)
{
var serviceBusContext = (ServiceBusReceiveContext)context;
var requestActivity = new Activity("Process");
requestActivity.SetParentId(serviceBusContext.Properties["Diagnostic-Id"].ToString());
IOperationHolder<RequestTelemetry> operation = _telemetryClient.StartOperation<RequestTelemetry>(requestActivity);
operation.Telemetry.Success = true;
serviceBusContext.Properties.Add(TelemetryOperationKey, operation);
return Task.CompletedTask;
}
On PostReceive I am able to stop operation:
public Task PostReceive(ReceiveContext context)
{
var serviceBusContext = (ServiceBusReceiveContext)context;
var operation = (IOperationHolder<RequestTelemetry>)serviceBusContext.Properties[TelemetryOperationKey];
operation.Dispose();
return Task.CompletedTask;
}
I also do some magic when exception happens:
public Task ReceiveFault(ReceiveContext context, Exception exception)
{
_telemetryClient.TrackException(exception);
var serviceBusContext = (ServiceBusReceiveContext)context;
var operation = (IOperationHolder<RequestTelemetry>)serviceBusContext.Properties[TelemetryOperationKey];
operation.Telemetry.ResponseCode = "Fail";
operation.Telemetry.Success = false;
operation.Dispose();
return Task.CompletedTask;
}
It was difficult to find this solution by reading MassTransit documentation. I would say that MassTransit is a fantastic tool for some situations there is no alternatives. However documentation is pretty poor.
You can use Application Insights with MassTransit, there is a package available that writes metrics directly.
The documentation is available here:
https://masstransit-project.com/advanced/monitoring/applications-insights.html
Also, you can access Activity.Current from anyway, I think, based on my experience with DiagnosticSource. It might be different with AppInsights though.

Axon Sagas duplicates events in event store when replaying events to new DB

we have Axon application that stores new Order. For each order state change (OrderStateChangedEvent) it plans couple of tasks. The tasks are triggered and proceeded by yet another Saga (TaskSaga - out of scope of the question)
When I delete the projection database, but leave the event store, then run the application again, the events are replayed (what is correct), but the tasks are duplicated.
I suppose this is because the OrderStateChangedEvent triggers new set of ScheduleTaskCommand each time.
Since I'm new in Axon, can't figure out how to avoid this duplication.
Event store running on AxonServer
Spring boot application autoconfigures the axon stuff
Projection database contains the projection tables and the axon tables:
token_entry
saga_entry
association_value_entry
I suppose all the events are replayed because by recreating the database, the Axon tables are gone (hence no record about last applied event)
Am I missing something?
should the token_entry/saga_entry/association_value_entry tables be part of the DB for the projection tables on each application node?
I thought that the event store might be replayed onto new application node's db any time without changing the event history so I can run as many nodes as I wish. Or I can remove the projection dB any time and run the application, what causes that the events are projected to the fresh db again. Or this is not true?
In general, my problem is that one event produces command leading to new events (duplicated) produced. Should I avoid this "chaining" of events to avoid duplication?
THANKS!
Axon configuration:
#Configuration
public class AxonConfig {
#Bean
public EventSourcingRepository<ApplicationAggregate> applicationEventSourcingRepository(EventStore eventStore) {
return EventSourcingRepository.builder(ApplicationAggregate.class)
.eventStore(eventStore)
.build();
}
#Bean
public SagaStore sagaStore(EntityManager entityManager) {
return JpaSagaStore.builder().entityManagerProvider(new SimpleEntityManagerProvider(entityManager)).build();
}
}
CreateOrderCommand received by Order aggregate (method fromCommand just maps 1:1 command to event)
#CommandHandler
public OrderAggregate(CreateOrderCommand cmd) {
apply(OrderCreatedEvent.fromCommand(cmd))
.andThenApply(() -> OrderStateChangedEvent.builder()
.applicationId(cmd.getOrderId())
.newState(OrderState.NEW)
.build());
}
Order aggregate sets the properties
#EventSourcingHandler
protected void on(OrderCreatedEvent event) {
id = event.getOrderId();
// ... additional properties set
}
#EventSourcingHandler
protected void on(OrderStateChangedEvent cmd) {
this.state = cmd.getNewState();
}
OrderStateChangedEvent is listened by Saga that schedules couple of tasks for the order of the particular state
private Map<String, TaskStatus> tasks = new HashMap<>();
private OrderState orderState;
#StartSaga
#SagaEventHandler(associationProperty = "orderId")
public void on(OrderStateChangedEvent event) {
orderState = event.getNewState();
List<OrderStateAwareTaskDefinition> tasksByState = taskService.getTasksByState(orderState);
if (tasksByState.isEmpty()) {
finishSaga(event.getOrderId());
}
tasksByState.stream()
.map(task -> ScheduleTaskCommand.builder()
.orderId(event.getOrderId())
.taskId(IdentifierFactory.getInstance().generateIdentifier())
.targetState(orderState)
.taskName(task.getTaskName())
.build())
.peek(command -> tasks.put(command.getTaskId(), SCHEDULED))
.forEach(command -> commandGateway.send(command));
}
I think I can help you in this situation.
So, this happens because the TrackingToken used by the TrackingEventProcessor which supplies all the events to your Saga instances is initialized to the beginning of the event stream. Due to this the TrackingEventProcessor will start from the beginning of time, thus getting all your commands dispatched for a second time.
There are a couple of things you could do to resolve this.
You could, instead of wiping the entire database, only wipe the projection tables and leave the token table intact.
You could configure the initialTrackingToken of a TrackingEventProcessor to start at the head of the event stream instead of the tail.
Option 1 would work out find, but requires some delegation from the operations perspective. Option 2 leaves it in the hands of a developer, potentially a little safer than the other solution.
To adjust the token to start at the head, you can instantiate a TrackingEventProcessor with a TrackingEventProcessorConfiguration:
EventProcessingConfigurer configurer;
TrackingEventProcessorConfiguration trackingProcessorConfig =
TrackingEventProcessorConfiguration.forSingleThreadedProcessing()
.andInitialTrackingToken(StreamableMessageSource::createHeadToken);
configurer.registerTrackingEventProcessor("{class-name-of-saga}Processor",
Configuration::eventStore,
c -> trackingProcessorConfig);
You'd thus create the desired configuration for your Saga and call the andInitialTrackingToken() function and ensuring the creation of a head token of no token is present.
I hope this helps you out Tomáš!
Steven's solution works like a charm but only in Sagas. For those who want to achieve the same effect but in classic #EventHandler (to skip executions on replay) there is a way. First you have to find out how your tracking event processor is named - I found it in AxonDashboard (8024 port on running AxonServer) - usually it is location of a component with #EventHandler annotation (package name to be precise). Then add configuration as Steven indicated in his answer.
#Autowired
public void customConfig(EventProcessingConfigurer configurer) {
// This prevents from replaying some events in #EventHandler
var trackingProcessorConfig = TrackingEventProcessorConfiguration
.forSingleThreadedProcessing()
.andInitialTrackingToken(StreamableMessageSource::createHeadToken);
configurer.registerTrackingEventProcessor("com.domain.notreplayable",
org.axonframework.config.Configuration::eventStore,
c -> trackingProcessorConfig);
}

NHibernate: Get all opened sessions

I have an ASP.NET application with NHibernate, for some reason few developers forgot to close the sessions in some pages (like 20 I think), I know that the best solution is to go through each page and make sure the sessions are closed properly, but I can't do that kind of movement because the code is already on production. So I was trying to find a way to get all the opened sessions in the session factory and then close it using the master page or using an additional process but I can't find a way to do that.
So, is there a way to get all the opened sessions? or maybe set the session idle timeout or something, what do you suggest?. Thanks in advice.
As far as I know, there is no support for getting a list of open sessions from the session factory. I have my own method to keep an eye on open sessions and I use this construction:
Create a class with a weak reference to a ISession. This way you won't interupt the garbage collector if sessions are being garbage collected:
public class SessionInfo
{
private readonly WeakReference _session;
public SessionInfo(ISession session)
{
_session = new WeakReference(session);
}
public ISession Session
{
get { return (ISession)_session.Target; }
}
}
create a list for storing your open sessions:
List<SessionInfo> OpenSessions = new List<SessionInfo>();
and in the DAL (data access layer) I have this method:
public ISession GetNewSession()
{
if (_sessionFactory == null)
_sessionFactory = createSessionFactory();
ISession session = _sessionFactory.OpenSession();
OpenSessions.Add(new SessionInfo(session));
return session;
}
This way I maintain a list of open sessions I can query when needed. Perhaps this meets your needs?

Include scheduling in web-application in asp.net

I wanted to run scheduling process in asp.net periodically in web application.
In brief,My database table is having date & deadline Hrs.I want to calculate expected dateTime from both then another table get updated (inserts 1000s of records) periodically & also want to run process of mail sending according to that calculation for the same.
This is expected scheduled process which should be executed periodically.
The Quartz.NET job scheduler library is excellent for this sort of thing.
You can use Window Service to work in backgroud or scheduling , please see below links:
Using Timers in a Windows Service
Here's what I did:
public class Email {
private static System.Threading.Timer threadingTimer;
public static void StartTimer()
{
if (threadingTimer == null)
threadingTimer = new Timer(new TimerCallback(Callback), HttpContext.Current, 0, 20000);
}
private static void Callback(object sender)
{
if (/* Your condition to send emails */)
using (var context = new MyEntities())
{
var users = from user in context.Usere
select user;
foreach (var user in users)
{
// Send an email to user
}
}
}
}
And you have to add this to Application_Start:
void Application_Start(object sender, EventArgs e)
{
EMail.StartTimer();
}
Check out this old article from Jeff Atwood:
Easy Background Tasks in ASP.NET
Basically he suggests that you use the cache expiration mechanism to schedule a timed task. The problem is: your web application needs to be running. What if the website isn't called at all? Well, since IIS 7.5 there is the possibility to keep your web app running at all times: auto starting web apps. Jeff suggests in the comments that his approach served well until they outgrew it. His conclusion is that for small sites this is a good approach.

Static variables and long running thread on IIS 7.5

Help me solve next problem.
I have ASP .NET MVC2 application. I run it on IIS 7.5. In one page user clicks button and handler for this button sends request to server (jquery.ajax). At server action in controller starts new thread (it makes long time import):
var thread = new Thread(RefreshCitiesInDatabase);
thread.Start();
State of import is available in static variable. New thread changes value of variable in the begin of work.
User can check state of import too with the help of this variable, which is used in view. And user sees import's state.
When I start this function few minutes everything is okey. On page I see right state of import, quantity of imported records is changed, I see changes in logs. But after few minutes begin troubles.
When I refresh page with import state sometimes I see that import is okey but sometimes I see page with default values about import (like application is just started), but after that again I can see page with normal import's state.
I tried to attach Visual Studio to IIS process and debug application. But when request comes to controller sometimes static variables have right values and sometimes they have default values (static int has 0, static string has "" etc.).
Tell me what I do wrong. May be I must start additional thread in other way?
Thanks in advance,
Dmitry
I add parts of code:
Controller:
public class ImportCitiesController : Controller
{
[Dependency]
public SaveCities SaveCities { get; set; }
//Start import
public JsonResult StartCitiesImport()
{
//Methos in core dll, which makes import
SaveCities.StartCitiesSaving();
return Json("ok");
}
//Get Information about import
public ActionResult GetImportState()
{
var model = new ImportCityStatusModel
{ NowImportProcessing = SaveCities.CitiesSaving };
return View(model);
}
}
Class in Core:
public class SaveCities
{
// Property equals true, when program are saving to database
public static bool CitiesSaving = false;
public void StartCitiesSaving()
{
var thread = new Thread(RefreshCitiesInDatabase);
thread.Start();
}
private static void RefreshCitiesInDatabase()
{
CitiesSaving = true;
//Processing......
CitiesSaving = false;
}
}
UPDATE
I think, I found problem, but still I don't know how solve it. My IIS uses application pool with parameter "Maximum Worker Processes" = 10. And all tasks in application are handled by few processes. And my request to controll about import's state always is handled by different processes. And they have different static variables. I guess it is right way for solving.
But I don't know how merge all static values in one place.
Without looking at the code, here are the obvious question. Are you sure your access is thread safe (that is do you properly use lock to update you value or even access it => C# thread safety with get/set) ?
A code sample could be nice.
thanks for the code, it seem that CitiesSaving is not locked properly before read/write you should hide the instance variable behind a property to handle all the locking. Marking this field as volatile could also help (see http://msdn.microsoft.com/en-us/library/aa645755(v=vs.71).aspx )

Resources