I am having an approval workflow. The client is asp.net mvc 4 and the client calls out to a WEB API. This WEB API in turn hosts the WF using WorkflowAplication class. The scenario here is, if a request is not approved for a certain time, i need the persisted workflow to load send an email and persist again. Is this scenario possible somehow using WF 4.5
Make a timer object outside of the workflows but trigger it from within the workflow (Pass it into the workflow as an Argument). Then when the timer fires you can use the event to restart the workflow from a persist point, send the e-mail and then persist again.
Related
Scenario
I am building courier service system using Microservices. I am not sure of few things and here is my Scenario
Booking API - This is where customer Place order
Payment API - This is where we process the payment against booking
Notification API - There service is responsible for sending the notification after everything is completed.
The system is using event-driven Architecture. When customer places booking order , i commit local transaction in booking API and publish event. Payment API and notification API are subscribed to their respective event . Once Done Payment and notification API need to acknowledge back to Booking API.
My Questions is
After publishing the event my booking service can't block the call and goes back to the client (front end). How does my client app will have to check the status of transaction or it would know that transaction is completed? Does it poll every couple of seconds ? Since this is distributed transaction and any service can go down and won't be able to acknowledge back . In that case how do my client (front end) would know since it will keep on waiting. I am considering saga for distributed transactions.
What's the best way to achieve all of this ?
Event Sourcing
I want to implement Event sourcing to track the complete track of the booking order. Does i have to implement this in my booking API with event store ? Or event store are shared between services since i am supposed to catch all the events from different services . What's the best way to implement this ?
Many Thanks,
The way I visualize this is as follows (influenced by Martin Kleppmann's talk here and here).
The end user places an order. The order is written to a Kafka topic. Since Kafka has a log structured storage, the order details will be saved in the least possible time. It's an atomic operation ('A' in 'ACID') - all or nothing
Now as soon as the user places the order, the user would like to read it back (read-your-write). To acheive this we can write the order data in a distributed cache as well. Although dual write is not usually a good idea as it may cause partial failure (e.g. writing to Kafka is successful, but writing to cache fails), we can mitigate this risk by ensuring that one of the Kafka consumer writes the data in a database. So, even in a rare scenario of cache failure, the user can read the data back from DB eventually.
The status of the order in the cache as written at the time of order creation is "in progress"
One or more kafka consumer groups are then used to handle the events as follows: the payment and notification are handled properly and the final status will be written back to the cache and database
A separate Kafka consumer will then receive the response from the payment and notification apis and write the updates to cache, DB and a web socket
The websocket will then update the UI model and the changes would be then reflected in the UI through event sourcing.
Further clarifications based on comment
The basic idea here is that we build a cache using streaming for every service with data they need. For e.g. the account service needs feedback from the payment and notification services. Therefore, we have these services write their response to some Kafka topic which has some consumers that write the response back to order service's cache
Based on the ACID properties of Kafka (or any similar technology), the message will never be lost. Eventually we will get all or nothing. That's atomicity. If the order service fails to write the order, an error response is sent back to the client in a synchronous way and the user probably retries after some time. If the order service is successful, the response to the other services must flow back to its cache eventually. If one of the services is down for some time, the response will be delayed, but it will be sent eventually when the service resumes
The clients need not poll. The result will be propagated to it through streaming using websocket. The UI page will listen to the websocket As the consumer writes the feedback in the cache, it can also write to the websocket. This will notify the UI. Then if you use something like Angular or ReactJS, the appropriate section of the UI can be refreshed with the value received at the websocket. Until that happens user keeps seeing the status "in progress" as was written to the cache at the time of order creation Even if the user refreshes the page, the same status is retrieved from the cache. If the cache value expires and follows a LRU mechanism, the same value will be fetched from the DB and wriitten back to the cache to serve future requests. Once the feedback from the other services are available, the new result will be streamed using websocket. On page refresh, new status would be available from the cache or DB
You can pass an Identifier back to client once the booking is completed and client can use this identifier to query the status of the subsequent actions if you can connect them on the back end. You can also send a notification back to the Client when other events are completed. You can do long polling or you can do notification.
thanks skjagini. part of my question is to handle a case where other
microservices don't get back in time or never. lets say payment api is
done working and charged the client but didn't notify my order service
in time or after very long time. how my client waits ? if we timeout
the client the backend may have processed it after timeout
In CQRS, you would separate the Commands and Querying. i.e, considering your scenario you can implement all interactions with Queues for interaction. (There are multiple implementations for CQRS with event sourcing, but in simplest form):
Client Sends a request --> Payment API receives the request --> Validates the request (if validation fails throws error back to the user) --> On successful validation --> generates a GUID and writes the message request to Queue --> passes the GUID to the user
Payment API subscribes the payment queue --> After processing the request --> writes to Order queue or any other queues
Order APi subscribes to Order Queue and processes the request.
User has a GUID which can get him data for all the interactions.
If use a pub/sub as in Kafka instead of Kafka (all other subsequent systems can read from the same topic, you don't need to write for each queue)
If any of the services fail to process, once the services are restarted they should be able to pick where they left off, if the services are down in the middle of a transaction as long as they roll back their resp changes you system should be stable condition
I'm not 100% sure what you are asking. But it sounds like you should be using a messaging service. As #Saptarshi Basu mentioned kafka is good. I would really recommend NATS - although I'm biased because that's the one I work with
With NATS you can create request-reply messages to interface between client and booking service. That's a 1-1 communication
If you have multiple instances of each of your services running, you can use the Queuing service to automatically load balance. NATS will just randomly select a server for you
And then you can use pub-sub feeds for communication between all of your services.
This will give you a very resilient and scalable architecture, and NATS makes it all incredibly easy
I want a page to run in the background in my Asp .Net web application.
That page should not be visible to the user.
The exact use of this page: the user will schedule a mail, that is to be send later. After he scheduled, the page should be hidden.
Can we do it?
Platform version (.NET 4)
What you really want is a service.
However, there are several kludgy ways to do back ground tasks with asp.net
http://www.codeproject.com/Articles/12117/Simulate-a-Windows-Service-using-ASP-NET-to-run-sc
http://www.west-wind.com/weblog/posts/2007/May/10/Forcing-an-ASPNET-Application-to-stay-alive
http://www.mikesdotnetting.com/Article/129/Simple-task-Scheduling-using-Global.asax
If the user closes the browser before your scheduled event has occurred, it will never take place.
You really want a backend service that processed queued events. When the user schedules an email it adds it to the queue and then gets picked up and processed by the backend service.
http://quartznet.sourceforge.net/ is one option for it, or you could build a window's service and the queue manually.
Alternatively you could look into a service bus approach such as http://www.nservicebus.com/ which is backed by MSMQ.
I have two different web applications that need to communicate with each others (which I can currently accomplish by using Silverlight Duplex but that doesn't scale very well). After reading about SignalR, I'd like to give this a try but failed to find much documentation on how to do this. Any advice on ho to get started would be greatly appreciated.
Thanks!
More specific Info:
Example:
Application A (Bidding Interface) - A web page to allow multiple end-users to place bids on certain items.
Application B (Managing Interface) - A web page to allow a user (or could potentially be multiple users) to monitor/control the actions from Bidding Interface.
So when a user from Application A place a bid on a piece, I'll need a way to alert Application B that a bid has been placed. Then from Application B, should the user choose to accept the bid, I need to send an alert back to Application A (to update current price, increase bid etc...)
In all honesty, in might just be simpler to have each application push the notifications to each other via standard service calls (WCF, ASMX, HTTP handler endpoints, MVC controllers, whatever). SignalR is useful in browser to server communications because there isn't a consistent way to do push from the server to a connected browser. But from web app to web app pushing is simple; Application A just calls a service endpoint on Application B to notify it of something happening.
Assuming that what you want is something like ...
User (browser) --- Application A --- Application B --- User (Browser)
Realtime communication can be done by doing the following ...
This isn't the job of signalR however something like NServiceBus would fit this very well.
you reference a bus dll file and hubs can both raise and respond to events.
In your case you would have both SignalR and your Service Bus technology work together to allow the cross application sync.
So the process is something like ...
User in application A fires up browser and requests page.
Application A creates Hub instance which internally subscribes to Service Bus events
User in application B fires up browser and requests page.
Application B creates Hub instance which internally subscribes to Service Bus events
User on either application does some action resulting in SignalR picking up a message.
SignalR raises bus event to say "This user did something" on the service bus.
Other Hub on other Application through subscribription to the event gets notified of the event and takes any action to inform its connected users of it.
Lesson to be learnt here ...
Don't try and make a technology do something beyond its purpose ... use the right tool for the job.
This entire solution can be done with little more than about 20 lines of code after getting the core framework setup.
NServiceBus can be found here:
http://particular.net/nservicebus
Disclaimer: There may be other solutions but this one suggestion don't assume this is the only way this can be solved, and the only technologies that be used in this manner.
I am not affiliated in any way with Particular or the NServiceBus product.
I'm building a project to send messages to users. The client wants a way to schedule these messages to be sent out at a certain time, for example, he creates the message at 2am but wants it to be sent out at 10am without his intervention, where do I begin with this sort of thing? I'm using ASP.NET MVC3, any help is appreciated.
Update
Darin has suggested Quartz.net, I've finally gotten around to attempting to set it up. But I'm not really understanding how to implement it with my web app.
I'm assuming I should be able to make an httprequest from my service to an action on my webapp, triggered by quartz. But I'm not sure how to communicate between the webapp and this service, such as sending instructions to the quartz server.
So far, I've created a windows service, set up the installers, and added the Quartz.net server 2010 solution to my service project, am I on the right track?
Using a managed Windows Service with Quartz.NET or a console application which you would schedule with the Windows task scheduler seems like a good approaches to achieve that.
Welp, there are scheduled tasks... either make a localhost request at a specific time, or write an executable/service to be called.
A possible alternative if you can't use scheduled tasks (but may be dependent upon the website being available to the Internet) is to make a remote request-maker program or use a website monitoring service. Have either of those make a request to a particular page/querystring on your site periodically. Then, make the web application perform a particular task (send an email) whenever that resource is requested.
A few free monitoring services are limited to one request every hour or half-hour, or you can pay to have it checked more often. The resource's code could be made to record the message-sending event, (thus making them only get sent once, regardless of how often the request is made)
I'm trying to model a request submission/ approval /completion scenario. I'm using a flowchart workflow hosted as a service in a console app using WorkflowServiceHost. The workflow has a service reference to a WCF Service hosted in IIS this second service interacts with the application database. I have an aspnet front end with a service reference to the hosted workflow service and call its methods from a proxy client.
The workflow is using a persistence database that I have created using the scripts provided.
The scenario is that a request for a service is made by a user. The request must be approved once by a specific person (I'm using a pick with a delay in one branch to remind the person if no decision arrives, the other branch is receive decision). For some services the request must have a second approval which can be done by any one of a pool of approvers. Once approval is all finished the request goes to a different pool of people for completion.
I have it working but 3 questions:
In the aspnet home page I have a list of requests with links to pages to approve/complete as appropriate and call methods on the proxy after which they redirect back but because it's all asynchronous I am having to manually refresh the home page to see the changed list. Am I stuck with forcing the page to refresh itself every x seconds to get around this or is there a way to make it synchronous/check state of workflow/wait for a message back? It's not terribly interactive just hitting a button and not knowing whether the action succeeded or not.
Is there a way to stop someone approving a request just after someone else in the pool has approved it? At the moment nothing happens for the second person when they hit the button (which is good). In the workflow persistence database I can see that the blocking bookmark is the next activity along (presumably set by the person who got there first) so it looks as though the second receive just doesn't happen. I have concurrency checking code in the WCF data service but this never fires because there is no attempt to update the database. I would like to be able to warn the second person that another user got there first.
My homepage list in the web app is built by querying the application database, but is it possible to query the workflow to find the status of each item, passing the item's id (I'm using the id as the correlation handle)? Is it normal to do this or do people usually just query the application database?
I guess you could create an Ajax call that would check if any state change occurs and only refresh the page when that is the case.
If you send a WCF request for an operation that is no longer valid you should receive an error, unless you are using one way messaging because there is no message to send the error back. Mind you that due to a bug in WF4 the message could be a timeout after 60 seconds. There is no real way to avoid the problem because you are checking the workflow state as persisted and letting the user do an action based on that. Even when you query the state the workflow could have been resumes but not saved yet.
Either can work but I normally query the workflow instance store as that is the closest to the actual workflow state.