I have a vb.net 4.0 UI that basically allows users to search for data on a SQL Server 2008 database and update/manipulate it. All of the communication with the database is done through stored procs. One of the update procs may take up to 6 minutes to process - currently the users just see the "processing..." message until the update has completed, and then they are shown the results.
I think this is a good candidate for a background task. I would like the users to be able to invoke the request, and then continue to do other work in the UI. When the task finishes it would notify them of the results. Can I accomplish this with threading? I'm new to threading but given some literature and an example or 2 I could be on my way. I've done some Googling but it's not apparent in the examples whether the user can continue working in the UI while the task executes. Are there other options to accomplish what I have described?
thanks.
There a number of options for running a background task, but in .net 4.0, the neatest is probably to make use of the TPL (Task Parallel Library). You can execute a background task as follows:
Task.Factory.StartNew(()=>SomeMethod());
Detail info can be found here.
http://msdn.microsoft.com/en-us/library/dd460717.aspx
Remember though, that if you need to perform any UI updates when returning from this call, you will need to dispatch that back on to the UI thread.
The TPL also has mechanism for running a continuation on the Dispatcher thread.
Whilst the background task is running, the UI thread will not be blocked.
Related
I have an application that is working well in production, but I wonder if I could have implemented the concurrency better....
ASP.NET .NET 4, C#
Basically, it generates n number of sql statements on the fly (approx 50 at the moment) and then runs them concurrently and writes the data to .csv files.
EDIT: First I create a thread to do all the work on so the page request can return. Then on that thread...
For each of the SQL statements I create a new Task using the TPL and execute it using a datareader and write the data to disk. When the last file is created I write some summary data to a summary file and zip it all up and give it to the user.
Should I have used Threads or Asynchronous Delegates instead?
I haven't posted code as I am really just wondering if my overall approach (i.e. TPL) is the best option in this situation.
Please don't lecture me about creating dynamic sql, it is totally necessary due to the technicalities of the database I am reading from and not relevant to the question. (Its the back end of a proprietary system. Got 7 thousand+ tables).
Should I have used Threads or Asynchronous Delegates instead?
Apparently, your background thread operation spans across the boundaries of a single HTTP request. In this case, it doesn't really matter what API you use to run such operation: Task.Run, Delegate.BeginInvoke, ThreadPool.QueueUserWorkItem, new Thread or anything else.
You shouldn't be running a lengthy background thread operation, which lifetime spans multiple HTTP requests, inside ASP.NET address space. While it's relatively easy to implement, this approach may have issues with IIS maintainability, scalability and security. Create a WCF service for that and call it from your ASP.NET page:
How to: Host a WCF Service in a Managed Windows Service.
If we start a new thread in ASP.Net from the thread which is serving the http request, and new thread has an unhandled exception, the worker process will crash immediately. Even if we use WCF service and call that from ASP.Net the ASP.Net thread is going to wait for the result. So better use any queuing mechanism so that the requests is in queue and queue can process in a different time based on the processing capacity. Of course when we say queuing we need to think about queue failure, requeue etc...But its worth if the application is big and needs to scale.
I have an asp.net 3.5 web application which generates alot of audit related data. Since that data isn't immediately relevant to the user, I'd like to be able to save it to the MSSQL database asynchronously and let the user go onto the next page without waiting. I'm using Nhibernate as my ORM.
I've looked into PageAsyncTasks and as far as I can tell they simply allow you to perform page operations in parallel, but all operations still have to complete before the page finishes loading. Is there an alternative, fairly lightweight method to have asynchronous processing that will continue on without affecting page load? Is simply spinning up a new thread manually an acceptable process?
You could create a web service within your solution and when your server-side code is finished and ready to move the user on to the next page it could call your web service to do the auditing as a fire and forget type thing.
Not sure if the NHibernate session is threadsafe so if you create a new thread be careful with the context.
Ideally you could use queues and a servicebus to deal with this sort of thing safely and async but that involves architectural changes.
Not sure if this is possible but if the auditing is actually noticeably slowing the UI down maybe you'd be better off to improve that process and keep it synchronous. Either way, good luck.
I have a process which I will be invoking manually for the first time in prod environment. Thing is, the process stops when the server is down or if the server is stopped. In this scenario, I will not be able to invoke the process manually everytime since it will be in production environment and not feasible also. So i need to know how can i invoke a process automatically once the server is up?
Heard that one way is to write a custom component to start the process using livecycle implementation class.
Please let me know how to go about it?
Any help regarding this is much appreciated!
Thanks
There are at least two ways you can do this.
First is the custom component route. You invoke the process on component life-cycle start to ensure that the invocation happens every time your component is deployed.
Second is the servlet route. You invoke the process on the initialisation of the servlet making sure that the server started.
The servlet implementation is a better fit for purpose, the only downside is, you need to package and deploy it separately as it won't be a part of the LCAs.
You can find the code samples on how to invoke LC processes using APIs on adobe docs. You can use Java API, WS API or Rest, whichever you are more comfortable with.
http://help.adobe.com/en_US/livecycle/9.0/programLC/help/index.htm
Team:
I need to invoke a WF activity (XAML) from a WF service (XAMLX) asynchronously. I am already referencing the Microsoft.Activities.Extensions framework and I'm running on the Platform Update 1 for the state machine -- so if the solution is already in one of those libraries I'm ready!
Now, I need to invoke that activity (XAML) asynchronously -- but it has an output parameter that needs to set a variable in the service (XAMLX). Can somebody please provide me a solution to this?
Thanks!
* UPDATE *
Now I can post pictures, * I think *, because I have enough reputation! Let me put a couple out here and try to better explain my problem. The first picture is the WF Service that has the two entry points for the workflow -- the second is the workflow itself.
This workflow is an orchestration mechanism that constantly restarts itself, and has some failover mechanisms (e.g. exit on error threshold and soft exit) so that we can manage our queue of durable transactions using WF!
Now, we had this workflow working great when it was all one WF Service because we could call the service, get a response back and send the value of that response back into another entry point in a trigger to issue a soft exit. However, a new requirement has arrisen asking us to make the workflow itself a WF activity in another project and have the Receive/Send-Reply sequences in the WF Service Application project.
However, we need to be able to startup this workflow and forget about it -- then let it know somehow that a soft exit is necessary later on down the road -- but since WF executes on a single thread this has become a bit challenging at best.
Strictly speaking in XAML activities Parallel and ParallelForEach are how you perform asynchrony.
The workflow scheduler only uses a single thread (much like UI) so any activity that is running will typically be running on the same thread, unless it implements AsyncCodeActivity, in which case you are simply handing back the scheduler thread to the runtime while waiting for a callback from whichever async code your AsyncCodeActivity implementation is calling.
Therefore are you sure this is what you want to achieve? Do you mean you want to run it after you have sent your initial response? In this case place your activity after the Send Reply.
Please provide more info if these suggestions don't answer your question./
Update:
The original requirement posed (separating implementation from the service Receive/Send activities) may actually be solved by hosting the target activity as a service. See the following link
http://blog.petegoo.com/index.php/2011/09/02/building-an-enterprise-workflow-system-with-wf4/
I need to get the next activities(transitions) what my workflow is being blocked for as soon as workflow entered a new state without relying on workflow persistence service, I found out that workflow persistence start to hit database when workflow instance is idle, which has a time latence when there are more than one instance of workflow running, it pose a serious problem for me, I need the blooking bookmarks to be in Synch with my workflow status, which I will set in code activity when workflow enters its new state, from codeActivityContext and NativityContext, there is no way to get the api to get this information(the next transitions), both the statemachine class and state class are sealed, there is not way to tag into it.I am using the blocking bookmarks to indict how the workflow will flow to U.I, so that I can drive the workflow from U.I, I am hosting the statemachine using workflowserviceHost with IIS. I am wondering why I am the only one run into this issue, I have been struggle with this issue for some time.
Thanks in advance.
Your best options is using a TrackingParticipant where you can see exactly what is going on in a workflow as it is executing. From the TrackingParticipant you can then save the bookmarks and have the UI reuse them.