Windows Work Flow persistence database? - workflow-foundation-4

Few questions:
1. Is SQL server installation needed to run Windows Work Flow?
2. If yes, where does work flow stores (persists) data for a long running process
3. I see that some files are created in .\windows\Microsft.NET\Framework\v4.0\SQL\en\ (some sql scripts to create persistense points)
4. Do we need to run these scripts to manually create database?
5. Can we persist data on file system instead? so that we don't need to install SQL Server?
Thanks

I see one supposed answer already, but "read the docs" answers really aren't good answers, especially in an area so poorly documented as WF, so in case anyone else stumbles across this thread:
(1) SQL Server doesn't have to be installed just to use workflows, but if you want persistence for long running workflows, (2) SQL Server is your easiest way to get it.
(3) and (4) You can let AppFabric do most of the heavy lifting in setting up the persistence database for you.
(5) you could persist on a file system instead of SQL Server but IMHO, from what I've seen in my short time with WF and persistence so far, you'd be crazy to try to implement your own persistence provider like that, especially when just starting out. You can use SQL Server Express to get started. Why reinvent the wheel?

Related

What is the best way to update database on 00:00 everyday in asp.net mvc4

In an ASP.NET MVC4 project, I want to update some data in sql server(2012) on 00:00 everyday.
Maybe I have three choice:
1. writing an windows service which running on the server and execute database updating.
2. writing an sql server Stored Procedure which execute on 00:00 everyday.
3. use Third-party tools,like Quartz.Net, fluent.
Which one is the best choice? Why?
If you're comfortable with T-SQL and the task can be completely entirely within the database, then it makes sense to implement it as a stored proc and schedule it using SQL Server Agent. This reduces the number of external dependencies as everything happens inside SQL Server and there are less points of failure (e.g. it will still run if IIS or your web solution is down).
If your update needs to interact with other resources, such as importing a text file from a known location, etc., then you might also consider implementing it in SSIS (SQL Server Integration Services). This has the advantage of still being SQL hosted, while giving you access to additional functionality not easily achieved with a stored proc.
I would only implement as a batch process in .NET if I felt that the functionality required would be difficult or awkward to implement in a proc or SSIS package. This is especially relevant since SQL Server 2012 allows you to build an SSIS package using .NET type code, but it "lives" in the SSIS package that is registered on the SQL Server.
I wouldn't implement it within your ASP.NET solution at all unless it actually needs a web based user interface for some reason. The fact that you want to execute this process at precisely the same time every night tells me that this does not require human interaction.
Where a process can be fully automated, avoid putting a user interface or anything which can potentially hang or become a single point of failure while waiting for some kind of UI input. Remember, a presentation layer is for human interaction - consider if you need one. Better to implement it as a batch of some sort, which also makes it easier to execute via automation. There are exceptions to this rule, e.g. if you wanted to implement your update as a web API with a REST interface or the like, but as a general rule it holds true.
As a side note, if your aim is to run your process overnight, consider scheduling it in the early hours of the morning (between 3 and 4 am) rather than midnight as this is generally when most people are asleep and your update is least likely to impact the availability of your app and its database, or if your update is long running, run into an edge case like a daylight savings change or conflict with other overnight processes.
Check out Hangfire. Scheduled tasks in ASP.NET. Super easy and reliable.
http://hangfire.io/
We just started using it and I like it.
I used all three ways you write, all have their pros and cons. But I suggest you to use Quartz.Net. It is very easy to impliment and very easy to use.
You can see here a wonderful article by mike on Scheduled tasks in ASP.NET

How to sync two ASP.NET Membership databases

I have a local and an azure ASP.NET Membership database. I need to be sync them both. Wondering if anyone has found a easy way to do this? The table structure seems simple enough but would rather pull from azure than push. Is there a routine or tool I do not know about to do this by now?
Thanks
-Ken
This would be a suitable job for the Microsoft Sync Framework.
You create a service or scheduled task that makes the necessary calls. Have this running on your server and you can pull from the Azure database and sync with the local one. It can be set up to sync one way or two ways.

Can I install WAMP on Microsoft Azure (Bizspark account)?

I have got a Bizspark account from Microsoft and they are providing a basic Azure account. I have been told that it can run PHP, however I would like to use a more tested solution like WAMP. On top of that, I want to place a quite heavy WordPress / BuddyPress installation (that I hope will bring a lot of trafic :)
Has anyone done something similar to this? If so, what is your experience / pitfalls etc.?
Thanks
Stelios
Yes, you can do this. At the end of the day you are just using Windows Server, so anything that installs there will install in the cloud as well. I have done this myself for hosting WordPress in Windows Azure.
However, there are some pitfalls here. Mostly the pitfalls are around the M (MySQL). To setup MySQL in Windows Azure is not really that hard, but you have several considerations on how to make sure it is always available. You can:
Setup a single instance of MySQL in
a role and store the db on local
disk (this is a bad idea).
Setup a single instance of MySQL in
a role and store the db on a drive
(blob backed storage)
Setup 2 instances of MySQL to each
point to a shared drive
(hot-failover). Only one drives will
be able to mount. Now, you have reliability and failover, but a single instance at a time working for you.
Setup 1 writer of MySQL on a drive,
and multiple readers on a snapshot
of a drive. Put in some logic via
connection strings to make sure only
writes goto a single one and reads
to the others. Snapshot every X
mins to update readers.
Setup multiple instances of MySQL
and use native replication features
(each storing to local disk) and
rely on that if you lose an
instance.
There are probably more permutations, but the gist of the problem is how you scale out MySQL to be available and reliable. In Windows Azure, you don't get to rely on the fact that the local disk will always be around or that you will always have the same instance. In fact, you can guarantee that your instances will be down for some period of time each month and eventually, given enough time, you will lose the local disk.
Overall, with multiple instances however, you can guarantee they won't be down simultaneously (to the service SLA level at least). So, you need to make sure MySQL works with multiple instances (or live with single instance downtime) and that your data is backed by blob storage to guarantee it is persisted.
Or you can scrap all that crap and just use SQL Azure, which solves all those problems. So, it become WASP. SQL Azure can also be more economical as well for smaller DBs.
Or you can scrap all that crap and just use SQL Azure, which solves all those problems. So, it become WASP. SQL Azure can also be more economical as well for smaller DBs.
Ditto.
Installing MySQL on an Azure role is not a good idea for plenty of reasons, most notably (lack of) scalability and reliability. (That's just for deploying on Azure, MYSQL itself is great)
To set it up remotely reliably you're going to need a dedicated instance which will run you at least $40 a month, going with SQL Azure is $10/Gb, or free if you get an introductory offer or Bizspark.
If you're just looking to play around with a single instance app, I'd suggest you rather use SQLite or some other in memory db, it'll be a lot less painful.

Communicating between ASP.NET applications on the same machine

I have a situation where information about a user is stored in the web application cache and when that information is updated in one application - I want to notify the other applications (running on the same machine) that the data should be removed from it's cache so it can be refreshed. Basically I need to keep cached data in sync across multiple asp.net applications.
I have started down the path of using a central web service to help coordinate the notifcations but it is turning out to be more complex than I think it needs to be.
Is there a way that one asp.net application can easily reach across to another on the same box to clear an item from the cache?
Is there a better way to achieve shared cached information than using the application cache?
I really want to create a way for apps to communicate in a loosely coupled way - I looked at nservice bus but the dependency on MSMQ scared me away - my client has had bad experiences with MSMQ and does not want to support an app that requires it.
Suggestions?
Michael
I agree with Hogan. Best is to use a shared database. I want to add to that that, when using SQL Server, you can use SQL Cache Dependency. This SQL Server mechanism allows notifications to applications in such a way that used caches can be invalided directly after a change is made to the data.
A shared database is probably going to cause you the least pain.
Edit
Note: ASP.NET allows you to make "cache clearing" triggers on SQL server changes. Should be a quick search in the cache examples on MSDN to find some examples. Thus when the user info stored in the cache changes in the DB the local cache copy will clear and be re-loaded from the DB.
There are commercial distributed caches available for .net other than Microsoft Velocity - NCache, Coherence, etc.
How about Velocity? It's a distributed cache that works between servers as well as between applications. It has PowerShell management and all sorts of documentation to get you going faster and be far more maintainable in the long-term.
What about COM/DCOM, using namespace System.Runtime.Remoting

SqlServer Express slow performance

I am stress testing a .NET web application. I did this for 2 reasons: I wanted to see what performance was like under real world conditions and also to make sure we hadn't missed any problems during testing. We had 30 concurrent users in the application using it as they would during the normal course of their jobs. Most users had multiple windows of the application open.
10 Users: Not bad
20 Users: Slowing down
30 Users: Very, very slow but no timeouts
It was loaded on the production server. It is a virtual server with a 2.66G Hz Xeon processor and 2 GB of RAM. We are using Win2K3 SP2. We have .NET 1.1 and 2.0 loaded and are using SQLExpress SP1.
We rechecked the indexes on all of the tables afterword and they were all as they should be.
How can we improve our application's performance?
This is just something that I thought of, but check to see how much memory SQL Server is using when you have 20+ users - one of the limitations of the Express version is that it is limited to 1GB of RAM. So it might just be a simple matter of there not being enough memory available to to server due to the limitations of Express.
You may be running into concurrency issues, depending on how your application runs. Try performing your reads with the "nolock" keyword.
Try adding in table aliases for your columns (and avoid the use of SELECT *), this helps out MSSQL, as it doesn't have to "guess" which table the columns come from.
If you aren't already, move to SPROCs, this allows MSSQL to index your data better for a given query's normal result set.
Try following the execution plan of your SPROCS to ensure they are using the indexes you think they are.
Run a trace against your database to see what the incoming requests look like. You may notice a particular SPROC is being run over and over: generally a good sign to cache the responses on the client if possible. (lookup lists, etc.)
Update: Looks like SQL Server express is not the problem as they were using the same product in previous version of the application. I think your next step is in identifying the bottlenecks. If you are sure it is in the database layer, I would recommend taking a profiler trace and bringing down the execution time of the most expensive queries.
This is another link I use for collecting statistics from SQL Server Dynamic Management Views (DMVs) and related Dynamic Management Functions (DMFs). Not sure if we can use in the Express edition.
Uncover Hidden Data to Optimize Application Performance.
Are you using SQL Server Express for a web app? As far as I know, it has some limitations for production deployment.
SQL Server Express is free and can be redistributed by ISV's (subject to agreement). SQL Server Express is ideal for learning and building desktop and small server applications. This edition is the best choice for independent software vendors, non-professional developers, and hobbyists building client applications. If you need more advanced database features, SQL Server Express can be seamlessly upgraded to more sophisticated versions of SQL Server.
I would check disk performance on the virtual server. If that's one of the issues, I would recommend putting the database on a separate spindle.
Update: Move to separate spindle or Upgrade SQL Server version as Gulzar aptly suggests.
make sure you close connections after retrieving data.
Run SQL Profiler to see the queries sent to the database. Look for queries that are:
returning too much data
constructed poorly
are being executed too many times

Resources