What is the easiest set of tools to get started with Source Control, TDD, and CI for Microsoft.Net 2008/2010 [closed] - asp.net

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
I work on a team with three other developers and one business analyst writing internal business applications. We're primarily building apps in ASP.Net, and do so in a very 2003-ish way. It's like going back in a time machine. Although two of the other developers are amenable to learning new things, one of the developers is not. He's the type who thinks he's the strongest developer in town, and that if he doesn't understand a new tool within 5 minutes then he just needs to build his own. He also doesn't recognize agile development, TDD, or basically any non-Microsoft-blessed tool or method. He even considers source control from anything other than SourceSafe to be dangerous. To his credit, he's a brilliant programmer, just not someone interested in software development.
So the only way I can get consensus is if a tool is really easy to use. Once we hit a single snag, he'll lose faith in a "I told you so" sort of way.
So what set of tools should I use to get us into a modern source control system, TDD, and CI? The obvious choice in my situation seems like it would be Microsoft's TFS, but I doubt I could get our thrifty and apathetic management team to spend the extra money (they already think MSDN Pro is too much).
Basically, what is the easiest set of tools to get going with Source Control, TDD, and CI for a .Net 2008/2010 environment?

I wouldn't recommend dumping all these tools and methodologies on your team at once, take baby steps. Introduce one at a time. Some will come naturally.

There are many good choices, but I can personally recommend these:
Source control: Subversion with TortoiseSVN and Ankh or VisualSVN
Continuous Integration: CruiseControl.NET
TDD tools: NUnit + your mocking framework of choice (I use NMock, though it's a bit old-school). I agree with commenter Eric that TestDriven.NET is a must-have, particularly if you want to make this easy!
These are easy to get started with because they're all good products, reasonably to very well-documented, and widely-used (so it's easy to get help).

It's always going to be difficult to introduce new tools if you can't build a consensus. Focus on building the consensus, rather than on the tools.
SVN is very good (with Ankh and TSVN), but it can be a bit surprising to people used to SourceSafe.
TDD is a technique, rather than a toolset, so you need books, blogs, etc. For tools to support it, NUnit or MSTest. Continuous Integration is a must-have. CruiseControl.Net is pretty good (though a bit difficult to configure initially). Consider also TeamCity.
Do you have a bug-tracking system?
Oh, and if your management team is that apathetic, consider quitting.
Update: you've said that they're not so much "apathetic" as "hands-off". Question: are they really hands-off, and will they let you move things along? Or are they "status quo" -- "it ain't broke, so don't fix it, and don't rock the boat"?

I think you can make a really really good case that within the last two years Agile has become completely and totally embraced by Microsoft. I know for a fact that the Codeplex, MEF, and ASP.NET MVC teams are quite steeped in it. I also think that visual studio and parts of the windows 7 team are Agile. Also consider that Visual Studio 2010 includes out-of-the-box refactorings that don't really make much sense outside the context of TDD and that Agile is the default project management template for TFS and a picture of a corporate culture that is quite different from the one of years past starts to emerge.
As for specific tools. TFS is OK for source control but I find it very heavyweight and finicky. Others have mentioned Subversion but if you're worried about MS blessings you might have better luck jumping straight to Mercurial. Its a more advanced SCM but it is now supported natively by Codeplex and has excellent windows integration. I've never used it but I am in deep tool-love with it's cousin git.
Test driven development: Start with MSTest, its not as slick as anyone would like but its not the worst thing in the world. I would also recommend MbUnit which has all of NUnit's features along with some good support for the integration tests that you will probably be writing by accident as you are starting out with testing. Oh, and if you have customization freak I would urge him to look at XUnit.Net.
Mocking: The choice is basically Rhino Mocks or MoQ. Here's a quick intro I wrote for Rhino Mocks that goes over all the basics. That being said, the trade off seems to be more documentation for RM versus a very mildly less error prone syntax for MoQ.
Test Runners: If you start out with MSTest you'll notice that you can get a significant speed boost in your test runs by using TestDriven.Net, resharper or coderush rather than the built in test runner. That being said, don't underestimate the standalone test-runners. They can be quite good every once in a while. I heavily recommend Gallio Icarus runner which comes with MbUnit.

I want to echo what George Mauer has said and suggest starting with MSTest for your unit testing. It's right there in the box to begin with Visual Studio, this will help in your cause as it's "MS blessed".
I would start with unit testing and take it from there, after a few months of "look how easier our life is now we have these tests automated" I'd take it up a notch. Consider adding something like Selenium or WatiN to the mix. Once you're rolling with that, get your CI server up. "Wouldn't it be great if we didn't have to start off all these tests manually?..."
I guess a decent SCM might be a sticking point. SourceSafe is better than nothing. Perhaps start using Mercurial or Git yourself? Show those open to the change the benefits, eventually your stubborn dev will come around when others around him are wanting to switch. Hopefully, he'll find it harder to shout if he's in the minority.
Check out http://www.viget.com/extend/effectively-using-git-with-subversion/ for ideas with mixing up different SCMs.
I also want to +1 mxmissile for saying to take things slowly. I think you'll find it very difficult to introduce all these changes in one go. It's a lot to take in at first if you're not used to it. Try to pick the part you're weakest on, or will add the most value and build up from there.
Good luck!

One tool that got me hocked on TDD is TestDriven.Net which puts the test results in the Output window. I mapped this to the F8 key and the productivity gain is superb; write a test, press F8 and see this results in the output window.
One suggestion I also have to differentiate between having Unit Tests and doing TDD. I have found that TDD can be hard to push on to a team, while; unit, integration or functional tests are an easier sell. Having a bunch of tests that saves an hour going through a manual test day after day is a big win.
After a while people will start to appreciate some new ideas if it is helping them in their daily life. Then you'll be able to introduce a build server, and move away from SourceSafe.

In .NET environments, Microsoft Visual SourceSafe is most frequently used. (but it costs). Next to that you can opt for SVN or GIT. Git is more recent (and gaining popularity). It's easier to work with than SVN once you get it.
http://git.wiki.kernel.org/index.php/GitSvnComparison might help with your decision.

Related

What is the best way to get up to speed on BizTalk? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
How should an experienced .NET & SQL developer go about becoming a BizTalk expert for a project starting in 1 month? How should I spend my limited time to gain some practical skill & knowledge in BizTalk so I can "walk the talk"?
I am self employed, and would not be willing to spend more than USD300. I have the book "Professional BizTalk Server 2006" by Wrox, but have not found it to be a particularly good learning resource (very dry, needs more real world examples).
The BizTalk Virtual Labs in MSDN are a pretty good place to start with. Pluralsight also has several good BizTalk courses, and their online subscription isn't too expensive; would likely be a good option.
I agree with everything written this far. All solid info.
I have a few addons, coming from a fellow freelancer working with BizTalk since 2002:
Unit testing.
It's not easy to do, but check out BizUnit. A Codeplex based toolset written and maintained by Kevin Smith. One of the early BizTalk heroes :-) http://bizunit.codeplex.com/
Deployment / getting things into production
But also keep in mind that none of the day to day development stuff will prepare you for the part of the project where you have to deploy the app and make sure that it is "manageable" by operations. This can be quite complex, and is a topic in it's own right.
Check out Apress Pro BizTalk 2009, it's got a decent (IMO) chapter on this.
The entire development process around BizTalk.
The first two chapters of the same book will give you a good impression on what a BizTalk project is about. Where to use it, and where to not use it, how to organize projects, and name your stuff. Really a good collection of info that you would only get by reading 5-6 years of blogs back in time :-)
And one last thing. Depending on the roles on the project, you might be asked to optimize and tune BizTalk. And if they don't ask you. Make sure that you ask if others have done that, because you have to do it. BizTalk should always be tuned towards what it is supposed to do. Low latency vs high throughput, tuned according to hardware, correct setup and config of network around the SQL boxes, etc etc etc. This can be hairy stuff, and you should be careful not to jump into it before reading up on it all. But it's a subject we as freelancers are often expected to be able to deal with ... so thought I might bring it up.
Example ... BizTalk x64 processes on an x64 box runs really bad out of the box, actually worse than on the x86 processes. The 64 bit processes need to be tuned to really use all the MEM that are availble to them.
Anyways ... a bag of mixed tips and I hope you can use some of them! And good luck! It can be a tough start, but if used right, BizTalk can be a great product/toolset.
And remember .... if it is ugly, or hard, or both. You are doing it wrong. And don't be afraid to dive into .net code, and bolt it onto the BizTalk box. We all do it ... some just won't admit to it :-D
Start with the advice of tomasr.
Then, try and build something as real as possible. Biztalk is the kind of product where everything seems fine when you read the book and follow the examples, then you sit down to do something and you are thinking "what do I do now".
As per Thomas and Shiraz - set up an environment and get your hands dirty. If you haven't done so already, download and install BizTalk Server 2010 Developer Edition
But just to temper your expectation, IMHO expertise in BizTalk (or any other EAI / BPM / ESB product) can take years to accumulate.
It isn't clear whether you are developing for a client with an established BizTalk installation, or if this is the client's first BizTalk deployment. If so, one thing not to be underestimated is that the operational considerations of running a production BizTalk environment (performance, redundancy, reliability, auditing, tracking, monitoring with SCOM etc) are as complex as the development and testing - but understanding of this will be important to 'walk the talk'.
W.r.t. dev, start with some a simple EAI type mapping project, and then work your way through the SDK samples progress to some common messaging patterns (e.g. batching with aggregator), and then move into the BPM type orchestrations. You can probably leave BAM and the BRE for later.
Good luck!
+1 to tomasr for mentioning the virtual labs. Getting hands-on is definitely the way to go, as Shiraz Bhaiji also mentions. Hopefully you're not starting with BizTalk 2006, and can go with the latest: 2010. If that's the case, you can get the Developer Ed. of BizTalk 2010 for free now (see link from nonnb).
I'd also recommend Richard Seroter's book: 'SOA Patterns with BizTalk Server 2009' (available on Amazon.com). There are many ways to do the "wrong" things with BizTalk, and this book does an excellent job of walking through both the how and the why of building BizTalk solutions (with the code samples available from the publisher's site). And yes, it pretty much takes a whole book to go through it all. It's a good (more readable) companion to the Pro BizTalk 20xx series (which is generally better for very specific questions/tasks).

Experiences with Test Automation FX [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Looking to add UI testing to my WinForms 3.5 project. Currently using MSTest for unit testing and MSBuild to build it.
One option I am looking at is Test Automation FX.
The product seems to be a bit new and not fully polished, but it seems to work. So, I'm curious if anyone else is using and has good or bad things to say about it.
It is quite a bit cheaper in price ($450) than Test Complete ($2000), so I also am trying to figure out what is lacking or missing, if anything, from Test Automation FX.
I have gone recently through the process of choosing a GUI testing solution, and finally decided to go to TestAutomationFX. Here are the main reasons I made this choice:
It's creating real code (in my case C#), which is invaluable for me: for maintenability, archivability, flexibility and so on. It is much easier to write in C# (I can ask my developers for support) than in a proprietary script language I would have to learn from scratch (or worse: endless grids of non-maintanable dropboxes). It also lets me build a good testing framework
It has seamless integration with NUnit (that my team uses for unit and integration tests). My data driven test come from the same CSVs, and GUI test reports are just appended to unit test reports, granting easy archiving and maintenance
It has much better recognition of the complex UI objects my developers use (Telerik, Infragistic, home-made): 25% of my clics are in x/y mode, versus 67% with TestComplete or Ranorex
Their sales engineers gave me excellent support (at least during the evaluation period)
It has no major bugs nor complex license setup (yes, I'm looking at you, TestComplete guys, see my other post), no runtime license issue, no virtual machine licensing problems either
(though this was not that important to me), it's four times cheaper than other commercial solutions
On the other hand, there is a medium flaw in the application:
The mapping system (ie. mapping AUT-object properties to Test-application-objects) is really touchy: code refactoring needs special attention. I overcome this by commiting to my VCS before every code refactoring. Anyway, does testComplete provide the option of code refactoring.
OK, as you can see, I'm pretty ethusiast with this solution. I've been using it for only a few days, and may run into bigger problems later. But right now it gives me exactly what I wanted, so let me be happy :)
The company I work for uses SilkTest, which works very good. In general, when using automated testing, you would be doing lots of regression testing. What is more important is when you've modified an existing project, then the test software must still be able to run those tests without any errors. (Or, with the errors you'd expect.)
But the market does have lots and lots of other test solutions. In the past, I even saw a test setup which required two computers and additional hardware. The hardware would connect to the monitor, mouse and keyboard of the test system. The other end would connect to a special extension card in the test server. The hardware was there so the server could send keyboard commands to the test system and record anything that happened on the screen. With some additional OCR software, it was very well capable of analysing any errors. Then again, it had a price of six digits and to be honest, I'd rather buy a Porsche for that price and probably would have some cash left to bring two beautiful dates with me while driving through the boulevards in Nice, France...
There's a Wiki page with an overview of all kinds of test software. It doesn't compare them but you can find Test Automation FX there, although it doesn't provide much information. It seems limited to testing Windows GUI's only.
TestComplete provides more information. Then again, comparing the Wiki's it also supports a lot more. Really a lot more. Enough to explain why it's that expensive...
I have just starting to evaluate different GUI automate testing tool. I have looked at Test Automation FX, Ranorex and TestComplete. And the price for the software are in that order.
This is some of my conclusions:
Test Automation FX - Coded in C#, Fully VS integrated. But very slow in finding components and takes much memory and don't fully support DevExpress components
Ranorex - Coded in C#, Have a studio for maintating test but can be fully integrated into VS. Has better object support. And you can find objects in your software by regex expresseion on several thing. Have some problem with DevExpress components but is rather fast to work with.
TestComplete - Uses its on script language. VBscript is the easiest one (C#Script is just awkward notation). This have really good support for DevExpress components and runs the test really fast. But is very expemsive
Right now I don't know which I should use. Ranorex is alite better than Test Automation FX but both lack the full support for DevExpress components. TestComplete is nice but it introduce a new language to the development and is very expensive. But the test scripts are small and the program have more logic in finding very to click.
I have evaluated Test Automation Fx, Although it recognizes all the controls of my application (we use 3rd party controls from infragistics ie netAdvantage controls for WPF)
It is very slow in recognizing the controls and even playback time is quite slow compared to QTP or Ranorex. I would recommend Ranorex over Test Automation Fx.

Fitnesse vs any other subsystem testing tool [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
We are currently using Fitness for subsystem testing.
we are having lot of issues using the tool, few to mention
Development time for writing Fixture is more then writing the actual code
Issues around check in of the dlls so that Qa can test them
Issues in running Fitnesse for project which uses NHibernate
limited help online
We are planning to use some other tool to do the testing
Few options which we know are
SOAP UI
Story teller
I am not sure whether we will have similar problems with these tools
It would be great to know if someone has experience using these tool and could guide us
In our project we have adopted TDD so we have Nuits for unit testing.
It would be great if anyone is aware of tools/ideas which could extend nunits for subsystem testing as well.
Component testing tools are all about calling functions. Your tests cause functions to be called in "fixtures" that then call into the SUT. Any tool based on this premise will encounter the problems you reference above.
However, most of those problem are manageable. For example you should not be writing lots of fixtures. If you are, something is wrong. Secondly, your fixtures ought to be little more than wiring code to call the APIs in your application. If your fixtures are doing significant work, then something is wrong.
In most FitNesse environments the number of fixtures is rather small. For example, there are over two hundred acceptance tests for fitnesse itself, but the number of fixtures in on the order of a dozen, and they are all relatively simple.
Get help on the fitnesse#yahoogroups.com site. The folks there are usually very responsive to questions.
If you can communicate with your software using text, then I have had success on past projects rolling my own framework using expect.
The framework I cooked up stored tests as XML files, using a simple xUnit style markup. The xml files were then transformed into executable tests using a stylesheet. I ended up transforming the tests into Tcl/Expect, but you could transform them into anything. In fact, if you wanted, you could transform them into multiple languages, depending on your needs.
Several people have kindly reminded me (in the same way you remind you poor dottering grandfather about the drool on his chin) that we are in the 21st century when they inquire why I would choose Tcl over some more modern language. As it turns out, for the purposes of this kind of testing, I haven't yet found a better choice. The Tcl language still kicks butt in this area. Trust me, I didn't wake up one day and say to myself "self, what I need a test framework implemented in a scripting language everyone will hate!"
Believe it or not, I really was looking for a tool, any tool, that had the following characteristics:
Cross platform. This was non-negotiable. We do a lot of cross platform development and we already use WAY too many tools that don't support cross platform development.
Simple syntax. Say what you want about Tcl, but the syntax is very regular. I knew that some native code would probably creep even into the XML files (and originally it was Tcl only, no XML) and I wanted the syntax to be comprehensible to a non-programmer. This simplicity is a core strength of Tcl. As it turns out, it also made transforming the XML easier too.
Free. My favorite price ;-)
Writing tests as simple xml files allowed non-programmers to write customer acceptance level tests - no programming required.
Easily extended.
I did not set out to home grow this to the extent I have. Initially, I looked at established test frameworks like DejaGnu and android. Mostly they had way too many features. They were so feature laden that I didn't think they would be easy for a project to start using without a lot of up front training. Looking at DejaGnu, got me interested in Tcl in general, and after a brief look at tcltest, I almost gave up. Both DejaGnu and tcltest assume you are an advanced Tcl scripter, which I didn't think anyone at my company ever would be. In addition, I wanted the test framework (if possible) to support an xUnit type of test framework and neither of these tools did.
Eventually I found TclTkUnit, a Tcl based testing framework that is designed along xUnit lines. It was only a short leap of logic to realize I could run TclTkUnit in Expect instead of tclsh and get everything I needed.
As it ended up getting used more, I added another stylesheet to render the xml files nicely in a web browser. The test framework generated it's own documentation.
On another project we needs a very basic sim / stim environment to emulate a person throwing switches and pushing buttons on a piece of hardware we didn't have. It only took a few hours to hack the test framework to function as a simulator. Creating the framework took some work, but we felt that it did pay benefits in the long run. I really believe that these types of unforseen consequences of creating your own tools is why people in the agile community & XP in particular have always been such strong advocates.
We have adopted a Fitnesse-based but practically-code-free approach using GenericFixture (google for Anubhava to find his wordpress site) for Fitnesse.
What this allows us to do is to create "executable test narratives" using a language that is friendly to the business-side (as opposed to the technical-side). This language, which is very easily defined, practically without coding, in Generic Fixture, is called a DSL (domain specific language). So we can write our test narratives using e.g. medical terms or even in a language other than English. Basically what we get is transforming our Use Cases into executable narratives.
We are starting to use it in a large project (15 ppl for 2 years) and it seems (so far) to have a good future.
It easily allows Test Driven Development or test-creation after development (traditional approach).
It is wiki-based (Fitnesse) and its versioning and refactoring funcitonality has proven so far sufficient.
I can give more info if anyone is interested.
best regards,
Aristotelis.
We use unit-testing frameworks like NUnit to drive our subsystem tests as well - the tests don't care how they are run. It doesn't have fitnesse's document-based approach, though.

Helps participating in an System Verification Test team getting a better programmer?

I am developing applications for 9 years now - meanly Java. Now am asked to participate in the SVT team for the next release. Overall this means installing complex system setups and running specific user scenarios on these setups as well as doing long runs and load runs.
Overall I am positive about it as I will learn something new. But I am also affraid to loose some grip and knowledge with programming, because of not doing it a lot then.
I know doing programming in side projects such as helping with open source projects will be one alternative, but finding the time on top of a familiy life and a fulltime jop is not that easy.
What do you think, is doing concrete testing work helping getting a better software engineer?
Thanks in advance,
Michael
Testing isn't asside of programming.
You can still program automated systems so you can have recursion testing. From unit tests to real complex automated systems, the best i know is selenium which generates code you can use to build testing scripts in most languages.
There are other tools for non webapps. But I personaly believe that testing is a bit far away from "stoping coding. Unless you're just doing user point-of-view testing.
You can also do error injections which will make you write small singletons to inject them in the memory of your application.
So you can code while testing ;) and learn new stuff also.
Having been in a testing team i think it really helps, because you'll learn to exploit code easily, which will reflect when you build your own API or App at a later date.
I would say it depends on your skill and temparament. Programming knowledge will serve you well while testing. At the same time, I know that it needs a different approach and mindset and is on a completely different career track. You can always keep up your programming skills by writing code for a project you like (even if you have to make one up).

What are your experiences with Windows Workflow Foundation?

I am evaluating WF for use in line of business applications on the web, and I would love to hear some recent first-hand accounts of this technology.
My main interest here is in improving the maintainability of projects and maybe in increasing developer productivity when working on complex processes that change frequently.
I really like the idea of WF, however it seems to be relatively unknown and many older comments I've come across mention that it's overwhelmingly complex once you get into it.
If it's overdesigned to the point that it's unusable (or a bad tradeoff) for a small to medium-sized project, that's something that I need to know.
Of course, it has been out since late 2006, so perhaps it has matured. If that's the case, that's another piece of information that would be very helpful!
Thanks in advance!
Windows Workflow Foundation is a very capable product but still very much in its 1st version :-(
The main reasons for use include:
Visually modeling business requirements.
Separating your business logic from the business rules and externalizing rules as XML files.
Separating your business flow from your application by externalizing your workflows as XML files.
Creating long running processes with the automatic ability to react if nothing has happened for some extended period of time. For example an invoice not being paid.
Automatic persistence of long running workflows to keep resource usage down and allow a process and/or machine to restart.
Automatic tracking of workflows helping with business requirements.
WF comes as a library/framework so most of the time you need to write the host that instantiates the WF runtime. That said, using WCF hosted in IIS is a viable solution and saves a lot of work. However the WCF/WF coupling is less than perfect and needs some serious work. See here http://msmvps.com/blogs/theproblemsolver/archive/2008/08/06/using-a-transactionscopeactivity-with-a-wcf-receiveactivity.aspx for more details. Expect quite a few changes/enhancements in the next version.
WF (and WCF) are pretty central to a lot of the new stuff coming out of Microsoft. You can expect some interesting announcements during the PDC.
BTW keeping multiple versions of a workflow running takes a bit of work but that is mostly standard .NET. I just did a series of blog posts on the subject starting here: http://msmvps.com/blogs/theproblemsolver/archive/2008/09/10/versioning-long-running-workfows.aspx
About visually modeling business requirements.
In theory, this works quite well with a separation of intent and implementation. However, in practice, you will drop quite a few extra activities on a workflow purely for technical reasons, and that sort of defeats the purpose as You have to tell a business analyst to ignore half the shapes and lines.
Related question: When to use Windows Workflow Foundation? My answer there:
You may need WF only if any of the
following is true:
You have a long-running process.
You have a process that changes frequently.
You want a visual model of the process.
For more details, see Paul Andrew's
post: What to use Windows Workflow Foundation for?
Please do not confuse or relate WF
with visual programming of any kind.
It is wrong and can lead to very bad
architecture/design decisions.
So, if you have such requirements, then WF is a good candidate. Of course it is relatively complex, but mention that the problems that is trying to solve is also complex (and sometimes very complex). IMHO, it is very complex for example to dehydrate/rehydrate objects that have event handlers attached (with events that can be triggered when the object is not in memory).
I can not judge what you mean by "small to medium-sized project", but in general I would say that if your project has at least two requirements from the above list, then you can consider WF as a solution.
We've used WF in a large-ish SharePoint application and I can say it's OK. It has lots of power and flexibility. and, as Kevin mentions, once you grok the underlying concepts of workflows, you can do pretty much anything you want with it.
On the other hand, it has some really serious issues, like lack of versioning, which can really hurt your application in the future. We've been forced to deploy up to 3 parallel versions of the same workflow named xxx-v1, xxx-v2 and xxx-v3 to keep older instances running and have new instances use the updated versions. A real pain in the ass. Oh, and there are also some really non-intuitive concepts in there (correlation tokens, wtf??)
We had a project at work that I was involved in using Workflows.
The idea (from management), was that us programmers would write the Workflow Activities along with the "engine" and framework. Then non-programmers would take care of all the rest by compiling their own Workflows into dlls which the engine would automatically load.
Management was sold on this idea of non-programmers using Workflow to help develop software, and it was pretty much a complete waste of time. The problem we were trying to solve with this project was relatively complex and we knew from the very beginning that the software would have to be modified almost constantly (its calculations were dependent on other companies and governements).
The end result was that we were unable to make the Workflow modules generic enough for anyone else to use. So the programmers were the ones who were forced to work with the Workflows, and all the Workflows did was get in our way.
I've been using Workflow 4.0 for the last few months and although mostly impressed, I've found it extremely hard to learn.
For the most recent version (that comes with .NET 4.0 RC), there is next-to-no documentation on the web, in any books or no training courses available. I've only found articles relating to the now defunct 3.0 version. Even the MSDN documenation is light on the ground.
The workflow designer is not as intuitive as it should be by any means so learning is very hard. I've had to rely on answers from a single person on StackOverflow (thanks by the way Maurice!) - and I would be stuffed without his help.
So in summary, I think it has potential but you would be quite mad to learn it yet - wait for more training, documentation and books otherwise you will be going into it blind!
Last year we completed a working application with WF, now used as the backbone of an unbelievably huge system which is used by a very big bank for its mortgage process. The pe process has many steps starting from customer application to approval of credit.
Although it was a success, there were so many problems and crisis all along the way. And it wont worth the trouble for any smaller size projects.
I consider MS WF as a low-level workflow library rather than a fully fledged enterprise workflow product such as K2. It will enable you to build a workflow enabled application, but is not in itself a workflow application. My experiance of it in this capacity has been positive, although we have had to build a lot of our own infrastructure around it (a pub/sub framework, a worlkflow lifetime manager etc). A lot of the documentation out there is fairly simplistic and does not cover building up an enterprise workflow application based on MS WF.
Hard to learn. Quite flexible. Not to be confused with a visual tool for end users, only for programmers. Not sure if I like the dependancy property approach.
It really depends on what you want to do with it. I've only used it a little, but compared to more mature products like MetaStorm (I know technically it's a BPM, but there is still a workflow component), Process Choriographer and IBM MQ workflow, there's no comparison. It's just not mature enough. On the other hand it's free where the others are not and can probably get the job done. I don't know if I would place a multi-million dollar operation on it, but with smaller ones, I'd give it another shot. The real hurdle you are going to face is the change in thought process it requires. If you don't have developers that have worked with state systems before that can be a real hurdle.
Brian, I can't reply to your comment, but anyway, by versioning i mean making changes to the underlying code of the workflow without breaking already running instances, and gracefully applying updates to existing workflows. I'm not sure about 'stock' WF, but at least in SharePoint environment there's no concept of workflow versions so new versions have to be deployed as completely different workflows which becomes a maintenance nightmare.
This has nothing to do with 'rehydration', rehydration is the process by which you bring a 'dormant' workflow back to activity after some event or change in state. That is handled transparently by the workflow runtime.
WF ist integrated into SharePoint (WSS 3.0), and i have created quite a few workflows for various SharePoint-Websites, so i can tell about my experience of WF in SharePoint. Compared with other workflow-frameworks WF scores well. It's stable (i haven't experienced any mysterious errors), workflows are fairly easy to design (thank to the workflow-designer in Visual Studio) and you can use not only sequential but also state-machine workflows.
It's not perfect, of course, and a developer will definitly need some time to understand the concept (of i.e. the Activity Model); but it's definitely useable - even for "small tasks".
Never tried WFF, but I remember reading this article about WFF by Leon Bambrick where he basically says the whole genre of software development tools is nonsense. Might help you decide one way or the other.

Resources