I am exploring if Workflow Foundation 4.0 is stable enough to start developing on it but the documentations I've seen so far are mysteriously silent about why there are no built-in Transaction & SQL Tracking services! They were available in WF 3.5 and seemed to be reasonably stable. Any clues? Was there no time for MS to release WF 4.0 on schedule or the whole concept was broken in 3.5 that they decided to scrap them? I know there are lot of links and hints pointing to writing a custom (SQL) tracking participant, but then what is the point of a "framework"? Moreover there's no way to query the tracked data. And nothing about Transaction service! So how do we keep the WF persistence data & application data consistent? Am i missing something here?
Some unsatisfactory answers on "missing" SQL tracking in WF4:
- http://social.msdn.microsoft.com/Forums/en-US/wfprerelease/thread/8cfe598a-a400-4804-92ad-d68aa444d8f3
[got a few more links, but couldn't post them here bcoz new users can post only one hyperlink per question :( ]
Any help will be greatly appreciated :)
SQL tracking is missing however the AppFabric does include tracking if you go the workflow services route.
Transactions are supported. There is the TransactionScope activity for short running transactions an a CompensatableTransaction for doing long running transactions. There is also the option of creating activity extensions based upon PersistenceIOParticipant where you can save extra data durin THE transaction used to save THE workflow.
According to MSDN, the SQLTrackingService is still supported (see the bottom of the below article):
http://msdn.microsoft.com/en-us/library/system.workflow.runtime.tracking.sqltrackingservice.aspx
You will have to add references to System.Workflow.Runtime.dll (and probably System.Workflow.ComponentModel.dll) to your project. Make sure you are targeting the full .net 4 framework in your project properties (i.e. not the client .net 4 framework). Both dlls can be found in the v4 framework directory.
Related
I am relatively new to SignalR and every tutorial seems to be for a chat application or some variant. This is good to get to grips, but I am looking to have a grid that updates automatically on a database change. I have even gone through Pluralsights tutorial which shows this as a sample but then all it's practicals are for a chat application.
I do understand that this is what a search engine is for, but I have exhausted this channel, as above I found plenty of tutorials but all seem to be for a chat application.
Has anyone else noticed this, and if so has anyone found any good resources for tutorials that deal with real-time grids. Any help would be greatly appreciated.
You can use SqlDependency but it tend to be heavily dependent to SQL (Thus the name). I think its bad practice to have a solution dependent to the database, sooner or later you also want updates from the Domain only etc.
I have made a EventAggregator proxy for signalR that can pick up events decoupled from domain and database. Check it out here
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/wiki
Install with nuget
Install-Package SignalR.EventAggregatorProxy
It has both a .NET and javascript client, check the wiki for how to set it up, and here is a demo
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/tree/master/SignalR.EventAggregatorProxy.Demo.MVC4
It's true that most of the tutorials demonstrate a chat application, since that's the easiest way to show off the technology. Once you start demonstrating a more complex sample, there are design patterns that don't really have to do with the technology.
Here's a tutorial that demonstrates how to update the client based on changes from a server:
http://www.asp.net/signalr/overview/signalr-20/getting-started-with-signalr-20/tutorial-server-broadcast-with-signalr-20
Here's a demo that uses SqlDependency to update the client:
http://techbrij.com/database-change-notifications-asp-net-signalr-sqldependency
(The above is SignalR 1.0, so see the Updating document for info on how to change it to 2.0):
http://www.asp.net/signalr/overview/signalr-20/getting-started-with-signalr-20/upgrading-signalr-1x-projects-to-20
We are upgrading to Tridion 2011 SP1 and as a part of Tridion search implementation we are using FS4SP (Fast Search for sharepoint 2010).
In proposed implemenatation search environement consists of following servers:
FAS4SP
FISE
Can someone guide us regarding how to push content to FAST from tridion and how to retrieve the same?
(Here due to some reasons we are not considering crawling of website by FAST)
What all APIs can be used for this implementation?
If you don't want to use the crawling approach, you will need to create a custom deployer, please take a look at this other article:
How can we integrate Microsoft FAST with SDL Tridion 2011 SP1?
Alternatively, if you don't have a development team who is familiar with Java, you might considering creating a .NET application which updates your FAST index based on either a File System or Database trigger when your pages or components are published, updated or deleted from your broker repository.
You will probably want to create XML for FAST and have the Custom Deployer (or Event System) send the content to FAST.
First create the FAST XML that works and write a sample app so you can insert it into the FAST index from either a .NET or Java application. This does not yet involve Tridion.
Then write your Custom Deployer or Event System and pass the XML to FAST.
IF you are using a Custom Deployer approach I would suggest to contact Tridion Professional Services if you have not done it yourself or are not a Java programmer. The new Tridion 2011 Storage API provides new opportunities for the Custom Deployer. In the meantime I would suggest to append the FAST XML to the normal Page Content at the end, surrounded by some markers, and have your custom deployer pull it out of the Page output, send to FAST, then remove from the output before continuing.
This is a fairly difficult challenge for those who do not have serious Content Delivery / Deployer / Java skills. However, if you want to go for it yourself I would suggest taking at least 2 weeks of time to research existing solutions and experiment with the API.
Using the Event System might be a little easier - but your success or failure message will not appear in the Publish Queue and if the search index fails to update you can only log the failure and not pass the info back to users.
I hope you can help me out with any advice, it'd be greatly appreciated.
I've been using EF 4.0 for a while now using the following object context management technique http://dotnetslackers.com/articles/ado_net/Managing-Entity-Framework-ObjectContext-lifespan-and-scope-in-n-layered-ASP-NET-applications.aspx . I have a fairly simple setup with a web project connecting to a BLL connecting to a DAL. The web and BLL reference the DAL Entity objects. It's been functioning fine but it seems very slow. It's an ASP .NET Webforms application that uses an existing database model (i.,e not code first) and points to a SQL Server 2005 DB.
Anyway, I'm now revisiting the architecture as we're getting complaints over screen to screen performance. I've done most of the UI enhancements possible but I think it's just the Save, Redirect and Load using EF that's the sluggish point now.
The site is a series of quote pages for car insurance. I'm now hoping to do create the relevant objects in session, i.e, page 1 create quote object, populate fields, page 2, add X additional drivers, page 3 add claims /convictions and then save the objects to the database. The users will be able to save and exit at any point in the quote process so the save won't always be at the end. We also need to be able to load page 1's info and update it in memory when they click Next and finally update the DB when they're ready to finish the quote.
At the moment, we're doing the retrieve and saves on the same page. How would you advise we move to the storing in session/final commit?
I've trawled through various msdn pages and I'm having trouble putting it all together into the latest 'best practice' for 4.1 for this fairly simple application. I've looked at Julie Lerman's videos but her n'tier only did a simple retrieve and 'Add', suspiciously leaving out the Update section as I suspect it is not straightforward. Do you think I should use Self Tracking Entities (I've read that people are having multiple issues with this but maybe their architecture is more complex?) or some other way of storing the EF objects in session and making the changes
Any help/ideas greatly appreciated.
Storing data in Session can be an effective method to minimise IO, but it's not a simple decision. You shouldn't put large amounts of data in Session State and we can't tell from your description what kind of volume of data you're talking about, or the number of concurrent active sessions.
There's also the question of your Session State provider. If you're using a single server, you'll probably use in-process Session State, which is quick, but if you have lots of users and lots of data, you can soon run into memory pressure issues.
If you're using a web farm, you'll have to use shared Session State, possibly using the SQL Server Session State provider, so you'll end up reading from and writing to a database on every interaction, which could be worse.
However, step 1 is to make sure you understand the problem. Don't make assumptions about where your performance problems are and try to redesign those. Use profiling or instrumentation techniques to identify the real bottlenecks and concentrate your efforts on those.
You might be surprised as to where your problems lie. It may well simply be a database optimisation issue.
I have an ASP.NET application that is consistently using 75% - 100% of the CPU on a production server. How can I profile the application to figure out what part of the code is using up the most CPU? I have looked at a couple of different tools (Xte Profiler, EQATEC, dotTrace), but they all seem to want you to load and run the application within their tool. It seems to me that they want you to load up the application in their tool and run tests locally (not in production). I want to profile the application while it is running in production with people hitting it to see what is actually going on. Is this possible?
I am a newbie to application profiling so forgive me if I have missed something obvious or am not thinking about this correctly.
Thanks,
Corey
Sam Saffron (one of the StackoverFlow creators) has written a great command-line tool a while ago, but unfortunately has abandoned it.
A friend of mine forked the code to make it work in 2015:
https://github.com/jitbit/cpu-analyzer
(the page has a link to Sam's post explaining how to use it)
The great thing about this tool (besides "no-install required" portability, cmd-line interface, etc etc) is that APM packages like NewRelic etc only monitor http-requests. If your app has some background threads - they won't help much.
You should consider taking a memory dump on the production server while it's experiencing high CPU. Check out ADPlus and taking a hang dump on the asp.net process. This can then be analyzed with Windbg or other tools.
I just went through a similar experience where our production servers were experiencing excessive CPU load - a scenario we could not recreate locally or in test/staging environments. It had nothing to do with the database (database CPU was normal). Analyzing the dump file is what clued us in on what was causing the problem (excessive compilation of regex objects by some library we were using).
This answer would be incomplete without Tess' blog, so here's the link.
My guess it has to do with long running database queries rather than the ASP.net application itself. In my experience 9 times out of 10 this is what I see and this takes the APPLICATION server down to a crawl as resources are consumed and the app has to wait for each query to finish to move on. Take a look at SQL profilier on the DB server and see if there are any queries that are taking a long time to execute.
It could be as simple as adding an index to a column or some other small minor optimizations. Once you know the query, you can then also go back to your code and tweak that section as well.
For those who stumble upon this question still, it really depends on what you are trying to accomplish.
If a server is running that high on CPU, odds are, a standard profiler will bring it to a grinding halt due to it's additional overhead.
There are actually three different types of profilers. Standard profilers, lightweight transaction profilers, and APM tools. You can read more about this in my blog post that discusses all 3:
.NET Profilers: 3 types and why you need all of them
It's certainly possible to profile ASP.NET with the EQATEC Profiler. See:
Profiling ASP.NET websites with EQATEC Profiler
EQATEC Profiler instruments your app in a separate step that enable the app itself to collect it's own profiling info, and the profiler then merely displays that timing data afterwards.
That means that you can run your instrumented ASP.NET app completely independent of the profiler itself.
You could e.g. instrument your app, mail it to your test site in India, have them run it on their server for some days where it will generate timing reports all on it's own, and have them mail back those reports to you, which you can then view in the profiler. Pretty neat.
Note: To have the profiled app generate timing snapshots "on it's own" it must know when to generate them. By default this is when the method Application_End is called in an ASP.NET app. You can programmatically dump snapshots when it suits you by using the EQATEC Profiler API. See the user guide or check out this thread.
You can read about this on Microsoft Developer Network.
You can select documentation according to the version of your Visual Studio. You should verify profiling functionality is provided for your Visual Studio type.
How to: Profile a Web Site or Web Application Using the Performance Wizard
Your best bet is to profile your code on your own machine to identify where it is spending time.
Grab a ten day free trial of this:
http://www.jetbrains.com/profiler/
Here are some links to get you going:
Link
http://msdn.microsoft.com/en-us/library/ms178643(v=VS.100).aspx
http://www.codeproject.com/KB/aspnet/10ASPNetPerformance.aspx
Background:
I am an intermediate web app developer working on the .Net Platform. Most of my work has been defined pretty well for me by my peers or superiors and I have no problem following instructions and getting the job done.
The task at hand:
I was recently asked by an old friend to redo his web app from scratch. His app is extremely antiquated and he is getting overwhelmed by it breaking all the time. The app in question is an inventory / CRM application and currently each customer requires a new install of the app (usually accomplished by deploying it on a different domain on the same server and pointing to a new database).
Currently if any client wants any modifications to the forms such as additional fields, new features, etc my friend goes in and manually adds those fields to the forms, scripts, database etc. As a result all installs of this application are unique. There is no one singular source repository and no one single version of this app. Generally new features are overtime rolled into the other sites, but still this is done on an individual site by site basis.
I will be approaching this on a very modular basis. Initially I will be coding a module that will query an external web service for some data, display and store it, and periodically update it automatically. The next module will likely be for storing and displaying inventory data. This way I want to over time duplicate the current feature set of his app 100% but do it incrementally.
The Million Dollar Questions
I want to make the app have user
configurable form fields. The user
should be able to go to an admin
page, create a new forms page of a
certain category, and then specify
what fields he wants in there. He
could say 'create a new text field
called Item # and make it a
requirement" and that will get
stored somewhere. All forms will be
dynamically rendered to screen based
on what the user has configured. Is
this a good way to go about the
problem of having no idea what a
customer could want in a form? and
thus be able to store and display
form data of any sort ? What sort of
design pattern should I follow here?
I am familiar with asp.net and
the .net framework in general and
have decent knowledge of javascript,
html, silverlight, jquery, c# etc
etc. I can work my way around web
apps in a good way, but I am not
sure what sort of framework or tech
I should use to accomplish this
task. Would ASP.net 3.5 webforms be
the way to go? or should I look into
ASP.NET MVC? Do I use jquery and ajax for
complete decoupling of frontend and
backend ? or will a normal asp.net
page with some spattering of ajax
thrown in working with a codebehind
be the order of the day?
Just looking for general advice before I start.
I am currently thinking of using ASP.NET 3.5 webforms, jquery for clientside animation, ui, manipulation and data validation, and sqlserver + a .net or wcf webservice for backend.
Your advice is much appreciated as always.
I've recently implemented a white-label ecommerce system for an insurance company that allowed each partner to choose their own set of input fields, screens, and order the flow of the application to suit their individual needs.
Although it wasn't rocket science, it added complexity and increased development time.
Consider the user configuration aspect very carefully In hindsight both my client and their clients in turn, would have been happy with a more rigid system.
As for the tech side of your question, I developed my project in VS2005, using asp.net webforms and webservices with a SQLserver back end, so the stack that you're looking at is definitely capable of delivering a working product. ASP.net MVC will almost certainly help as far as testability goes.
The biggest thing I would change now if I was going to start again would be to replace the intermediate webservices with message based services using nServiceBus, MassTransit or the like. While the webservices worked fine, message based communication should be quicker and more reliable.
Finally, before you start to code, make sure that you understand the current system's functionality inside and out. If the new system doesn't do something that the old system did, it will be pretty obvious to the end users straight away.