How can I consume IIS7's FREB programatically - iis-7

IIS 7 has a very useful feature called Failed Request Tracing (FREB for short). It has a very nice visualization feature, involving an extremely complex XSL stylsheet that parses the results into a useful treeview.
I, however, want to consume FREB programatically, and be able to present the results on my smart client (without waiting for the XML to be written server-side).
The only possible solution I found so far, involves compiling an IIS7 C++ plugin that converts FREB into OutputDebugStrings, an approach even the writer suggets should not be used on a production server (here's the article)
My question is: is there another approach?
As always, my thanks for reading, and even more for replying.
Guy

Related

ASP.NET Profiling

I have a slow asp.net program running. I would like to profile the production server to see what is going on, but I don't want to slow down the production server noticeably.
In general, is it standard practice to profile a production box or just local dev boxes? Also, what progams do you recommend to accomplish this?
I can recommend you to use "dynatrace Ajax edition 3" for client side profiling (it's free and easy tool) and "JetBrains dotTrace" for server side profiling. This tools does not slow down server as i know.
You can use Tracing and it is recommended to check these things on your local machine, but if you want to check something on server, you can enable tracing for short in your web.config.
ASP.NET tracing enables you to view diagnostic information about a single request for an ASP.NET page. ASP.NET tracing enables you to follow a page's execution path, display diagnostic information at run time, and debug your application. ASP.NET tracing can be integrated with system-level tracing to provide multiple levels of tracing output in distributed and multi-tier applications.
ASP.NET Tracing Overview
Tracing in ASP.NET
I guess the answer is really 'it depends'! I would start by considering whether the program runs slowly just on the production server, or whether it runs slowly on a development environment as well. I would also consider how closely I could get my development/test environment to match the production environment.
Once you've done that, consider whether there are any areas that could represent obvious bottlenecks that you might be able to eliminate. So, for example, is the ASP.NET application backed by some form of database? If it is, you can monitor the performance of the database separately and establish whether that is where the problem lies.
Next, try and be very specific about what you mean by 'slow performance'. Is it consistently slow (compared to what?), or just when you do specific actions. This may give you another clue as to where your problem lies, or at least what questions you should be asking.
Having answered lots of these questions, I'd then bust out ANTS Performance Profiler to try and profile what's going on. It has a fairly minimal overhead when profiling an application, and you should only really be running it for a fairly short time anyway, as you'll hopefully by this point have more specific questions you want to answer, or specific actions that you want to dig into.
Your best option is Prefix (http://www.prefix.io). It will let you see all of your SQL queries, logs, HTTP calls, and a lot more.
Another option is Glimpse or the Mini profiler.

100% cpu usage in xml/xslt driven asp.net web app

The web app uses XML from a web service, which is then transformed to HTML using XSLT. The app uses a HttpModule to get the XML using AddOnPreRequestHandlerExecuteAsync.
Classes Used:
XmlDocument - stores the xml.
XslCompiledTransform - stores the transform, is cached in Application.
Asynchronous HttpWebRequest using BeginGetResponse/EndGetResponse
HttpModule with hooked AddOnPreRequestHandlerExecuteAsync events.
I do not want to use the XPathDocument unless there are no other possible optimizations. It would take some complicated code to get all the XML together without the ability to write to the XmlDocument. There is additional XML that does not come from the web service that must also be added to the document.
Any suggestions would be nice. The server doesn't seem to be having memory issues, if that is telltale of anything, just really high cpu usage.
Thanks.
UPDATE
After much searching I found that the issue causing the cpu to race was actually an infinite (or near) loop, which was not in my code at all, and hidden from my profiling due to the nature of where it was coming up. Lesson here, if it doesn't make sense, look for alternative reasoning for the issue before tearing your code apart.
What version of .NET? It's been a while since I've done anything with it XML/ XSL, but .NET 2.0 had some memory issues in XslCompiledTransform. While that could be the issue, it's more likely something in the code. Can you provide some sample XML and the XSL doc?
What happens if you save both out as static files and try to run the transform (create a small standalone script or unit test that just does this to see if it's an issue)? Make sure you're disposing of your XslCompiledTransform object as soon as you're done with it (and the XML doc as well).
When I run into issues with XSL transforms, I usually save a sample XML document and apply my XSL in Cooktop. It's a little hard to figure out at first, but it's a good sanity check to make sure you don't have a glaring error in your XSL.
Consider using Linq to XML to do the transformation - 350 kB is a large text/xml document from a transformation standpoint - it might be faster than an XSLT tranformation. See here for a sample.
Is the web service on the same server? If so, does testing the service by itself show high CPU usage?
How are you putting the transformed document into cache?
Try to user a Profiler. DotTrace and ANTS have trial versions. This should make you able to pin point your problems. (The nice thing about dotTrace is, that it integrates with unit tests.)

Speeding up a Web Service

I have a web service running and I consume it from my desk application that is written on Compact Framework.
It takes 13 seconds to retrieve 8 results which is kinda slow. I also expect to be retrieving more results in the future. The database query runs fast.
Two questions: how do I detect where the speed slow down occurs? Do I put timers in the Web services code?
I would like to detect whether it is the network or the application code.
This is my first exposure to web services in a real environment so please bear with me.
i used asp.net 2.0 and c# to write a simple web service.
Another good profiler is the EQATEC Profiler. I did a write up on it here: http://elegantcode.com/2009/07/02/eqatec-profiler-and-net-cf-profiling-and-regular-net/
And it works find for .net CF projects. But this will allow you to see if there performance issues in unexpected places.
Your already on the right track of adding event logging, and include timers in them. Note, doing so will add to the over all time it takes, so you'll want to remove them after you track down the culprit. Also look into running the same webservice call multiple-times without re-initiating the connection, that may be cause as well.
-Jay
A starting point is to profile your web service to see where the delay is comming from
Did you know the CLR Profiler? There are some tools you can use to see what is happening
http://msdn.microsoft.com/en-us/library/ms998579.aspx
The database connectivity from your service to the DB could be a possible cause for slowdown. Adding timers should do the trick. If the code isnt too huge, you can look at the coding constructs to come up with an informed decision of where exactly things can be slow. Then add the timers. You would get a fair idea of where things are slowing down.
Two biggest pain points are going to be instantiating the web service reference and transferring all the data over the network. Pending anything turning up where some obvious blunder was made, I would look at ways of reducing the size of your xml and ways of better handling your web service reference.
All I know about the compact framework is that it is a pain to work in. I've worked on a number of web projects though and profiling your server, putting in logging to record the time taken will be helpful. If all the time is being taking post server response, however, it won't do much more than prove your server is working quickly.
SoapUI is a fantastic java application for consuming web services. It has a lot of functionality, including time metrics. I would start with that and see how long it takes to consume the same thing your client would be. Failing issues there, start with what I recommended above.

asp.net handlers and extensions - am I doing it incorrectly?

I have a requirement on my new project to serve up some "hidden" assets (actually just stashed in the App_Data directory) after a certain date. Before then, they should act like they aren't there.
I've done this kind of thing a hundred times with Page object, but as I started work on this, I thought I'd look into handlers. Having never worked with them (and being a little intimidated by them), I was happy to find that they'd serve up my XML and JPG files without the overhead of the whole Page class. Already, I'm happy that I considered it. I wrote it to handle functionality like "MyHandler.ashx?secretfile=blah.xml", and it worked great.
Then I started looking at special extension handling, so that a request for "blah.xml.secret" would get picked up by handler and return blah.xml after checking the date. A couple of lights went off in my head, and I reworked the code so that it handled that case. It worked (in the IDE)! I was pretty excited.
Getting it onto the dev server (IIS) was a little different: I had to register .secret as a .NET type (no big deal), and it still didn't work until I unchecked the "verify file exists" checkbox. (blah.xml.secret obviously doesn't exist: blah.xml does, but not in the spot it's being asked for, only in the secured App_Data directory.) That's not a huge deal, but now my clever solution relies on two implementation details from the IIS side.
So my question is: is this the intended use of handlers in asp.net? Am I warping this beyond recognition? I feel like I've seen sites do tricks like this in the past, but for the one thing I'm trying to do, the IIS changes seem overly complicated. In my research on this, I didn't find a slam dunk 1-2-3 guide to using handlers that included an example like this, so it's got me thinking I'm maybe abusing it or going about it the wrong way.
Yes, that's pretty much how it works. (In Windows Server 2008 there is a remote chance that you can do the settings from web.config so that you don't have to change anything in IIS.)
If you use an extension that isn't already registered to be handled by the ASP.NET engine, you have to register it. If you use an extension that already is handled by ASP.NET, like .aspx, then you don't have to register anything in IIS. (When you run it in the integrated web server in Visual Studio, everything is already handled by ASP.NET, that's why it works there.)

Tools and methods for live-monitoring ASP.NET web applications?

I think many developers know that uncomfortable feeling when users tell them that "The application is slow (again)."
In a complex web application there can be many possible reasons for a degradation in (perceived) performance: slow database response, bandwidth issues, bad caching etc. There certainly are issues which will never occur in a development or staging environment.
Now my question:
Is there a set of tools and/or methods which would provide a comprehensive "live" state on a IIS/ASP.NET/SQL Server production system in a visually way (not just performance counters):
Current HTTP requests (say the last n minutes)
Exceptions / timeouts
Bandwidth data
Number of open database connections / database calls
...
The primary goal is to see at a glance (or after looking closer) what problem is causing the performance problems.
I think the category of software you're looking for is ".net profiler" or ".net tracer". One such tool that you might consider is JetBrains' dotTrace. It gives you runtime stack traces and an array of counters that indicate possible bottlenecks.
Previously mentioned tools will certainly work. At our shop we needed finer information and built our own solution (long story: it was easier to code than to argue about tools and retrievable data).
I used LogParser to flip through the IIS logs and create output reports of those logs (e.g. result code breakdowns etc).
I used a combination of performance counters and WMI values to get the rest - you can read these using some pretty straightforward C# - this gives you full control that you can then dump to .csv etc for viewing/processing in excel or if you are updating a page as a control center.
I would probably also look at IIS.net as a great resource for IIS tools including debugging, security etc.
I followed urig's advice and found this software called SmartInspect.
Does anybody know this logging/monitoring tool? It seems to be a combination of real time console and developer library.
CLR 4.5 will have some new capabilities that will help you monitor ASP.NET performance live - without restarting your app. Basically you can re-JIT your code to include some monitoring-hooks in it, and then inspect time spent in classes/methods etc.
I'm sure dotTrace and other profiling tools will leverage this automatically, but it's worth checking out: C9 - Inside Re-JIT with David Broman

Resources