The web app uses XML from a web service, which is then transformed to HTML using XSLT. The app uses a HttpModule to get the XML using AddOnPreRequestHandlerExecuteAsync.
Classes Used:
XmlDocument - stores the xml.
XslCompiledTransform - stores the transform, is cached in Application.
Asynchronous HttpWebRequest using BeginGetResponse/EndGetResponse
HttpModule with hooked AddOnPreRequestHandlerExecuteAsync events.
I do not want to use the XPathDocument unless there are no other possible optimizations. It would take some complicated code to get all the XML together without the ability to write to the XmlDocument. There is additional XML that does not come from the web service that must also be added to the document.
Any suggestions would be nice. The server doesn't seem to be having memory issues, if that is telltale of anything, just really high cpu usage.
Thanks.
UPDATE
After much searching I found that the issue causing the cpu to race was actually an infinite (or near) loop, which was not in my code at all, and hidden from my profiling due to the nature of where it was coming up. Lesson here, if it doesn't make sense, look for alternative reasoning for the issue before tearing your code apart.
What version of .NET? It's been a while since I've done anything with it XML/ XSL, but .NET 2.0 had some memory issues in XslCompiledTransform. While that could be the issue, it's more likely something in the code. Can you provide some sample XML and the XSL doc?
What happens if you save both out as static files and try to run the transform (create a small standalone script or unit test that just does this to see if it's an issue)? Make sure you're disposing of your XslCompiledTransform object as soon as you're done with it (and the XML doc as well).
When I run into issues with XSL transforms, I usually save a sample XML document and apply my XSL in Cooktop. It's a little hard to figure out at first, but it's a good sanity check to make sure you don't have a glaring error in your XSL.
Consider using Linq to XML to do the transformation - 350 kB is a large text/xml document from a transformation standpoint - it might be faster than an XSLT tranformation. See here for a sample.
Is the web service on the same server? If so, does testing the service by itself show high CPU usage?
How are you putting the transformed document into cache?
Try to user a Profiler. DotTrace and ANTS have trial versions. This should make you able to pin point your problems. (The nice thing about dotTrace is, that it integrates with unit tests.)
Related
I want to apply caching techniques to improve my asp.net web application performance. I am going to use .NET default cash. I want to store the data in the XML file as well so that If the system fails to found the data from the cache, I can use the XML file as a secondary option. Is this workflow seems well or standard? Will file i/o operation degrade the performance instead of improving it or break the system integrity? The data volume will be medium and the number of files will be around 1k~2k.
Using XML files as data source seems like a rather unorthodox approach. A more common way would be using a database as data source and something like the Distributed Redis Cache for caching.
See the docs for further information.
Let me start off by stating that I am a novice developer, so please excuse the elementary nature of my question(s).
I am currently working on a Flex Application, and am getting more and more confused about when to use server side scripting, and when to develop web services. For most of the functionality I am working on, I am taking various files from the user (client), uploading to the server for processing/conversion, then sending back to client in new format.
I am accomplishing most of this using asp.net generic handlers (ashx) files, but not very confident this is best practice. But at the same time, does making web services make any more sense? What would be considered best practice for this? Any suggestions would be greatly appreciated.
The way I look at it is as follows:
Web Services mean Established Best Practice.
For most of our development, we don't need to create "Web Services", or what I'm thinking when I think REST, SOAP, and the Twitter API. You only need to start doing that once you've got something you're going to be using every day for years.
Clean and DRY code will Lead you to Creating a Web Service
If you spend the time to clearly define the parts of your upload-process-render Architecture, and you find that it can be applied to almost everything you are doing, then all you need to do to make it a Web Service is define a clear, 1-2-3 set of rules for using the system (GET/POST data, etc.). As long as you are consciously building an architecture the whole way, you'll end up creating a Web Service if it's worthy. Otherwise there's no need.
It sounds like you have a clear workflow going, I don't know anything about asp.net though.
As far as it being confusing sometimes, and best practices, I suggest the following:
Create a Flex Library Project for your "generic ashx file handling" Flex classes. Give it a cool, simple name.
Create a .NET Library Project that encapsulates all the logic for your server-side file processing. Host it online and make it open source. I recommend github. Test it as you go, and document it, its purpose and the theory behind it.
If you don't have to do anymore work at this point, and it's just plug and chug, then you've probably arrived at something that might become a Web Service, though that's probably a few years down the road.
I don't think you should try to create a Web Service right off the bat. Just make some clean and reusable code, make a few examples, get it online and open source, have others contribute and give feedback, and if it solves a specific problem, then make it a web service. You can just use REST for now probably, and build your system around that. RestfulX is a great library for that.
Best,
Lance
making web services without any sense make no sense ;)
Now in the world of FLEX as3 with flash version 10, you can easily read local files, modify them with whatever modifcation algorithm and save local files without pinging server.
You only have to use webservices if you want to get some server data or to send some data to server. that's all.
RSTanvir
Flash / Flex uses a simple HTTP POST approach for file uploads, so trying to do that using SOAP web services will be problematic. Your approach of using ASHX here sounds reasonable to me.
To send / receive data that isn't file based (e.g. a list of files the user has uploaded previously), I would recommend looking at the open source Fluorine FX library. Fluorine uses AMF which is a highly performant way of doing data transfer with Flash. It's also purely configuration-based, which means you don't need to code against any of its APIs, just configure Fluorine to expose your .NET service classes. You could easily add attributes to those same classes to expose them as SOAP web services via WCF if you need that in the future. I would not recommend using SOAP with Flex however, due to the performance losses and also because the Flex implementation of SOAP has a history of bugs and interoperability problems.
I have a web service running and I consume it from my desk application that is written on Compact Framework.
It takes 13 seconds to retrieve 8 results which is kinda slow. I also expect to be retrieving more results in the future. The database query runs fast.
Two questions: how do I detect where the speed slow down occurs? Do I put timers in the Web services code?
I would like to detect whether it is the network or the application code.
This is my first exposure to web services in a real environment so please bear with me.
i used asp.net 2.0 and c# to write a simple web service.
Another good profiler is the EQATEC Profiler. I did a write up on it here: http://elegantcode.com/2009/07/02/eqatec-profiler-and-net-cf-profiling-and-regular-net/
And it works find for .net CF projects. But this will allow you to see if there performance issues in unexpected places.
Your already on the right track of adding event logging, and include timers in them. Note, doing so will add to the over all time it takes, so you'll want to remove them after you track down the culprit. Also look into running the same webservice call multiple-times without re-initiating the connection, that may be cause as well.
-Jay
A starting point is to profile your web service to see where the delay is comming from
Did you know the CLR Profiler? There are some tools you can use to see what is happening
http://msdn.microsoft.com/en-us/library/ms998579.aspx
The database connectivity from your service to the DB could be a possible cause for slowdown. Adding timers should do the trick. If the code isnt too huge, you can look at the coding constructs to come up with an informed decision of where exactly things can be slow. Then add the timers. You would get a fair idea of where things are slowing down.
Two biggest pain points are going to be instantiating the web service reference and transferring all the data over the network. Pending anything turning up where some obvious blunder was made, I would look at ways of reducing the size of your xml and ways of better handling your web service reference.
All I know about the compact framework is that it is a pain to work in. I've worked on a number of web projects though and profiling your server, putting in logging to record the time taken will be helpful. If all the time is being taking post server response, however, it won't do much more than prove your server is working quickly.
SoapUI is a fantastic java application for consuming web services. It has a lot of functionality, including time metrics. I would start with that and see how long it takes to consume the same thing your client would be. Failing issues there, start with what I recommended above.
IIS 7 has a very useful feature called Failed Request Tracing (FREB for short). It has a very nice visualization feature, involving an extremely complex XSL stylsheet that parses the results into a useful treeview.
I, however, want to consume FREB programatically, and be able to present the results on my smart client (without waiting for the XML to be written server-side).
The only possible solution I found so far, involves compiling an IIS7 C++ plugin that converts FREB into OutputDebugStrings, an approach even the writer suggets should not be used on a production server (here's the article)
My question is: is there another approach?
As always, my thanks for reading, and even more for replying.
Guy
I think many developers know that uncomfortable feeling when users tell them that "The application is slow (again)."
In a complex web application there can be many possible reasons for a degradation in (perceived) performance: slow database response, bandwidth issues, bad caching etc. There certainly are issues which will never occur in a development or staging environment.
Now my question:
Is there a set of tools and/or methods which would provide a comprehensive "live" state on a IIS/ASP.NET/SQL Server production system in a visually way (not just performance counters):
Current HTTP requests (say the last n minutes)
Exceptions / timeouts
Bandwidth data
Number of open database connections / database calls
...
The primary goal is to see at a glance (or after looking closer) what problem is causing the performance problems.
I think the category of software you're looking for is ".net profiler" or ".net tracer". One such tool that you might consider is JetBrains' dotTrace. It gives you runtime stack traces and an array of counters that indicate possible bottlenecks.
Previously mentioned tools will certainly work. At our shop we needed finer information and built our own solution (long story: it was easier to code than to argue about tools and retrievable data).
I used LogParser to flip through the IIS logs and create output reports of those logs (e.g. result code breakdowns etc).
I used a combination of performance counters and WMI values to get the rest - you can read these using some pretty straightforward C# - this gives you full control that you can then dump to .csv etc for viewing/processing in excel or if you are updating a page as a control center.
I would probably also look at IIS.net as a great resource for IIS tools including debugging, security etc.
I followed urig's advice and found this software called SmartInspect.
Does anybody know this logging/monitoring tool? It seems to be a combination of real time console and developer library.
CLR 4.5 will have some new capabilities that will help you monitor ASP.NET performance live - without restarting your app. Basically you can re-JIT your code to include some monitoring-hooks in it, and then inspect time spent in classes/methods etc.
I'm sure dotTrace and other profiling tools will leverage this automatically, but it's worth checking out: C9 - Inside Re-JIT with David Broman