We've got an app written in Flex that displays data from our app. The .swf file is only 427kb, but it takes a full five seconds to load in Firefox. This is a headache for our users because they need to access the page that contains the app frequently. (The app displays documents, and it's really slow to march through a list of them).
I've confirmed that it's not a slow web server problem. The .swf appears to be cached in the browser. Firebug reports that every time the web page accesses the .swf, the app server returns a "304 Not Modified" response, meaning that the load time from the server is almost zero.
Is there anything we can do to debug this issue? Or is the Flash player just slow?
If you're having issues with the time to download the SWF or to initialize the application, you could try breaking it up into modules and using the SWFLoader to only load the pieces as you need them. Flex applications are 2-frame movies, so the more you have in your application the more there is to initialize before it can start "playing."
If it's slow rendering everything, take a look at the creationPolicy and see if you're needlessly creating a hierarchy of items that aren't being displayed. Repeaters are also notorious for rendering slowly.
If your performance problems are more in-application, then you could consider profiling your application to see where the hotspots are.
Have you tried running the app using the Flex Profiler? That may help you isolate any performance issues.
Consider checking out the Flex RSLs. These runtime shared libraries allow the Flash Player to cache the Flex framework and after the first load allow for a much faster startup time.
Look at the
creationPolicy documentation.. it may help..
The default should be "auto"... Creates all controls only in the initial view of the navigator container. This setting causes a faster startup time for the application, but results in slower response time for user navigation.
This setting is the default for multiple-view containers.
See if someone has changed your setting.
Related
I need to measure the total time a page takes to fully load, from clicking a menu button to fully rendered. I did that, by using BeginRequest and EndRequest events. Because some numbers were too big, I started measuring the time for every method and event the page executed when loading. Using a stopwatch I timed Preinit, Init, Load etc, all the events in the page life cylcle plus all other methods I created. My surprise was that adding these numbers I didn't even get close to the numbers I got using begin/end request. The last one was even double, triple, eg 7 seconds comparing with the 2 seconds I got from each event/methods.
So I'm wondering, where does this extra time come from? Between my timings, which are showed in the debug window, I could see all sort of "loading jhfggdfaasdf.dll", probably some temporay file. Could this dll loading take this time?
I noticed that from time to time, my timings match, so maybe there is some cache mechanism, but I need some confirmation from someone more experienced.
EDIT: From reading your question again, you might be using the asp.net web site solution type. This is not pre compiled when you are debugging and the first time you request a page it compiles it into a class in your asp.net directory. The dll for the class is then loaded into the app domain with a funny name like the one you mentioned. This happens the first time you request one of these pages and when you want to deploy you can pre compile your site to increase performance.
If you are seeing "loading xyz.dll" it means the app domain is loading in stuff to be used by the application. This happens when you need to run code from required dlls (in your case probably third party libs) that have not yet been loaded into the app domain. This is good because it means a page that uses a library but has never been called, never loads that assembly into memory. You could move this hit from first page request to application load by pre loading all the assemblies in your bin into the app domain on application start. It is a trade off between use of memory against speed of requests. This question is a good place to start with that:
How to pre-load all deployed assemblies for an AppDomain
You can get a good overview of your page lifecycle times by using the trace functionality in asp.net. This can be set in the web.config file as described in this article:
http://msdn.microsoft.com/en-us/library/94c55d08.aspx
Viewing the trace.axd page will tell you times off the various server events and make it really easy to see where you are slow.
If the page still takes a long time to render, there are client side considerations such as
is your page very large
are you interpreting javascript before all the html and css was written
is your network slow
are you sending many css and js files? most browsers limit the number of concurrent resource downloads. maybe consider rolling a few css files into one, on your production environment at least
are you employing client cache. you can tell the browser to cache something for a period of time. there are ways to invalidate this cache if your content requires to be updated.
This can be debugged using client side tools such as firebug or the developer tools in chrome.
The stopwatch is actually a fairly cumbersome object by itself. You may actually be adding to the load time by using it to record the time. A slightly more efficient method would be to use a simple datetime comparison.
protected void Pre_Init(object sender, EventArgs e)
DateTime started = DateTime.Now;
// .... some code
lblDisplayComment.Text = DateTime.Now.Subtract(started).TotalMilliseconds.ToString();
}
This would give you the time in ms it took to execute that method only.
I'm a C++ programmer, but I'm a newbie in Flex. I'm developing a Flex 3 application for a social network using the FlashDevelop. For debugging I'm using the stand-alone FlashPlayer (10.3 debug) downloaded from Adobe. The application is a simple audio player which shows artist/album images.
The application worked properly on both local computer and remote server until I made some layout changes in Main.mxml. I added some HBoxes and changed Image placement. After that the application still works on my local computer, but it doesn't work properly after I upload it to a server.
Application buttons are not highlighted on over/out/click events, images loaded from the Internet are not displayed, text changed dynamically is not displayed, but when I click buttons a sound file is loaded from the internet and starts playing. It looks like some events which are responsible for components displaying are not dispatched, because some part of functionality not related to displaying still works.
To make sure that this is not a server problem, I rolled back to the previous revision. All works fine.
I suppose that this is a known issue, but I have no idea what is the reason.
Could anyone please help me to resolve the issue?
Thanks.
UDP: I observed the issue in IE and FF, I didn't test Opera and Chrome.
Are you tying to access the pixel data of the images in the new version? If so, that might be the problem, as pixel data for loaded images is not (always) accessible so that might throw a security error, which in turns breaks the rest of the interface.
Also, did you try running the remote version in the debugger? If so, is there any exception being thrown?
And no, it's not a known issue, it's the kind of annoying and hard to debug error that you sometime get when using the Flex SDK.
I have a Web application (http://www.holidaystreets.com), it has around 120,000+ pages. Whenever we restart the server it takes more then 15 minutes for the site to warm up.
I built it as 'Release', do not have any heavy stuff initilizing (i.e. Control Adapters or in APPInit).
Any tips?
Mystery Solved
Well I spoted the problem today. This application was converted from WebSite type project to WebApplication type. I had codedom defined in web.config so that I can compile each page separately when requested first time. (this was done becuase we had such a huge number of pages). However in WebApplication it was compiling each and every page on first load. Since removing the section, the application is loading in less then 2 seconds!
Probably you have lots of static data that a initiated on the first hit. Look for big amount of cached data, that you use in static classes (probably getting it from the db?).
I would suggest using the System.Diagnostic.Trace class methods to log timings for various methods and events as your site is loading up for the first time to see where the time is being spent.
Also profiling your database should reveal any bottlenecks there.
I got it back in 2.65s:
http://www.webpagetest.org/test
I have three Silverlight 3 applications in the same solution. In my asp.net hosting project I have a seperate page for all three projects. When I navigate between the pages, the only Silverlight breakpoints that get hit are the ones the initial page I load.
This problem has only started recently. I used to be able to debug between all silverlight projects at the same time. Any ideas? I have deleted the ClientBin folder, I have deleted all files and re-retrieved from source control. Nothing seems to be working.
"The problem has only started recently". What changed? Here are some guesses:-
You upgraded to Windows 7
You installed some more memory
Some other memory guzzling app is no longer running when you are testing.
By default IE8 will run multiple processes at least 2. One for the browser frame and one for the content of the intial tab. As you open more windows and tabs IE may add new processes to the set it is currently using.
When you debug VS will launch an new IE8 session and will attach to the process handling the content of the single tab that is open, (it doesn't bother attaching to the parent frame process). However as you navigate about your application IE8 will start new process that VS won't be attached to. This forces you to open the Attach to Process dialog and do it manually.
You can control this IE8 feature (called BTW LCIE, Loosely Coupled IE) from the Registry.
In the Key HKEY_CURRENT_USER\Software\Microsoft\Internet Explorer\Main add a new DWORD value TabProcGrowth. Set its value to 1. Now IE8 will only ever create 2 processes per session one for the frame and one for all the tab and window contents which is the one the VS will attach to.
This perhaps is a bit draconian if you also use IE8 as your general browser. One option is to leave IE8 for test purposes and use another browser for general browsing. Another option is a variation of the above. Instead of creating TabProcGrowth as a DWORD create it as a string type instead and set is value to "small". In this mode IE8 is much less aggressive in the number of processes it will open. Of course you could create a couple of scripts to create and delete the registry entry.
Note without the registry entry IE8 uses its own hueristics that depends on available memory etc to determine if a new process is warrented or not. This might explain why in the past your debugging worked and that for apparently no reason it stopped working.
Here was the issue:
One of my child windows had a Silverlight that calling a .Net Ria Service. The service call ended in an error.
The next time several I debugged, the debugger did not attach to the child windows. I had to attach to the child windows manually.
I fixed the Ria Service call so that it did not end in an error. And had to manually attach to the child windows in that debugging session. However in subsequent debugging sessions, the debugger automatically attached.
I tried breaking the Ria Service call and I had to manually attache again. What is a little weird is that closing Visual Studio and even rebooting the machine does not make Visual Studio automatically attach again. You have to have a debugging session where the child window make a sucessful call to a Ria Service to fix it.
NOTE:
The RIA error that was breaking my debugger was caused by a misspelled include in the domain query (ie...
return Context.SOME_ENTITY.Include("Misspelled_Association_Property");
) not all RIA exceptions cause this problem.
My scenario has a number of specific cases that I will go over. I don't have all the things handy to test a more general scenario, but I will when I finish my project unless someone does this first.
Here is what I have:
I am using the a LinqToEntitiesDomainService from the July 2009 Preview release of .Net RIA Services.
To complicate things a little more, since my application is using an Oracle backend, I am using DevArt's dotConnect Entities provider as the EntityFramework model for my domain service.
When I get time, I will try this on the Nov 2009 RIA and a standard SQL backend and EF to see if I still have the same issue. If this is the case I will report it to Microsoft as a visual studio bug.
I have deployed an application written in ASP.NET 2.0 into production and it's experiencing some latency issues. Pages are taking about 4-5 seconds to load. GridView refreshing are taking around the same time to load.
The app runs fine on the develpment box. I did the following investigation on the server
Checked the available memory ... 80% used.
Cheched the processor ... 1%
Checked disk IO from perfmon, less than 15%
The server config is
Windows Server 2003 Sp2
Dual 2.0 GZH
2GB RAM
Running SQL Server 2005 and IIS only
Is there anything else I can troubleshoot? I also checked the event log for errors, it's clean.
EDITED ~ The only difference I just picked up is on the DEV box I am using IE7 and the clients are using IE6 - Could this be an issue?
UPDATE ~ I updated all clients to IE8 and noticed a 30% increase in the performance. I finally found out I left my debug=true in the web.config file. Setting that to flase got the app back to the stable performance... I still can't believe I did that.
First thing I would do is enable tracing. (see: https://web.archive.org/web/20210324184141/http://www.4guysfromrolla.com/webtech/081501-1.shtml)
then add tracing points to your page generation code to give you an idea of how long each part of the page build takes:
System.Diagnostics.Trace.Write(
"Starting Page init",
"TraceCheck");
//Init page
System.Diagnostics.Trace.Write(
"End Page init",
"TraceCheck");
System.Diagnostics.Trace.Write(
"Starting Data Fetch",
"TraceCheck");
//Get Data
System.Diagnostics.Trace.Write(
"End Data Fetch",
"TraceCheck");
etc
this way you can see exactly how long each stage is taking and then target that area.
Double check that you application is not running in debug mode. In your web.config file check that the debug attribute under system.web\compilation is set to false.
Besides making the application run slower and using more system memory you will also experience slow page loading since noting is cached when in debug mode.
Also check your page size. A developer friend of mine once loaded an entire table into viewstate. A 12 megabyte page will slip by when developing on your local machine, but becomes immediately noticeable in production.
Are you running against the same SQL Server as in your tests or a different one?
In order to find out where the time's coming from you could add some trace statements to your page load, and then load the page with tracing turned on. That might help point to the problem.
Also, what are the specs of your development box? The same?
Depending on the version of visual studio you have, Team Developer has a Performance Wizard you might want to investigate.
Also, if you use IE 8, it has a Profiler which will let you see how long the site takes to load in the browser itself. One of the first things to determine is whether the time is client side or server side.
If client side, start looking at what javascript you have and optimize / get rid of it.
If server side, you need to look at all of the performance counters (perfmon). For example, we had an app that crawled on the production servers due to a tremendous amount of JIT going on.
You also need to look at the communication between the web and database server. How long are queries taking? Are the boxes thrashing the disk drives? etc.