How does the ASP.NET page rendering happens from Server to Client Browser? The question is, consider the Page has a Header and Footer which are User controls and contain many server controls.
Does ASP.NET start sending the HTML to client browser, once it gets some of the controls being rendered and converted in their respective HTML? Or does it wait for the whole page to be rendered and converted in HTML on Server, and then it sends back the Page HTML to the browser.
I am seeing that the "Page Title" of our website is shown much before and then the page takes too much time in loading completely. I want to be clear on this concept whether its server that is taking time or the client side scripts, images etc.. are the culprit. Accordingly we will start the optimizations.
Specifically I am interested in knowing how the outputstream (in response object) is sent to the Client Browser? Is the outputstream flushed once whole page is rendered in outputstream or it is sent to client in batches (i.e. few controls rendered and sent to browser via outputstream --> then some more controls are rendered and so on...)?
Sorry if am not clear enough on the problem.
in terms of debugging you can turn .NET tracing on to see whats taking the time on the server side,
and use google chrome or firebug for firefox to see whats taking the time on the client side.
I believe this is controlled by Response.BufferOutputor something similar(no reference at hand) to determine if it should start sending out HTML as soon as it's ready or if it should store it in a buffer and wait until everything is done and then send it.
The answer I was looking for was regarding the rendering way, how is stream sent to client, there could be two ways, Either directly sending it as soon as it is generated, in multiple chunks, Or cache and store until whole page is rendered and then send it to client.
I got the answer at: http://www.asp.net/aspnet/overview/aspnet-and-visual-studio-2012/whats-new
"Normally ASP.NET buffers the response bytes as they are created by an application. ASP.NET then performs a single send operation of the accrued buffers at the very end of request processing.
If the buffered response is large (for example, streaming a large file to a client), you must periodically call HttpResponse.Flush to send buffered output to the client and keep memory usage under control. However, because Flush is a synchronous call, iteratively calling Flush still consumes a thread for the duration of potentially long-running requests."
Thank you all for your help!!!
User controls are rendered before controls on the .aspx page itself.
Take a look at the Page Life Cycle
Fiddler should help determine where the bottleneck is, if you're seeing the page title show up but the page doesn't render for a bit after I'd suspect there are other files (images, javascript, css, etc) that are holding up the page from rending in the browser and not the html in the page
Page rendering . at this stage, view state for the page and all controls are saved. The page calls the Render method for each control and the output of rendering is written to the OutputStream class of the Page's Response property.
Related
I have a form that dynamically generates a PDF based on database data. But I don't want to navigate away from the form whilst the PDF is generated and downloaded. So I am using response.redirect to call the .aspx page that generates the PDF and serves it via stream (Have done for many years) so there may be a better option out there now. However I have found people are logging out before the PDF has been sent to the browser which is causing issues.
Is there a way to detect when the reponse.redirect has finished and the file has been downloaded?
I have tried using postmessage and a listener but this doesn't work.
I've also tried setting up and EndRequestHandler as below in my main form:
Sys.WebForms.PageRequestManager.getInstance().add_endRequest(EndRequestHandler) ;
But this hasn't worked either. The browser is aware as as the tab with the main form has a progress icon in, so there must be a way to intercept the complete event.
Might not be accurate like you want it but I do something sort of similar. When the button is clicked, a "Please Wait..." message appear until the file is ready to be downloaded. The way I do it is the client wait for a cookie. When the server is ready to send the file (after processing) it sets a cookie. Even if the cookie was set in a different request, the client still gets the updated value.
I am using multi-threading to update/display the content of page. Page is using multiple ( and nested ) update panels. Right now, i am using following logic to update page.
I have seven threads, each thread gets data by querying database and display them in specific section of page. We start threads and wait for 2 mints, after passing 2 mints if some threads still working then we break those and display the populated data on page, these calls to thread are making on page load event.
Problem here is that we must need to wait for specific time before page load, and then after that time limit, page 'll be displayed with populated data. Users need to wait for long to see the page, which making a really bad impression.
If we remove the limit of 2 mints, then page rendered fast but it does not display all data.
What i want here, when we call threads, we don't need to wait for all, when one thread completes it should show its data on page, and as soon as other threads being complete then they should display their data accordingly.
I found solution of this problem after testing many techniques.
To implement this, we don't need to use threading. When we call a page, server makes an object of this page and perform all required executions and then it render this page, destroy its object on server and send rendered page to Client Browser. So after rendering we can't receive thread's response. To receive thread's response, we must have to stop page being rendered explicitly (it causes delay, which we don't want).
So our solution is using JSON Ajax API with Web Methods (with Serialization and Deserialization if you are dealing with complex objects).
We have to load page with all controls on it, onload event of page, call javascript method which call Web Method using JSON API, after receiving response from Web Method, we have to update respective controls using JavaScript/JQuery manually.
I have a listview with 250 rows and 4 columns in my ASP.Net 4.0/C# application. The Rendered page size (from Trace) is 650,000 Bytes. The entire listview is in an update panel.
The listview facilitates view/add/edit/delete operations on the listview records.
Every POSTBACK action (i.e. edit click, delete click) causes a POSTBACK request of size 112,000 Bytes and an AJAX Response of ~650,000 Bytes.
The listview gets the data from a declarative data source (SQLDataSource) on the page. And the listview is bound on each round trip.
I want to reduce the data going back and forth in every call because on a slow connection, these AJAX calls take 2-3 minutes to complete.
What I have tried -
Removed the update panel over the entire listview and added an update panel over each:
ItemTemplate contents
AlternateItem Template contents
Edit Template contents
Insert Template contents
I was hoping that with the template in each row, it would reduce the size of the AJAX response since only the HTML for the update panel would come back. Unfortunately, it does not seem to work that way.
Any inputs on how the problem in my case can be solved?
Thanks in advance for looking this up.
The problem with an UpdatePanel is that you are not using real AJAX. Instead ASP.NET uses some really clever hacks to create the illusion of a partial page update. On the background, your whole page life cycle is executed. This also means that your complete ViewState is send back and forth.
If you want a faster experience, you should not use UpdatePanels. Instead, use plain HTML controls (preferably not even server controls) and use JavaScript and a server side webservice (such as WebAPI or a WCF service) to respond to the client side requests.
Those requests and response will only contain some JSON data and no markup. Your data can be kept to a minimum. If for example, a user removes a row, you only have to send the Id of the row to your service and it will return success or failure. The client will use JavaScript and maybe something like KnockoutJS to render the result. This will give you minimal overhead and a better performance.
The best possible way to do this is to not use the ASP.NET user controls and instead do this cleanly using JavaScript, JSON, HTML and a server side web service/http handler
That way you don't have to send large HTML responses from the server to client. You can also control when need to refresh and rebind your data.
I bet the whole size issue has to do ViewState. The reason being that on every postback, even if it's an AJAX postback the ViewState travels with it on every request. The only thing you can do, without making any changes, is to enable compression on the IIS side. This, at least, will send the response compressed and the browser will take care of decompressing it.
The best approach is not to use UpdatePanel and ScriptManagers at all and instead make AJAX requests using jQuery (or whatever framework you prefer) by invoking a WCF Web service. This will not trigger the full page lifecycle and will not send the ViewState on every request.
In my ASP.Net application I have a requirement that when a user clicks on an UI element we generate a PDF for them which they can download. This is currently implemented by doing a form post to an ashx page. This page essentially inspects the form and then executes the correct server side page which either results in HTML or a PDF document of that pages HTML.
On the client I know ahead of time if we are going to be getting a PDF or HTML, when its an HTML I open a new window and direct the form post to that window and all works well. When its a PDF I don't change the target for the form and it remains on the current page.
This works, the user is presented with a save dialog, and the current page is not changed or lost.
The problem I have is that generating the PDF takes anywhere from 1-15 seconds. What I want to do is popup a please wait dialog. Displaying the popup is going to be easy, what I am not sure of is how do I know to close the popup? The popup will be a div in the current page.
The popup can have a client side timer which polls the server for task completion. The long running server task should update the progress in a database table or a server cache object which can be accessed by the polling service.
Couple of old articles from MSDN magazine. You should be able to use the same concepts with newer libraries like asp.net Ajax.
Reporting Task Progress With ASP.NET 2.0
Simplify Task Progress with ASP.NET "Atlas"
just have some javascript on the client side and let it show some animated GIF for 1-15 seconds (your choice) and close itself after the designated time.
Gulzar's suggestion was spot on. I have a simple ajax enabled wcf service which checks a session variable. My ashx page sets the variable to false when it starts processing and then true when its done.
I think there might be a race condition if the client checks before we set the session item to false; however, there are ways around that if we modify the service to set the session item to false after a client gets an im done response.
The tricks is still going to be figuring out what the intervalon the client should be. If we set it to low the user could save the file and then see the still processing message. I'm debating myself between half a second and a second. Anything less then a half a second seems unnessecary.
You said:
When its a PDF I don't change the
target for the form and it remains on
the current page.
If that is the case then the original page will be gone when the PDF is opened. In that situation I would have a loading animated gif and open it using Javascript into a div tag overlaying the rest of the page. You would not need to close it, so no timer or polling needed. It would just be gone when the page is gone.
We have a painfully slow report.I added a Response.flush and it seems a great deal better. What are some of the caveats of using this method.
If Response.Buffer is not set to true, then you'll get a run-time error. Also, If the Flush method is called on an ASP page, the server does not honor Keep-Alive requests for that page.
You'll also want to look out if you're using a table-based design as it won't render in some browsers until the entire table is sent.. meaning if you have 10,000 rows, the user would still need to wait for all 10,000 rows to transfer before they'd actually see them.
Expanding Wayne's answer: if anything you do needs to set Response.Headers, you can't do it after any part of the Response has been Flushed.
There are no problems with flushing the response like this. It is generally recommended for better performance to buffer the entire page and the flush it to the client but for long running scripts it is often better to display some data to the client so the user sees something is happening.
Do remember that flushing manually only have a proper effect when buffering the page from the start, otherwise IIS will flush automatically (stream the page to the client).
You should avoid flushing to often as IIS will then have to use resources on flushing the page often instead of processing the script. I.e.: flush every 50 rows instead of ever row.
Response.flush could be useful to send to the browser the report's header.. then display a "loading message", then your report process and you flush the report, then execute a little piece of javascript to hide the "loading" message.
This way you will let your users know that something is hapenning so they won't press STOP BACK or just close the window as they may otherwise be tempted.
Also, I've played a lot with what browser render what table and IE seems to be the only one that doesn't render a table unless the tag is received. Which means that all rows could gradually appears in other browser but not in IE.