We have a painfully slow report.I added a Response.flush and it seems a great deal better. What are some of the caveats of using this method.
If Response.Buffer is not set to true, then you'll get a run-time error. Also, If the Flush method is called on an ASP page, the server does not honor Keep-Alive requests for that page.
You'll also want to look out if you're using a table-based design as it won't render in some browsers until the entire table is sent.. meaning if you have 10,000 rows, the user would still need to wait for all 10,000 rows to transfer before they'd actually see them.
Expanding Wayne's answer: if anything you do needs to set Response.Headers, you can't do it after any part of the Response has been Flushed.
There are no problems with flushing the response like this. It is generally recommended for better performance to buffer the entire page and the flush it to the client but for long running scripts it is often better to display some data to the client so the user sees something is happening.
Do remember that flushing manually only have a proper effect when buffering the page from the start, otherwise IIS will flush automatically (stream the page to the client).
You should avoid flushing to often as IIS will then have to use resources on flushing the page often instead of processing the script. I.e.: flush every 50 rows instead of ever row.
Response.flush could be useful to send to the browser the report's header.. then display a "loading message", then your report process and you flush the report, then execute a little piece of javascript to hide the "loading" message.
This way you will let your users know that something is hapenning so they won't press STOP BACK or just close the window as they may otherwise be tempted.
Also, I've played a lot with what browser render what table and IE seems to be the only one that doesn't render a table unless the tag is received. Which means that all rows could gradually appears in other browser but not in IE.
Related
I have a form that dynamically generates a PDF based on database data. But I don't want to navigate away from the form whilst the PDF is generated and downloaded. So I am using response.redirect to call the .aspx page that generates the PDF and serves it via stream (Have done for many years) so there may be a better option out there now. However I have found people are logging out before the PDF has been sent to the browser which is causing issues.
Is there a way to detect when the reponse.redirect has finished and the file has been downloaded?
I have tried using postmessage and a listener but this doesn't work.
I've also tried setting up and EndRequestHandler as below in my main form:
Sys.WebForms.PageRequestManager.getInstance().add_endRequest(EndRequestHandler) ;
But this hasn't worked either. The browser is aware as as the tab with the main form has a progress icon in, so there must be a way to intercept the complete event.
Might not be accurate like you want it but I do something sort of similar. When the button is clicked, a "Please Wait..." message appear until the file is ready to be downloaded. The way I do it is the client wait for a cookie. When the server is ready to send the file (after processing) it sets a cookie. Even if the cookie was set in a different request, the client still gets the updated value.
It is possible to have a control that would proxy requests to another domain/Web site, including postback?
In this control, you would specify the URL you wanted to execute, and whenever the control executed, it would make a GET request to this other URL, and render the HTML return. (This part is not hard.)
However, when the page is posting back, it would make a POST request, with all of its postback variables intact, to this other page.
I'm really looking for a blind proxy. Some control that will take the incoming request and throw it another URL, and render the results. The other page would really have no idea it wasn't interacting with a human.
I want to think I could develop this, but I can't be the first person who wants to do it, so there has to be some reason why Google isn't revealing the solution to me. I suspect I'm going to run into the same Big Problem that anyone else with this idea has run into.
I'm not exactly sure what the value of this is; which is probably why you haven't found a solution yet.
However, it seems to me that there are two possible solutions.
When the page is rendered have the control modify the form action to point elsewhere; or,
on post back, have the control execute a web request to the alternate URL with the post variables and decide what to do with the results at that time.
In this end, this never had much of a chance of working. I experimented with it for a while, but Postback requires intimate knowledge of the control tree, and there's no way that you're going to be able to apply a postback from the calling page to the other page and have it overlay correctly because the control trees between the two pages are totally different.
Now, if you wanted to write the backend app as a more traditional Web app (even something not in ASP.Net), it might work. During postback, you could iterate the Request.Form values and send them back, and just have your backend app prepared to accept those incoming values and deal with them, but this wouldn't be a traditional postback.
How does the ASP.NET page rendering happens from Server to Client Browser? The question is, consider the Page has a Header and Footer which are User controls and contain many server controls.
Does ASP.NET start sending the HTML to client browser, once it gets some of the controls being rendered and converted in their respective HTML? Or does it wait for the whole page to be rendered and converted in HTML on Server, and then it sends back the Page HTML to the browser.
I am seeing that the "Page Title" of our website is shown much before and then the page takes too much time in loading completely. I want to be clear on this concept whether its server that is taking time or the client side scripts, images etc.. are the culprit. Accordingly we will start the optimizations.
Specifically I am interested in knowing how the outputstream (in response object) is sent to the Client Browser? Is the outputstream flushed once whole page is rendered in outputstream or it is sent to client in batches (i.e. few controls rendered and sent to browser via outputstream --> then some more controls are rendered and so on...)?
Sorry if am not clear enough on the problem.
in terms of debugging you can turn .NET tracing on to see whats taking the time on the server side,
and use google chrome or firebug for firefox to see whats taking the time on the client side.
I believe this is controlled by Response.BufferOutputor something similar(no reference at hand) to determine if it should start sending out HTML as soon as it's ready or if it should store it in a buffer and wait until everything is done and then send it.
The answer I was looking for was regarding the rendering way, how is stream sent to client, there could be two ways, Either directly sending it as soon as it is generated, in multiple chunks, Or cache and store until whole page is rendered and then send it to client.
I got the answer at: http://www.asp.net/aspnet/overview/aspnet-and-visual-studio-2012/whats-new
"Normally ASP.NET buffers the response bytes as they are created by an application. ASP.NET then performs a single send operation of the accrued buffers at the very end of request processing.
If the buffered response is large (for example, streaming a large file to a client), you must periodically call HttpResponse.Flush to send buffered output to the client and keep memory usage under control. However, because Flush is a synchronous call, iteratively calling Flush still consumes a thread for the duration of potentially long-running requests."
Thank you all for your help!!!
User controls are rendered before controls on the .aspx page itself.
Take a look at the Page Life Cycle
Fiddler should help determine where the bottleneck is, if you're seeing the page title show up but the page doesn't render for a bit after I'd suspect there are other files (images, javascript, css, etc) that are holding up the page from rending in the browser and not the html in the page
Page rendering . at this stage, view state for the page and all controls are saved. The page calls the Render method for each control and the output of rendering is written to the OutputStream class of the Page's Response property.
Consider the scenario:
I visited a page of a website built using ASP.NET. The page is a simple aspx page containing ASP.NET server controls.
I clicked on a link which takes me to some other page on the same website.
I clicked the BACK button of the browser.
QUESTION: What happens in terms of page life cycle? Does all the events occur or the browser just displays the cached version of the page without making any requests?
I think the best answer is: It depends on the browser, especially after a post/postback.
Older browsers used to pop up a confirmation dialog to the effect of "the page contains POST data which will be resubmitted", and you could either proceed (resubmit) or cancel out. Since everything that happens in ASP.NET WebForms is part of the FORM element (ViewState, events, etc.), this would cause the entire lifecycle to be repeated.
Of course, this caused no end of trouble with duplicate submissions, so many sites had to come up with workarounds for the dupe problem, and today most browsers just fetch the page from cache instead.
...That is unless you override the cache-control headers and force the browser not to store the page in cache. Obviously, in that case, it can't be retrieved from cache, so it will usually end up being resubmitted. But, again, it depends on the browser - for example, some browsers won't allow the resubmission over SSL, so if that's the protocol in use then the user will just see a message saying that the page has expired / can't be shown.
Come to think of it, probably an even better answer is: As a site designer, you really can't depend on any specific behaviour from the user's browser when the Back button is clicked. If a duplicate submission could have negative side-effects (such as charging a credit card twice), then you need to take adequate measures to prevent that from happening. It's good practice anyway as it's entirely possible for a user to simply double-click the "submit" button by accident.
we have even tried
Response.ExpiresAbsolute = DateTime.Parse("1/1/1980");
Response.AddHeader("cache-control", "no-store, must-revalidate, private");
Response.AddHeader("Pragma", "no-cache");
to resolve this kind of problem
The page would be displayed from Cache.
usually all the events should occur, but if you have an uber browser than it could happen to display a cached page
you can just put a breakpoint in your Page Load and see if it's going to occur
I need to read data from an online database that's displayed using an aspx page from the UN. I've done HTML parsing before, but it was always by manipulating query-string values. In this case, the site uses asp.net postbacks. So, you click on a value in box one, then box two shows, click on a value in box 2 and click a button to get your results.
Does anybody know how I could automate that process?
Thanks,
Mike
You may still only need to send one request, but that one request can be rather complicated. ASP.Net is notoriously difficult (though not impossible) to screen scrape. Between event validation and the ViewState, it's tricky to get your requests just right. The simplest way to do it is often to use a sniffer tool like fiddler to see exactly what the http request looks like, and then just mimic that request.
If you do still need to send two requests, it's because the first request also places some state in a session somewhere, and that means whatever you use to send those requests needs to be able to send them with the same session. This often means supporting cookies.
Watin would be my first choice. You would code the selecting and clicking, then parse the HTML after.
I'd look at HtmlAgilityPack with the FormProcessor addon.