I can't believe how many articles I have read and example code I've tried to understand, with no success in accomplishing what I need to do. Hopefully someone can help me out by providing some example code or pointing me to another resource.
I am working in ASP.NET 4.0, C# using VS 2012 Express. The project is a web site. I have a UI page that contains checkbox controls and dropdowns for a user to set preferences. At the bottom of the page is a checkbox for the user to agree to some terms and conditions and an image button they click to get the results based on their preferences. This all works and so does the results page.
The problem is that the results page takes as much as 90 seconds to complete. I can't just leave the user with a "Loading..." on the status bar of their browser. So what I am wanting is:
User clicks to get results
Results page loads immediately
Once results page has loaded, a call is made to server that begins the 90 second process.
User is provided a status display of the progress.
Once process completes, a link is presented for them to view their results.
I see many examples with varied approaches. Most of them require the user to click a button to begin the process. I don't want the user to have to click another button - I just want the process to begin when the results page is finished loading.
Additionally, I'm looking for an idea of how to code the client page to make calls to the server to obtain the status of the process. Preferably a percentage value and a text message for each step of the process. Ie:
25%
Compressing files...
I've seen some Web Method examples, but I don't think I've seen a single one that demonstrates beginning the process initially without having to click a button to invoke it initially. I considered putting the Javascript call in the , but it is contained in a Master.Master that is used by many other pages.
Does anyone know of any code examples that might help me accomplish this sort of thing?
I found a really nice solution from EssentialObjects.com. Their progress bar control is free, although I needed their CallBack custom control to do what I wanted - the CallBack control is not free.
Hope this helps someone else out!
Related
Hello Im traying to scrape data from https://eservicios2.aguascalientes.gob.mx/sop/geobras/UI/frmObrasTodas.aspx
I can get the data from the main page but I don't know how to get the data from the form,
a) when choose a row and ask for "Detalle" , means detail goes to a form.
b) don't know how to follow the link
Need to get data from each row, can anybody help me.
the main issue and problem is that this is a asp.net web site. So, when you select a row, this likely uses a server side event. you MIGHT be able to write some JavaScript to select a row. But then the next issue is even more of a challenge. Once you select a row, then you have to click on a button. That button is going to run server side code. And that server side code is going to look at and grab the row value selected - VERY likely again server side code. Unlike say a simple web site with hyper-links?
.net sides are full driven from vb.net or c# code. We don't use silly things like hyper-links, or even silly parameters in the web URL.
So, after you select a row (perhaps possible in js), then you would then have to click on the details button. This again can be done with JavaScript
Say, in jQuery like this:
$('#NameOfButton').click();
So asp.net sites don't use simple code like what you see and get from someone who take that 3 day web developer program promising that now you are a experienced web developer. Asp.net sites as a result don't use simple HTML markup code and things like a simple hyper-link to drive the web site. There are no "links" for each row, but only code on the server side that runs to pull the data from the database, and then render that information, and THEN send it down as a html markup.
The bottom line?
The site is not simply HTML and simple hyper-links that you click on. When you click on that button, then the code behind (written in a nice language like c# or vb.net) runs. There is thus no markup code or even JavaScript code that is required here. You talking about clean and nice server side code. (and code written in a fantastic IDE - Visual Studio).
This means that aspx web sites are code behind driven, and as a result they are rather difficult to web scape in a automated fashion. You can get/grab the page you are on, but since there are no hyper-links to the additonal data (such as details), then you don't have a simple URL to follow/trace here.
Worse yet, the setup code (what occurs when you selected a single row) also in most cases has to be run. Only if all values are setup 100% correctly BEFORE hitting the "details" button will this thus work. And even worse, if you note, on the details page, there is no parameters in the URL. This means that not only is code behind required to run BEFORE the 2nd details page launches, but the correct setup code behind has to run. And even worse yet, is the 2nd page URL VERY likely also checks and ensures that the previous URL page was from the same site (as a result you can NOT JUST type in a url for the 2nd page - it will not work.
And in fact, if you look even closer? When you hit details button, the web pages re-loads, re-plots and renders what is CLEARLY a whole new web page and layout.
But note how the URL DOES NOT change!!! They are NOT even using a iframe for this.
This is because they are using what is called a server side re-direct. The key "tell tell" sign is that the URL remains the same, but the whole page layout is 100% different. What occurred is the server side did a re-direct to a 100% whole new page. But since the browser did not and was not causing this navigation? The code behind actually loads + displays a whole new web page and sends it down to the web client side.
However, note how the URL remains the same!!! This is due to the code behind is loading + displaying a whole new different web page - but since the navigation to that new page occurred with server side code?
Well then the server can load + send out anything it wants to the client - include a whole new web page, and you don't get nor see a web url change.
Again, this is typical of asp.net systems in which server side code drives the web site, and not much client side code.
You "might" be able to automate scraping. But you would need some custom code to select a given row, and then some code to click the details button. And that's going to be a REAL challenge, since any changes to the web page code (by you) also tend to be check for, and not allow server side.
The only practical web scrape approach would be to use some desktop tools to create a WHOLE instance of the web browser, let you the user navigate to the given web page that displays the data, and then hit some "capture" button in your application that now reads and parses out the data like you doing now for the main page.
I'm trying to describe it in as few steps as possible:
I have Page1.aspx with lot of controls, and Preview and Save button among those. I also have Page2.aspx that is the redirection target of a Preview Button click.
Since I need all the controls selections from Page1 to draw a preview on Page2 the redirection is done with setting Preview's PostBackUrl.
I also must have preview shown on a new tab or window so I used onClientClick="aspnetForm.target='_blank'" for Preview button definition.
Save button-click callback, after storing data to a database does redirection to some Page0.aspx (initial list of reports - the subject of the code)
Preview button works fine - a preview renders in a new tab, but when I go to the old tab and click on Save, I see from debugger, that firstly Page2.aspx(?) and secondly Page1.aspx are loaded. Then all the data is stored in the db, but though Page0 redirection is executed Page1.aspx stays loaded in the browser.
I have no idea what processes are behind this. Could one who knows give me an insight? Or if you consider my approach impossible to implement give some idea how to do the same?
If it's of importance, everything on the Page1 is located in an update panel.
Thank you very much for replying
In ASP.NET there are basically zero (0) circumstances in which you will ever send form data from one page to another. Although what exactly you are trying to accomplish is vague, you can consider some of the following:
Isolate unique operations/systems to a single page. If you have something like a User Profile, don't have three different aspx pages; just use a single page for the user or admin to manage that data / functions. Postback events are your friend.
Understand the difference between ViewState and traditional form data. I'm guessing that if you're trying to post form data from one page to another, you probably don't understand the point of ViewState. Using a single page to maintain temporary data that the user is currently working with is a great use for ViewState. If you want the data to appear on another page then you need to consider the data from the previous page as final and thus should be saved to a database or some other medium.
These are just some general guidelines because there is no exact answer to your problem without saying something generic like "You're doing it wrong." I would recommend starting by never again trying to post form data from one aspx page to another.
I have a performance issue where we have a 2 page setup as part of a workflow in a bigger system. This section is dedicated to rendering reports allowing users to chose their own parameters.
Page1.aspx collects parameter information for a report. It takes the information submitted on a form and validates it. If it validates OK, it stores the selections in the DB as XML, then redirects to Page2.aspx with the run id in the query string. Simple enough, performance is great.
Page2.aspx pulls the ID out of the DB and hydrates a Crystal ReportDocument object (taking milliseconds) then we call ExportToHttpStream which then renders the report as a PDF or DOC or XLS download (output format is determined in Page1.aspx). The performance of the ExportToHttpStream method is very poor due to the way our reports are written and DB indexes on the target system. This is outwith my control at the moment but I am promised that they are being worked on.
So the problem is, that when the submit button in Page1.aspx is pressed, the user experiences a very long delay before the download starts. It is then compounded by the user pressing the submit button again thinking there is a problem.
I think what I need to do is have Page1.aspx redirect to Page2.aspx. Page2.aspx should render the master page furniture and a loading div, and the report should render asynchronously somehow in the background before the save dialogue automatically pops up, after this i'd like to change the loading div to a 'Report generated, click here to go back'.
If this is the best way to achieve this, how can I load a full page, then request the report asynchronously? I'm open to any suggestions here.
You could use ajax to load the report on Page2.aspx and show a loading message while it's processing.
Look at the jQuery.load() method. This might be the easiest way to accomplish what you are trying to do.
Page1.aspx - collect parameters
Page2.aspx - report view, calls Page2Details.aspx via ajax.
Try loading Page2.aspx inside iframe and use jQuery to display waiting indicator and hide it after Page2.aspx download
Whilst both answers gave me some ground to go out and research in the right direction. My solution included using the fileDownload plugin from John Culviner to facilitate a similar solution:
jQuery fileDownload by John Culviner
This allowed me the following page structure:
Page1.aspx, gathers and validates parameters for the report and puts them into Oracle.
Page2.aspx, whilst passed in the runid (pointer to the parameters in the db) via the query string setup 3 hidden divs. Loading, Error and Success.
The script mentioned above was employed at this point. jQuery firstly sets the loading div visible then calls the plugin. The plugin dynamically creates an iframe and downloads the binary (xls/doc/pdf) from Page3.aspx. It then fires a success callback or failure. The success callback is fired by means of a cookie set at the end of the response in Page3.aspx.
I believe the plugin mentioned downloads using a 'text/plain' AJAX call in jQuery avoiding the limitation of there not being an octet-stream equivalent in AJAX.
It works, its not the cleanest solution by any means, it doesn't degrade one bit, but provides the users on our controlled intranet with an extremely responsive and pleasing UI.
I have a wizard style interface where I need to collect data from users. I've been asked by my managers that the information is to be collected in a step by step type process.
I've decided to have a page.aspx with each step of the process as a separate user control. step1.ascx step2.ascx etc...
The way it works now, is that when the initial GET request comes in, I render the entire page (which sits inside of a master page) and step1.ascx. When then next POST request comes in for step 2 (using query string step=2), I render only step2.ascx to the browser by overriding the Render(HtmlTextWriter) method and use jQuery html() method to replace the contents of a div.
The problem with this whole approach, besides being hacky (in my opinion) is that it's impossible to update viewstate as this is usually handled server side.
My workaround is to store the contents of step1.ascx into temporary session storage so if the user decides to click the Back button to go back one step, I can spit out the values that were stored for it previously.
I feel I'm putting on my techy hat on here in wanting to try the latest Javascript craze as jQuery with .NET has taken a lot of hack like approaches and reverse engineering to get right. Would it be easier to simply use an updatepanel and be done with it or is there a site with a comprehensive resource of using jQuery to do everything in ASP.NET?
Thanks for taking the time to read this.
Another approach, that might be easier to work with, is to load the entire form with the initial GET request, and then hide all sections except the first one. You then use jQuery to hide and show different parts of the form, and when the final section is shown the entire form is posted in one POST to the server. That way you can handle the input on the server just as if the data entry was done in one step by the user, and still get the step-by-step expreience on the client side.
You could just place all your user controls one after another and turn on the visibility of the current step's control and turn on other controls when appropriate . No need to mess with overriding Render(). This way the user controls' viewstate will be managed by the server. and you can focus on any step validation logic.
Using an UpdatePanel to contain the steps would give the ajax experience and still be able to provide validation on each step. If you are OK with validating multiple steps at once, Tomas Lycken's suggestion (hide/show with JQuery), would give a fast step by step experience.
Did you look into using the ASP.NET Wizard control? It's a bit of a challenge to customize the UI, but otherwise it's worked well for me in similar scenarios.
In my ASP.Net application I have a requirement that when a user clicks on an UI element we generate a PDF for them which they can download. This is currently implemented by doing a form post to an ashx page. This page essentially inspects the form and then executes the correct server side page which either results in HTML or a PDF document of that pages HTML.
On the client I know ahead of time if we are going to be getting a PDF or HTML, when its an HTML I open a new window and direct the form post to that window and all works well. When its a PDF I don't change the target for the form and it remains on the current page.
This works, the user is presented with a save dialog, and the current page is not changed or lost.
The problem I have is that generating the PDF takes anywhere from 1-15 seconds. What I want to do is popup a please wait dialog. Displaying the popup is going to be easy, what I am not sure of is how do I know to close the popup? The popup will be a div in the current page.
The popup can have a client side timer which polls the server for task completion. The long running server task should update the progress in a database table or a server cache object which can be accessed by the polling service.
Couple of old articles from MSDN magazine. You should be able to use the same concepts with newer libraries like asp.net Ajax.
Reporting Task Progress With ASP.NET 2.0
Simplify Task Progress with ASP.NET "Atlas"
just have some javascript on the client side and let it show some animated GIF for 1-15 seconds (your choice) and close itself after the designated time.
Gulzar's suggestion was spot on. I have a simple ajax enabled wcf service which checks a session variable. My ashx page sets the variable to false when it starts processing and then true when its done.
I think there might be a race condition if the client checks before we set the session item to false; however, there are ways around that if we modify the service to set the session item to false after a client gets an im done response.
The tricks is still going to be figuring out what the intervalon the client should be. If we set it to low the user could save the file and then see the still processing message. I'm debating myself between half a second and a second. Anything less then a half a second seems unnessecary.
You said:
When its a PDF I don't change the
target for the form and it remains on
the current page.
If that is the case then the original page will be gone when the PDF is opened. In that situation I would have a loading animated gif and open it using Javascript into a div tag overlaying the rest of the page. You would not need to close it, so no timer or polling needed. It would just be gone when the page is gone.