Multiple postbacks of ASPX page - asp.net

I have an aspx page with a simple form to send emails to pre-defined lists of users. On the longer lists the page usually times out before the emails finish sending but this has never been an issue.
Today something weird happened and each user got four emails. In the log I could see three new threads crank up one at a time and start over sending from the beginning of the list.
Any ideas? I absolutely know I didn't intentionally refresh the Web page myself, and certainly not three times. But could the browser (IE8) have done it? Would it post again trying to re-establish a connection when it timed out? Or when I switched back to the browser window from another app? I have never seen behavior like this before.

First question would be whether there is any reason to do a long-running task syncronously, i.e. lock up a thread that should be serving web requests for something that could be done in the background, while the browser sits and waits for a response that its probably not going to get. I'd look into running this asynchronously unless there's a very deliberate reason not to.
Secondly have you looked into creating some kind of locking mechanism such that the process can't be started more than once? I have processes where I add a token to the application cache (and remove it when I'm done) so that if the token exists the process won't run again (the call to the asynch task isn't made), and that does the job. That way it doesn't matter how many clients call your code, you prevent things happening more than they should.

Related

Meteor.logout() and Meteor.call() too slow

I have a web app. I am living a problem about time of Meteor.logout() and Meteor.call(). When i meteor.logout(), it takes time between about 30-40 sec. Same for Meteor.call() as well. About 200-250 clients use this system on the same time.
if a client see about 100-200 items his on app screen this delay time is so much. but 10-20 items, it's a little well. we get data every 5-10 sec as different times each others on these items. I mean, live screen.
I don't get this problem when i work this system on diffrent port with same code and same database by the way just use only me.
I can't figure it. What can be reason it. I need your ideas and help.
The logout function waits for a callback form the server, there is something wrong with the way you have configured your server.
Run the same code on another machine, it should not happen.
You can use this.unblock() in every method and publications.
By default, Meteor process requests one by one, it will queue all the requests coming, if one is processing.
This may be due to the reason that some of the functions doing some bigger functionalities will be requiring more time and all other request to the server have to wait till it ends.
You need to simply place this.unblock() at the starting of every method and publications and it will not block your requests.
Thanks
I solved my problem.
While the collection update process is performed from one side, the meteor publish process is performed from the other side. As the number of clients increases, the server becomes unresponsive. I solved it with Mongodb oplog feature.
Thank you for your interest.
There could be multiple reasons.
There could be unsubscription of collections, which means client and server exchange the list of id's which are being unsubscribed.
You many have reactive UI, which suddenly gets overwhelmed with the amount of data that is being transferred and needs to update itself. (example angular digest cycle always runs after meteor sub/unsub)
Chrome Inspector - Network websocket frame is your best tool understand how soon Meteor logout fires and and if there are any messages being passed back and forth before server retutns the result of logout request.
You may also use this.unblock() feature in subscribe. This way your subscritption run parallelly and don't block each other

Web App in Flaky Internet connection

Have PHP/mySQL/JS-JQuery based web site that records finish times for racers, then sends the time back to the server. The server inserts the finish time in the db, Calculates the finish place based on a handicapping formula. Stores that and send the finish place back to the web page and it is updated on the screen.
It uses Jquery Ajax calls so the page doesn't get reloaded at all.
Everything works fine if the data connection is good.
If the data connection is bad my first version of this page would put a message up that the connection was bad.
Now I am trying to make it a bit smarter, so I have started with the HTML5 feature that tells the browser if it is on or offline(i realize this may not be the best way yet but it works for concept testing)
When a new finish time is recorded(or updated) and we are offline the JS just adds a class of notSent to the tag of the finish time. The finish place and all of the finish places would normally come from the sever are greyed out indicating the data is no longer valid(until it can communicate with the server).
When the browser finds itself back online, A simple jQuery each loop on each notSent class starts re-sending the AJAX requests and if they all get completed it processes the return finish place information and display it as up to date.
It also disables all external links on the page when the browser is offline. This keeps the user from losing the data entry page by accident by clicking a link that will give them a page not found button.
So my last issue, is the browsers reload and close buttons, if the user click these when it is offline they will lose the data entry screen and are out of luck until the connection comes back.
Can I disable these functions as well? A quick Stack-overflow search of this indicates it can be done but most answers give the old, "you really shouldn't and if you think you need to you should rethink your design." warning.
So rethinking my design I start learning about;
HTML 5 local storage (decide I don't need it, since my data is stored already in a input box)
App-cache Manifest for controlling the cache of the page so if reloaded in the browser off line if would get that cached version. After much reading came to the conclusion that this could work on a static page but not mine where the data is updated all the time. Then found that most browsers are deprecating this anyways.
Service Workers seems to be the possible future for contorlling offline caching, but not all browsers support it, it is pretty cumbersome to learn and still very new.
Now I am stuck, Leaning towards preventing browser reloads and defering learning service worker till more support and better examples for a dynamic content pages like mine.
Bottom line- am I missing something here? Is there a easy solution?
I think the best option is to use PouchDB to sync between the client and server and use Background Sync to awake a Service Worker when you regain connectivity. If Service Worker is not present in your browser, it can sync the next time your user open the browser.
You have a similar example of deferred requests explained in the Service Worker Cookbook,

Can a single asp.net user make more than one request at a time if the Session is in use?

I am not able to make more than one request at a time in asp.net while the session is active. Why does this limitation exist? Is there a way to work around it?
This issue can be demonstrated with a WebForms app with just 3 simple aspx pages (although the limitation still applies in asp.net mvc).
Create an asp.net 3.5 web application.
There should be just three pages:
NoWait.aspx, Wait.aspx, and SessionStart.aspx
NoWait.aspx has this single nugget added between the default div tags: <%=DateTime.Now.Ticks %>. The code-behind for this page is the default (empty).
Wait.aspx looks just like NoWait.aspx, but it has one line added to Page_Load in the code-behind: Thread.Sleep(3000); //wait 3 seconds
SessionStart.aspx also looks just like NoWait.aspx, but it has this single line in its code-behind: Session["Whatever"] = "Anything";
Open a browser and go to NoWait.aspx. It properly shows a number in the response, such as: "633937963004391610". Keep refreshing and it keeps changing the number. Great so far! Create a new tab in the same browser and go to Wait.aspx. It sits for 3 seconds, then writes the number to the response. Great so far! No, try this: Go to Wait.aspx and while it's spinning, quickly tab over to NoWait.aspx and refresh. Even while Wait.aspx is sleeping, NoWait.aspx WILL provide a response. Great so far. You can continue to refresh NoWait.aspx while Wait.aspx is spinning, and the server happily sends a response each time. This is the behavior I expect.
Now is where it gets weird.
In a 3rd tab, in the same browser, visit SessionStart.aspx. Next, tab over to Wait.aspx and refresh. While it's spinning, tab over to NoWait.aspx and refresh. NoWait.aspx will NOT send a response until Wait.aspx is done running!
This proves that while a session is active, you can't make concurrent requests with the same user. Requests are all queued up and served synchronously. I do not expect or understand this behavior. I have tested this on Visual Studio 2008's built in web server, and also IIS 7 and IIS 7.5.
So I have a few questions:
1) Am I correct that there is indeed a limitation here, or is my test above invalid because I am doing something wrong?
2) Is there a way to work around this limitation? In my web app, certain things take a long time to execute, and I would like users to be able to do things in other tabs while they wait of a big request to complete. Can I somehow configure the session to allow "dirty reads"? This could prevent it from being locked during the request?
3) Why does this limitation exist? I would like to gain a good understanding of why this limitation is necessary. I think I'd be a better developer if I knew!
Here is a link talking about session state and locking. It does perform and exclusive lock.
The easiest way around this is to make the long running tasks asynchronous. You can make the long running tasks run on a separate thread, or use and asynchronous delegate and return a response to the browser immediately. The client side page can send requests to the server to check and see if it is done (through ajax most likely), and when the server tells the client it's finished, notify the user. That way although the server requests have to be handled one at a time by the server, it doesn't look like that to the user.
This does have it's own set of problems, and you'll have to make sure that account for the HTTP context closing as that will dispose certain functionality in the asp.net session. One example you'll probably have to account for is probably releasing a lock on the session, if that is actually occurring.
This isn't too surprising that this could be a limitation. Each browser would have it's own session, before the advent of ajax, post back requests were synchronous. Making the same session handle concurrent could get really ugly, and I can see how that wouldn't be a priority for the IIS and ASP.NET teams to add in.
For reasons Kevin described, users can't access two pages that might write to their session state at the same time - the framework itself can't exert fine-grained control over the locking of the session store, so it has to lock it for entire requests.
To work around this, pages that only read session data can declare that they do so. ASP.NET won't obtain a session state write lock for them:
// Or false if it doesn't need access to session state at all
EnableSessionState="ReadOnly"

Notifying the user after a long Ajax task when they might be on a different page

I have an Ajax request to a web service that typically takes 30-60 seconds to complete. In some cases it could take as long as a few minutes. During this time the user can continue working on other tasks, which means they will probably be on a different page when the task finishes.
Is there a way to tell that the original request has been completed? The only thing that comes to mind is to:
wrap the web service with a web service of my own
use my web service to set a flag somewhere
check for that flag in subsequent page requests
Any better ways to do it? I am using jQuery and ASP.Net, if it matters.
You could add another method to your web service that allows you to check the status of a previous request. Then you can use ajax to poll the web service every 30 seconds or so. You can store the request id or whatever in Session so your ajax call knows what request ID to poll no matter what page you're on.
I would say you'd have to poll once in a while to see if request has ended and show some notifications, like this site does with badges for example.
At first make your request return immediately with something like "Started processing...". Then use a different request to poll for the result. It is not good neither for the server nor the client's browser to have long open HTTP sessions. Moreover the user should be informed and educated that he is starting a request that could take some time to complete.
To display the result you could have a"notification area" in all of your web pages. Alternatively you could have a dedicated page for this and instruct the user to navigate there. As others have suggested you could use polling to get the result.
You could use frames on your site, and perform all your long AJAX requests in an invisible frame. Frames add a certain level of pain to development, but might be the answer to your problems.
The only other way I could think of doing it is to actually load the other pages via an AJAX request, such that there are no real page reloads - this would mean that the AJAX requests aren't interrupted, but may cause issues with breaking browser functionality (back/forward, bookmarking, etc).
Since web development is stateless (you can't set a trigger/event on a server to update the client), the viable strategy is to setup up a status function that you can intermittently call using a javascript timer to check whether your code has finished executing. When it finishes, you can update your view.

browser timeouts while asp.net application keeps running

I'm encountering a situation where it takes a long time for ASP.NET to generate reply with the web page (more than 2 hours). It due to the codebehind running for a while (very long, slow loop).
Browser (both IE & Firefox) stops waiting for the reply (after about an hour) and gives generic cannot display webpage error (similar to what you would see if you'd try to navige to non-existing server).
At the same time asp.net app keeps going (I can see it in debugger) and eventually completes.
Why does this happen? Are there any settings in web.config to influence this? I'm hoping there's a timeout setting that I'm missing that's causing this.
Maybe a settings in IE or Firefox? But I think they wait while the server is keeping connection alive.
I'm experiencing this even when I launch app in debug mode (with compilation debug="true") on my local machine from VS (so it's not running on IIS, but on ASP.NET Dev Server).
I know it's bad that it takes so long to generate the page, but it doesn't matter at this stage. Speeding it up would take a lot of extra work and the delay doesn't really matter. This is used internally.
I realize I can redesign around this issue running logic to a background process and getting notified when it's done through AJAX, or pull it to a desktop app or service or whatever. Something along those lines will be done eventually, but that's not what I'm asking about right now.
Sounds like you're using IE and it is timing out while waiting for a response from the server.
You can find a technet article to adjust this limit:
http://support.microsoft.com/kb/181050
CAUSE
By design, Internet Explorer imposes a
time-out limit for the server to
return data. The time-out limit is
five minutes for versions 4.0 and 4.01
and is 60 minutes for versions 5.x, 6,
and 7. As a result, Internet Explorer
does not wait endlessly for the server
to come back with data when the server
has a problem. Back to the top
RESOLUTION
In general, if a page does not return within a few
minutes, many users perceive that a
problem has occurred and stop the
process. Therefore, design your server
processes to return data within 5
minutes so that users do not have to
wait for an extensive period of time.
The entire paradigm of the Web is of request/response. Not request, wait two hours, response!
If the work takes so long to do, then have the page request trigger the work, and then not wait for it. Put the long-running code into a Windows service, and have the service listen to an MSMQ queue (or use WCF with an MSMQ endpoint). Have the page send requests for work to this queue. The service will read a request, maybe start up a new thread to process it, then write a response to another queue, file, or whatever.
The same page, or a different, "progress" page can poll the response queue or file for responses, and update the user, assuming the user still cares after two hours.
For something that takes this long, I would figure out a way to kick it off via AJAX and then periodically check on it's status. The background process should update some status variable on a regular basis and store it's data in the cache or session when complete. When it completes and the browser detects this (via AJAX), have the browser do a real postback (or get by changing location.href), pick up the saved data, and generate the page.
I have a process that can take a few minutes so I spin off a separate thread and send the result via ftp. If an error occures in the process I send myself an error message including the stack trace. You may want to consider sending the results via email or some other place then the browser and use a thread as well.

Resources