Web App in Flaky Internet connection - reload

Have PHP/mySQL/JS-JQuery based web site that records finish times for racers, then sends the time back to the server. The server inserts the finish time in the db, Calculates the finish place based on a handicapping formula. Stores that and send the finish place back to the web page and it is updated on the screen.
It uses Jquery Ajax calls so the page doesn't get reloaded at all.
Everything works fine if the data connection is good.
If the data connection is bad my first version of this page would put a message up that the connection was bad.
Now I am trying to make it a bit smarter, so I have started with the HTML5 feature that tells the browser if it is on or offline(i realize this may not be the best way yet but it works for concept testing)
When a new finish time is recorded(or updated) and we are offline the JS just adds a class of notSent to the tag of the finish time. The finish place and all of the finish places would normally come from the sever are greyed out indicating the data is no longer valid(until it can communicate with the server).
When the browser finds itself back online, A simple jQuery each loop on each notSent class starts re-sending the AJAX requests and if they all get completed it processes the return finish place information and display it as up to date.
It also disables all external links on the page when the browser is offline. This keeps the user from losing the data entry page by accident by clicking a link that will give them a page not found button.
So my last issue, is the browsers reload and close buttons, if the user click these when it is offline they will lose the data entry screen and are out of luck until the connection comes back.
Can I disable these functions as well? A quick Stack-overflow search of this indicates it can be done but most answers give the old, "you really shouldn't and if you think you need to you should rethink your design." warning.
So rethinking my design I start learning about;
HTML 5 local storage (decide I don't need it, since my data is stored already in a input box)
App-cache Manifest for controlling the cache of the page so if reloaded in the browser off line if would get that cached version. After much reading came to the conclusion that this could work on a static page but not mine where the data is updated all the time. Then found that most browsers are deprecating this anyways.
Service Workers seems to be the possible future for contorlling offline caching, but not all browsers support it, it is pretty cumbersome to learn and still very new.
Now I am stuck, Leaning towards preventing browser reloads and defering learning service worker till more support and better examples for a dynamic content pages like mine.
Bottom line- am I missing something here? Is there a easy solution?

I think the best option is to use PouchDB to sync between the client and server and use Background Sync to awake a Service Worker when you regain connectivity. If Service Worker is not present in your browser, it can sync the next time your user open the browser.
You have a similar example of deferred requests explained in the Service Worker Cookbook,

Related

From the server point of view, how are navigators implemented best?

I have to make a navigator menu for a web application (based these: java, tomcat, jsp, oracle db) that will be present in everything once the user is logged in. The point is that almost every person connected, has different privileges so each person would see a diferent navigator menu. This is one of my first web developments so I'm not very strong on the concepts of how things are communicated from the client to the server and vice-versa and therefore, what is the best to do, however some of my considerations have their pros and cons.
Making a filter to load the menu in every request. This would query the database which is painful to it because in few cases, things are going to change (or at least, not very often); however, when there is a change, it would load immediately.
Making an iframe to load the menu once and control the second iframe from this one (i think this is the most discouraged, but is still an option and has the advantage of not making the same request to the server everytime you click something). When there is a change, reloading would load the new data.
Making the menu to stay as an object in the session. This will query the db once the user logs in. Changes will be loaded in every connection (or by making a button to reload). But this approach would put things in the server memory which I think is one of the worse ideas. (not sure about what I say here, I'm I wrong?)
As I said, I'm too new to this so I don't know about anything else I can do. For what I know, I can't write a file to the client (xml for example) so it is the data source for the menu and delete it daily or so. I can write a file in the server with the data so I don't need to query from the db (but still this is a request to the server), but this brings other problems, changes of data would not be refreshed (unless of course, I make something to it which require more time and more things to mantain) and I don't know if this would be faster than just accessing the db.
How are this things best implemented? consider the cost of development, mantainance, performance (reducing requests of the same information on every click), user perceived lag and others that my maturity on this subject don't see yet. Any recommendations on books about web design?
Edit:
I'm planning to use jsTree for the visualization and making the menu sublevels to load on ajax requests.
I would forget about the second option and go with the simplest, stateless option: the first one. Databases are fast!
If it appears that this takes too much time or puts too much load on the database (but I doubt it: loading the provileges of a user should be very fast), you could always go with the third option. Sure it would store the menu (or just the privileges) in the session, but the session can also be written to the disk or the database if necessary.
Don't pre-optimize.

Asp.net slow first load per user

I have a website set up with in IIS 7 with HTTPS, and every time a user access it for the first time the load time is about 15 sec.
THIS IS NOT the compile/warm up "problem" described for instance her: Slow first page load on asp.net site
I know about that "problem" and I also have that, but that is of course expected and not the issue here.
Since it's not when the application loads first time since recycle/start. If I open another browser and access it after doing it first in another browser then it takes the same amount of time. So it seems every time a session is started, that's when the delay happens. All following requests from the same user/browser is as quick as expected.
This is for an admin interface site I have and I use asp.net membership. Although the delay happens even before the user have logged in. So I'm not sure if that is the culprit.
I am a bit unsure where to look for killing the delay. I am running session state in process. With cookies.
Any ideas?
You need to get a little more information. Enable trace and track how long each step takes. You could also use Wireshark and have a look at the traffic between the client/server. If there is a big gap in traffic you can see something is hanging at the servers end. If you see constant traffic perhaps you have to much going on with your landing page. Other simple things to do would be to enable dynamic caching/compression on the server to speed things up.
Warm it up...
http://learn.iis.net/page.aspx/688/using-the-iis-application-warm-up-module/

Multiple postbacks of ASPX page

I have an aspx page with a simple form to send emails to pre-defined lists of users. On the longer lists the page usually times out before the emails finish sending but this has never been an issue.
Today something weird happened and each user got four emails. In the log I could see three new threads crank up one at a time and start over sending from the beginning of the list.
Any ideas? I absolutely know I didn't intentionally refresh the Web page myself, and certainly not three times. But could the browser (IE8) have done it? Would it post again trying to re-establish a connection when it timed out? Or when I switched back to the browser window from another app? I have never seen behavior like this before.
First question would be whether there is any reason to do a long-running task syncronously, i.e. lock up a thread that should be serving web requests for something that could be done in the background, while the browser sits and waits for a response that its probably not going to get. I'd look into running this asynchronously unless there's a very deliberate reason not to.
Secondly have you looked into creating some kind of locking mechanism such that the process can't be started more than once? I have processes where I add a token to the application cache (and remove it when I'm done) so that if the token exists the process won't run again (the call to the asynch task isn't made), and that does the job. That way it doesn't matter how many clients call your code, you prevent things happening more than they should.

Can a single asp.net user make more than one request at a time if the Session is in use?

I am not able to make more than one request at a time in asp.net while the session is active. Why does this limitation exist? Is there a way to work around it?
This issue can be demonstrated with a WebForms app with just 3 simple aspx pages (although the limitation still applies in asp.net mvc).
Create an asp.net 3.5 web application.
There should be just three pages:
NoWait.aspx, Wait.aspx, and SessionStart.aspx
NoWait.aspx has this single nugget added between the default div tags: <%=DateTime.Now.Ticks %>. The code-behind for this page is the default (empty).
Wait.aspx looks just like NoWait.aspx, but it has one line added to Page_Load in the code-behind: Thread.Sleep(3000); //wait 3 seconds
SessionStart.aspx also looks just like NoWait.aspx, but it has this single line in its code-behind: Session["Whatever"] = "Anything";
Open a browser and go to NoWait.aspx. It properly shows a number in the response, such as: "633937963004391610". Keep refreshing and it keeps changing the number. Great so far! Create a new tab in the same browser and go to Wait.aspx. It sits for 3 seconds, then writes the number to the response. Great so far! No, try this: Go to Wait.aspx and while it's spinning, quickly tab over to NoWait.aspx and refresh. Even while Wait.aspx is sleeping, NoWait.aspx WILL provide a response. Great so far. You can continue to refresh NoWait.aspx while Wait.aspx is spinning, and the server happily sends a response each time. This is the behavior I expect.
Now is where it gets weird.
In a 3rd tab, in the same browser, visit SessionStart.aspx. Next, tab over to Wait.aspx and refresh. While it's spinning, tab over to NoWait.aspx and refresh. NoWait.aspx will NOT send a response until Wait.aspx is done running!
This proves that while a session is active, you can't make concurrent requests with the same user. Requests are all queued up and served synchronously. I do not expect or understand this behavior. I have tested this on Visual Studio 2008's built in web server, and also IIS 7 and IIS 7.5.
So I have a few questions:
1) Am I correct that there is indeed a limitation here, or is my test above invalid because I am doing something wrong?
2) Is there a way to work around this limitation? In my web app, certain things take a long time to execute, and I would like users to be able to do things in other tabs while they wait of a big request to complete. Can I somehow configure the session to allow "dirty reads"? This could prevent it from being locked during the request?
3) Why does this limitation exist? I would like to gain a good understanding of why this limitation is necessary. I think I'd be a better developer if I knew!
Here is a link talking about session state and locking. It does perform and exclusive lock.
The easiest way around this is to make the long running tasks asynchronous. You can make the long running tasks run on a separate thread, or use and asynchronous delegate and return a response to the browser immediately. The client side page can send requests to the server to check and see if it is done (through ajax most likely), and when the server tells the client it's finished, notify the user. That way although the server requests have to be handled one at a time by the server, it doesn't look like that to the user.
This does have it's own set of problems, and you'll have to make sure that account for the HTTP context closing as that will dispose certain functionality in the asp.net session. One example you'll probably have to account for is probably releasing a lock on the session, if that is actually occurring.
This isn't too surprising that this could be a limitation. Each browser would have it's own session, before the advent of ajax, post back requests were synchronous. Making the same session handle concurrent could get really ugly, and I can see how that wouldn't be a priority for the IIS and ASP.NET teams to add in.
For reasons Kevin described, users can't access two pages that might write to their session state at the same time - the framework itself can't exert fine-grained control over the locking of the session store, so it has to lock it for entire requests.
To work around this, pages that only read session data can declare that they do so. ASP.NET won't obtain a session state write lock for them:
// Or false if it doesn't need access to session state at all
EnableSessionState="ReadOnly"

browser timeouts while asp.net application keeps running

I'm encountering a situation where it takes a long time for ASP.NET to generate reply with the web page (more than 2 hours). It due to the codebehind running for a while (very long, slow loop).
Browser (both IE & Firefox) stops waiting for the reply (after about an hour) and gives generic cannot display webpage error (similar to what you would see if you'd try to navige to non-existing server).
At the same time asp.net app keeps going (I can see it in debugger) and eventually completes.
Why does this happen? Are there any settings in web.config to influence this? I'm hoping there's a timeout setting that I'm missing that's causing this.
Maybe a settings in IE or Firefox? But I think they wait while the server is keeping connection alive.
I'm experiencing this even when I launch app in debug mode (with compilation debug="true") on my local machine from VS (so it's not running on IIS, but on ASP.NET Dev Server).
I know it's bad that it takes so long to generate the page, but it doesn't matter at this stage. Speeding it up would take a lot of extra work and the delay doesn't really matter. This is used internally.
I realize I can redesign around this issue running logic to a background process and getting notified when it's done through AJAX, or pull it to a desktop app or service or whatever. Something along those lines will be done eventually, but that's not what I'm asking about right now.
Sounds like you're using IE and it is timing out while waiting for a response from the server.
You can find a technet article to adjust this limit:
http://support.microsoft.com/kb/181050
CAUSE
By design, Internet Explorer imposes a
time-out limit for the server to
return data. The time-out limit is
five minutes for versions 4.0 and 4.01
and is 60 minutes for versions 5.x, 6,
and 7. As a result, Internet Explorer
does not wait endlessly for the server
to come back with data when the server
has a problem. Back to the top
RESOLUTION
In general, if a page does not return within a few
minutes, many users perceive that a
problem has occurred and stop the
process. Therefore, design your server
processes to return data within 5
minutes so that users do not have to
wait for an extensive period of time.
The entire paradigm of the Web is of request/response. Not request, wait two hours, response!
If the work takes so long to do, then have the page request trigger the work, and then not wait for it. Put the long-running code into a Windows service, and have the service listen to an MSMQ queue (or use WCF with an MSMQ endpoint). Have the page send requests for work to this queue. The service will read a request, maybe start up a new thread to process it, then write a response to another queue, file, or whatever.
The same page, or a different, "progress" page can poll the response queue or file for responses, and update the user, assuming the user still cares after two hours.
For something that takes this long, I would figure out a way to kick it off via AJAX and then periodically check on it's status. The background process should update some status variable on a regular basis and store it's data in the cache or session when complete. When it completes and the browser detects this (via AJAX), have the browser do a real postback (or get by changing location.href), pick up the saved data, and generate the page.
I have a process that can take a few minutes so I spin off a separate thread and send the result via ftp. If an error occures in the process I send myself an error message including the stack trace. You may want to consider sending the results via email or some other place then the browser and use a thread as well.

Resources