Concurrent Downloads - http

I've been monitoring the Net panel of Firebug and noticed that the HTML has to be downloaded first before any other resources are downloaded. I guess this make sense since the other resources are defined in the HTML. Is there a way around this so that other components can be downloaded during the HTML download?

Debugging 101: what you see while debugging is different than what happens when you are not looking.
Most browsers start the HTML interpretation while downloading it, and start downloading the additional resources concurrently. Firebug is not a great place to see that happening, try HTTPFox instead.
Now, to answer your question: you don't need to do anything to make the browser download the other components while downloading your HTML, it'll take care of that for you.

No - the browser needs a parseable HTML document first before it can start downloading scripts, images, etc.
You can speed up downloading of the non-HTML elements by moving them to different subdomains though: Browsers have a connections-per-host limit which is circumvented by using subdomains. Additionally you could compress/minify your CSS/JavaScript files to reduce their size.

There is the potential for one to create a small HTML file that then makes several requests to fill in the rest of the page through various AJAX-like calls, but if someone has JavaScript disabled then the page may look really bad. In a sense this is taking out some of the original HTML content and having it be downloaded separately which may or may not be a good idea. In a sense though this is using more network resources as there would be many requests to fully load the page in this case but it is a question of what is an acceptable trade-off.

Related

CSS speed optimisation - Why multiple files are better then only one?

Less HTTP request the better it's, right ?
Regarding to Google best practice explanation, less unused css rules is also better.
The browser's CSS engine has to evaluate every rule contained in the file to see if the rule applies to the current page.
Even if a stylesheet is in an external file that is cached, rendering is blocked until the browser loads the stylesheet from disk.
In your opinion what's giving better performance :
One css file per page.
One general css that will be cached (even if there will be +70% unused css / but avoiding any other http requests).
Google speed best-practice
One of the important sentence to note from the Google best practice document is "Often, many web sites reuse the same external CSS file for all of their pages, even if many of the rules defined in it don't apply to the current page".
This needs to be taken into account as if the css file has additional code that is never going to be used if user does not visit the page for which this redundant code applies then we are certainly wasting the bandwidth which may not be a proper trade off for an additional HTTP request.
This leads to additional time to load the file plus the time wasted in evaluation of that redundant code.
Certainly using multiple files for just a single page (like different header/footer css files) would be a bad practice.
And as you know that there is not a perfect solution for any problem. You have to choose the best thing that suits your need.
So, I would say the decision to use multiple files or a single file is solely based on the overall structure of website and other trade offs.
Loading CSS is usually extremely quick. CSS blocking is something you will probably never catch. Whereas JavaScript could do so that you are visually aware that it's being downloaded. (white spaces while rendering the page).
In reality one CSS is good enough, because of a single HTTP request.
Optimization should go towards JavaScript, because this is where you can see the page slowing down. We are talking about a second-two of a difference or less here.
Here is a site where you can enter URL and it will check load times. In the graph below you can compare CSS load times.

How can I optimize http requests

Modern browsers support gzip/deflate compression and http pipelining, which helps speed up my page being loaded by the client's browsers.
I came across a great technique for optimizing images, so I was wondering if there is a way to bundle css/js/html (the plain text) files together into a single stream so that my web pages can be delivered faster.
Your help is kindly appreciated.
Regards,
Richard
There are several minification projects out there.
Google minify is one example.
Minification tends to not go as far as you are suggesting (bundling css/js/html into one stream), but combining all CSS into a single request and JS into a single request (as well as removal of insignificant whitespace and sometimes renaming of code to shorter variables).
I presume you've had a look at the excellent YSlow add-on for Firefox which gives some excellent tips on speeding up download times.
You can certainly combine js / css files on the server before being downloaded, but you wouldn't want to actually combine the html+js+css into a single stream. The simple reason for this is browser caching. The js and css files are only downloaded once, then your html is downloaded for every new page. If the js and css are combined into the html, then essentially every page is different and nothing be cached.

What are some client-side tricks to get around IE7's absurd 32-stylesheet limit?

I just worked out, by trial-and-error, that IE 7 has an upper limit of 32 stylesheet includes (i.e. tags).
I'm working on the front-end of a very large website, in which we wish to break our CSS into as many separate files as we wish, since this makes developing and debugging much easier.
Performance isn't a concern, as we do compress all these files into a single package prior to deployment.
The problem is on the development side. How can we work with more than 32 stylesheets if IE 7 has an upper limit of 32?
Is there any means of hacking around this?
I'm trying to come up with solutions, but it seems that even if I loaded the stylesheets via Ajax, I'd still be writing out tags, which would still count towards the 32-stylesheet limit.
Is this the case? Am I stuck with the 32-file limit or is there a way around it?
NOTE: I'm asking for a client-side solution to this. Obviousy a server-side solution isn't necessary as we already have a compression system in place. I just don't want to have to do a re-compress every time I make one little CSS change that I want to test.
Don't support IE7.
To avoid confusion: I'm not seriously suggesting this as a real solution.
Create CSS files on the server side and merge all files that are needed for this certain page.
If you are using Apache or Lighttp consider using mod_concat
Write your stylesheet into an existing style block with JavaScript using the cssText property, like this:
document.styleSheets[0].cssText += ourCss;
More info here:
https://bushrobot.blogspot.com/2012/06/getting-around-31-stylesheet-limit-in.html
At my last company we solved this by mashing all the CSS into one big document and inserting a URL in the web page that referenced that one-shot document. This was all done on-the-fly, just before returning the page to the client (we had a bunch of stuff going on behind the scenes that generated dynamic CSS).
You might be able to get your web server to do something similar, depending on your setup, otherwise it sounds like you're stuck with only 32 files.
Or you could just not support IE7 ;)

Asp.net Website Performance Improvement Checklist [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I have asp.net website name http://www.go4sharepoint.com
I have tried almost all ways to improve performance of this site, I have even check firebug and page speed addon on Firefox, but somehow i am not pleased with the result.
I also tried like removing whitespace, remove viewstate, optimizing code which renders it, applied GZip, I have also no heavy session variables used, but still when i compare with other popular websites it is not upto the mark.
I have check CodeProject website and was surprise that even though they have lot of stuff displayed there website is loading fast and they also have good loading rate.
To all experts, Please suggest me where i am going wrong in my development.
Thank you.
First of all I see now your pages and they not gZipped.
You make the question for the gzip, but its seems that at the end they are not gzipped.
Second your pages come very fast, they are small, and the lag time is slow, that means that your call to sql is good.
I only see a problem on "banner.php" page that for some reason this is seams that make the delay. A Javascript make this call to banner.php and waits until get return, render it and continue.
Check this 2 issues to fix your slow load.
About the banner.php
Here is one of the calls that you page make
http://sharepointads.com/members/scripts/banner.php?a_aid=go4sharepoint&a_bid=ac43d413
and you make at least 9 of them !. in first page.
This page have 400ms lag x 10, plus delay to load and reder is the delay that you search for. and is not comming direct from you. You need to find some other way to load them...
I can suggest some other way but not I must go... maybe tomorrow
gzip
An external test to prove that your pages are not gzip. Just see the report.
When optimizing the html visible to the client, the server side is sometimes neglected. What about:
Server side Caching - from entire page to data caching
Reduce number of database queries executed. And once retrieved from the database, cache it.
Is your server hardware up to it? Memory, cpu?
EDIT:
And for completeness, here's the list from the performance section of the popular question What should a developer know before building a public web site?
Implement caching if necessary, understand and use HTTP caching properly
Optimize images - don't use a 20 KB image for a repeating background
Learn how to gzip/deflate content (deflate is better)
Combine/concatenate multiple stylesheets or multiple script files to reduce number of browser connections and improve gzip ability to compress duplications between files
Take a look at the Yahoo Exceptional Performance site, lots of great guidelines including improving front-end performance and their YSlow tool. Google page speed is another tool for performance profiling. Both require Firebug installed.
Use CSS Image Sprites for small related images like toolbars (see the "minimize http requests" point)
Busy web sites should consider splitting components across domains. Specifically...
Static content (ie, images, CSS, JavaScript, and generally content that doesn't need access to cookies) should go in a separate domain that does not use cookies, because all cookies for a domain and it's subdomains are sent with every request to the domain and its subdomains.
Minimize the total number of HTTP requests required for a browser to render the page.
Utilize Google Closure Compiler for JavaScript and other minification tools
Are you using JavaScript, and are these JavaScript files loaded at the very beginning? Sometimes that slows the page down... Minifying JS files helps reduce size, and if you can, load scripts dynamically after the page loads.
Using an approach like http://www.pageflakes.com can also help too, where the content is loaded after the fact.
Lastly, is it speed related to your machine or hosting? Doing a tracert in the command window can help identify the network traffic.
HTH.
Have you identified any slow running queries? You might consider running profiler against your DB and see if anything if running long...
Before you do anything to change the code, you need to figure out where the problem actually is.
Which component is it that is "slow"?
The browser?
The server?
The network?
A stackoverflow user actually has a good book on this subject:
http://www.amazon.com/gp/product/1430223839?ie=UTF8&tag=mfgn21-20&linkCode=as2&camp=1789&creative=390957&creativeASIN=1430223839
A couple of recommendations after looking at your site:
Put some of your static files (images, js, etc.) on different domains so that they can be downloaded at the same time. (also turn off cookies for those domains)
Use image sprites instead of separate images.
Move around when things are loaded. It looks like the script files for the ads are holding up content. You should make content of the site load first by putting it in the HTML before the ads. Also, make sure that the height and width of things are specified such that the layout doesn't change as things are downloaded, this makes the site feel slow. Use Google Chrome's developer tools to look at the download order and timeline of all your object downloads.
Most of the slowness looks like it's coming from downloading items from sharepointads.com. Perhaps fewer adds, or have them use space already reserved for them by specifying height and width.
Add a far future expires time to the header for all static content.
Serve scaled images. Currently the browser is resizing the images. You could save tons of bandwidth by serving the images already the proper size.
Also, download YSlow (from yahoo) and Page Speed (from google)
Another good post for performance.
Just check
http://howto-improveknowledge.blogspot.com/2011/11/performance-improvement-tips.html
which explain the how to find bottleneck for performance.

is it a good idea to put all javascript file's content into one file to reduce server request and keep at bottom to increase performance?

I use simple javascripts, jquery library and many plugins , should i make one file for all if yes then what we need to "just copy and paste and code from all file into one in needed order" or any thing else need to be considerd.
as stated here http://developer.yahoo.com/performance/rules.html#num_http
Combined files are a way to reduce the
number of HTTP requests by combining
all scripts into a single script, and
similarly combining all CSS into a
single stylesheet. Combining files is
more challenging when the scripts and
stylesheets vary from page to page,
but making this part of your release
process improves response times.
and this http://developer.yahoo.com/performance/rules.html#js_bottom
The problem caused by scripts is that
they block parallel downloads. The
HTTP/1.1 specification suggests that
browsers download no more than two
components in parallel per hostname.
If you serve your images from multiple
hostnames, you can get more than two
downloads to occur in parallel. While
a script is downloading, however, the
browser won't start any other
downloads, even on different hostname
It these are god practices then
How to combine multiple javascript ito one without getting any conflict?
Is it just same as i copy all css code from all files into one or it's tricky?
For each file you have, there are two steps :
send the HTTP request to the server
download the content of the file
If you reduce the number of files by combining them, you will reduce the number of HTTP requests -- which means your page will load a bit faster ;; which is good for your users ; which is why it's recommended.
But this will make debuggig harder, which is why it's recommended to do this only on your production environment, and not on the development platform -- hence the "making this part of your release process" part.
Of course, the process of combining your files content should not be done manually -- else, you'll have to re-do it each time there's a modification made ; it should be fully automated, and done at the time you are building the archive that is going to be deployed on your production server.
Also :
You might gain a bit on the "dowload" part if using minification
You will gain a lot more on the "download" part if using compression (see mod_deflate, for Apache)
Ideally, you can use all three solutions, btw ;-)
Placing the <script> tags at the end of your page will :
allow the content of the page (which generall is what matters the most) to be displayed faster
but will only work if your page/JS is coded "correctly" (i.e. unobstrusive JS, not using JS "hardcoded" in the HTML page)
This can help too -- but might be a bit harder to achieve than combinaison+minification+compression.
There are several methods for improving javascript load performance.
Combine scripts into one file: I suggest only combining scripts you write/maintain yourself. Otherwise if the 3rd party library is updated it could be tough to update your combined file.
Use JSMin to reduce the size of javascript files, see http://www.crockford.com/javascript/jsmin.html.
Use Google's CDN for referencing JQuery and JQuery UI, see http://code.google.com/apis/ajaxlibs/documentation/, eg:
<script type='text/javascript' src='http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js'></script>
This avoids the user loading the file at all if their browser already has it cached.
For JQuery, you should be loading from Google. Since a lot of places use Google's JQuery, it's likely that it will already be cached and even potentially compiled on the user's machine, which is about as good as one can possibly get. Cache beats all when it comes to JS optimization.
If you're using one set of JS files across all the pages on the site, you can get a similar effect by combining them into one file and using it everywhere; the browser will load it on the first page the user visits and then the JS will be cached.
However, if each of your pages uses a different set of files, the cache benefits will be vastly reduced and in fact it may be counterproductive, since the browser will detect a+b.js as a different file and will load it even if a.js and b.js are already cached. Additionally, combining the files in the right configurations for each page is a non-trivial dependency-tracking problem. In short, it's more trouble than it is worth unless you're serving millions of unique hits per day, and even then it might not be a good idea.
In any case, minification and compression should always be applied in production, since they have basically no downsides.

Resources