Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 months ago.
Improve this question
I'm into my first 3 months of web development and I have been dabbling with some server side scripting in the form of ColdFusion, along with some Javascript, JQuery and CSS.
I have read about CSS optimization and would like to know what are the other pertinent factors contributing to the better performance of a website. What all factors can a developer profile and optimize?
How much part does picking (or rather I should say recommending) a particular browser play in this performance hunt?
cheers
Install YSlow and Pagespeed plugins for Firefox. Then start looking at all of the ways your site is unoptimized. This will be much like trying to take a sip of water from a fire hydrant.
Using minified ( and possibly aggregated ) Javascript and CSS along with a good, healthy far-future-expires is a really good way to start.
Don't forget to gzip.
And use Etags.
And put your CSS at the top of the document.
And put the javascript at the end.
And use separate domains for static resources.
And avoid URL redirects.
And remove duplicate javascript and CSS.
And... see what I mean about the fire hydrant!
Just to sum the stuff up from above:
The speed of a website depends on a few things:
Server
Connection
Client
And on each of this part you can do improvements.
Server: if you rely on a database, check if you queries are cached, and more importantly check if your data is cached. For example if on every page you get a menu from the database, then you can cache that result. In addition you can check your code and see if there is room for optimization.
Also the hardware itself plays a role. If you are on a shared hosting plan, maybe the server is full of other not-optimized apps that take a toll on the server.
Connection: Here the YSlow and Pagespeed come in handy, as well as Fiddler. You can do some caching of static content (CSS and JS). Set their expire date far in the future. Using GZIP to make their contents smaller, and combining the static files helps to a certain extent.
In addition maybe the server has a low bandwidth.
Client: if you do wacky javascript or have slow css selectors, this might hurt performance on the client. But this depends on the speed of the client's computer.
I'd recommend reading Best Practices for Speeding Up Your Web Site and all of the content on Yahoo!'s Exceptional Performance page.
If you like books, you may be interested in High Performance Websites (note that a lot of the content in this is in the Best Practices for Speeding Up Your Web Site article) and Even Faster Websites.
Here are a few of my favourite rules from Best Practices for Speeding Up Your Web Site:
Minimize HTTP Requests
Add an Expires or a Cache-Control Header
Gzip Components
Make JavaScript and CSS External
Minify JavaScript and CSS
Also, smush.it is good for compressing images (which have a big impact on how fast a webpage loads).
As far as browsers go, Safari 4 claims it is "the world's fastest browser", and I can say that the Mac version is certainly nice and fast (not to mention elegant!). However, the above suggestions will make much more of a difference than which browser you're using.
Steve
With ColdFusion you will want to make sure your queries are being cached. Use query analyzer (if using mssql server) to make sure a slow loading page isn't the result of a bad query. On the database end you'll also want to ensure proper indexing.
A big factor in performance is how many HTTP requests are sent for images, files, etc. YSlow will show you this info. Its only available for firefox.
I'd recommend this book.
Google is currently collecting all sorts of performance tips on their new 'Let's make the web faster'-page here: http://code.google.com/intl/de-CH/speed/articles/
FYI: Not all information on these pages are valid, particularily the PHP tips are way off.
There is a really great plugin for for Firefox called Dust-Me Selectors. It scans your css files and lets you find selectors that aren't used/have become redundant in your markup.
https://addons.mozilla.org/en-US/firefox/addon/5392
You should also be delivering your static content off a CDN. Parallel downloads of static files will speed up your page renders. A better explanation here: http://www.sitecheck.be/tips/speed-up-your-site-using-a-cdn/
You shouldn't recommend any particular browser, but design your webpage to current standards with some fixes for older models, if necessary. From my perspective everything can have a speed impact, but CSS is the least important one and in real world examples the user will not notice this. In most cases a clear separation of html and style declarations will do the job. What really has an impact? First of all you can simply throw money at the problem by getting a better hosting contract (maybe a dedicated server). Another thing to improve the speed a website takes to load is to reduce the quality of your images and the usage of CSS-Sprites. Very often on dynamic webpages the database is a bottleneck and therefore caching and a good database abstraction layer can improve things (PHP: PDO instead of simply using mysql()). GZip your output to the user. There are so much more things, but a lot of them are very language dependent..
I recommend the use of FireBug and loadimpact.com for testing.
Less files are more - CSS sprites may be something to consider. In my experience, you have to balance your CSS file between speed and maintainability - one rule more or less won't make the difference between night and day...
For IE, see http://www.fiddler2.com/fiddler/Perf/
The new neXpert plugin for Fiddler (http://www.fiddler2.com/fiddler2/addons/nexpert.asp) offers features similar to those found in YSlow and PageSpeed.
The biggest problem I have is creating fast-running, beautifully designed pages with rich content. This is one thing that is extremely hard to do with today's technology.
If you have lots of javascript you might wanna use Javascript compression. Dojo provides one such tool SHRINKSAFE to compress your javascript. Find the link below:
http://www.dojotoolkit.org/docs/shrinksafe
There is another tool open sourced by google called page speed, which can help you optimize the website performance. It was used internally before it was open sourced to everyone recently.
http://google-code-updates.blogspot.com/2009/06/introducing-page-speed.html
http://code.google.com/speed/page-speed/
Hope it helps.
A couple of very basic rules of performance testing:
Performance means nothing if the program/web page/whatever is wrong.
Do not try to improve performance without having a reliable form of measurement.
You must profile your site/program/whatever to find out what is making things slow.
Corrolary: Do not just change things at random to see if things get better.
Cache everything (web server and browser caching).
Statically publish as much as possible (i.e. to reduce the amount of database calls)
Also add multiple waiting icons to your website.
Show icons in such way that every time user should get different waiting icon, which should be effective to engage user. And mean while your website will get loaded.
Related
Would css parse quicker if it was in the order that its needed on page?
If so what sort of difference would it make and are there any online tools that would sort it in order?
As others have said, as far as CSS parsing goes, I would assume this to be negligible. However, this is definitely true for HTML when loading JS or CSS. It becomes especially noticable when downloading from multiple CDNs.
If you are concerned and can host the site on either github.io or your own webpage, use Google Pagespeed Insights to benchmark performance on more basic things.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I have asp.net website name http://www.go4sharepoint.com
I have tried almost all ways to improve performance of this site, I have even check firebug and page speed addon on Firefox, but somehow i am not pleased with the result.
I also tried like removing whitespace, remove viewstate, optimizing code which renders it, applied GZip, I have also no heavy session variables used, but still when i compare with other popular websites it is not upto the mark.
I have check CodeProject website and was surprise that even though they have lot of stuff displayed there website is loading fast and they also have good loading rate.
To all experts, Please suggest me where i am going wrong in my development.
Thank you.
First of all I see now your pages and they not gZipped.
You make the question for the gzip, but its seems that at the end they are not gzipped.
Second your pages come very fast, they are small, and the lag time is slow, that means that your call to sql is good.
I only see a problem on "banner.php" page that for some reason this is seams that make the delay. A Javascript make this call to banner.php and waits until get return, render it and continue.
Check this 2 issues to fix your slow load.
About the banner.php
Here is one of the calls that you page make
http://sharepointads.com/members/scripts/banner.php?a_aid=go4sharepoint&a_bid=ac43d413
and you make at least 9 of them !. in first page.
This page have 400ms lag x 10, plus delay to load and reder is the delay that you search for. and is not comming direct from you. You need to find some other way to load them...
I can suggest some other way but not I must go... maybe tomorrow
gzip
An external test to prove that your pages are not gzip. Just see the report.
When optimizing the html visible to the client, the server side is sometimes neglected. What about:
Server side Caching - from entire page to data caching
Reduce number of database queries executed. And once retrieved from the database, cache it.
Is your server hardware up to it? Memory, cpu?
EDIT:
And for completeness, here's the list from the performance section of the popular question What should a developer know before building a public web site?
Implement caching if necessary, understand and use HTTP caching properly
Optimize images - don't use a 20 KB image for a repeating background
Learn how to gzip/deflate content (deflate is better)
Combine/concatenate multiple stylesheets or multiple script files to reduce number of browser connections and improve gzip ability to compress duplications between files
Take a look at the Yahoo Exceptional Performance site, lots of great guidelines including improving front-end performance and their YSlow tool. Google page speed is another tool for performance profiling. Both require Firebug installed.
Use CSS Image Sprites for small related images like toolbars (see the "minimize http requests" point)
Busy web sites should consider splitting components across domains. Specifically...
Static content (ie, images, CSS, JavaScript, and generally content that doesn't need access to cookies) should go in a separate domain that does not use cookies, because all cookies for a domain and it's subdomains are sent with every request to the domain and its subdomains.
Minimize the total number of HTTP requests required for a browser to render the page.
Utilize Google Closure Compiler for JavaScript and other minification tools
Are you using JavaScript, and are these JavaScript files loaded at the very beginning? Sometimes that slows the page down... Minifying JS files helps reduce size, and if you can, load scripts dynamically after the page loads.
Using an approach like http://www.pageflakes.com can also help too, where the content is loaded after the fact.
Lastly, is it speed related to your machine or hosting? Doing a tracert in the command window can help identify the network traffic.
HTH.
Have you identified any slow running queries? You might consider running profiler against your DB and see if anything if running long...
Before you do anything to change the code, you need to figure out where the problem actually is.
Which component is it that is "slow"?
The browser?
The server?
The network?
A stackoverflow user actually has a good book on this subject:
http://www.amazon.com/gp/product/1430223839?ie=UTF8&tag=mfgn21-20&linkCode=as2&camp=1789&creative=390957&creativeASIN=1430223839
A couple of recommendations after looking at your site:
Put some of your static files (images, js, etc.) on different domains so that they can be downloaded at the same time. (also turn off cookies for those domains)
Use image sprites instead of separate images.
Move around when things are loaded. It looks like the script files for the ads are holding up content. You should make content of the site load first by putting it in the HTML before the ads. Also, make sure that the height and width of things are specified such that the layout doesn't change as things are downloaded, this makes the site feel slow. Use Google Chrome's developer tools to look at the download order and timeline of all your object downloads.
Most of the slowness looks like it's coming from downloading items from sharepointads.com. Perhaps fewer adds, or have them use space already reserved for them by specifying height and width.
Add a far future expires time to the header for all static content.
Serve scaled images. Currently the browser is resizing the images. You could save tons of bandwidth by serving the images already the proper size.
Also, download YSlow (from yahoo) and Page Speed (from google)
Another good post for performance.
Just check
http://howto-improveknowledge.blogspot.com/2011/11/performance-improvement-tips.html
which explain the how to find bottleneck for performance.
This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
Is it worth it to code different functionality for users with javascript disabled?
I've just learned in this question that an ASP.NET webforms application will have issues to run properly in a browser with disabled Javascript (unless I don't use certain controls and features).
So this is a kind of a follow-up question. For my current web application which will only have a small number of users I can postulate to enable Javascript.
But how to deal with this question in general? As I can see (with my rather small knowledge about web development until now) Javascript is all around, especially the more a web site is "dynamic" or "RIA" like.
Is it worth at all to take care about the few users during web app development who disable Javascript in their browsers?
Are they really "few"? (I actually have no clue, I just don't know anyone who has Javascript disabled.)
I'm a bit inclined to tell them: "If you want a nice interactive page, don't disable Javascript." or "You cannot enter this website without Javascript. Go away!" Because I don't want to double code and development time for some mavericks (if they are mavericks). Does it indeed add a lot more time to get a website working with Javascript and without Javascript as well?
For what reason does someone disable Javascript at all in his/her browser? Well, I've heard: "Security!" How unsecure is Javascript actually? And does that mean in the end that millions of pages are unsecure and almost every website I am using is a risk for me? Are there other reasons except security?
Are there any branches or environments where it is usual or even mandatory to disable Javascript for security (or other) reasons? (For instance I have security services, offices in defense ministries or a bank in mind.)
Are there any trends in development to see to add more Javascript to web sites, thus making it more and more necessary to let Javascript enabled in the browser? Or is there rather a counter motion?
Thank you for some feedback in advance!
Whether or not to care about users who turn off javascript should be done on a case by case basis. If you believe that it is ok to turn away users that do not have it enabled then that is a decision that you can make and is made by many apps.
Keep in mind that it is not necessarily a conscious decision to have javascript disabled or at a limited capacity. Screen readers, for example have a very stripped version of javascript and a site that uses it throughout will often be inaccessible. Depending on the website, this may actually be illegal.
If a website is properly constructed with progressive enhancement from the beginning, then creating versions that work without javascript should not be too much additional work. Therein lies one of the major issues with webforms - it is difficult to gain control over markup and javascript tends to be very tightly coupled.
My personal view is the number of people who completely disable javascript to the extent that .net web sites stop functioning is very small - For some government sites I have been responsible for, I don't recall getting any complaints from "non-javascript" users.
My concern would be more about making sure your site was xhtml compliant, with valid markup (which earlier versions of Visual Studio did not generate), valid css, and intelligent use of javascript.
Having a disclaimer somewhere on your site - that is accessible to those few with javascript disabled telling people that javascript is required for the site to function correctly would be a good thing.
Depends on your audience. For one thing, if the site is completely nonfunctional without JavaScript, accessibility (e.g. to those who must use a screen reader) may be an issue, so if you expect any blind users, you might need to consider that.
In most situations, I'd say, you're probably fine just using <noscript> tags to drop in a quick disclaimer along the lines of "JavaScript is required to use this web site."
While I have no solid numbers -- and I presume I'm in the distinct minority -- I, and many of the more savvy users I know, disable JavaScript by default (a la NoScript). I enable it on websites on a case by case basis. Most novice users (I'm ignoring the 25% in the upper/middle of experience) don't even know what "JavaScript" means.
As a developer I see the cost/benefit of supporting JS-less users
as boiling down to one question:
Do your users need the site more, or do you need your users more?
One of my current projects makes heavy use of JavaScript and Flash and does not function at all without it. But as it's installed at the employer's site and the visitors are the employees using it for their job, that requirement is completely reasonable.
However, if I were working on a revenue-generating site where losing users meant losing money, I'd seriously think about crafting the site to work w/o JS -- albeit less quickly with more page reloads. (Perhaps by asking advice here.)
As a random thought -- you could probably devise a special method to determine how many of your current users do and don't have JS enabled.
I've recently begun working on a very large, high traffic website. We would very much like to reduce the size and number of our style sheets, minification is one route we will pursue but is anyone aware of any tools for checking ID and class use? Literally scanning the website to see what's active and what isn't?
Alternatively any software for redacting the css to reduce repition and size?
Thanks in advance
Literally scanning the website to see
what's active and what isn't?
Dust-Me Selectors is a Firefox plugin that you can use to show what css rules aren't being used.
http://www.sitepoint.com/dustmeselectors/
I can certainly recommend Page Speed (http://code.google.com/speed/page-speed/) by Google to check the performance (and possible improvements) of your webpages.
Page Speed also checks CSS and usage of classes on your webpages.
It is used in combination with Firebug.
Gzip compression in the webserver.
Expiry dates that lie far in the future to avoid redownloading the CSS files.
Alternatively any software for
redacting the css to reduce repition
and size?
Yet another level of indirection ... You (and your team) should write long CSS files with as many comments as needed and then write a tool that will publish merged files as needed (different templates need different files), stripped comments and minified as http://www.cleancss.com could do (CSSTidy). Readability comes first if you wan't to be able to modify a file in 1 month or keep track of modifications (or worse if sb else must do that!).
Other options are to reduce the number of templates used throughout the site. No need of two templates with 2px differences (grid layouts are a good way to stick to this) or inconsistent ways of displaying error messages. Define a common look and feel to your site and give instructions to webdesigners, if it isn't already done.
There are probably thousands of applications out there like 'Google Web Accelerator' and all kinds of popup blockers. Then theres header blocking personal firewalls, full site blockers, and paranoid cookie monsters.
Fortunately Web Accelerator is now defunct (I suggest you read the above article - its actually quite funny what issues it caused) but there are so many other plugins and third party apps out there that its impossible to test them all with your app until its out in the wild.
What I'm looking for is advice on the most important things to remember when writing a web-app (whatever technology) with respect to ensuring the user's environment isnt going to break it. Kind of like a checklist.
Whats the craziest thing you've experienced?
PS. I may have linked to net-nanny above, but I'm not trying to make a porn site
The best advice I can give is to program defensively. For example, don't assume that all of your scripts may be loaded. I've seen cases where AdBlocker Plus will block 1/10 scripts that are included in a page just because it has the word "ad" in the name or path. While you can work around this by renaming the file, it's still good to check that a particular object exists before using it.
The weirdest thing I've seen wasn't so much a browser plugin but a firewall/proxy configuration at a user's workplace. They were using a squid proxy that was trying to remove ads by replacing any image HTTP request that it thought was an ad with a single pixel GIF image. Unfortunately it did this for non-GIF images too so when our iPhone application was expecting a PNG image and got a GIF, it would crash.
Internet Explorer 6. :)
No, but seriously. Firefox plugins like noscript and greasemonkey for one, though those are likely to be a very small minority.
Sometimes the user's environment means a screen reader (or even a braille interface like this). If your layout is in any way critical to the content being delivered as intended, you've got a problem right there.
Web pages break, fact of life; the closer you have been coding and designing up against standards, the less your fault it is.
Something I have checked in the past is loading some of the more popular toolbars that people tend to install (Google, Yahoo, MSN, etc) and seeing how that affects the users experience.
To a certain extent it is difficult to preempt which of the products you mentioned will be used by your users since there are so many. I would say your best bet is to test for the most frequent products that your user base may employ and roll with the punches for the rest. If you have the time to test other possible scenarios, by all means do.
Also, making it easy for your users to report possible issues also helps lessen the time it takes to get a fix in place should it be something you can work around.