Asp.net Website Performance Improvement Checklist [closed] - asp.net

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I have asp.net website name http://www.go4sharepoint.com
I have tried almost all ways to improve performance of this site, I have even check firebug and page speed addon on Firefox, but somehow i am not pleased with the result.
I also tried like removing whitespace, remove viewstate, optimizing code which renders it, applied GZip, I have also no heavy session variables used, but still when i compare with other popular websites it is not upto the mark.
I have check CodeProject website and was surprise that even though they have lot of stuff displayed there website is loading fast and they also have good loading rate.
To all experts, Please suggest me where i am going wrong in my development.
Thank you.

First of all I see now your pages and they not gZipped.
You make the question for the gzip, but its seems that at the end they are not gzipped.
Second your pages come very fast, they are small, and the lag time is slow, that means that your call to sql is good.
I only see a problem on "banner.php" page that for some reason this is seams that make the delay. A Javascript make this call to banner.php and waits until get return, render it and continue.
Check this 2 issues to fix your slow load.
About the banner.php
Here is one of the calls that you page make
http://sharepointads.com/members/scripts/banner.php?a_aid=go4sharepoint&a_bid=ac43d413
and you make at least 9 of them !. in first page.
This page have 400ms lag x 10, plus delay to load and reder is the delay that you search for. and is not comming direct from you. You need to find some other way to load them...
I can suggest some other way but not I must go... maybe tomorrow
gzip
An external test to prove that your pages are not gzip. Just see the report.

When optimizing the html visible to the client, the server side is sometimes neglected. What about:
Server side Caching - from entire page to data caching
Reduce number of database queries executed. And once retrieved from the database, cache it.
Is your server hardware up to it? Memory, cpu?
EDIT:
And for completeness, here's the list from the performance section of the popular question What should a developer know before building a public web site?
Implement caching if necessary, understand and use HTTP caching properly
Optimize images - don't use a 20 KB image for a repeating background
Learn how to gzip/deflate content (deflate is better)
Combine/concatenate multiple stylesheets or multiple script files to reduce number of browser connections and improve gzip ability to compress duplications between files
Take a look at the Yahoo Exceptional Performance site, lots of great guidelines including improving front-end performance and their YSlow tool. Google page speed is another tool for performance profiling. Both require Firebug installed.
Use CSS Image Sprites for small related images like toolbars (see the "minimize http requests" point)
Busy web sites should consider splitting components across domains. Specifically...
Static content (ie, images, CSS, JavaScript, and generally content that doesn't need access to cookies) should go in a separate domain that does not use cookies, because all cookies for a domain and it's subdomains are sent with every request to the domain and its subdomains.
Minimize the total number of HTTP requests required for a browser to render the page.
Utilize Google Closure Compiler for JavaScript and other minification tools

Are you using JavaScript, and are these JavaScript files loaded at the very beginning? Sometimes that slows the page down... Minifying JS files helps reduce size, and if you can, load scripts dynamically after the page loads.
Using an approach like http://www.pageflakes.com can also help too, where the content is loaded after the fact.
Lastly, is it speed related to your machine or hosting? Doing a tracert in the command window can help identify the network traffic.
HTH.

Have you identified any slow running queries? You might consider running profiler against your DB and see if anything if running long...

Before you do anything to change the code, you need to figure out where the problem actually is.
Which component is it that is "slow"?
The browser?
The server?
The network?

A stackoverflow user actually has a good book on this subject:
http://www.amazon.com/gp/product/1430223839?ie=UTF8&tag=mfgn21-20&linkCode=as2&camp=1789&creative=390957&creativeASIN=1430223839
A couple of recommendations after looking at your site:
Put some of your static files (images, js, etc.) on different domains so that they can be downloaded at the same time. (also turn off cookies for those domains)
Use image sprites instead of separate images.
Move around when things are loaded. It looks like the script files for the ads are holding up content. You should make content of the site load first by putting it in the HTML before the ads. Also, make sure that the height and width of things are specified such that the layout doesn't change as things are downloaded, this makes the site feel slow. Use Google Chrome's developer tools to look at the download order and timeline of all your object downloads.
Most of the slowness looks like it's coming from downloading items from sharepointads.com. Perhaps fewer adds, or have them use space already reserved for them by specifying height and width.
Add a far future expires time to the header for all static content.
Serve scaled images. Currently the browser is resizing the images. You could save tons of bandwidth by serving the images already the proper size.
Also, download YSlow (from yahoo) and Page Speed (from google)

Another good post for performance.
Just check
http://howto-improveknowledge.blogspot.com/2011/11/performance-improvement-tips.html
which explain the how to find bottleneck for performance.

Related

Concurrent Downloads

I've been monitoring the Net panel of Firebug and noticed that the HTML has to be downloaded first before any other resources are downloaded. I guess this make sense since the other resources are defined in the HTML. Is there a way around this so that other components can be downloaded during the HTML download?
Debugging 101: what you see while debugging is different than what happens when you are not looking.
Most browsers start the HTML interpretation while downloading it, and start downloading the additional resources concurrently. Firebug is not a great place to see that happening, try HTTPFox instead.
Now, to answer your question: you don't need to do anything to make the browser download the other components while downloading your HTML, it'll take care of that for you.
No - the browser needs a parseable HTML document first before it can start downloading scripts, images, etc.
You can speed up downloading of the non-HTML elements by moving them to different subdomains though: Browsers have a connections-per-host limit which is circumvented by using subdomains. Additionally you could compress/minify your CSS/JavaScript files to reduce their size.
There is the potential for one to create a small HTML file that then makes several requests to fill in the rest of the page through various AJAX-like calls, but if someone has JavaScript disabled then the page may look really bad. In a sense this is taking out some of the original HTML content and having it be downloaded separately which may or may not be a good idea. In a sense though this is using more network resources as there would be many requests to fully load the page in this case but it is a question of what is an acceptable trade-off.

Why do people still use iframes? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 12 years ago.
For me iframes are pure evil (well, maybe not so pure). They seems to make a lot of troubles. Yes, your whole site will load once and then you can just load single pages. But people invented AJAX for this purpose.
One of the biggest issues I found with iframe was that I couldn't paste a link to one of the subpages, because the URL never changed (yes, I know there is a workaround for this). Second thing, web search engines may have problems indexing such pages.
Sometimes the accessibility of this sites are worse and some browser can even display them improperly.
There are better ways to design layouts without iframes. Everyday I can see some one asking at SO questions, like "How to access iframe with jQuery?".
So what are the benefits of iframes? What reason can it be to still use them? I just would like to know why.
I can think of 2 reasons (at the moment) why people would still use iframes instead of AJAX:
1) Iframes circumvent the cross domain origin policy (images, scripts, and styles do not). This can be useful for pulling in sites / content from other domain names relatively safely. Basically, this allows the advantage of being able to visually show data from other domains without letting them stomp all over your page with unlimited access (like something like JSONP would be able to do).
2) You can load multiple types of resources from within an iframe, not just certain mime-types (you're relatively limited to application/javascript, application/x-javascript, text/css, text/xml, image/png, image/jpeg, image/gif with scripts, XHR, images, and sources). For instance, if I want to show you a PDF, I can open an iframe and let the Adobe Reader plugin show you that file. Additionally, in the same domain, if I want to pipeline a script, style, and image all together (inline on the page, image would have to be data URI), I can accomplish this with an iframe (and if it's in the same domain, port, and protocol I can access it with JavaScript as well).
Did you know that Gmail is a set of iframes? The visible part is just clever positioning. Additionally, many OAuth implementation (Twitter, Facebook, Google, Yahoo!) usually use iframes to associate a user on their domain with a successful authentication URL (for after the user logs in).
IFRAMEs are used to embed and isolate third-party content into a website.
Most of web advertising solutions are based on iframes - because they give security (cross-domain policy) and isolated rectangle on screen which can be fully managed by third party content and scripting (a common use case is advertisments).
Another modern use of IFRAMES is a management of history (common back button workaround) of AJAX applications.
FRAMEs are poor version of IFRAMES. Their use is declining.
If a user has javascript disabled, iframes will work when ajax doesn't. This is not out of the question, considering that people use things like NoScript.
I use them on ajax websites, when I need to upload files without reloading the page.
I still see iframes being used in large corporations where they provide a single sign on which injects header information about the authenticated user which is then passed, via an iframe, towards the actual application(s). Since the "portal" surrounding the iframe handles all the specific authentication details those applications behind it don't need to have each an implementation for it, making things easier to make for the development team and having a single place to monitor and adjust authentication details of users.
There are plenty of technical reasons to use them (especially the security issue mentioned by Dan Beam).
What you shouldn't do is use iframes “like frames”, doing navigation to new pages by updating the iframe only. As you say, this prevents the navigation from being bookmarkable/linkable, responding to the normal navigation buttons, and providing useful link affordances like open-in-new-tab.
But that's not peculiar to iframes. You can see more and more pages where the navigation is done by fetching new content with XMLHttpRequest and writing it to the main content div's innerHTML. Often this is done with jQuery load() and clever-clever slidey animations. This breaks navigation just as badly as iframe-used-as-frame, or indeed old-school framesets. It's a shame so many web authors are using this tactic believing it to be a super-modern web design methodology, when really it's just a new skin on yesterday's despised framesets.
You can work around it in both cases, but it means you have to store a viewstate in the # fragment identifier part and support proper hash-navigation, which isn't trivial. Even then you've still got problems with non-JS agents like search engines; you end up having to have a parallel ?-based and #-based navigation to support both. It's a pain and most don't bother.
Framesets are outdated as of HTML 5, and sometimes you need to have a frame with another site within a site.
Also AJAX can only do so much. Try uploading a file to a site on another domain through https without an iframe. AJAX won't help you there.
In addition to others reasons, I have one specific usage of iframe in my application. Unfortunately, the target browser in my case is Internet Explorer 6. I need to have a footer and a header that are fixed in my web pages. The main part of this page is scrollable.
However, there is a bug in IE6 where I cannot display a div element on top of select elements using the z-index CSS property. Thus, I need to create an iframe that will be used as a hack to avoid this problem.
Of course, this is a really specific usage of iframe and only concern IE6...
Javascript WYSIWYG Editors use iframes, because that is easiest and best way to make it. For example TinyMCE uses it:
http://tinymce.moxiecode.com/
I was building a social network and i see iframes being useful for widgets to put on other peoples website to show like a mini profile or integrate with the content on a remote server. Seems like the most simple way to build this. I know some widgets use JavaScript. Also with the iframe method the session is the same as visiting the site like normal, so great for like buttons.
Many Formatted Text Editors (e.g. TinyMCE, HTMLArea) are implemented as iframe.
iFrames are okay for some cases, as X-domain-requests, or posting data to a source via parameters. But when I want to access data across domains, I prefer using CSS-files - they can accept params, set cookies, add content to the page (:before & :after) and give a visual feedback.

How to make a website run faster? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 months ago.
Improve this question
I'm into my first 3 months of web development and I have been dabbling with some server side scripting in the form of ColdFusion, along with some Javascript, JQuery and CSS.
I have read about CSS optimization and would like to know what are the other pertinent factors contributing to the better performance of a website. What all factors can a developer profile and optimize?
How much part does picking (or rather I should say recommending) a particular browser play in this performance hunt?
cheers
Install YSlow and Pagespeed plugins for Firefox. Then start looking at all of the ways your site is unoptimized. This will be much like trying to take a sip of water from a fire hydrant.
Using minified ( and possibly aggregated ) Javascript and CSS along with a good, healthy far-future-expires is a really good way to start.
Don't forget to gzip.
And use Etags.
And put your CSS at the top of the document.
And put the javascript at the end.
And use separate domains for static resources.
And avoid URL redirects.
And remove duplicate javascript and CSS.
And... see what I mean about the fire hydrant!
Just to sum the stuff up from above:
The speed of a website depends on a few things:
Server
Connection
Client
And on each of this part you can do improvements.
Server: if you rely on a database, check if you queries are cached, and more importantly check if your data is cached. For example if on every page you get a menu from the database, then you can cache that result. In addition you can check your code and see if there is room for optimization.
Also the hardware itself plays a role. If you are on a shared hosting plan, maybe the server is full of other not-optimized apps that take a toll on the server.
Connection: Here the YSlow and Pagespeed come in handy, as well as Fiddler. You can do some caching of static content (CSS and JS). Set their expire date far in the future. Using GZIP to make their contents smaller, and combining the static files helps to a certain extent.
In addition maybe the server has a low bandwidth.
Client: if you do wacky javascript or have slow css selectors, this might hurt performance on the client. But this depends on the speed of the client's computer.
I'd recommend reading Best Practices for Speeding Up Your Web Site and all of the content on Yahoo!'s Exceptional Performance page.
If you like books, you may be interested in High Performance Websites (note that a lot of the content in this is in the Best Practices for Speeding Up Your Web Site article) and Even Faster Websites.
Here are a few of my favourite rules from Best Practices for Speeding Up Your Web Site:
Minimize HTTP Requests
Add an Expires or a Cache-Control Header
Gzip Components
Make JavaScript and CSS External
Minify JavaScript and CSS
Also, smush.it is good for compressing images (which have a big impact on how fast a webpage loads).
As far as browsers go, Safari 4 claims it is "the world's fastest browser", and I can say that the Mac version is certainly nice and fast (not to mention elegant!). However, the above suggestions will make much more of a difference than which browser you're using.
Steve
With ColdFusion you will want to make sure your queries are being cached. Use query analyzer (if using mssql server) to make sure a slow loading page isn't the result of a bad query. On the database end you'll also want to ensure proper indexing.
A big factor in performance is how many HTTP requests are sent for images, files, etc. YSlow will show you this info. Its only available for firefox.
I'd recommend this book.
Google is currently collecting all sorts of performance tips on their new 'Let's make the web faster'-page here: http://code.google.com/intl/de-CH/speed/articles/
FYI: Not all information on these pages are valid, particularily the PHP tips are way off.
There is a really great plugin for for Firefox called Dust-Me Selectors. It scans your css files and lets you find selectors that aren't used/have become redundant in your markup.
https://addons.mozilla.org/en-US/firefox/addon/5392
You should also be delivering your static content off a CDN. Parallel downloads of static files will speed up your page renders. A better explanation here: http://www.sitecheck.be/tips/speed-up-your-site-using-a-cdn/
You shouldn't recommend any particular browser, but design your webpage to current standards with some fixes for older models, if necessary. From my perspective everything can have a speed impact, but CSS is the least important one and in real world examples the user will not notice this. In most cases a clear separation of html and style declarations will do the job. What really has an impact? First of all you can simply throw money at the problem by getting a better hosting contract (maybe a dedicated server). Another thing to improve the speed a website takes to load is to reduce the quality of your images and the usage of CSS-Sprites. Very often on dynamic webpages the database is a bottleneck and therefore caching and a good database abstraction layer can improve things (PHP: PDO instead of simply using mysql()). GZip your output to the user. There are so much more things, but a lot of them are very language dependent..
I recommend the use of FireBug and loadimpact.com for testing.
Less files are more - CSS sprites may be something to consider. In my experience, you have to balance your CSS file between speed and maintainability - one rule more or less won't make the difference between night and day...
For IE, see http://www.fiddler2.com/fiddler/Perf/
The new neXpert plugin for Fiddler (http://www.fiddler2.com/fiddler2/addons/nexpert.asp) offers features similar to those found in YSlow and PageSpeed.
The biggest problem I have is creating fast-running, beautifully designed pages with rich content. This is one thing that is extremely hard to do with today's technology.
If you have lots of javascript you might wanna use Javascript compression. Dojo provides one such tool SHRINKSAFE to compress your javascript. Find the link below:
http://www.dojotoolkit.org/docs/shrinksafe
There is another tool open sourced by google called page speed, which can help you optimize the website performance. It was used internally before it was open sourced to everyone recently.
http://google-code-updates.blogspot.com/2009/06/introducing-page-speed.html
http://code.google.com/speed/page-speed/
Hope it helps.
A couple of very basic rules of performance testing:
Performance means nothing if the program/web page/whatever is wrong.
Do not try to improve performance without having a reliable form of measurement.
You must profile your site/program/whatever to find out what is making things slow.
Corrolary: Do not just change things at random to see if things get better.
Cache everything (web server and browser caching).
Statically publish as much as possible (i.e. to reduce the amount of database calls)
Also add multiple waiting icons to your website.
Show icons in such way that every time user should get different waiting icon, which should be effective to engage user. And mean while your website will get loaded.

Does having to many background-image in css affect perfomance?

Some users are reporting that my site is too slow
And i think background images in css might be a possible
culprit
I have a site that uses a custom build system
to concatenate all css, compress them ( yui compressor ) , make css sprites
automatically ( smartsprites ) and I end up with a 9kb CSS for the
whole page, this includes all css for background-images at last is d they
were around 60 ( several go to the same sprite not sure how many )
I was wondering if the default behavior of browsers
is to download the images as needed ( when they match a selector )
or download them all.
Right now firebug in firefox seems to suggest that they are all been downloaded.
What techniques would you suggest i'd use to avoid the problem, and or
mitigate it.
edit:
I misread firebug, the images that are being downloaded belong to a lightview
that is hidden but the background-images are matched to the dom.
No, in fact it's better to put them in the CSS than in the markup.
The image calls will not block the page and as the images are loaded they will be rendered on the page so overall it is a good idea to load them via CSS. Not to mention that this design is also more flexible.
(It goes without saying that each of those images will take up bandwidth and extra HTTP requests)
the default browser behavior is to download two items at a time(i.e. 2 http requests), if you think you have lots of images create a sub domain for your images like images.yoursite.com and you will start seeing the browsers making parallel request, and you can see some improvement in performance
(Side topic. Not really answering your question. But might be interesting or even relevant.)
I think another possible culprit is that "some users" will always feel that your site is "too slow". (Maybe it's more of a Mental Breakdown than Stack Overflow thing? What do these users consider being a fast site? Can they give examples? How fast is their connection and computer? Where are they, and where is your server located? What exactly is slow? The signup process? Watching videos in HD? Scrolling the window? Loading Firefox? After all, it's humans.. n'est pas?)
Maybe take a closer look at the image(s) you're sending, particularly if there are a lot of them being compiled into a single "sprite" image.
What could be happening is that the image you're pointing to is very large. Sure, it should only be loaded once (the benefit of the sprite method), but if it's several hundred KB it could certainly cause some performance problems.
There's a nice firefox plugin called Yslow that gives you some useful information about performance optimization. It shows you performance issues it detected, and gives you links to articles suggesting an improvement.
http://developer.yahoo.com/yslow/
Some info on best practices
http://developer.yahoo.com/performance/rules.html

What might my user have installed thats going to break my web app?

There are probably thousands of applications out there like 'Google Web Accelerator' and all kinds of popup blockers. Then theres header blocking personal firewalls, full site blockers, and paranoid cookie monsters.
Fortunately Web Accelerator is now defunct (I suggest you read the above article - its actually quite funny what issues it caused) but there are so many other plugins and third party apps out there that its impossible to test them all with your app until its out in the wild.
What I'm looking for is advice on the most important things to remember when writing a web-app (whatever technology) with respect to ensuring the user's environment isnt going to break it. Kind of like a checklist.
Whats the craziest thing you've experienced?
PS. I may have linked to net-nanny above, but I'm not trying to make a porn site
The best advice I can give is to program defensively. For example, don't assume that all of your scripts may be loaded. I've seen cases where AdBlocker Plus will block 1/10 scripts that are included in a page just because it has the word "ad" in the name or path. While you can work around this by renaming the file, it's still good to check that a particular object exists before using it.
The weirdest thing I've seen wasn't so much a browser plugin but a firewall/proxy configuration at a user's workplace. They were using a squid proxy that was trying to remove ads by replacing any image HTTP request that it thought was an ad with a single pixel GIF image. Unfortunately it did this for non-GIF images too so when our iPhone application was expecting a PNG image and got a GIF, it would crash.
Internet Explorer 6. :)
No, but seriously. Firefox plugins like noscript and greasemonkey for one, though those are likely to be a very small minority.
Sometimes the user's environment means a screen reader (or even a braille interface like this). If your layout is in any way critical to the content being delivered as intended, you've got a problem right there.
Web pages break, fact of life; the closer you have been coding and designing up against standards, the less your fault it is.
Something I have checked in the past is loading some of the more popular toolbars that people tend to install (Google, Yahoo, MSN, etc) and seeing how that affects the users experience.
To a certain extent it is difficult to preempt which of the products you mentioned will be used by your users since there are so many. I would say your best bet is to test for the most frequent products that your user base may employ and roll with the punches for the rest. If you have the time to test other possible scenarios, by all means do.
Also, making it easy for your users to report possible issues also helps lessen the time it takes to get a fix in place should it be something you can work around.

Resources