will the use of multiple subdomains speed up my website? - asp.net

i am considering moving my images to a subdomain on my website, and i read somewhere that moving the script to a different one would make it even faster! is it really true? or should i just leave it at what it is if i am not considering a real CDN?

Yes and no. The site itself won't be faster, but it may load faster in most browsers and thereby it may seem faster.
The reason is that most browsers limit themselves to a set maximum of concurrent connections to a domain. Say you have your site on www.mysite.com. Now when your browser tries to download your html, css, scripts and images it may need to download 20-30 files from the server. Since the browser limits itself to, say 4, concurrent connections to your domain the browser will have to download only 4 files at one time.
Now if you serve your css files on a separate subdomain css.mysite.com, your images on images.mysite.com and scripts on scripts.mysite.com your browser can open 4 concurrent connections to each of the domains. Hence it can download up to 16 files at the same time. If your banwidth allows it this may cause the page to load faster.
So your site may appear to be faster for the visitor, but the reason will be loading times, not any speedup of code or database access.

Related

First Byte Time scores F

I recently purchased a new theme and installed wordpress on my GoDaddy hosting account for my portfolio. I am still working on it, but as of right now I sometimes get page load speeds of 10-20seconds, and others 2 seconds (usually after the page has been cached). I have done all that I believe I can (without breaking the site) to optimize my performance speed (reducing image sizing, using a free CDN, using W3 Total Cache, etc).
It seems that my main issue is this 'TTFB' wait time I get whenever I go to a new page that hasn't been cached yet. How I can fix this? Is it the theme's fault? Do I NEED to switch hosting providers? I really don't want to go through the hassle of doing that and paying So much more just to have less than optimal results. I am new to this.
My testing site:
http://test.ninamariephotography.com/
See my Web Page Results here:
http://www.webpagetest.org/result/161111_9W_WF0/
Thank you in advance to anyone for your help:)
Time To First Byte should depend on geography. I don't think that's your problem. I reran your test and got a B.
I think the issue is your hosting is a tiny shared instance, and you're serving static files. Here are some ideas to speed things up.
Serve images using an image-serving service. Check out imgix which is $3/m. It could help in unexpected ways serving images off an external domain depending on HTTP protocol version and browser version, and how connections are shared.
Try lossy compression. You lose some image detail, but you also lose some file size. Check out compressor.io for an easy tool.
Concatenate and minify scripts. You have a number of little javascript files that load individually. Consider joining them together and minifying. I don't know the tool chain for Wordpress, perhaps there's a setting?
If none of that helps, you should experiment with different a hosting choice.

Optimizing Page Load: multiple CDN files or single CDN file?

I'm trying to optimize my page loads. Currently, I have multiple resources being pulled from various CDNs (e.g. jquery, etc.). In all, I have about 10 different JS files and 10 different CSS files. Around 50-75% are available on CDNs.
When I run PageSpeed/YSlow on it (via GTMetrix), it complains about me having too many resources and that I should combined the files. I combined the JS files into a single file and did the same to the CSS files (later, I will serve these from a CDN). When I re-ran the tests, my page load time went from 2.19s to 1.87s. It would appear that combining the files and serving locally is faster than separate files served from CDNs.
I could not find any definitive tests showing that combining files and serving locally is better than separate files served from a CDN is superior. I can only guess at this point that once I put the combined file on a CDN that things will get even faster.
Is combining files a superior approach?
You have two dimensions for improvement:
Combine Resources
In general fewer requests mean fewer roundtrips and better page load speeds. (Ref: Yahoo: Minimize HTTP Requests - Best Practices for Speeding Up Your Web Site)
Serve Resources via CDN
A content delivery network will typically have multiple data centers and can server content closer to your site visitor. (Ref: Use a Content Delivery Network - Best Practices for Speeding Up Your Web Site)
You can apply both principals and test to see how your site improves. I would definitely start with combining resources and you'll probably see a good gain there.

Does reducing the number of http requests matter if they are cached?

I have about 10 referenced CSS and JavaScript (JS) files on my website and I can combine them so there's only 2 CSS and JS files for the web browser client to download. Combining the files would decrease the number of HTTP requests and decrease page load times.
My question is: if those 10 CSS and JS files are cached after the first page load, would the 2nd page load time be just as fast as the page that has only 2 combined CSS and JS files? I would think so because cached files would no require an HTTP request. I understand that there are command scripts that can automate combining CSS and JS files for deployments but doing that would complicate future junior programmers/web designers that may work on website. The website is for a charity organization and there's very little control and available human resources to keep things in check and in order.
Update: My assumptions that using a cached file do not require an HTTP is wrong.
Yes, you should reduce number of requests even if resources are cached. Two reasons:
1: There is a small performance hit (per file) even for cached files
If you profile your website, you will see that for every resource there is still a roundtrip being made to the server. If the resource has not been updated the server will return HTTP status code 304 - Not Modified, otherwise it will return HTTP status code 200 (and the resource itself).
Here's a sample trace produced by profiling Stack Overflow homepage CSS files:
As you can see, even though the CSS file is cached, a request was made to make sure the cached version is up to date.
2: First-time visitors will also benefit from this optimisation
It will improve page load speed for first-time visitors and anyone with a stale cache.
Yes you should, browsers will get content size to determine whether the file has been changed or not. So no matter if browser already cached it ,it will always send http requests.
It's true that as soon as the files are loaded and cached, it doesn't matter how many files the css and js are split into, the speeds will be the same.
Of course, every time a user clears their cache the 10 files will be served again, thus causing more http requests.
In general, it's better to have as few files to server as possible, especially when your site is visited by a large number of users. Whenever you can save on http requests you should.

subdomains for the images ,videos and others assets in the web application

what is the benefit from making subdomain for images in my website?
is it regarding faster downloading or server caching . I noticed that all big sites like facebook and others are doing something like that for images , videos.....
It's a common recommendation for high load, high performance websites: Serving images off different servers can improve your overall performance significantly.
Browsers generally only keep 2-5 connections to the same server, so if your webpage has 10 images, the browser will download 2 to 5 of them simultaneously, while the rest waits. Using different subdomains will "trick" the browser, and it will open additional connections, which reduces the overall page load time.
Using a CDN for images, stylesheets and other static content, can also take some load of your webserver, which is another benefit.

Very large drupal page execution time

I'm on VPS hosting with dreamhost and am experiencing very high page load times. Here is the output from Devel module for mysql queries.
Executed 190 queries in 227.67 milliseconds. Page execution time was 21969.43 ms.
Using the module profiling at http://2bits.com/articles/measuring-memory-consumption-by-drupal-bootstrap-and-modules.html it seems ok:
total: 304.15
So if the modules are taking 304ms and the mysql is taking 227ms, where could the other 21 seconds be going?!
Here is the url http://5oup.net
As always any help very much appreciated!
James
You are not compressing your JavaScript of CCS files, it shouldn't be the cause to such a slow page load. It seems that you have your site setup for development mode, which is quite ineffective for a production site.
I tried browsing around, and I didn't find any page that was as slow as you describe. But the point above is a major point for performance improvement.
Some ad hoc testing on the home page gives me about 8-12 seconds per request (forced reload to exclude local caching). According to firebug, the biggest waits are due to loading all the images for the rotation, closely followed by the separate and uncompressed css and js files.
With local caching, this goes down to 1-4 seconds, with most of the time being spent on waiting for the server to actually respond with a '304 - not modified' for all the files involved.
So your first goal should be reduction of the number of separate files:
For the js and css files, combining them into single files and turning on compression should already help quite a lot - check your site settings under admin/settings/performance.
For the rotation images, this would require more work, as you would either have to combine them into a sprite or add the logic to only load one with the page itself and pull the additional ones later on via js.
You should try the css/js combination first and see if you really need more tweaking after that.
I found the very high page load on the home page was down to simplexml_load_file(), which for some reason was not enabled on my host.

Resources