I'm trying to optimize my site and using GTMetrix to see which requests that takes a lot of time. And I can see that the heaviest file is the cache file generated in wp-content/cache/min/1/78d81b6108cf10884c20ab592c9fbb46.js, is that as it should be or is it too large?
After finally working out how to type your domain name (ignorant british man that I am!) I get similar times for that script and https://xn--bstaelscootern-5hb.se/wp-content/cache/min/1/4fac67378f86d8e54c6af5529b17e3d5.css.
It is possible that the cache hadn't been fully created when you ran your test so it took longer as it was being generated 'on the fly' and some of the downloading time was actually waiting for the file to be generated.
If you install a cache plugin you still have to load the page sometimes for it to generate the cached files.
If this was the first run after enabling the plugin then it would be slow, same goes for if you made a change to the file and then ran first time as it needs to regenerate the cached version of the file.
On the runs I have done it loads in the same amount of time as similarly sized files.
Related
I am in the process of assisting a client build a site at ballershoesdb.com.
About 50% of the time that I try to load the site (either load the front end or access the WordPress backend), it takes 20 seconds or more to load a page. I have had various people in different places across the world test the site and most report no issues, though some have occasionally reported significant slowdowns. A test on Pingdom shows an issue with browser caching but I doubt that is the cause of what I am seeing.
What steps can I take to determine the source of this problem?
Generally your approach of running tests with Pingdom (or others, such as Gomez or Catchpoint) can be incredibly helpful to determine problem areas. In my initial assessment, I saw that there was generally a slow response for the scripts and stylesheets coming from your server and it wasn't always the same script causing issues. For instance, sometimes it was bootstrap.js and sometimes not.
The fixes I would recommend to try are:
Load your content from a CDN
Use different domain names that rotate (each browser will only open a fixed number of connections to each domain and it varies by browser)
Put all of your scripts in a single file if possible, and minify them.
Ensure all of the files you're downloading are needed (you seem to have multiple themes loading.)
Determine whether you need to load all of the scripts in your HEAD, as this blocks page rendering.
You can try https://testmysite.thinkwithgoogle.com/. It tells you, where your site has performance issues and gives you hints on how you can fix them. You can use this as a starting point.
A similar free online tool can also be found at: https://gtmetrix.com/
I'm running simple_html_dom to index links from a few websites. Sometimes I will come across a link that is actually a download file. For example, this link: "example.com/survey.html" - is actually a 500MB zip file download. It causes simple_html_dom to get stuck on the URL for minutes at a time and maxes out the memory. I tried setting the MAX_FILE_SIZE limit by using:
define('MAX_FILE_SIZE', 2000000);
This doesn't work... How can I get around these download files?
Thanks
i've got an issue with website made in asp.net. A site is published and online, i've made some modifications, republished site on my computer and just uploaded a .aspx file into the server via ftp.
First time it seems to have worked after a while. But i've made a small error and want to edit it again, i did the same, but it wont change. Could it be that i need to wait some time before changes are seen? Or could it be that there needs to be a server restart or something?
If you've edited something in the aspx.cs page you will need to upload the bin directory to the remote site, or better still republish the whole site.
If it is a change to the .aspx, css or javasctipt file, the original will most likely be cached in your browser. Try a differrent browser brand or refreshing the page, ctrl-f5 does a complete refresh.
If this error was by any chance a CSS mistake, that can be easily fixed by adding a "?" at the end of the address since CSS files are normally stored in the cache of the browser and the ? tells the browser to update them. Same thing is true about JavaScripts which are kept in individual files
I'd recommend you to use the Visual Studio Publish Website under the Build instead of manually uploading the site over FTP. That built in publisher provides you many advantages of which one of them is the same issue you have faced. When you make a small change, fixing the error in host would be very faster by republishing the site that way rather than manually upping it over FTP.
I have about 10 referenced CSS and JavaScript (JS) files on my website and I can combine them so there's only 2 CSS and JS files for the web browser client to download. Combining the files would decrease the number of HTTP requests and decrease page load times.
My question is: if those 10 CSS and JS files are cached after the first page load, would the 2nd page load time be just as fast as the page that has only 2 combined CSS and JS files? I would think so because cached files would no require an HTTP request. I understand that there are command scripts that can automate combining CSS and JS files for deployments but doing that would complicate future junior programmers/web designers that may work on website. The website is for a charity organization and there's very little control and available human resources to keep things in check and in order.
Update: My assumptions that using a cached file do not require an HTTP is wrong.
Yes, you should reduce number of requests even if resources are cached. Two reasons:
1: There is a small performance hit (per file) even for cached files
If you profile your website, you will see that for every resource there is still a roundtrip being made to the server. If the resource has not been updated the server will return HTTP status code 304 - Not Modified, otherwise it will return HTTP status code 200 (and the resource itself).
Here's a sample trace produced by profiling Stack Overflow homepage CSS files:
As you can see, even though the CSS file is cached, a request was made to make sure the cached version is up to date.
2: First-time visitors will also benefit from this optimisation
It will improve page load speed for first-time visitors and anyone with a stale cache.
Yes you should, browsers will get content size to determine whether the file has been changed or not. So no matter if browser already cached it ,it will always send http requests.
It's true that as soon as the files are loaded and cached, it doesn't matter how many files the css and js are split into, the speeds will be the same.
Of course, every time a user clears their cache the 10 files will be served again, thus causing more http requests.
In general, it's better to have as few files to server as possible, especially when your site is visited by a large number of users. Whenever you can save on http requests you should.
I'm on VPS hosting with dreamhost and am experiencing very high page load times. Here is the output from Devel module for mysql queries.
Executed 190 queries in 227.67 milliseconds. Page execution time was 21969.43 ms.
Using the module profiling at http://2bits.com/articles/measuring-memory-consumption-by-drupal-bootstrap-and-modules.html it seems ok:
total: 304.15
So if the modules are taking 304ms and the mysql is taking 227ms, where could the other 21 seconds be going?!
Here is the url http://5oup.net
As always any help very much appreciated!
James
You are not compressing your JavaScript of CCS files, it shouldn't be the cause to such a slow page load. It seems that you have your site setup for development mode, which is quite ineffective for a production site.
I tried browsing around, and I didn't find any page that was as slow as you describe. But the point above is a major point for performance improvement.
Some ad hoc testing on the home page gives me about 8-12 seconds per request (forced reload to exclude local caching). According to firebug, the biggest waits are due to loading all the images for the rotation, closely followed by the separate and uncompressed css and js files.
With local caching, this goes down to 1-4 seconds, with most of the time being spent on waiting for the server to actually respond with a '304 - not modified' for all the files involved.
So your first goal should be reduction of the number of separate files:
For the js and css files, combining them into single files and turning on compression should already help quite a lot - check your site settings under admin/settings/performance.
For the rotation images, this would require more work, as you would either have to combine them into a sprite or add the logic to only load one with the page itself and pull the additional ones later on via js.
You should try the css/js combination first and see if you really need more tweaking after that.
I found the very high page load on the home page was down to simplexml_load_file(), which for some reason was not enabled on my host.