Programming grunt-rev/grunt-usemin to use commit-id - gruntjs

The default behavior of grunt-rev is to evaluate the given resource and put a hash of the content in the path, so /images/sprites/rewards.png becomes /images/sprites/f936712a.rewards.png
I don't want that. I've got several revisions being served simultaneously and I want to be able to delete old revs at will, so I'd like to rename it to /2013-06-10-e75607/images/sprites/rewards.png (where e75607 is the commit-id for the whole revision, nothing to do with the individual file).
Is this a possibility with grunt-rev and grunt-usemin? Is there an equivalent tool that would do it?
Edit
Several people have asked me why not use the hash of each file. Let me explain:
In old-fashion websites, in response to pretty much any user input, the browser is reloaded, the back-end generates a brand new page from the input, the HTML is sent to to the browser, which displays the page. It's a slow, CPU- and bandwidth-intensive process, but it does have one advantage: all the assets that are ever loaded are loaded together, within a few seconds.
In a more modern website, the page that is loaded describes the whole application. When the user makes some input, the Javascript on the page renders new DOM elements and loads new assets as needed. The page as a whole is rarely or never reloaded The site is enormously more responsive in consequence (and much easier to develop, more secure, cheaper to run, and on and on) but has the corresponding disadvantage: an asset might be loaded hours or days after the page is loaded.
Say, you visit the site at 10 am, when the site is running version cd1d0906. At 10:30, the site is upgraded to version 4b571377. At 11 am, you press a button that causes a popup that is rendered with a image called sprite.png. Obviously, you need the cd1d0906 version of sprite.png -- not the 4b571377 version.
Therefore, a well-maintained site will continue to offer old versions of all assets for several days after the versions have changed over. The easiest way to do that is to keep all the assets in a directory that is named after the version.
The complaint that this "unnecessarily" discards cache entries for unchanged files is rather unconvincing. Most deployed assets are CSS files, JS files, and sprites, all of which are compilation of many smaller files. It's a rare deployment that doesn't changes one CSS file and one JS file and one sprited image. The cache is rarely valuable after a version change.

Related

Slow Product Category Page WooCommerce - Need Speeding Up

I have installed and customized WooCommerce Product Pages on my WordPress site, but one of the product category pages takes about 7 seconds on average to load. Other category pages load in around 3 seconds. I am struggling to find the reason for this. There are less products on this page than other pages and less sub-categories. I have installed plug-ins such as 'W3TC' and 'Better WordPress Minify' but it hasn't made much difference.
Has anyone else experienced an issue like this and if so, would you mind sharing how you resolved it?
Any help would be greatly appreciated.
Thanks
Using caching plugins is fine and dandy but the reason these pages load slowly is simply the data model that WordPress uses, post-types and the metadata look-ups. The only way to truly get speeds up is good hosting and turning on Object Cache on the server.
We enable this on a WP-Engine site and it was night and day. 12 seconds turned into 2.5 seconds.
Object caching
Object Caching is designed to capture queries to the database and store them in memory. This allows you to run an "expensive" query - a query that takes a long time - one time, and then reuse the results again. When used properly, Object Caching can give your site a speed boost by reducing the time that is spent accessing the database. Please note that this change can take a while to take effect.
There can be many reasons for a WordPress pages to load slower. But you problem seems to be unique.
Here are some useful tips by which you can speed up your page loading:
Optimize Your Images
The page on which you are having issue might have High Resolution Images.
Avoid displaying flash on your Page
Avoid too many advertisements
Cut off the Unnecessary ads from the page.
Do not use inline cascading style sheets
Besides utilizing inline cascading style sheets make a CSS file and call up file on all page of your site that will likewise help in repressing download speed.
Put stylesheets at the top - Put scripts at the bottom
Utilize javascript at the bottom of the page this will serve to load up your page fast. When web browser download javascript it will finish downloading your internet site data, and so any analog downloading will end while browser request Javascript downloading.
Use CSS Sprites
A CSS sprite is an an image comprised of other images used by your design as something of a map containing the coordinates of all the images. Some clever CSS is used to show the proper section of the sprite when your design is loaded.
Here you do not have to load multiple images which are used on you site. Just loading of a single sprite image will do all your work.
Limit Your External Scripts
There might be a issue that external script is being loading on that page. You need to check and limit the same.
Add LazyLoad to your images
You can use this technique to load the page part by part.
Control the amount of post revisions stored
I saved this post to draft about 8 times.
WordPress, left to its own devices, would store every single one of these drafts, indefinitely.
Turn off pingbacks and trackbacks
Let me know if the problem resolves using these tips for you site.
The list of suggestions that WisdmLabs mentions above is great!
However, I'm not sure if you've seen the plugin for Wordpress called W3 Total Cache. It has a load of built in functionality to automatically improve the performance of your Wordpress web pages.
It's free and worthwhile using if you are looking to improve the performance across your whole site.
https://wordpress.org/plugins/w3-total-cache/

What is causing such a long wait time on my blog page?

I have recently developed 4 websites in WordPress using the rtpanel theme framework.
When I put the websites live, I noticed that a couple of them upon clicking through to the blog page, are taking up to 25 seconds to load. (see link)
http://tools.pingdom.com/fpt/#!/brQN7J/www.exactabacussoftware.com/blog
Can anyone tell me what is causing this long wait? If i change my theme back to twentytwelve it loads fine and the same applies on the other sites eg: http://www.exactabacusfulfilment.com/blog
The two examples are both running on the same server using the same theme but I cannot find out what is slowing the software site down so much.
Any help would be greatly appreciated!
It seems that PHP execution is taking lot of time. The analysis of your site shows that it takes around 22 seconds to generate HTML.
There could be few reasons why php execution is taking time:
You have activated some plugin which is causing your site to slow down.
There may be some theme component which is causing your site to load slow.
Your database queries are taking long to execute. (If this is the case, check why this is happening and you can enable Memcache to cache mysql queries)
Install and activate P3 (Plugin Performance Profiler) on the website and find out which component of your site is taking the performance of the site down. To debug in detail, you can also try Query Monitor plugin.
Once the issue tracked down and resolved, you may activate PHP-APC on your server if you do not make changes to the code.
There are couple of really easy ways to investigate the reasons behind slow loads:
Google Page Speed. Basically you feed the URL and you will get list of suggestions how to improve the page load speed. Here is the url for testing for your site:
http://developers.google.com/speed/pagespeed/insights/?url=www.exactabacussoftware.com%2Fblog
The first thing that I notice is very high server response time (in my tests between 0.5sec and 1.6sec). This means that it takes at least 0.5+resource download time sec to load every image, javascript etc. If you have 100 resources this will take you 50 seconds or so which is a lot. So you might want to look for hosting alternatives.
Google page speed will give you more details what could be fixed and improved look through it and try to solve those issues. It should help you to improve your speed quite a bit.
Another option is Google Chrome Developer Tools, Firefox Firebug or similar tools. Just open Network tab and reload the page, you will be able to see how long it takes to load one or another resource of your page.
Another option is Google Chrome Developer Tools, Firefox Firebug or similar developer tools. Just open Network tab and reload the page, you will be able to see how long it takes to load one or another resource of your page.
Building on that.
It looks like there is 2seconds of latency before your server even answers the first GET request --- followed by another 2 seconds that contain 84 more GET requests.
Now, a 4 second load time isn't awful, but if you want it to go faster, the best thing you can do is:
1). Combine all of your javascript files into one file - making sure jQuery/other dependencies are first.
2). Combine all of your PNGs into one file -- a sprite -- or, alternatively, Base64 encode them all.
3). A lot of those pngs could be compressed --- 5kb for an icon is a bit big. 66kb for an image is certainly too big.
4). Same thing with your CSS -- combine them all, and there will be fewer requests.

Best way to let browsers refresh from cache on a live website?

It's about making changes in design (css-files and images) on a website which is already online and in use. I wonder what is the best-practice to make sure that visitors see the changes without clearing there browser's cache manually. Things that came in my mind:
change meta-tag - dismissed because I do not want the site to be ALWAYS loaded from the server
include the css-file with a parameter (like timestamp) after made a change
change the names of included images so that they are reloaded - means also change names in the files where images are included
?
What else could achieve the loading from server? Did I forget some advantages/disadvantages?
Possible duplicate of this post: How to control web page caching, across all browsers?
My favoured solution is to set a random number after you call the file e.g.
css/styles.ccs?628454548
images/sprite.gif?8356484894
You could use javascript/php or whatever to set those random numbers every time the page is called to the browser.

Solr and faceting performance in MVC3 application

continued from here..
In the application we are currently writing we have the ability like IStockPhoto to search for images.
In the case of IStockPhoto, after it searches only the view where the images are loaded are refreshed while the other sections of the site remains.
For us, when we search or click pagination, we load the whole site and this comes with quite a performance hit. Since we are loading the scripts, images and facets (and since many of them are nested inside another it has to be recursively loaded) every time someone searches or wants to go to a different page number.
I was thinking of creating an AJAX control for our section where we load the images so we don't have to refresh the whole page. Hence, no need to reload js files, and no need to recursively load facets (I think this is where we have the most performance hit)
So the question is, should I try to create the AJAX control for retrieving stored images? And if so, are there a lot of changes I have to make in my Views and Controllers?
Or would there be some way I can optimize our loading time?
ps. When Solr is hit the first time, the page loading takes about 15-20 seconds (if my memory serves me correct) and afterwards, it takes about 3-6 seconds to load a site with 25 images in the page. Is that normal? It takes about 1-2.5 seconds to load a section in iStockPhoto.
Thank you...

ASP.NET: How to enforce a reload of a web static file

When doing webpages, the client/browser decides if it updates a file like an image or .css or .js or if it takes that from the Cache.
In case of .aspx page it is the server who decides.
Sure, on IIS level or also using some HttpModule techniques I can change the headers of requests to tell the client if and how long a file should be cached.
Now, I do have a website where the .aspx goes hand-in-hand with a corresponding .js. So, perhaps I have some jQuery code in the .js which accesses an element in the .aspx. If I remove that element from the .aspx I would also adapt the .js. If the user goes to my page he will get the new .aspx but he might still get the old .js, leading to funny effects.
My site uses lots of scripts and lots of images. For performance reasons I configured in the IIS that those files "never" expire.
Now, from time to time a file DOES change and I want to make sure that users get the update files.
In the beginning I helped myself by renaming the files. So, I had SkriptV1.js and SkriptV2.js and so on. That's about the worst option since the repository history is broken and I need to adapt both the references and the file name.
Now, I improved here and change only the references by using Skript.js?v=1 or Skript.js?v=2.
That forces the client to refresh the files. It works fine, but still I have to adapt the references.
Now, there is a further improvement here like this:
<script type='text/javascript' src='../scripts/<%# GetScriptLastModified("MyScript.js") %>'></script>
So, the "GetScriptLastModified" will append the ?v= parameter like this:
protected string GetScriptLastModified(string FileName)
{
string File4Info = System.Threading.Thread.GetDomain().BaseDirectory + #"scripts\" + FileName;
System.IO.FileInfo fileInfo = new System.IO.FileInfo(File4Info);
return FileName + "?v=" + fileInfo.LastWriteTime.GetHashCode().ToString();
}
So, the rendered .js-Link would look like this to the client:
<script type='text/javascript' src='/scripts/GamesCharts.js?v=1377815076'></script>
The link will change every time, when I upload a new version and I can be sure that the user immediately gets a new script or image when I change it.
Now, two questions:
a) Is there a more elegant way to achieve this?
b) If not: Has someone a guess how big the performance overhead on the server would be? There can be easily 50 versioned elements on one page, so for one .aspx the GetScriptLastModified would be invoked 50 times.
Looking forward to a discussion :)
There are a few different answers to this question.
First of all, if you have files which only change every once in a while, set the Expires and Cache-Control headers to expire in one year. Only if the files truly never expire should you say that they never expire. You're seeing the issues with saying "never expire" right now.
Also, if you are having performance issues on your site from serving up lots of images and JavaScript, the commonly accepted solution is to use a CDN (Content Delivery Network). There are many different providers and I'm sure that you can find one that meets your budget. You'll also save money in the long run as the CDN will offload a great deal of I/O and CPU time from IIS. It's astounding how big of a difference it can make.
Lastly, one way to make sure that users are getting the latest for your files which almost never change is to implement some sort of versioning scheme in your assets URLs to make cache busting happen. There are many different ways to do this, but one (very naive) way to do it is to have a version number that increases every time you deploy to your site.
E.g. all your asset URLs will look like /static/123/img/dog_and_pony.jpg
Then, next time you deploy to your site, you increase the version number so that it's "124". You would need some way to keep track of the version, dynamically injecting it into asset URLs, as well as making sure that the version number changes every time you deploy. The idea being that anything referencing this asset should automatically know the new version number.
In terms of performance, it's an admirable goal to never need the user to refresh or have to download the same thing twice. But sometimes it's just a lot less hassle, and if users are only refreshing everything periodically, that's probably okay for most websites.
Hope this helps.

Resources