how to check if Leverage browser caching is enabled - css

I Was wondering if there exist an method using which i can check if an website has Leverage browser caching enabled or not and if enabled then for how much time
example an css file link like http://foo.com/foo.css or an image link http://foo.com/foo.img
now how to know if Leverage browser caching is enabled in them and its configuration.

you can use pageSpeed From Google to help you search for the Laverage browser caching.
This website: http://gtmetrix.com/leverage-browser-caching.html can create a insight of all the performance you can gain by optimizing your website.

Check if browser caching is enabled (and expiration) https://www.giftofspeed.com/cache-checker/
also, GTmetrix may warn "Leverage browser caching for the following cacheable resources:" for .SVG files (ignore if so)

Related

gzip compression through cloudfront blocks loading through an iframe

I recently enabled website compression through CloudFront that compress all compressible files (html, js, css).
It seemed to work fine on each browser I try it from but recorded browser tests failed over and over (both in datadog and in ghost inspector).
This is the error I get "Your website does not support being loaded through an iframe".
Did anyone came across with something similar to this?
I will appreciate any help! :)

how to fix Enable compression for t.dtscout.com/…i google insights

I'm try to optimize my site & get a good score in Google insights Test
But Now it's showing this
Enable compression for the following resources to reduce their transfer size by 1.1KiB (55% reduction).
Compressing http://t.dtscout.com/…i/?l=http%3A%2F%2Fkatmoviehd.co.in%2F&j= could save 1.1KiB (55% reduction).
Hide details
Test Site - KatmovieHD.co.in/
Gzip compression is already enabled and I don't know how to Fix this Issue ..
Issue Image
Simply remove Histats counter from your website and then the problem will be solved. Histats uses a very aggressive tracking system that can cause your page sometimes to load forever although your website has already loaded. Not to mention other downsides like some antiviruses that can block access to your website. I highly recommend Google Analytics as they do provide same features and even more.

Google pagespeed and image_rewrite for direct requests

I have successfully installed Google pagespeed (it took me a while to get it up and running with nginx).
I'm using Meteor as my framework and would like to use image_rewrite. The thing with Meteor is that the HTML gets rendered on the client. Obviously the typical scenario will never pass by pagespeed, meaning pagespeed cannot optimize the page.
What I would like to do is that pagespeed optimizes the image on the http call for that image, but for some reason this does not work. Example:
If i open http://mydomain.com/myimage.jpg in the browser that image should be "pagespeeded" by all the filters that are active. Unfortunately this does not work. Any ideas?
It sounds like you want to turn on InPlaceResourceOptization:
pagespeed InPlaceResourceOptimization on;

Lack of CDN availability

I use both Telerik and Microsoft CDN, for their respective AJAX toolkits. Both work great 99% of the time. However, I was working out of two different cafes recently and went to visit my site: The first cafe did not permit the Telerik CDN, while the second one does not allow the Microsoft CDN as a URL request. I can actually see the status bar in IE shows "ajax.microsoft.com" as the file being retrieved as I am waiting for the website to load.
Lack of CDN access seems to be a very unusual problem. In fact, I cannot fathom why such URL requests would be blocked when the cafe seems to permit pretty much everything else. Any reason? Could this be an availability issue at the respective CDNs themselves (ie how reliable are these CDNs)? And of-course, is there a recommended fix, apart from discarding CDN use?
Update: I can now connect to my app. So my lack of access to ajax.microsoft.com was most likely a temporary lack of MS CDN availability, and not any domain blocking.
all you need to do is implement fallback to your local server, explained here, http://happyworm.com/blog/2010/01/28/a-simple-and-robust-cdn-failover-for-jquery-14-in-one-line/
The Telerik online demos use the CDN by default, but fallback to embedded resources if the Amazon cloud service is unavailable. If you have the RadControls for ASP.NET AJAX installed locally, then you can see the source of the demo site. The files that you need to review are ~/Common/Footer.ascx and its code file ~/App_Code/QuickStart/Footer.cs, also
~/App_Code/QuickStart/QsfCdnConfigurator.cs ~/App_Code/QuickStart/HeadTag.cs. The Footer files set a cookie using JavaScript, depending on whether the CDN is available and the last two files provide support for reading the cookie on the server side and setting the appropriate configuration for the script manager.

Caching Typekit CSS

I'm using TypeKit to provide fonts for a site I'm developing. When the page is loading it loads slow (more than a second). Turns out that this is because it's downloading the fonts on every requests. It's beyond me that a service such as this doesn't have etags configured to get clients to cache the fonts...but I digress. Until TypeKit fixes this I host the CSS locally temporarily.
Anyone had this issue with TypeKit? How did you work around it? Perhaps I'm wrong?
According to a posting on their getsatisfaction.com account, they have at least some caching in place:
One thing to note is that although the fonts are served with an Expires header, they're also served with an Etag. The browser is required to make a request after 5 minutes, but will normally use the Etag to generate a 304 (Not Modified) response - meaning, the fonts aren't actually downloaded again.
can you check what happens using Firebug?

Resources