I know Google AMP Cache caches valid AMP pages and resource files of a website and make them available via .cdn.ampproject.org/. I made some tests and it works fine for my website.
I work for a popular website with terabytes of throughput per day. What happens if I make all my image files available via google AMP Cache? Will AMP Cache just serve my files for free?
Is this a free CDN service? If it is free, why are companies paying for CDN services?
My boss just don't buy the idea that google AMP cache is a free CDN. He asks "what if the service is discontinued abruptly?" or "what if they start to charge the throughput?". Is there any gotcha?
Does anyone know about a big company currently using the AMP Cache as a CDN?
Using the Google AMP Cache for serving images for display outside of documents served from the Google AMP Cache is not a supported use and may not work, or may stop working in the future.
[Tech lead of AMP here]
You can use AMP cache for contents on your AMP pages for free, there is a limitation on size, from FAQ:
Are there size limits on resources?
Yes, the Google AMP Cache does not fetch any resources (i.e., HTML,
images, fonts) that are larger than 12 MB. In these cases, the Google
AMP Cache returns a 404 error.
Does anyone know about a big company currently using the AMP Cache as a CDN?
Just to calm your boss, AMP pages is being used by prestigious online publishers around the world including The Atlantic, The Washington Post, Vox Media, BuzzFeed, The Guardian and The New York Times and Twitter as well.
But I'm concerned if there are limits on bytes transferred or number of client requests.
- The batchGet method has a default limit of 50; request up to 50 AMP URLs each time you call this method. This is found in AMP Usage Limits.
Related
I have been using the wordpress Litespeed plugin on my site Trailer y Estrenos CO. I have the Chemicloud hosting and it uses the Litespeed server, which according to them, it works super well if is used with the plugin of the same name.
On top of that, you started using Cloudflare and I integrated it into the Litespeed plugin.
My problem is the following:
When I measure how optimized the page is after having installed everything mentioned, in GTmetrix at the beginning it gives me a statistic of the B and C range, but soon after in another scan they lower me to C and D.
In the Litespeed plugin I have DNS Prefetch configured and in the first scan of range B and C the DNS Prefetch configuration works as it is, but in the others scans later that go down to ranges C and D, it seems that the DNS Prefetch does not work, since it shows me those external DNS errors.
I don't know if it's my ideas, but it seems that the Litespeed plugin didn't do its job well, although it could be other things.
Should I use another cache plugin? I know that the WP Rocket is very good but the detail is that it is paid.
Any suggestion or help?
Thank you
In the current report, Your scores are mainly dependent on your third-party resources. Everything that is loading from your domain(Litespeed) is pretty quick(Check the Waterfall section). If you expand the results which are causing these scores, you'll see the responsible ones.
Pagespeed Scores
Minimize Redirects is caused by resources from addthis.com,openx.com and soon.
Leverage Browser Caching is also by addthis, google and Facebook resources.
Serve Scaled Images is something to be done during the upload, or a plugin like Shortpixel Adaptive Images can help you.
YSlow Scores
Add Expires Headers is the same as Leverage Browser caching above.
Use a CDN is caused by the same Third-Party Resources.
Use Cookie-Free Subdomains is caused by Cloudflare, since it appends its Cookies on all requests, but if your server is far away from where your visitors are, the benefits of using Cloudflare overweighs the drawbacks.
Reduce DNS Lookups is again due to a lot of third-party resources, from multiple domains, DNS Prefetch can help the users resolve the DNS faster, but won't affect the scores.
Avoid URL Redirects is the same as Minimize Redirects above.
No Cache plugin can edit how third-party requests are being handled by the website, while they can defer the requests or load them after the actual page is loaded to improve user experience, it won't still affect the scores.
What should you do to fix these?
Figure out what you want to keep and remove the rest if not required.
Talk to your developer about storing these external files locally so that they can be optimized and controlled using Litespeed Cache or any other Cache Plugin.
For the Minimize Redirects, get in touch with the provider you're serving these resources from and ask them to improve this behavior.
I hope this helps!
We are currently improving our site's Google Pagespeed score and one of the very few remaining items left is the below issue:
Leverage browser caching
https://ssl.google-analytics.com/ga_exp.js?utmxkey=xxxxxxxxx&utmx=&utmxx=&utmxtime=xxxxxxxxx
https://www.googletagmanager.com/gtag/js?id=xxxxxxxxx
https://connect.facebook.net/signals/config/xxxxxxxxx?v=2.8.27&r=stable
https://www.googleadservices.com/pagead/conversion.js
https://www.googleadservices.com/pagead/conversion_async.js
As you can see, PageSpeed is suggesting to cache the mentioned files, but they are external sources.
This is a WordPress site by the way, and I did try what seems like an experimental plugin to host locally the Google Analytics file, but it doesn't help.
I also tried locally hosting conversion.js - creating a copy of the file, saving it to my server, and referencing to that file instead but it breaks Google Analytics' A/B testing.
Are there any ways to address these PageSpeed issues?
My wife and I run a WordPress blog that utilizes the AMP plugin. AMP versions of posts are automatically created. Though I may have read somewhere that AMP caches can be shared with anyone, it did not occur to me until now that other sites can display our AMP content as a cached page of their site and we never get credit for the visit.
Just today, a blogger friend mentioned that she saw traffic in her Google Analytics from https://cdn.ampproject.org/v/www.mommypotamus.com/how-to-buy-a-non-toxic-mattress/amp/?amp_js_v=5
This is a cached version of our web page located at http://www.mommypotamus.com/how-to-buy-a-non-toxic-mattress/amp/
She is seeing referral traffic from a third party site containing our content. We have no Google Analytics record for this traffic or that third party page. So my question is:
Am I losing the credit for traffic passing through third party sites? And how are people even getting to them?
This matters because we just launched AdThrive ads on our site last week, and with mobile being approx 70% of traffic, we are losing money due to AMP. I realize that some ad networks are already compatible with AMP, but AdThrive is not, due to the fact that they employ real-time bidding on ad spots, which delivers maximum earnings to the publisher.
Maybe more sites are sharing cached versions of our AMP pages, but we know at least that Google Cache is doing so and AmpProject is doing so. How is that content tracked in GA? If it isn't, how is that supposed to be okay?
AMP is starting to feel like I've given someone permission to scrape all my content and outrank my actual site with my own content.
cdn.ampproject.org is a cache that belongs to Google. This cache is used for all views that happen in search and google news and many other apps including Bing etc. You are definitely not losing credit since they will all show you as the author and creator.
I got a message from our shared hosting company (InMotion) informing me our resource usage is too high. We have a WordPress-powered website. To give you an idea of our website, based on Google Analytics, we get 8,343 unique sessions per month. According to our webhost, we used 8660.71 MB of bandwith in January.
One day, InMotion told me there was a spike in CPU usage, and included an excerpt of my access logs which they say indicated "some heavy WordPress Admin activity". They said "We are not exactly sure what this admin user was attempting to accomplish, however this activity does seem to have inflated your account's CPU usage." They included the ID of the item that was uploaded and caused the spike. It was the only file I uploaded that day. It was a 7 kB PNG file; I uploaded it once, deleted it and uploaded it a second time.
I do not understand the complexities of resource usage, so to me, it seems stange that uploading a 7kB twice can bring about spikes and be considered heavy activities.
When I asked more about the Ressource Usage graph, they replied this : "The numbers are percentages. 100% means you're right at the top of what we consider normal CPU usage on a shared platform. Anything above that is VPS territory. If you zoom in on the graph, you'll see that for the most part you're right at 100%, but you occasionally have spikes over. Going through the logs, your CPU usage is mostly from the Wordpress Dashboard, so disabling the heartbeat feature should reduce your usage the most."
They also told me there was unusual activity to the wp-admin/admin-ajax.php.
At our hosting company's request, I did the following:
Disabled WordPress' heartbeat / autosave features
Installed a caching plugin (WP Fastest Cache)
Installed P3 Plugin Performance Plugin to see which plugins were using most resources
I deactivated 2 plugins that were highest in ressource usage : Scroll Back To Top and Simple Page Tester.
But even with these changes, there are still "spikes" in our resource usage and we are receiving warnings. Our host is recommending we either upgrade to VPS hosting or use a CDN service like CloudFlare or MAXCDN.
So my questions are as follow :
How can I tell what is really causing excessive ressource usage?
Are there other ways to reduce resource usage caused by WordPress?
Are CloudFlare or MAXCDN good for this type of situation?
Thank you for taking the time to read. Any help or tips will be appreciated!
Your usage is not at all surprising. If you get 8,343 unique sessions per month and you used 8660.71 MB, then each visitor is taking an average of about 1MB.
Just loading your home page is about 0.525MB, so if people browse your site (plus with admin traffic) it's easy to see how you might use that much.
Yes, a CDN like CloudFlare or MAXCDN would be really good for your situation. Your page is loading a lot of CSS, Javascript and images that could all be moved to a CDN. This would significantly reduce your host bandwidth usage and probably lower your page load time as an additional benefit.
I have a landing page where I'll be embedding a video on my page and allowing users to view it when visiting the site.
I'm using an Azure Website for hosting my site, and ASP.NET/MVC 5 for the server. I have a huge video (345 MB) that I'd like to store on my server, and simply play in the browser.
I'm currently using a video tag to do this and it works fine. My question is, is storing a huge video as part of my website a valid approach to this? Will the data being sent to the user from my server cost us money? Are there better approaches for storing huge videos and still being able to embed them on the site?
My question is, is storing a huge video as part of my website a valid
approach to this?
I don't think so that this is the right approach.
Will the data being sent to the user from my server cost us money?
Yes. Any data that is sent out of Azure is chargeable.
Are there better approaches for storing huge videos and still being
able to embed them on the site?
Use Blob Storage instead. Blob storage is meant for that purpose only. Furthermore, you can make use of Azure CDN so that your video will get replicated/cached across many CDN nodes and will be served from a CDN location closest to your website visitor.