S3 CSS assets not loading, but previously did. Why would it stop? - css

I've been using S3 to host static websites and I've made changes to the HTML & CSS files and have seen those changes reflected in the past. For some reason I go to do the exact same thing I've done before, change the style of one of my sites, and no change would take place. In-fact after deleting all previous files, the old build was still rendering. I had no version control on that particular bucket.
Content-type is set to 'text/css'. My file structure is normal with index.html being in the root. My normal steps of creating or updating new or existing sites has not changed, but S3 has for some reason.
When I click on the index.html file and go to the public url link, it reflects all my changes.
My only fix is to add the full url to the style link.
<link href="https://s3.amazonaws.com/{bucket-name}/css/style.css">
Does anyone know why this is happening or how to fix it other than adding the http link? If not, I hope my solution helps others for this weird S3 issue. Normally you can just upload your files to a bucket, set the policy and finally enable hosting after stating the root html file.

It might be due to your browser caching, where it's loading locally stored assets (CSS stylesheet) from a previous time you've visited the URL rather than fetching the new resources in an attempt to speed up load times. There are settings you can change in your browser to determine how long your browser will hold onto cached resources before fetching new ones.
By setting the stylesheet link directly to the s3 bucket URL, it will cause it to fetch the new stylesheet every time the page is loaded, which leads me to believe that caching is the issue here.
Try clearing your cache and see if it solves the problem.
Here is a deeper explanation of the concept with respect to browsers, and a list of commands to perform a cache refresh depending on what browser/OS you have!

I think it's the CSS folder's doesn't allow you to access the files inside. If you make the folder public, it will work.

Select all your files and folders, go to actions tab and then select make public to allow objects to access one another.

Related

How to force flush old cache in visitor browser

recetnly i changed cache plugin from WP fastest cache to WP Rocket. I had to move some inline JS codes to file. The issue is some repeat visitors can have old JS file in browser cache. Is there some way, how to force delete it, when they visit site?
My Idea would be to still put a reference to the old files like:
<script type="text/javascript" src="http://www.example.com/myOldFile.js?2"></script>
I haven't tested wether it works with deleted files. But it works with changed files. So my assumption is, that it also could work with a deleted one.
The index 2 after the question mark in the link will force the brower to reload the file and remove the old one from cache. You can change to index whenever a file was changed.
I've tested it on many browsers and WebView Objects (WKWebView and Android WebView).

Section doesn't change after modifying script

This is a long shot, but I came to a wall and I don't have any idea what to do with it.
There is a site that has section with Google Maps map with custom pins. Location of the pins and configuration of the map are defined in wp-content/themes/mytheme/js/map.js file. I have to add some pins. According to the person who created the site a couple of years ago, new pins may be added by modifying map.js file.
The problem is - nothing changes when I modify this file. Even better - I can remove this file (and all other scripts in "js" folder) and nothing changes. This is the only instance of a script that I found.
There isn't any caching plugin enabled.
There isn't caching on server.
This is not browser cache.
Basically it seems like instead of loading this script, the site is taking it from different location, but I don't have a clue where it could be. Is there anything I can do to find source location of a script?
EDIT: I deleted all css and js folders entirely from FTP and it still shows in browsers. I entered Chrome web tools and used network tab to see initiator for the script and it shows as
xxxxxxxxxxx.xx/wp-content/themes/xxxxxxxxxx/js/map.js?ver=5.2.10
But this file DOES NOT EXIST. I deleted it via FTP. Same thing happens with CSS files. It isn't browser caching because it happens on different browsers, different computers...

Is it possible to prevent cloudflare to cache my images?

I try to get an answer on the question if it is possible to prevent Cloudflare from caching my images.
The link to the files would be https://my-domain/wp-content/uploads/2019/*
The * on the end indicates that all what is in /2019/ should be ignored.
I read this post:https://www.itsupportguides.com/knowledge-base/tech-tips-tricks/how-to-exclude-wp-admin-from-cloudflare/ which is indicating that it is possible for http://my-domain/wp-admin/ but I'm not sure if it will also work for my uploads folder. See the last sentence in the image below.
I want this folder ignored because I'm using another cdn for my images.
You should be able to do that with a Page Rule.
If a Page Rule is set to "Bypass Cache", then the resources that match that page rule will not be cached. Note that we will still act as a proxy, and our other performance features will still be active - content just won't be served from our cache and fetched from the origin directly.
Ryan

Magento doesn't load my CSS

I have change a bit of code in my CSS from Magento for my header logo but Magento doesn't load my new CSS update and still shows the old one.
I have already refresh the cache in Magento
Flush Magento Cache
Flush Cache storage
Flush Javascript/CSS Cache
At System - Cache Management
I have a folder var/cache and in here folders like mage--0, mage--1
i have tried to back-up them so i can restore it when i delete them and something won't wrong but i cant back-up it.
Hello first of all you can always safely delete the contents of var/cache you do not need to back it up. I konw it might sound silly but did you clear browser cache? Also make shure you changed the correct css file, use Firebug to see if your changes are not overwritten by other rules. A link to the project and more information will be helpful.
It may be that the browser is caching your files, not the server. To check, try either merging your files or unmerging your files and refresh the page. If you see the changes, then it is indeed your browser that is caching the files.
In that case, we've developed a handy little extension that automatically refreshes the merged JS + CSS static files. http://extensions.activo.com/css-and-javascript-versioning.html
you may be using different theme. check in system config design section what package and theme you are using and then check for that folder in skin and change. delete the var cache and changes will show. you do not need to back up var cache
Its also important to check System -> Design, where design overrides are located. Recently we've had a problem with this, someone (we are not sure who, hacker?) added override without dates, and whole shop become broken (we have pretty sophisticated package with lots of modifications). It took us about 30 minutes to figure out what was going on.

Project hosting on Google Code. Files are cached?

I do not really understand how Google Code handles file versioning.
I am building a jQuery plugin that anyone can access. Like so:
<script type="text/javascript" src="http://jquery-old-browser-warning.googlecode.com/files/jquery.browser-warning.js"></script>
This script accesses other files on the same project (via ajax).
The problem is, that when I upload a new file, it just seems like there aren't any changed to it. Google recommends that new files should have new names.
But then I would have to change the filenames that the script loads.
But then I would have to change the script file as well, and that would break everybodys implementation (with the script-tag above)
Is there a way to force a file to change when uploading with the same filename?
PS: If I go directly to the project page's file list. Then I do get the file with the updated content. But as I said, not when getting it through ajax.
The cheapest trick in the book to prevent caching is adding some random content to a GET parameter:
www.example.com/resources/resource.js?random=1234567
You can for example use the current timestamp for this.
This, however, causes any and every access to re-fetch the content, and invalidates any client-side caching mechanism as well. I would use this only as a last resort. If Google are that stringent about caching, I'd rather develop a workflow that allows for easy renaming of files.
I don't know your workflow, but maybe you can work with versioned directories?
Like so:
www.example.com/50/resources/resource.js
www.example.com/51/resources/resource.js
that would keep whatever caching the client employs intact, but whenever there's a change from your end, the browser would reload the content.
I think Its just a cache on the browsers, So when you request file from ajax, just add random parameters or version number.
For example, Stackoverflow add version parameter to static contents like
http://sstatic.net/so/all.css?v=6638
Are you talking about uploading files to the "Downloads" area? Those should have distinct filenames, for example they should be versioned. If you're uploading the script code, that should be submitted by the version control system you're using, and should most definitely keep the same name across revisions.
Edit: your code snippet didn't show up on my page, misunderstood what you're trying. Don't imagine Google would be happy with you referencing the SVN repository every time some client page is loaded :)

Resources