My website requires multiple refreshes to show new content - asp.net

I am in charge of updating a large e-commerce website, tedbrownmusic.com. We hired an outside web developer to code the website, so I'm not very familiar with the back-end side of things.
My problem: The front page requires multiple manual refreshes in order for newly posted content to show up.
The front page is an ASP.NET storefront, separate from the rest of the site (which is run on Wordpress). The Wordpress site loads properly, but on the front page, I have to hit F5 a bunch of times for new stuff to show up (I am using the newest version of Chrome). In IE, it refuses to refresh at all. The cache has to expire before a user in IE can see new content.
The web developer said there was nothing he could do about it. But that doesn't seem like normal website behavior to me.
I have tried adding "no-cache" META tags to force the page to refresh every time it is visited, but it didn't work.
Any thoughts? I appreciate it.

If you are using css/js files, you can try to force browsers to load the latest CSS file by changing a version number. I think its called as css versioning.
<link rel="stylesheet" href="yourwebsitescsslocation/style.css?v=1324491378" type="text/css" />
if you change 1324491378 to some other value it will force browser to load css again rather than from cache. You can use changeset number or current timestamp as version number and make sure that it changes everytime you make changes to css.
see below article for some additional help:
http://www.codeproject.com/Articles/203288/Automatic-JS-CSS-versioning-to-update-browser-cach

Related

Updating the thumbnail image within the Facebook send button

I want to update the thumbnail image that Facebook send button uses. I realize that this is cached by Facebook and am trying to update the cache using the debugger tool
I tried using http://domain.com/path_to_image.jpg?fbrefresh=CAN_BE_ANYTHING on the debugger tool
However, the old image is still displayed. It appears that the image is being stored at https://fbexternal-a.akamaihd.net/safe_image.php?id=path_to_image/image.jpg
Any ways to clear the cache since fbrefresh=CAN_BE_ANYTHING is not working or am i typing in the wrong url ?
I have also tried adding
<meta property="og:image" content="http://domain.com/new_image.jpg"/> in my header.php file (I am using Wordpress)
Any idea as to why the image is not updated / cache not cleared?
The image gets refreshed after a day by the above procedure.
Again, go to the debugger tool
Type in http://domain.com/path_to_image.jpg?fbrefresh=CAN_BE_ANYTHING
Great explanation here and clarification that you use debugger on the "FB post" URL (not just the image) and once resolved, use the "rescrape" to force updates on the thumbnail.
http://info.tmrdirect.com/bid/105994/How-To-Change-Facebook-Link-Thumbnail-and-Description
One tip: in the Debugger tool, I found it only worked if I wrote the URL using HEX for characters (but that might be because that's how my original post was formatted). e.g.: https%3A%2F%2Fwww.domain.com%2blog2%2F

Best way to let browsers refresh from cache on a live website?

It's about making changes in design (css-files and images) on a website which is already online and in use. I wonder what is the best-practice to make sure that visitors see the changes without clearing there browser's cache manually. Things that came in my mind:
change meta-tag - dismissed because I do not want the site to be ALWAYS loaded from the server
include the css-file with a parameter (like timestamp) after made a change
change the names of included images so that they are reloaded - means also change names in the files where images are included
?
What else could achieve the loading from server? Did I forget some advantages/disadvantages?
Possible duplicate of this post: How to control web page caching, across all browsers?
My favoured solution is to set a random number after you call the file e.g.
css/styles.ccs?628454548
images/sprite.gif?8356484894
You could use javascript/php or whatever to set those random numbers every time the page is called to the browser.

How to serve different cached versions of a page depending on a cookie in Drupal?

The task is relatively straightforward:
A Drupal website displays a list of articles with thumbnails. Some visitors would like to view it without images by clicking on a button/link and have that preference saved.
e.g. http://patterntap.com/collections/index/
The problem is all visitors are anonymous and given certain traffic, page cache is enabled.
My idea was to use some simple JavaScript to set a cookie, refresh the page and depending on the cookie values (or its presence/absence) display or hide the images.
Except Drupal serves cached pages quite early and the only quick way to modify the cached version that I could find is by hacking includes/bootstrap.inc and add a custom class to the body classes then hide the images with css.
A very wrong approach, I know. But I wonder if there is a way to save different versions of a page and serve the correct version?
Edit:
need to keep the same uri
the js to show/hide the images without reload and set the cookie is already in place
hook_boot() is not really called for cached pages, so can't do it via custom module
.htaccess mods?
Edit/solution:
In the end went with Rimian's suggestion. But it is possible to accomplish the task using our own cache.inc implementation as seen in the Mobile Tools module. Specifically, by extending cache.inc and updating settings.php to include
$conf['page_cache_fastpath'] = FALSE;
$conf['cache_inc'] = 'path/to/my/module/my_module_cache.inc';
So let me get this right. You wanna hide some images on a cached page if the user chooses to?
Why don't you write some jQuery or javascript and load that into your cached page with all the rest of the document?
Then, the client/browser would decide to run your script and hide images depending on some parameters you passed along with the request to that page or in the cookie? The script gets cached and only runs when you call it.
If you were hacking the bootstrap for something like that you'd really need to be rethinking what you were doing. Crazy! :)
Also take a look at cache_get and cache_set:
http://api.drupal.org/api/drupal/includes--cache.inc/6
I'm not sure I 100% understand what you are trying to do but here are my thoughts. One of your root problems is that you are trying to access what is essentially different content at the same uri.
If this is truly what you want to do, then Rimian's suggestion of checking out chache_get and chache_set may be worthwhile.
Personally, it seems cleaner to me to have your "with thumbnails" and "without thumbnails" be accessed via different uri's. Depending on exactly what you are wanting to accomplish, a GET variable my be an even better way to go. With either of these two options you would hide or show your thumbnails at the theme layer. Pages with different paths or get variables would get cached separately.
If you want the visitor to be able to switch views without a page reload, then jQuery and a cookie would probably suite your needs. This wouldn't require a page reload and switching back and forth would be quite simple.

Odd error with HTML content and attributes disappearing

I have an odd error with an ASP.NET web page (ASP.NET 2.0, C#). For several users at one customer location, on one part of one page, HTML content and attributes are being stripped out. So, something that should look like this:
<p class="adminmainlink">
Add or edit resources
<script type="text/javascript">
var hb526 = new HelpBalloon(
{
title: '',
content: 'Add or edit downloadable file, web links, and text resources associated with a course.'
}
);
</script>
</p>
In the users' source code looks like this:
<p><a></a><script></script></p>
Not only is the content of the HTML tags disappearing, but also the attributes of the tags (the "class" value for the "p" tag, the "href" from the "a" tag).
Other areas of the same page are being rendered fine, with no changes to the HTML. The HTML isn't being generated by a code-behind page -- it's just plain text in the .aspx page. The area that is rendering correctly is in the .master page; the problem area is inside an asp:Content tag.
This error is only happening on one page of the application. Other, very similar pages that use the same .master page are unaffected. I am not able to reproduce this error outside of the client's facility, even when logging in to the client's account. The client is using IE 6 -- we have tested on that, and all is OK. No other customers are reporting a similar problem.
Maybe it's a content blocker or firewall issue at the client's location? Maybe the script is causing the content filtering (other pages use the same script and they display fine, however)?
If it's a code problem, it would seem to affect only the area inside the asp:Content control that is dropping into the .master page. Has anyone seen something like this before? What part of the ASP.NET page life cycle would eliminate attributes and tag content from hard-coded HTML? I could see weirdness happening with a control, but with regular HTML?
Many thanks for your thoughts and opinions!
Are the users using Firefox with AdBlock or some other ad blocking software? I've had strange behaviors in our internal application where certain content was being mysteriously removed, and it turned out it was because a liberal filter was applied, blocking out the word "ad." I noticed "ad_resourcewizard.aspx" was contained in your link. You should have the customers at that location try a different browser or disable their ad blocking software in case it's hooked into their networking software (a plugin for their security suite, for example).
Check whether they have any internet security software installed, and if so, try disabling it.
I know we had an issue with one of the versions of Norton Internet Security, which was stripping scripts out of our CMS pages for one particular client.

Detect IE setting: check for newer versions of stored pages "never"

I understand there isn't a way to interrogate a users IE settings directly due to security reasons, but is there a way to derive this answer with some other mechanism? I would like to stop a user from using my site if the setting "Check for newer versions of stored pages" is set to "Never". Any suggestions?
Is there a way I could test for this using javascript? An example of what I am trying to accomplish is this: While it is not possible to check IE settings to see if you are running a popup blocker, that is a way to "test" for a popup blocker via javascript. I am looking for something similiar but for the cache setting, not the popup blocker.
Append something to the querystring that is unlikely to have been used before (a datestring, a random number).
That will make IE 6 request the page again, as far as IE is concerned, its another page.
So if you have http://somepage.com/dontcachethis.html you'd replace this with something like http://somepage.com/dontcachethis.html?please=23143425 where you asign please a random number.
Add the following to your web pages:
meta http-equiv="Pragma" content="no-cache"
meta http-equiv="Expires" content="0"
This forces the user's browser to reload the pages every time, regardless of the browser's cache settings.
I've not implemented this yet, but this is my plan:
Write out a date onto the page that is on an attribute of some HTML element (this should be written server side...writing it with javascript will defeat the purpose)
Using javascript, load in the date. If it is over a certain amount of time, I can assume they have not loaded a fresh copy of the page. If that is the case, continue on...
If I've detected that they have an old version of a page, I could do a couple things:
I could warn them of this problem. I could show them the setting that they need to change.
I could redirect them to the same page but add something onto the querystring so it looks like a different page to the browser. If I do this, I would add something in my javascript that knows about this potential redirect loop.

Resources