Logged-in Users need to Refresh page to see content - wordpress

Hi I'm having an issue with a site where visitors need to be members to access certain pages, but once logged in they go to these pages and still see the 'not logged in' page and need to refresh to view the actual content.
This obviously leads to a lot of bounces and I'd like to fix so that they see the content right away.
The root issue comes from some cache settings or something from the host - unfortunately we can't change host (and it's not a regular hosting company with a website but a design company reseller) for the time being. This issue does not occur in our offline environment of the same site.
I've already had to add a ?randomnumber to the stylesheet so it loads new versions properly. I was wondering if something like this would work - but dynamically as pages are being added all the time by different admins.
Or any other solutions also appreciated!
Thanks

Like you said, tweaking the caching settings would be the most ideal. But since that's not an option, I'd suggest adding a random, meaningless query string to the URL of the member pages so that it's seen as a 'new page' and (likely) won't cache.
So instead of /member-page
Direct them to /member-page?cache-buster=randomlyGeneratedStringHere

Related

Wordpress site switched to displaying Posts for no apparent reason

I have a site with a static home page which is just one of the pages. Ive been working on the site for several weeks. Today, when I went to clear the cache to see if some links were updated, the home page switched to displaying Posts (which is the other setting under the Settings->Reading) I went to settings and sure enough display Posts is checked. No one else, that I know of has the password to this site. Does anyone know why this happened or how I can prevent it from happeneing again?
There are so many variables to consider, but it has to be a direct database manipulation. So:
Someone did in fact change it, but no one knows who
A plugin or theme changed it. Unlikely, but certainly possible. Search your plugins/theme changelog and/or support threads for similar reported issues
One thing you could do is install the plugin Stream. This logs all (well, nearly every) database manipulation and tells you when, where, and who. This way, if it happens again, you can immediately pinpoint it.

Caching and/or Cookies breaking WooCommerce site

The App:
I am running a WordPress WooCommerce website and did some modifications.
Users arrive at a page called /configurator/ where they get asked different questions. After answering all questions I lead the users to a page /summary/ .
On this /summary/ page an individual result is presented to the user based on their answers in the /configurator/. Also I create a cookie on /configurator/ with all answers.
I use the cookie also on /cart/ and /checkout/ to add individual information to the product we sell to the user.
The Problem:
When we went live with the website we turned on "production mode" for our website at the admin panel of our hoster. It basically turns on the CDN and enables caching.
Unfortunately users experienced problems on /summary/. It seemed that the page couldn't be loaded.
My analysis:
I think the hoster caches /summary/ and breaks my site. Following this article it makes sense that the site doesn't work any more: https://docs.woocommerce.com/document/configuring-caching-plugins/
„These pages need to stay dynamic since they display information specific to the current customer.“
What the hoster says:
The hoster says they cannot exclude any subpages from being cached: "The problem was caused by coding errors in combination with the cookies that we create on /summary/"
Current Status:
I need to leave the site in development mode (without CDN and cache) which is very slow. Based on what the hoster says I can't turn on production mode because it will probably break the site again and we lose a lot of money. Currently I cant reproduce the error on a cloned version of the site :(
You should rewrite your code and instead of using cookies use WC Sessions. Every customer has a session that already works and persists throughout the whole site, just set your data in it and use it at all pages you need.

Site only refreshes when adding www. to URL

First time posting so please bear with me.
I'm the unofficial web guy at the company I work for and I helped create our basic static HTML site.
Any work that I do to the site offline and then FTP shows up instantly on my machine. I rarely, if ever, need to clear the cache for changes to show up. However, within the company I work for, nearly half of the users never see the updates. Some do, some don't.
On the machines that don't I've cleared the cache in browser and through the internet control panel settings. Nothing. Still shows the stale content. The only thing that works - and I've seen this both in Chrome and IE is that when I add www in front of the URL is then shows the refreshed site. No big deal, right? Well for users who type in mysite.com without the in front will not see the updates. People who have favorited it like that, will not see the updates.
Now, on to what I've tried to fix it. After much research many people have steered me away from meta tag refresh so I haven't tried that, however, with the help of the IT guy we have, from what we can tell, set the HTTP header of the site to always refresh. This did not do anything for us.
I've tried changing image names in the HTML page when updating a photo and that didn't work either.
I haven't been able to find a .htaccess file so can I create one? If we (IT guy and I) changed the HTTP Header setting to always refresh but there is not .htaccess file will there be no change?
Any help or suggestions would be greatly appreciated.
I have searched on here for the answer and the two most suggested changes are HTTP Header and Meta refresh. HTTP header didn't help and it seems the Meta tag route is bad form.
This is a DNS issue. You need to ask the provider of your web services to add an A or a CNAME record for the domain's root.
If you don't understand the above, just call the provider of your web presence (the company that hosts your web server) and tell them you want yourdomain.com and www.yourdomain.com to go to the same place.

Drupal - Keep getting logged out on front page only

Drupal 7.19
Acquia hosting
using Yottaa for optimization and monitoring
Problem: when trying to log in from home, we're experiencing something odd: we keep getting "logged out" ONLY on the front page. When you click anywhere inside the site, we get the admin toolbar and contextual links back. But we can't update the front page at all because it acts like it's logged out only for the front page.
This only happens when we're at home working on the site. This does not happen when we're in the work office.
Anyone experienced this? Don't know at what level is this occurring in...drupal, yottaa?
Found out the issue. It was a specific caching for the homepage on Yottaa's CDN. Once I turned this off, I was able to edit the home page again.
Note: They said that turning it off is not recommended. Instead, they recommend to flush the cache. Either way, found out what was causing the issue.

capture details from external web page

I'm wondering if it's possible to capture details from the web page that a user previously visited, if my page was not linked from it?
What I am trying to achieve is to allow users to my site to find a page they like while browsing the web, and then navigate to a page on my site via a bookmark, which will add the URL (and possibly some other details like the page title) to a form which they can then submit to my site to add the page to a list of favourites there.
I am not really sure where to start looking for this. I wondered if I could use http referrer, but think this may only work if there is a link to my page?
Alternatively, I am open to other suggestions as to how I could capture this data - a Firefox plugin? A page which users browse other sites in an iframe, with a skinny frame on top?
Thanks in advance for your suggestions.
Features like this are typically not allowed by browsers for security and privacy reasons. The IFrame would work, but this is a common hacking technique so it may be likely to break or be flagged in the future.
The firefox addon is the best solution, but requires users to install it manually.
Also, a bookmarklet could be used. While they are actively on the target page, the bookmarklet could send you the URL.
This example bookmarklet would create a tinyURL for the destination page. You could add it to your database or whatnot.
javascript:void(window.open('http://tinyurl.com/create.php?url='+document.location.href));
If some other site links to yours and the user clicked on that link which took them to your site you can access the "referrer" from the http headers. How you get a hold of the HTTP headers is language / framework specific. In .NET you would use the Request.UrlReferrer; other frameworks would probably handle it differently.
EDIT: After reading your question again, my guess would be what you're looking for is some sort of browser plugin. If I understand correctly, you want to give your clients the ability to bookmark a site, while they are on that site, which would somehow notify your site about the page they're viewing. The cleanest way to achieve this would be a browser plugin. You can also do FRAME tricks, like the Digg bar.

Resources