Using iframes to add dynamic content to statically cached pages - wordpress

I have a wordpress based site which makes use of a full page caching system. Problem is that I need to add dynamic content for logged in users. Right now I'm keeping the caching enabled for non-logged users and disabled for the logged ones via cookie check.
Would inserting the dynamic content in an iframe be an acceptable solution to extend the full page caching to all the users? Or any better ways?

Another approach would be to pull in the dynamic contents via AJAX. Of course your page will not be accessible without Javascript, but that may or may not be an issue.

Related

Logged-in Users need to Refresh page to see content

Hi I'm having an issue with a site where visitors need to be members to access certain pages, but once logged in they go to these pages and still see the 'not logged in' page and need to refresh to view the actual content.
This obviously leads to a lot of bounces and I'd like to fix so that they see the content right away.
The root issue comes from some cache settings or something from the host - unfortunately we can't change host (and it's not a regular hosting company with a website but a design company reseller) for the time being. This issue does not occur in our offline environment of the same site.
I've already had to add a ?randomnumber to the stylesheet so it loads new versions properly. I was wondering if something like this would work - but dynamically as pages are being added all the time by different admins.
Or any other solutions also appreciated!
Thanks
Like you said, tweaking the caching settings would be the most ideal. But since that's not an option, I'd suggest adding a random, meaningless query string to the URL of the member pages so that it's seen as a 'new page' and (likely) won't cache.
So instead of /member-page
Direct them to /member-page?cache-buster=randomlyGeneratedStringHere

Does Wordpress list all pages for crawlers?

I created a page on a Wordpress site that was for internal use only and triggers some backend code. Within a few days I started seeing hits on that page from "bingbot".
I'm not using any kind of sitemap plugin. How are crawlers finding this page?
I know the robots.txt file can block them but I want to make sure they don't show up for crawlers that don't respect this. I still want to have the page publicly accessible if someone types in the URL.
What needs to be done in Wordpress to make sure a page can't be discovered except by typing in the URL?
Any given URL is potentially "discovered" once the post is published and if there's a link to it from elsewhere on your site. There's no guaranteed way to prevent search engines from indexing a URL.

Dynamic Content not working with WP Super Cache

wonder if someone can help. I am trying to add a cart widget to the header of a woocommerce enabled site. However, when WP Super Cache is enabled, the widget doesn't update when something is added to the cart, understandably.
I am trying to add the following so the widget isn't cached:
<!--dynamic-cached-content-->
<?php echo time(); ?>
<!-- my_dynamic_content(); -->
<!--/dynamic-cached-content-->
I have just displayed the time in this case to see if I can get it working.
I have set the caching to PHP cache, with late init and dynamic caching enabled but the time still doesn't update when I'm logged out of the administrator.
I have trailed through documentation to see if there is another way to get this working but so far I have not found a way.
Can anyone point me in the right direction? Maybe I've got this completely wrong!
I just want one widget to be dynamic in the header.
Thank you in advance.
There are multiple types of cache for web applications, and WordPress has ways to take advantage of all of them.
Plugins like WP Super Cache, W3 Total Cache, and Batcache as well as server components like Varnish and Nginx implement page caching. These tools store a copy of the complete page and use that cached copy every time the same URL is requested. This is the fastest cache available, but the down side is they return the same HTML to everyone.
If you want to use a page cache but still have dynamic elements like your header widget, you'll have to render them in JavaScript.
If you've written your own theme, you can implement fragment caching by storing the rendered HTML of different sections of the page except the part you want to be dynamic. There's no plugin you can download that'll do it for you. You'll need to make your own judgement calls about what needs to be cached and for how long.
Lastly, you can just cache the data used to render pages. Look for information on WordPress persistent object caching or write code to use WordPress's Transients API. A persistent object cache plugin can automatically store the results of WordPress's queries to something like Memcached or Redis if you have that available.

Sharing page across websites in ASP.NET

Here at our company we are trying to figure out how to create one single page and share it across domains in ASP.NET.
We would like to create a simple "cart" page that is the same for all of our clients websites, so that we can include the page from a central location (such as http://ourwebsite.com/thecart.aspx) without duplicating code, and still be able to apply the CSS styles and branding for each client to the page.
How can we share a single page across websites in ASP.NET?
Each of our client's websites are on a different domain, and in some cases may also be on different servers as well.
I think what you want to do is manage one page and have it automatically update the other pages on your client's sites, ideally the same thing as "sharing a resource" no? For that you don't necessarily need to "share" a page, you need an easy process for multiple site deployment of just the single page, errr....I think? In any case, without loading the page via an iframe, or creating a central spot like "cart.somedomain.net" and then pushing the info back and fourth (I assume you'll have shopping cart items), you'd need a way to automate the publish of the page on different sites.
Even if you were to make the "cart" page it's own solution and then just include it in the individual sites, you'd still have the deployment issue. I think you have a few options, some of them previously mentioned:
Create an iframe that loads the page from an external source.
Create a central location for all the domains to push information to for their checkout process (store.somedomain.net or somdomain.net/cart.aspx) and handle it accordingly.
Create an application or script that automates the deployment of the updated resource to multiple sites (I don't know of a tool that does this or I would offer up the name to you, I apologize).
Anyway, I hope that helps, best of luck.
Inherit from and create the page as just another server control in a custom library. You'll of course have it in source control.
In fact, it doesn't need to be a "page", rather a custom server shopping cart control.

capture details from external web page

I'm wondering if it's possible to capture details from the web page that a user previously visited, if my page was not linked from it?
What I am trying to achieve is to allow users to my site to find a page they like while browsing the web, and then navigate to a page on my site via a bookmark, which will add the URL (and possibly some other details like the page title) to a form which they can then submit to my site to add the page to a list of favourites there.
I am not really sure where to start looking for this. I wondered if I could use http referrer, but think this may only work if there is a link to my page?
Alternatively, I am open to other suggestions as to how I could capture this data - a Firefox plugin? A page which users browse other sites in an iframe, with a skinny frame on top?
Thanks in advance for your suggestions.
Features like this are typically not allowed by browsers for security and privacy reasons. The IFrame would work, but this is a common hacking technique so it may be likely to break or be flagged in the future.
The firefox addon is the best solution, but requires users to install it manually.
Also, a bookmarklet could be used. While they are actively on the target page, the bookmarklet could send you the URL.
This example bookmarklet would create a tinyURL for the destination page. You could add it to your database or whatnot.
javascript:void(window.open('http://tinyurl.com/create.php?url='+document.location.href));
If some other site links to yours and the user clicked on that link which took them to your site you can access the "referrer" from the http headers. How you get a hold of the HTTP headers is language / framework specific. In .NET you would use the Request.UrlReferrer; other frameworks would probably handle it differently.
EDIT: After reading your question again, my guess would be what you're looking for is some sort of browser plugin. If I understand correctly, you want to give your clients the ability to bookmark a site, while they are on that site, which would somehow notify your site about the page they're viewing. The cleanest way to achieve this would be a browser plugin. You can also do FRAME tricks, like the Digg bar.

Resources