This question already has answers here:
How does the revalidate process in Incremental Static Regeneration work?
(2 answers)
Closed 7 months ago.
As you know, we can update SSG page via revalidate key in getStaticProps() and user at firstly see latest content and after refresh page twice can see updated content.
But is it proper that user should refresh page twice? In my opinion, user do not know should refresh twice. However, new users can see the new content at first.
I would be appreciate if share your comments about revalidate.
You'll usually have more than 1 user. The Re-generation can be triggered by any of your users so not everyone on the site needs to refresh twice. Only 1 user would do that.
Related
This question already has answers here:
how to cleanly handle errors in nextjs getStaticProps
(2 answers)
Closed 12 months ago.
I'm trying to create a blog for my website. Because I want to be able to dynamically add blogs without rebuilding I had to choose between incremental static generation and server-side rendering. Because a blog page won't change often, I decided to go for ISG. Of course, not every blog ID is a valid page. Therefor I want to set a status code, but I can't find out how without using server-side rendering
I found the answer. Turns out I can return notFound: true from the getStaticProps to get a 404
From: next/link
You can see that the <Link> component from next/link enables client-side transitions and link prefetching, which are great features, but maybe not for all cases.
Please see the caveat I've run into. Let's say I have the following pages:
Home - Some landing page with a nav bar
Latest - Here I can see my latest posts
Admin - Here I can add more posts
The Latest page from the example above uses getStaticProps with revalidate. Something like:
export const getStaticProps : GetStaticProps<HomeRoute> = async () => {
const preloadedState = await getPreloadedState();
return({
revalidate: 1,
props: {
preloadedState
}
});
};
In theory, after 1 second, it should send the last stale response for the next request and trigger a new static regeneration to be served for the subsequent requests. After 1 second, the process repeats and you get fresh data at least after every second, which is pretty much immediately.
Now, see the caveat I've run into with next/link:
User lands on the Home page. There is a Link on the nav bar pointing to Latest. That link will be prefetched by next/link.
In some other browser, an admin goes to the Admin page and adds one more post (which should appear on the Latest page at some point).
Now user clicks on the Latest page link. The new post is not there.
Clicks on Home again. And clicks again on Latest. New post is still not there and never will be.
The transitions in this case are blazing fast, which is nice. But from my experience so far, I think that that user is locked inside a version of my website where the new post will never be available, because that 1st prefetch happened during a time where the new post didn't exist.
The only way that user will ever see the new post is if he/she presses F5 to do a full website reload. And it might be necessary to refresh twice, because the 1st one might return the previous stale version while triggering the regeneration for the next one.
I mean, what is the workaround to this issue? Should I not use next/link for pages that contain any dynamic data? Should I just use normal <a> tags?
UPDATE
From: https://nextjs.org/docs/basic-features/data-fetching#statically-generates-both-html-and-json
From the excerpt above, we can see that indeed, client-side transitions will not trigger a page regeneration, because they'll not call getStaticProps. They only fetch the pre-built JSON object for the page to use as props.
AFAIK, it means that you'll be locked to the version of the page that existed when you first visited the website. You can go back and forth and nothing in the pages would change, because the JSON data is probably cached on client anyway.
PS: I've tested this (like I've mentioned in the question above) and this is exactly what happens.
So how to workaround that? I would like for users that keep an open tab of my website to be able to get updates for the pages when they navigate from one page to the other.
A POSSIBLE SOLUTION
Set some kind of idle time counter, and if the user gets like 10 minutes of idle time (it means that they left the tab open). Whenever he comes back and do some action, I could refresh the whole website to make sure they get the new version of the pages.
Has anyone faced that problem before?
I've posted this very same question in several forums and this is the response I've got:
It seems what you described is true. next/link caches results in the client-side and your visitor will not fetch a revalidated result out of the box unless there is a full-page reload.
Depending on the likelihood of content changes, you might want to use <a> instead or you can look at some client-side content reload strategy that kicks in after mount and query data source for updated content.
Given that fact, I'll stick to using next/link and client-side transitions. But I'll also use something like a setInterval() to do a full website reload from time to time, so I'm sure my users will keep getting revalidated pages eventually.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Facebook share url thumbnail problem
How to clear Facebook Sharer cache?
I placed a Facebook Like button on my website about 7 days ago. It seemed to work fine; except several days later I noticed that it was giving out the wrong meta data in the Facebook message that is sent. The Title, the Description and the Canonical URL were all wrong. Mea Culpa. It was my fault. I had cloned an old page to save time, had changed the content but had forgotten to change the Meta data at the top. Easy to fix, right. Edit the html page. WRONG. Made no difference. Deleted the Like button code, and re-created a new Like button on the Facebook Developers website and pasted the new code. Made no difference. The button still shows the Meta data from the first button.Tried different variations of button code. Made no difference whatever. It seems the original data has been cached in the Facebook database, and cannot be changed.
Any help somebody?
Load the URL in the Facebook debugger. That might flush the cache.
You should probably try changing URL with a query string in the end.
Also, URL Linter provides you an insight on how FB is going to read your like Meta Tags.
it is always a good practice to check your like page URL here and see what all Meta Tags FB is picking up ..
http://developers.facebook.com/tools/debug
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
asp.net mvc disable browser cache
how to disable cache for a method in asp.net mvc3?
Problem in IE9
I am saving some data on /home/ page. The page saves data and reload the page to show the newly updated result. In other browsers FF/Opera page refresh is getting new results but on IE9 its showing old data.
Response.Cache.SetCacheability(HttpCacheability.NoCache);
Just reload the page with some random parameter.
Like: /home/?rd=2823772
Make sure to always change the rd parameter to another random number on each reload. This way, IE9 will not have the page cached and it'll refresh like FF/Opera.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Our current application is working fine but when you try to misbehave like we found out that When login with same user in multiple tab with different organization(there is a organization dropdown in the master page which sets the cookie whenever it is changed.)
in tab one it is org 1 and tab 2 it is org2 , cookie has the later org 2 in it but when we go back in tab1(which had org1) and save the record org 2 will be saved with the record
So can some one share some sort of a checklist with us which address these types of problem.
Unfortunately there is not much you could do about this. Browsers share cookies between tabs. And forms authentication uses cookies to track users. That's the same behavior you will get with other sites as well such as gmail for example.
you can add hidden field with data for recognize each view.
you store all data in server side such as session,cache,Database and serve "unique" view
I hope you'll find an elegant solution to this problem, but AFAK one instance of browser simply shares one identity.
To deal with this situation we use HIDDEN on master page, that is a part of main form. Its value is randomly generated when first page loads. Later, the value is kept between requests. Session values are stored with HashTable key of hidden value.
2 more hacks needed to get it work.
Response.Redirect is done with simple form that uses POST method to pass HIDDEN value to the new page.
All hrefs clicked with left button also posting HIDDEN value (if user uses 'Open in new TAB/Window' direct redirect without post simply creates new HIDDEN value - new subsession.)