Nextjs export clear "out" folder - next.js

I'm working with nextjs and this example https://github.com/zeit/next.js/tree/master/examples/with-static-export
in next.config.js i have code:
module.exports = {
async exportPathMap(defaultPathMap, { dev, dir, outDir, distDir, buildId, incremental }) {
// we fetch our list of posts, this allow us to dynamically generate the exported pages
const response = await fetch(
'https://jsonplaceholder.typicode.com/posts?_limit=3'
)
const postList = await response.json()
// tranform the list of posts into a map of pages with the pathname `/post/:id`
const pages = postList.reduce(
(pages, post) =>
Object.assign({}, pages, {
[`/post/${post.id}`]: { page: '/post/[id]' },
}),
{}
)
// combine the map of post pages with the home
return Object.assign({}, pages, {
'/': { page: '/' },
})
},
}
Its fetch 3 posts and generate files - [id].html - its great!
But now i need to fetch new post and build page only for this new post but commad next export remove all files from out and create only one post.
What i need to do to keep old post and add new one on next export?
Example:
First next export with request for 3 posts from api
generate 3 post in folder "out"
change api url and run next export for 1 new post
summary i have 3 old post pages and 1 new in my "out" directory
How to do that?

Next can't do this out of the box, but you can set it up to do so. First, you'll need a system (database) of which pages have already been built. Second, you'll need some method of communicating with that database (api) to ask which pages should be built (eg, send over a list of pages and the api responds telling you which ones have not yet bene built). Then, tell your exportPathMap which pages to build. And finally, move your built pages out of out and into a new final/public directory.
By default, Next will build/export anything in the pages directory plus anything you set in exportPathMap, and put all of those in the out directory. You can override what it builds by passing a custom exportPathMap, and how you handle what goes into the out directory is up to you, so you can move those files to a different actual public directory and merge them with the old files.

Related

Strapi V4 slugify plugin not seeing the created models

I just started playing around with strapi using it for my next project with nextjs and i got stuck a little bit on the slug part.
I have installed the slugify plugin in the strapi admin panel, restarted the server and in the roles(permissions) section i enabled it for both authenticated and public roles.After this i created a collection type name Blog. I added some fields to it title, content, cover, slug(short text).
After this i created some blog posts and listed them out on the page. The problem began when i tried to access the blog post using the slug:
`${process.env.NEXT_PUBLIC_STRAPI_URL}/slugify/slugs/blog/${slug}?populate=*`,
The url is ok as the slug part is populated and is the value that i have given the slug field when created the blog post. The error that i get is the following:
blog model name not found, all models must be defined in the settings and are case sensitive.
The problem is that the slugify plugin is trying to match the model name to the existing ones and its not finding it so throws this error.
I started to dig a little bit deeper and began to console log in the slugify plugin inside strapi node_module:
module.exports = ({ strapi }) => ({
async findSlug(ctx) {
const { models } = getPluginService(strapi, 'settingsService').get();
const { modelName, slug } = ctx.request.params;
const { auth } = ctx.state;
console.log(getPluginService(strapi, 'settingsService').get());
isValidFindSlugParams({
modelName,
slug,
models,
});
As you can see it should container a models param aswell that should contain all the current models created in strapi. However the model paramateres comes back as an empty object, its like the plugin does not see the created collections.
The collections were created after the instalation of the slugify plugin.
I am developing on localhost using sqlite with strapi v4.
Any ideas why is this happening? Anyone else encountered this error?
Thanks,
Trix
Firstly, you have to install Slugify plugin.
After that you have to do some config steps.
To do all of that:
As you mentioned, you found Slugify folder in node_modules so you can skip first step:
npm install strapi-plugin-slugify
in the ./config/ folder create a file named plugins.js
./config/plugins.js
Paste this code there it will let you see the endpoint path in the right side of the screen:
module.exports = ({ env }) => ({
// ...
slugify: {
enabled: true,
config: {
contentTypes: {
blog: { //write what your collection type's name is this case we should use "blog"
field: "slug",
references: "title",
},
},
},
},
// ...
});
The endpoint example
fetcher(`${API}/slugify/slugs/blog/${slug}`)

Where should I store JSON file and fetch data in Next.JS whenever I need?

Project:
I am working on an E-commerce application and it has more than 1,600 products and 156 categories.
Problem:
Initially, on the first product page, 30 products will be fetched (due to the page limitation), but on the left sidebar, I need filters that will be decided on the basis of tags of all 1,600 products. So that's why I need all the products in the first fetch and then I will extract common tags by looping over all the products and immediately show them on the sidebar.
What do I want?
I am not sure but I think it would be the best solution if I generate a JSON file containing all the products and store it somewhere, where I can fetch just hitting the URL using REST API in Next JS (either in getServerSideProps or getStaticProps).
Caveat:
I tried by storing JSON file in ./public directory in next js application, it worked in localhost but not in vercel.
Here is the code I wrote for storing JSON file in ./public directory:
fs.writeFileSync("./public/products.json", JSON.stringify(products, null, 2)); //all 1,600 products
One solution it to fetch it directly from front-end (if the file is not too big) otherwise, for reading the file in getServerSideProps you will need a custom webpack configuration.
//next.config.js
const path = require("path")
const CopyPlugin = require("copy-webpack-plugin")
module.exports = {
target: "serverless",
future: {
webpack5: true,
},
webpack: function (config, { dev, isServer }) {
// Fixes npm packages that depend on `fs` module
if (!isServer) {
config.resolve.fallback.fs = false
}
// copy files you're interested in
if (!dev) {
config.plugins.push(
new CopyPlugin({
patterns: [{ from: "content", to: "content" }],
})
)
}
return config
},
}
Then you can create a utility function to get the file:
export async function getStaticFile(file) {
let basePath = process.cwd()
if (process.env.NODE_ENV === "production") {
basePath = path.join(process.cwd(), ".next/server/chunks")
}
const filePath = path.join(basePath, `file`)
const fileContent = await fs.readFile(filePath, "utf8")
return fileContent
}
There is an open issue regarding this:
Next.js API routes (and pages) should support reading files

Firebase Functions: hosting rewrite to dynamically generate sitemap.xml with more than 50000 links

I´d like to dynamically generate a sitemap.xml containing all static and dynamic user links (through uids from Firestore) with Cloud Functions when a user or a crawler requests https://www.example.com/sitemap.xml. I already managed to implement a working version using sitemap.js (https://github.com/ekalinin/sitemap.js#generate-a-one-time-sitemap-from-a-list-of-urls) and Firebase Hosting rewrites. However, my current solution (see below) generates one large sitemap.xml and only works for up to 50000 links which is not scalable.
Current solution:
Hosting rewrite in firebase.json:
"hosting": [
...
"rewrites": [
{
"source": "/sitemap.xml",
"function": "generate_sitemap"
},
]
}
],
Function in index.ts
export const generateSitemap = functions.region('us-central1').https.onRequest((req, res) => {
const afStore = admin.firestore();
const promiseArray: Promise<any>[] = [];
const stream = new SitemapStream({ hostname: 'https://www.example.com' });
const fixedLinks: any[] = [
{ url: `/start/`, changefreq: 'hourly', priority: 1 },
{ url: `/help/`, changefreq: 'weekly', priority: 1 }
];
const userLinks: any[] = [];
promiseArray.push(afStore.collection('users').where('active', '==', true).get().then(querySnapshot => {
querySnapshot.forEach(doc => {
if (doc.exists) {
userLinks.push({ url: `/user/${doc.id}`, changefreq: 'daily', priority: 1 });
}
});
}));
return Promise.all(promiseArray).then(() => {
const array = fixedLinks.concat(userLinks);
return streamToPromise(Readable.from(array).pipe(stream)).then((data: any) => {
res.set('Content-Type', 'text/xml');
res.status(200).send(data.toString());
return;
});
});
});
Since, this scales only to about 50000 links, I´d like to do something like https://github.com/ekalinin/sitemap.js#create-sitemap-and-index-files-from-one-large-list. But it seems like I´d need to actually create and temporarily store .xml files somehow.
Does anyone have experience with this issue?
As you noted, this isn't scalable and your costs are going to skyrocket since you pay per read/write on Firestore, so I would recommend rethinking your architecture.
I solved a similar problem several years ago for an App Engine website that needed to generate sitemaps for millions of dynamically created pages and it was so efficient that it never exceeded the free tier's limits.
Step 1: Google Storage instead of Firestore
When a page is created, append that URL to a text file in a Google Storage bucket on its own line. If your URLs have a unique ID you can use that to search and replace existing URLs.
https://www.example.com/foo/some-long-title
https://www.example.com/bar/some-longer-title
If may be helpful to break the URLs into smaller files. If some URLs start with /foo and others start with /bar I'd create at least two files called sitemap_foo.txt and sitemap_bar.txt and store the URLs into their respective files.
Step 2: Dynamically Generate Sitemap Index
Instead of a normal enormous XML sitemap, create a sitemap index that points to your multiple sitemap files.
When /sitemap.xml is visited have the following index generated by looping through the sitemap files in your bucket and listing them like this:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://storage.google...../sitemap_foo.txt</loc>
</sitemap>
<sitemap>
<loc>https://storage.google...../sitemap_bar.txt</loc>
</sitemap>
</sitemapindex>
Step 3: Remove Broken URLs
Update your 404 controller to search and remove the URL from your sitemap if found.
Summary
With the above system you'll have a scalable, reliable and efficient sitemap generation system that will probably cost you little to nothing to operate.
Answers to your questions
Q: How many URLs can you have in a sitemap?
A: According to Google, 50,000 or 50MB uncompressed.
Q: Do I need to update my sitemap everytime I add a new user/post/page?
A: Yes.
Q: How do you write to a single text file without collisions?
A: Collisions are possible, but how many new pages/posts/users are being created per second? If more than one per second I would create a Pub/Sub topic with a function that drains it to update the sitemaps in batches. Otherwise I'd just have it update directly.
Q: Let's say I created a sitemap_users.txt for all of my users...
A: Depending on how many users you have, it may be wise to break it up even further to group them into users per month/week/day. So you'd have sitemap_users_20200214.txt that would contain all users created that day. That would most likely prevent the 50,000 URL limit.

Firebase. How to execute function AND load webpage (hosting) with a URL

I need to execute a Firebase function AND load a page using a single URL.
e.g.
URL = "localhost:5000/cars?make=hyundai&model=elantra"
This URL will load the views/cars/index.html webpage.
This URL will also call the loadCar() Firebase function.
...How can I achieve these two goals? (or is there a better way to achieve this - load a page AND invoke a function)
Current behaviour: It loads the page (achieve 1), but doesn't invoke the loadCar() function (doesn't achieve 2).
FYI my firebase.json:
{
"hosting": {
"public": "functions/views",
"rewrites": [
{
"source": "/cars/**",
"function": "loadCar"
}
]
}
My file directory (arrows pointing to relevant files):
loadCar():
exports.loadCar= functions.https.onRequest((req, res) => {
// const requestedCar = Get requested car info from URL queries.
// Save requestedCar to Firestore DB, so it can be loaded in the html view.
// ... etc.
res.end();
});
If you're asking if it's possible for a single Firebase Hosting URL to both load static content and trigger a function, it's not possible. It has to be either one, not both.
You could instead have some JavaScript code in the static content invoke the function through another URL. Or you could have the URL invoke the function, which returns HTML to display, in addition to performing other work. But the destination of the requested URL can only go to static content or a function, not both.

Change url to default page next.js

I tried to find how to change the default page path (when you access the server on the "/" path) but couldn't find anything. I have a structure in the pages directory in the form index/index.jsx and I want this page to be returned by default when accessing the server. I did not find a similar question on this forum, maybe someone will need your help, except me.
The way pages structure works in Next.js is very opiniated. If you need to send a file from a different pages structure when accessing / you would need to configure your own little express server.
See this link for official doc.
You would then have something like :
// Some code
if (pathname === '/') {
app.render(req, res, '/index', query) // Or /index/index.jsx
}
// Some code
Also if you do not want to create your own express server, you could have an index.js that redirects to your index page with something like this :
import Router from 'next/router'
const Index = () => null
Index.getInitialProps = async ({ res }) => {
if (res) {
res.writeHead(302, {
Location: `/index`
})
res.end()
}
else
Router.push(`/index`)
return {}
}
export default Index

Resources