NextJS generate page getServerSideProps to getStaticProps [duplicate] - next.js

This question already has answers here:
Internal API fetch with getServerSideProps? (Next.js)
(3 answers)
Closed last year.
using getServerSideProps to do fetch internal API data, the TTFb time is really high, my page run slow.
So I'm searching for other fetching strategies, my MongoDB data is not large (DATABASE SIZE: 33.84KB), and data does not change often, the best way I think is the State generation page, the total should only 25 pages being generated, but the problem is getStateProps() method can't fetch internal API (development works, production not).
I try:
useEffect : slower than getServerProps
export the MongoDB file to data.js and put it into the project as a fake API: it can work with getStaticProp but the date I still want to storge in the database.
Host API to other domains as external: getStateProps works, approach weird
hard code every 25 page (X)
Question:
the method to improve the code and TTFB
Why getStateProps can't fetch internal API, why design like that.

I saw an article on MongoDB, here is the link, just don't use internal API and then fetch data direct to MongoDB in getStaticProps, here in my code.
BEFORE
export async function getServerSideProps() {
const response = await fetch(`${server}/api/gallery`);
const data = await response.json();
if (!data) {
return {
notFound: true,
};
}
return {
props: { data },
};
}
AFTER
export async function getStaticProps() {
await dbConnect()
//connect to mongodb
const gallery = await art.find()
//i use mongoose model to fetch data
return {
props:{
data:JSON.parse(JSON.stringify(gallery))
}
}
}

Related

Next.js Auto Kills Process When Fetching too much data

So I had been using next.js for fetching posts from the server using getServerSideProps()
It had been working fine for a little bit, but then it started throwing this error:
Warning: data for page "/" is 1.57 MB which exceeds the threshold of 128 kB
The fact is that there are only 27 posts from the server but it is continously going to increase from 100s to 1000s of posts. But Next.js throws me this error as it has to fetch the ENTIRE POST ROUTE AND GET ALL THE 1000s OF POSTS AT BEFORE THE PAGE IS LOADED. But this is not preferable to data that may change constantly. (eg: if someone likes a post. )
So it goes for speed over content and KILLS THE NEXT.JS PROCESS.
Is there any way of limiting the data coming from getServerSideProps() or have any workarounds this issue.
Here is my code if needed:
pages/index.tsx
import { fetchPosts } from '../utils';
// ...
export async function getServerSideProps() {
const { data } = await fetchPosts();
return {
props: { posts: data },
};
}
api.ts
const API = axios.create({ baseURL: "http://localhost:5000" });
export const fetchPosts: FetchPostsAPI = () => API.get("/posts");

Build fails while building SSR/ISR pages with new API routes

I am getting issues while building new ISR/SSR pages with getStaticProps and getStaticPaths
Brief explanation:
While creating ISR/SSR pages and adding new API route never existed before, building on Vercel fails because of building pages before building API routes (/pages/api folder)
Detailed explanation:
A. Creating next SSR page with code (/pages/item/[pid].tsx)
export async function getStaticProps(context) {
const pid = context.params.pid;
//newly created API route
const res = await fetch(process.env.APIpath + '/api/getItem?pid=' + (pid));
const data = await res.json();
return {
props: {item: data}
}
}
export async function getStaticPaths(context) {
//newly created API route
let res = await fetch(process.env.APIpath + '/api/getItemsList')
const items = await res.json()
let paths = []
//multi-language support for the pages
for (const item of items){
for (const locale of context.locales){
paths.push({params: {pid: item.url }, locale: locale})
}
}
return { paths, fallback: false }
}
B. Local checks work, deploying to Vercel
C. During deploying Vercel triggers an error because trying to get data from the API route doesn't exist yet. (Vercel is deploying /pages/item/[pid].tsx first and /api/getItemsList file after). Vercel trying to get data from https://yourwebsite.com/api/getItemsList which does not exist.
Only way I am avoiding this error:
Creating API routes needed
Deploying project to Vercel
Creating [pid].tsx page/s and then deploy it
Deploying final version of code
The big issue with my approach is you are making 1 deployment you don't actually. The problems appears also while remaking the code for your API routes also.
Question: there is an way/possiblity to force Versel to deploy firstly routes and than pages?
Any help appreciated

Web Scraping in React & MongoDB Stitch App

I'm moving a MERN project into React + MongoDB Stitch after seeing it allows for easy user authentication, quick deployment, etc.
However, I am having a hard time understanding where and how can I call a site scraping function. Previously, I web scraped in Express.js with cheerio like:
app.post("/api/getTitleAtURL", (req, res) => {
if (req.body.url) {
request(req.body.url, function(error, response, body) {
if (!error && response.statusCode == 200) {
const $ = cheerio.load(body);
const webpageTitle = $("title").text();
const metaDescription = $("meta[name=description]").attr("content");
const webpage = {
title: webpageTitle,
metaDescription: metaDescription
};
res.send(webpage);
} else {
res.status(400).send({ message: "THIS IS AN ERROR" });
}
});
}
});
But obviously with Stitch no Node & Express is needed. Is there a way to fetch another site's content without having to host a node.js application just serving that one function?
Thanks
Turns out you can build Functions in MongoDB Stitch that allows you to upload external dependencies.
However, there're limitation, for example, cheerio didn't work as an uploaded external dependency while request worked. A solution, therefore, would be to create a serverless function in AWS's lambda, and then connect mongoDB stitch to AWS lambda (mongoDB stitch can connect to many third party services, including many AWS lambda cloud services like lambda, s3, kinesis, etc).
AWS lambda allows you to upload any external dependencies, if mongoDB stitch allowed for any, we wouldn't need lambda, but stitch still needs many support. In my case, I had a node function with cheerio & request as external dependencies, to upload this to lambda: make an account, create new lambda function, and pack your node modules & code into a zip file to upload it. Your zip should look like this:
and your file containing the function should look like:
const cheerio = require("cheerio");
const request = require("request");
exports.rss = function(event, context, callback) {
request(event.requestURL, function(error, response, body) {
if (!error && response.statusCode == 200) {
const $ = cheerio.load(body);
const webpageTitle = $("title").text();
const metaDescription = $("meta[name=description]").attr("content");
const webpage = {
title: webpageTitle,
metaDescription: metaDescription
};
callback(null, webpage);
return webpage;
} else {
callback(null, {message: "THIS IS AN ERROR"})
return {message: "THIS IS AN ERROR"};
}
});
};
and in mongoDB, connect to a third party service, choose AWS, enter the secret keys you got from making an IAM amazon user. In rules -> actions, choose lambda as your API, and allow for all actions. Now, in your mongoDB stitch functions, you can connect to Lambda, and that function should look like this in my case:
exports = async function(requestURL) {
const lambda = context.services.get('getTitleAtURL').lambda("us-east-1");
const result = await lambda.Invoke({
FunctionName: "getTitleAtURL",
Payload: JSON.stringify({requestURL: requestURL})
});
console.log(result.Payload.text());
return EJSON.parse(result.Payload.text());
};
Note: this slowed down performances big time though, generally, it took twice extra time for the call to finish.

NuxtJS state changes and firebase authentication

I am still a nuxt beginner, so please excuse any faults.
I am using the "official" firebase module for nuxt https://firebase.nuxtjs.org/ to access firebase services such as auth signIn and singOut.
This works.
However, I am using nuxt in universal mode and I cannot access this inside my page fetch function. So my solution is to save this info in the vuex store and update it as it changes.
So, once a user is logged in or the firebase auth state changes, a state change needs to happen in the vuex store.
Currently, when a user logs in or the firebase auth state changes, if the user is still logged in, I save the state to my store like so :
const actions = {
async onAuthStateChangedAction(state, { authUser, claims }) {
if (!authUser) {
// claims = null
// TODO: perform logout operations
} else {
// Do something with the authUser and the claims object...
const { uid, email } = authUser
const token = await authUser.getIdToken()
commit('SET_USER', { uid, email, token })
}
}
}
I also have a mutation where the state is set, a getter to get the state and the actual state object as well to store the initial state:
const mutations = {
SET_USER(state, user) {
state.user = user
}
}
const state = () => ({
user: null
})
const getters = {
getUser(state) {
return state.user
}
}
My problem is, on many of my pages, I use the fetch method to fetch data from an API and then I store this data in my vuex store.
This fetch method uses axios to make the api call, like so:
async fetch({ store }) {
const token = store.getters['getUser'] //This is null for a few seconds
const tempData = await axios
.post(
my_api_url,
{
my_post_body
},
{
headers: {
'Content-Type': 'application/json',
Authorization: token
}
}
)
.then((res) => {
return res.data
})
.catch((err) => {
return {
error: err
}
console.log('error', err)
})
store.commit('my_model/setData', tempData)
}
Axios needs my firebase user id token as part of the headers sent to the API for authorization.
When the fetch method runs, the state has not always changed or updated yet, and thus the state of the user is still null until the state has changed, which is usually about a second later, which is a problem for me since I need that token from the store to make my api call.
How can I wait for the store.user state to finish updating / not be null, before making my axios api call inside my fetch method ?
I have considered using cookies to store this information when a user logs in. Then, when inside the fetch method, I can use a cookie to get the token instead of having to wait for the state to change. The problem I have with this approach is that the cookie also needs to wait for a state change before it updates it's token, which means it will use an old token upon the initial page load. I might still opt for this solution, it just feels like it's the wrong way to approach this. Is there any better way to handle this type of conundrum ?
Also, when inside fetch, the first load will be made from the server, so I can grab the token from the cookie, however the next load will be from the client, so how do I retrieve the token then if the store value will still be null while loading ?
EDIT:
I have opted for SPA mode. After thinking long and hard about it, I don't really need the nuxt server and SPA mode has "server-like" behaviour, where you could still use asyncdata and fetch to fetch data before pages render, middleware still works similar and authentication actually works where you dont have to keep the client and server in sync with access tokens etc. I would still like to see a better solution for this in the future, but for now SPA mode works fine.
I came across this question looking for a solution to a similar problem. I had a similar solution in mind as mentioned in the other answer before coming to this question, what I was looking for was the implementation details.
I use nuxt.js, the first approach that came to my mind was make a layout component and render the <Nuxt/> directive only when the user is authenticated, but with that approach, I can have only one layout file, and if I do have more than one layout file I will have to implement the same pre-auth mechanism across every layout, although this is do-able as now I am not implementing it in every page but implementing in every layout which should be considerably less.
I found an even better solution, which was to use middlewares in nuxt, you can return a promise or use async-await with the middleware to stop the application mounting process until that promise is resolved. Here is the sample code:
// middleware/auth.js
export default async function ({ store, redirect, $axios, app }) {
if (!store.state.auth) { // if use is not authenticated
if (!localStorage.getItem("token")) // if token is not set then just redirect the user to login page
return redirect(app.localePath('/login'))
try {
const token = localStorage.getItem("token");
const res = await $axios.$get("/auth/validate", { // you can use your firebase auth mechanism code here
headers: {
'Authorization': `Bearer ${token}`
}
});
store.commit('login', { token, user: res.user }); // now just dispatch a login action to vuex store
}
catch (err) {
store.commit('logout'); // preauth failed, so dispatch logout action which will clear localStorage and our Store
return redirect(app.localePath('/login'))
}
}
}
Now you can use this middleware in your page/layout component, like so:
<template>
...
</template>
<script>
export default {
middleware: "auth",
...
}
</script>
One way of fixing this is to do the firebase login before mounting the app.
Get the token from firebase, save it in vuex and only after that mount the app.
This will ensure that by the time the pages load you have the firebase token saved in the store.
Add checks on the routes for the pages that you don't want to be accessible without login to look in the store for the token (firebase one or another) and redirect to another route if none is present.

Why does dynamoose store the data only for a very short time?

I use simple setup from dynamoose page.
const startUpAndReturnDynamo = async () => {
const dynaliteServer = dynalite();
await dynaliteServer.listen(8000);
return dynaliteServer;
};
const createDynamooseInstance = () => {
dynamoose.AWS.config.update({
accessKeyId: 'AKID',
secretAccessKey: 'SECRET',
region: 'us-east-1'
});
dynamoose.local(); // This defaults to "http://localhost:8000"
}
const bootStrap = async () => {
await startUpAndReturnDynamo();
createDynamooseInstance();
}
bootStrap();
I can save the data, get the data by Model.get(hashKey) and my data seems likely be saved only for less than a minute? After that query returns undefined.
There is another TTL (time to live) setup but since I didn't use it. My data should stay permanent in DynamoDB, right?
I found the problem.
Because I was using the remoting dynamodb, not the local one.
dynamoose.local() should be changed to dynamoose.ddb()
dynamoose.local() Configure Dynamoose to use a local DynamoDB
dynamoose.ddb() Configures and returns the AWS.DynamoDB object.
The document of dynamoosejs is very detailed but somehow not easily comprehensible to me.
I posted the answer in case newbie with dynamoose facing the same problem.

Resources