Next.js Auto Kills Process When Fetching too much data - next.js

So I had been using next.js for fetching posts from the server using getServerSideProps()
It had been working fine for a little bit, but then it started throwing this error:
Warning: data for page "/" is 1.57 MB which exceeds the threshold of 128 kB
The fact is that there are only 27 posts from the server but it is continously going to increase from 100s to 1000s of posts. But Next.js throws me this error as it has to fetch the ENTIRE POST ROUTE AND GET ALL THE 1000s OF POSTS AT BEFORE THE PAGE IS LOADED. But this is not preferable to data that may change constantly. (eg: if someone likes a post. )
So it goes for speed over content and KILLS THE NEXT.JS PROCESS.
Is there any way of limiting the data coming from getServerSideProps() or have any workarounds this issue.
Here is my code if needed:
pages/index.tsx
import { fetchPosts } from '../utils';
// ...
export async function getServerSideProps() {
const { data } = await fetchPosts();
return {
props: { posts: data },
};
}
api.ts
const API = axios.create({ baseURL: "http://localhost:5000" });
export const fetchPosts: FetchPostsAPI = () => API.get("/posts");

Related

Trying to implement shopify webhooks but getting 'InternalServerError: stream is not readable'

I'm building an app for shopify and need to add the GDPR webhooks. My back end is handled using next.js and I'm writing a webhook handler to verify them. The docs havent been very helpful because they dont show how to do it with node. This is my verification function.
export function verifiedShopifyWebhookHandler(
next: (req, res, body) => Promise
): NextApiHandler {
return async (req, res) => {
const hmacHeader = req.headers['x-shopify-hmac-sha256'];
const rawBody = await getRawBody(req);
const digest = crypto.createHmac('sha256', process.env.SHOPIFY_API_SECRET).update(rawBody).digest('base64');
if (digest === hmacHeader) {
return next(req, res, rawBody);
}
const webhookId = req.headers['x-shopify-webhook-id'];
return res.status(401).end();
};
}
But I get this Error: error - InternalServerError: stream is not readable
I think it has to do with now Next.js parses the incoming requests before they are sent to my api. Any ideas?
I discovered the answer. Next.js was pre parsing the body in the context which made it so that I couldn't use the raw body parser to parse it. By setting this:
export const config = {
api: {
bodyParser: false
}
};
above the api function in the api file it prevented next from parsing it and causing the issue. I found the answer because people had the same issue integrating swipe and using the bodyParser.

NextJS dev server response time

My main question is; is there a difference in response time for fetching in localhost vs live/production?
I have a project im building in NextJS, with GraphCMS and I'm using GraphQL/graphql-request to fetch the data. When I first start up localhost and the pages loads, I click a link in the page navigation to go to about page and it literally takes 2 seconds for the data to fetch and the page to change. I'm watching the network tab in Chrome DevTools and the .json file status is (pending) and then switches to 200 once the content is downloaded. Here is a screenshot from DevTools:
When I hover over the waterfall, It says the Waiting for server response is 1.86s and content download is 0.46ms. So is the waiting for server response, something because im on a localhost, or is this something to do with the GraphCMS server were its fetching the data from?
Also you may note that the json file size is only 5.2kB, so its not a large fetch.
To give you a little context on the code, my queries & client are stored in the /lib folder:
// /lib/client.js
import { GraphQLClient } from 'graphql-request'
export const graphcmsClient = () =>
new GraphQLClient(process.env.NEXT_PUBLIC_GRAPHCMS_URL, {
headers: {
authorization: `Bearer ${process.env.GRAPHCMS_TOKEN}`,
},
})
// example of a query in ./lib/queries.js
import { gql } from 'graphql-request'
const blogPageQuery = gql`
fragment BlogPostFields on BlogPost {
id
category
content
coverImage {
id
height
url
width
}
excerpt
published
slug
title
}
`
and here is an example where im using getStaticProps and fetching the query data:
export async function getStaticProps({ params, preview = false }) {
const client = graphcmsClient(preview)
const collectionCards = await getAllCollections()
const { page, navigation } = await client.request(pageQuery, {
slug: params.slug
})
if (!page) {
return {
notFound: true
}
}
const parsedPageData = await parsePageData(page)
return {
props: {
page: parsedPageData,
navigation,
collectionCards,
preview
},
revalidate: 60
}
}

NextJS generate page getServerSideProps to getStaticProps [duplicate]

This question already has answers here:
Internal API fetch with getServerSideProps? (Next.js)
(3 answers)
Closed last year.
using getServerSideProps to do fetch internal API data, the TTFb time is really high, my page run slow.
So I'm searching for other fetching strategies, my MongoDB data is not large (DATABASE SIZE: 33.84KB), and data does not change often, the best way I think is the State generation page, the total should only 25 pages being generated, but the problem is getStateProps() method can't fetch internal API (development works, production not).
I try:
useEffect : slower than getServerProps
export the MongoDB file to data.js and put it into the project as a fake API: it can work with getStaticProp but the date I still want to storge in the database.
Host API to other domains as external: getStateProps works, approach weird
hard code every 25 page (X)
Question:
the method to improve the code and TTFB
Why getStateProps can't fetch internal API, why design like that.
I saw an article on MongoDB, here is the link, just don't use internal API and then fetch data direct to MongoDB in getStaticProps, here in my code.
BEFORE
export async function getServerSideProps() {
const response = await fetch(`${server}/api/gallery`);
const data = await response.json();
if (!data) {
return {
notFound: true,
};
}
return {
props: { data },
};
}
AFTER
export async function getStaticProps() {
await dbConnect()
//connect to mongodb
const gallery = await art.find()
//i use mongoose model to fetch data
return {
props:{
data:JSON.parse(JSON.stringify(gallery))
}
}
}

NextJS special characters routes do not work from browser

Using NextJS, I am defining some routes in getStaticPaths by making an API call:
/**
* #dev Fetches the article route and exports the title and id to define the available routes
*/
const getAllArticles = async () => {
const result = await fetch("https://some_api_url");
const articles = await result.json();
return articles.results.map((article) => {
const articleTitle = `${article.title}`;
return {
params: {
title: articleName,
id: `${article.id}`,
},
};
});
};
/**
* #dev Defines the paths available to reach directly
*/
export async function getStaticPaths() {
const paths = await getAllArticles();
return {
paths,
fallback: false,
};
}
Everything works most of the time: I can access most of the articles, Router.push works with all URLs defined.
However, when the article name includes a special character such as &, Router.push keeps working, but copy/pasting the URL that worked from inside the app to another tab returns a page:
An unexpected error has occurred.
In the Network tab of the inspector, a 404 get request error (in Network) appears.
The component code is mostly made of API calls such as:
await API.put(`/set_article/${article.id}`, { object });
With API being defined by axios.
Any idea why it happens and how to make the getStaticPaths work with special characters?
When you transport values in URLs, they need to be URL-encoded. (When you transport values in HTML, they need to be HTML encoded. In JSON, they need to be JSON-encoded. And so on. Any text-based system that can transport structured data has an encoding scheme that you need to apply to data. URLs are not an exception.)
Turn your raw values in your client code
await API.put(`/set_article/${article.id}`)
into encoded ones
await API.put(`/set_article/${encodeURIComponent(article.id)}`)
It might be tempting, but don't pre-encode the values on the server-side. Do this on the client end, at the time you actually use them in a URL.

fetching mp3 file from MeteorJS and trying to convert it into a Blob so that I can play it

am playing around with downloading and serving mp3 files in Meteor.
I am trying to download an MP3 file (https://www.sample-videos.com/audio/mp3/crowd-cheering.mp3) on my MeteorJS Server side (to circumvent CORS issues) and then pass it back to the client to play it in a AUDIO tag.
In Meteor you use the Meteor.call function to call a server method. There is not much to configure, it's just a method call and a callback.
When I run the method I receive this:
content:
"ID3���#K `�)�<H� e0�)������1������J}��e����2L����������fȹ\�CO��ȹ'�����}$A�Lݓ����3D/����fijw��+�LF�$?��`R�l�YA:A��#�0��pq����4�.W"�P���2.Iƭ5��_I�d7d����L��p0��0A��cA�xc��ٲR�BL8䝠4���T��..etc..", data:null,
headers: {
accept-ranges:"bytes",
connection:"close",
content-length:"443926",
content-type:"audio/mpeg",
date:"Mon, 20 Aug 2018 13:36:11 GMT",
last-modified:"Fri, 17 Jun 2016 18:16:53 GMT",
server:"Apache",
statusCode:200
which is the working Mp3 file (the content-length is exactly the same as the file I write to disk on the MeteorJS Server side, and it is playable).
However, my following code doesn't let me convert the response into a BLOB:
```
MeteorObservable.call( 'episode.download', episode.url.url ).subscribe( ( result: any )=> {
console.log( 'response', result);
let URL = window.URL;
let blob = new Blob([ result.content ], {type: 'audio/mpeg'} );
console.log('blob', blob);
let audioUrl = URL.createObjectURL(blob);
let audioElement:any = document.getElementsByTagName('audio')[0];
audioElement.setAttribute("src", audioUrl);
audioElement.play();
})
When I run the code, the Blob has the wrong size and is not playable
Blob(769806) {size: 769806, type: "audio/mpeg"}
size:769806
type:"audio/mpeg"
__proto__:Blob
Uncaught (in promise) DOMException: Failed to load because no supported source was found.
On the backend I just run a return HTTP.get( url ); in the method which is using import { HTTP } from 'meteor/http'.
I have been trying to use btoa or atob but that doesn't work and as far as I know it is already a base64 encoded file, right?
I am not sure why the Blob constructor creates a larger file then the source returned from the backend. And I am not sure why it is not playing.
Can anyone point me to the right direction?
Finally found a solution that uses request instead of Meteor's HTTP:
First you need to install request and request-promise-native in order to make it easy to return your result to clients.
$ meteor npm install --save request request-promise-native
Now you just return the promise of the request in a Meteor method:
server/request.js
import { Meteor } from 'meteor/meteor'
import request from 'request-promise-native'
Meteor.methods({
getAudio (url) {
return request.get({url, encoding: null})
}
})
Notice the encoding: null flag, which causes the result to be binary. I found this in a comment of an answer related to downloading binary data via node. This causes not to use string but binary representation of the data (I don't know how but maybe it is a fallback that uses Node Buffer).
Now it gets interesting. On your client you wont receive a complex result anymore but either an Error or a Uint8Array which makes sense because Meteor uses EJSON to send data over the wires with DDP and the representation of binary data is a Uint8Array as described in the documentation.
Because you can just pass in a Uint8Array into a Blob you can now easily create the blob like so:
const blob = new Blob([utf8Array], {type: 'audio/mpeg'})
Summarizing all this into a small template if could look like this:
client/fetch.html
<template name="fetch">
<button id="fetchbutton">Fetch Mp3</button>
{{#if source}}
<audio id="player" src={{source}} preload="none" content="audio/mpeg" controls></audio>
{{/if}}
</template>
client/fetch.js
import { Template } from 'meteor/templating'
import { ReactiveVar } from 'meteor/reactive-var'
import './fetch.html'
Template.fetch.onCreated(function helloOnCreated () {
// counter starts at 0
this.source = new ReactiveVar(null)
})
Template.fetch.helpers({
source () {
return Template.instance().source.get()
},
})
Template.fetch.events({
'click #fetchbutton' (event, instance) {
Meteor.call('getAudio', 'https://www.sample-videos.com/audio/mp3/crowd-cheering.mp3', (err, uint8Array) => {
const blob = new Blob([uint8Array], {type: 'audio/mpeg'})
instance.source.set(window.URL.createObjectURL(blob))
})
},
})
Alternative solution is adding a REST endpoint *using Express) to your Meteor backend.
Instead of HTTP we use request and request-progress to send the data chunked in case of large files.
On the frontend I catch the chunks using https://angular.io/guide/http#listening-to-progress-events to show a loader and deal with the response.
I could listen to the download via
this.http.get( 'the URL to a mp3', { responseType: 'arraybuffer'} ).subscribe( ( res:any ) => {
var blob = new Blob( [res], { type: 'audio/mpeg' });
var url= window.URL.createObjectURL(blob);
window.open(url);
} );
The above example doesn't show progress by the way, you need to implement the progress-events as explained in the angular article. Happy to update the example to my final code when finished.
The Express setup on the Meteor Server:
/*
Source:http://www.mhurwi.com/meteor-with-express/
## api.class.ts
*/
import { WebApp } from 'meteor/webapp';
const express = require('express');
const trackRoute = express.Router();
const request = require('request');
const progress = require('request-progress');
export function api() {
const app = express();
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
app.use('/episodes', trackRoute);
trackRoute.get('/:url', (req, res) => {
res.set('content-type', 'audio/mp3');
res.set('accept-ranges', 'bytes');
// The options argument is optional so you can omit it
progress(request(req.params.url ), {
// throttle: 2000, // Throttle the progress event to 2000ms, defaults to 1000ms
// delay: 1000, // Only start to emit after 1000ms delay, defaults to 0ms
// lengthHeader: 'x-transfer-length' // Length header to use, defaults to content-length
})
.on('progress', function (state) {
// The state is an object that looks like this:
// {
// percent: 0.5, // Overall percent (between 0 to 1)
// speed: 554732, // The download speed in bytes/sec
// size: {
// total: 90044871, // The total payload size in bytes
// transferred: 27610959 // The transferred payload size in bytes
// },
// time: {
// elapsed: 36.235, // The total elapsed seconds since the start (3 decimals)
// remaining: 81.403 // The remaining seconds to finish (3 decimals)
// }
// }
console.log('progress', state);
})
.on('error', function (err) {
// Do something with err
})
.on('end', function () {
console.log('DONE');
// Do something after request finishes
})
.pipe(res);
});
WebApp.connectHandlers.use(app);
}
and then add this to your meteor startup:
import { Meteor } from 'meteor/meteor';
import { api } from './imports/lib/api.class';
Meteor.startup( () => {
api();
});

Resources