NEXT.JS – Using aws-sdk to upload to DigitalOceans - dev mode work, prod isn't - next.js

I'm using aws-sdk to upload images to DigitalOceans bucket. On localhost it works 100% but production seems like the function goes on without an error but the file does not upload to the bucket.
I cannot figure out what is going on and can't think of a way to debug this. tried aswell executing the POST request with Postman multipart/form-data + adding file to the body of the request and it is the same for localhost, working, and production is not.
my api endpoint:
import AWS from 'aws-sdk'
import formidable from "formidable"
import fs from 'fs'
const s3Client = new AWS.S3({
endpoint: process.env.DO_SPACES_URL,
region: 'fra1',
credentials: {
accessKeyId: process.env.DO_SPACES_KEY,
secretAccessKey: process.env.DO_SPACES_SECRET
}
})
export const config = {
api: {
bodyParser: false
}
}
export default async function uploadFile(req, res) {
const { method } = req
const form = formidable()
const now = new Date()
const fileGenericName = `${now.getTime()}`
const allowedFileTypes = ['jpg', 'jpeg', 'png', 'webp']
switch (method) {
case "POST":
try {
form.parse(req, async (err, fields, files) => {
const fileType = files.file?.originalFilename?.split('.').pop().toLowerCase()
if (!files.file) {
return res.status(400).json({
status: 400,
message: 'no files'
})
}
if (allowedFileTypes.indexOf(fileType) === -1) {
return res.status(400).json({
message: 'bad file type'
})
}
const fileName = `${fileGenericName}.${fileType}`
try {
s3Client.putObject({
Bucket: process.env.DO_SPACES_BUCKET,
Key: `${fileName}`,
Body: fs.createReadStream(files.file.filepath),
ACL: "public-read"
}, (err, data) => {
console.log(err)
console.log(data)
})
const url = `${process.env.FILE_URL}/${fileName}`
return res.status(200).json({ url })
} catch (error) {
console.log(error)
throw new Error('Error Occured While Uploading File')
}
});
return res.status(200)
} catch (error) {
console.log(error)
return res.status(500).end()
}
default:
return res.status(405).end('Method is not allowed')
}
}

Related

Axios getStore is undefined in NextJs api calls. (Redux, NextJs, Jwt)

I am trying to set up authentication for a project. Once a user signs up for our app they get sent to our home page with an id in the query. This id then gets used to submit user and then the jwt token gets saved inside redux state.
All our calls now go through an axios client where the jwt token is passed on every request. The token gets read with store.getState(injectStore)
This all works fine inside getserversideProps, but the issue comes in when using calls on the frontend that goes through NextJs built in 'pages/api' folder. Any calls inside those folders causes the store.getState() to be undefined. I do not understand why since it uses the exact same client as geserversideProps.
Example GetServersideProps(working)
try {
const response = await serverApiClient.get('v1/config');
return {
props: {
},
};
} catch ({ error: { statusCode = 500, message = 'Internal Server Error' } }) {
if (statusCode === 401) {
return {
redirect: {
permanent: false,
destination: '/',
},
};
}
throw new Error(message as string);
}
};
Example Frontend bff call(not working)
try {
// Call below get sent to next built in api
const players = await apiClient.get(`/defenders?sortBy=${statId}&team_id=${teamShortName}`);
return players;
} catch (error) {
return { error };
}
};
export default async function handler(req: NextApiRequest) {
console.log('Start request')
try {
const { sortBy, team_id: teamId } = req.query;
const response = await serverApiClient.get(`/v1/players/picks?position=DEF&sort_by=${sortBy}&team_id=${teamId}`);
Api Client
mergeConfigs(
params: Record<string, string>,
headers: Record<string, string>,
configs: Record<string, string>,
): AxiosRequestConfig {
const defaultConfigs = ApiClient.getDefaultConfigs();
*const token = store?.getState()?.jwtToken?.value*
//ISSUE ABOVE - This store .getState() is only undefined in nextJS api folder calls.
return {
...defaultConfigs,
...configs,
params,
headers: {
...defaultConfigs.headers,
...headers,
...(token ? { Authorization: `Bearer ${token}` } : {}),
},
};
}
get(
uri: string,
params = {},
headers = {},
configs = {},
): Promise<AxiosResponse | any> {
return this.client
.get(uri, this.mergeConfigs(params, headers, configs))
.then((response) => {
return (response.data ? response.data : response);
})
.catch((error) => {
const errorObject = {
error: error?.response?.data,
};
throw Object.assign(errorObject);
});
}
If anyone has some advice on why that getStore is undefined in frontend-to-backend calls please assist. Thanks all!

How to upload formdata file to Pinata?

I am trying to upload a base64 file to Pinata, but my Formdata seems malformed for unknown reason.
I create the formdata in index.js and send to the NextJS api as so:
// file is a base64 string
_createNFTFormDataFile = async (name, description, file) => {
try {
const formData = new FormData()
formData.append('name', name)
formData.append('description', description)
formData.append('file', file)
// formdata logs correctly
for (let pair of formData.entries()) {
console.log(pair[0]+ ', ' + pair[1]);
}
const { data } = await axios.post('/api/upload', formData, {
headers: { 'Content-Type': 'multipart/form-data' }
})
} catch (ex) {
console.error(ex)
}
}
The call goes through page/api/middleware/middleware.js
import nextConnect from 'next-connect'
import multiparty from 'multiparty'
const middleware = nextConnect()
middleware.use((req, res, next) => {
const form = new multiparty.Form()
form.parse(req, function (err, fields, files) {
if (err) {
console.log(err)
next()
}
req.body = fields
req.files = files
next()
})
})
export default middleware
And then is passed to the handler in ./page/api/upload.js.
handler.post(async function handlePost ({ body, files }, response) {
try {
const fileUrl = await uploadFileToIPFS(files.file[0]) //
const metadata = {
name: body.name[0],
description: body.description[0],
image: fileUrl
}
const metadaUrl = await uploadJsonToIPFS(metadata, body.name[0])
return response.status(200).json({
url: metadaUrl
})
} catch (error) {
console.log('Error uploading file: ', error)
}
})
However, i can't retrieve the files here and get this error Error uploading file: TypeError: Cannot read properties of undefined (reading '0').
console.log(body, files) here gives: {"undefined":["[object Promise]"]} {}`.
Why can't I retrieve the formdata here?

What's the proper way for returning a response using Formidable on Nextjs Api?

I'm sending an uploaded file to a Next.js API route using FormData. The file is then processed on the API route using formidable and passed to sanity client in order to upload the asset, but I can't return the data to the client... I get this message in console:
API resolved without sending a response for /api/posts/uploadImage, this may result in stalled requests.
When console logging the document inside the API everything is in there, I just can't send back that response to client side. Here's my client upload function:
const addPostImage = (e) => {
const selectedFile = e.target.files[0];
if (
selectedFile.type === "image/jpeg" ||
selectedFile.type === "image/png" ||
selectedFile.type === "image/svg" ||
selectedFile.type === "image/gif" ||
selectedFile.type === "image/tiff"
) {
const form = new FormData();
form.append("uploadedFile", selectedFile);
axios
.post("/api/posts/uploadImage", form, {
headers: { "Content-Type": "multipart/form-data" },
})
.then((image) => {
setPostImage(image);
toast.success("Image uploaded!");
})
.catch((error) => {
toast.error(`Error uploading image ${error.message}`);
});
} else {
setWrongImageType(true);
}
};
This is my API:
import { client } from "../../../client/client";
import formidable from "formidable";
import { createReadStream } from "fs";
export const config = {
api: {
bodyParser: false,
},
};
export default async (req, res) => {
const form = new formidable.IncomingForm();
form.keepExtensions = true;
form.parse(req, async (err, fields, files) => {
const file = files.uploadedFile;
const document = await client.assets.upload(
"image",
createReadStream(file.filepath),
{
contentType: file.mimetype,
filename: file.originalFilename,
}
);
console.log(document);
res.status(200).json(document);
});
};
Solution:
As stated in the comments by #juliomalves, I had to promisify the form parsing function and await its results like so:
import { client } from "../../../client/client";
import formidable from "formidable";
import { createReadStream } from "fs";
export const config = {
api: {
bodyParser: false,
},
};
export default async (req, res) => {
const form = new formidable.IncomingForm();
form.keepExtensions = true;
const formPromise = await new Promise((resolve, reject) => {
form.parse(req, async (err, fields, files) => {
if (err) reject(err);
const file = files.uploadedFile;
const document = await client.assets.upload(
"image",
createReadStream(file.filepath),
{
contentType: file.mimetype,
filename: file.originalFilename,
}
);
resolve(document);
});
});
res.json(formPromise);
};
Then I checked for the response's status on the client-side.
Your code is not working because by default formidable saves files to disk, which is not available on vercel. This works.
const chunks = []
let buffer;
const form = formidable({
fileWriteStreamHandler: (/* file */) => {
const writable = new Writable();
// eslint-disable-next-line no-underscore-dangle
writable._write = (chunk, enc, next) => {
chunks.push(chunk);
next();
};
return writable;
},
})
form.parse(req, (err, fields) => {
if (err) {
res.end(String(err));
return;
}
buffer = Buffer.concat(chunks);
res.end();
});

FormData using BusBoy for firebase works in serve but not in deploy

Situation
I have a firebase function that updates the user image.
Problem
When I run locally my function using firebase serve, I successfully upload the image to firestore using Postman. However, when I run firebase deploy and I try to upload the image using Postman, I get a 500 Internal Server Error. The other functions (not dealing with FormData, just json) work perfectly when I deploy them.
I don't understand why it works locally, but not on deploy when I am doing the exact same thing. Not sure if this is something in the config I am missing, or if I am doing something wrong. Any help would be appreciated!
Code
users.js
const { admin, db, firebase } = require('../util/admin');
const config = require('../util/config');
exports.postUserImage = (req, res) => {
const BusBoy = require('busboy');
const path = require('path');
const os = require('os');
const fs = require('fs');
let imgFileName;
let imgToBeUploaded = {};
const busboy = new BusBoy({ headers: req.headers });
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
// Invalid file type
if (mimetype !== 'image/jpeg' && mimetype !== 'image/png') {
return res.status(400).json({ error: 'Invalid file type' });
}
// Extract img extension
const imgDotLength = filename.split('.').length;
const imgExtension = filename.split('.')[imgDotLength - 1];
// Create img file name
imgFileName = `${Math.round(Math.random() * 1000000)}.${imgExtension}`;
// Create img path
const filepath = path.join(os.tmpdir(), imgFileName);
// Create img object to be uploaded
imgToBeUploaded = { filepath, mimetype };
// Use file system to create the file
file.pipe(fs.createWriteStream(filepath));
});
busboy.on('finish', () => {
admin
.storage()
.bucket()
.upload(imgToBeUploaded.filepath, {
resumable: false,
metadata: {
metadata: {
contentType: imgToBeUploaded.mimetype
}
}
})
.then(() => {
// Create img url to add to our user
const imgUrl = `https://firebasestorage.googleapis.com/v0/b/${config.storageBucket}/o/${imgFileName}?alt=media`;
// Add img url to user document
return db.doc(`/users/${req.user.handle}`).update({ imgUrl });
})
.then(() => {
return res.json({ message: 'Image uploaded succesfully' });
})
.catch((err) => {
console.error(err);
return res.status(500).json({ error });
});
});
busboy.end(req.rawBody);
};
index.js
const { app, functions } = require('./util/admin');
const FirebaseAuth = require('./util/firebaseAuth');
const {
postUserImage,
} = require('./handlers/users');
app.post('/user/image', FirebaseAuth, postUserImage);

Uploading a form posted image buffer to Cloud Storage with Firebase Functions

Here's my cloud function. It's supposed to get an http posted image and upload it to storage, returning the url.
exports.uploadImageToEditor = functions.https.onRequest((req, res) => {
const img = JSON.parse(JSON.stringify(req.body));
const bucket = admin.storage().bucket();
return bucket.file('blog/foo.jpg').save(img.data, {
resumable: false,
metadata: {
contentType: 'image/jpeg'
}
})
.then(() => {
return cors(req, res, () => {
res.status(200).send({ "url": bucket.file('foo.jpg').getSignedUrl()});
});
});
});
This is how the image is actually sent in the client:
uploadImage(file, endPoint) {
if (!endPoint) {
throw new Error('Image Endpoint isn`t provided or invalid');
}
const formData = new FormData();
if (file) {
formData.append('file', file);
const req = new HttpRequest('POST', endPoint, formData, {
reportProgress: true
});
return this._http.request(req);
}
else {
throw new Error('Invalid Image');
}
}
I think you're probably looking for the save() method on File in the Admin SDK.
const bucket = admin.storage().bucket()
.file('my-file.jpg').save(blob)
.then(() => { /* ... */ });
You can also get back information about the file this way.
export const uploadImage = async (destination: string, image: Buffer) => {
const file = storage.bucket().file(destination);
await file.save(image, { contentType: yourContentType });
return file.publicUrl();
};
If you get a permission error, go to Firebase storage Rules and add this rule to allow to write in the directory :
service firebase.storage {
match /b/{bucket}/o {
match /blog/{anyPath=**} {
allow read;
allow write;
}
}
}

Resources