Firebase functions deploy failed: illegal operation on a directory, read [duplicate] - firebase

I'm trying to download an image from an url and then uploading it to my firebase cloud storage.
This is the code i'm using.
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
const download = require('image-downloader');
const tmp = require('tmp');
export const downloadFunction = functions.https.onCall(async (data, context) => {
var bucket = admin.storage().bucket();
await tmp.dir(async function _tempDirCreated(err: any, path: any) {
if (err) throw err;
const options = {
url: 'theUrlIWantToPutInTheStorage',
dest: path,
}
console.log('Dir: ', path);
await download.image(options)
.then(async () => {
console.log('Saved');
await bucket.upload(path, {
destination: "testfolder/test.jpg",
metadata: "metadata",
});
})
.catch((err2: any) => console.error(err2))
});
});
But from the firebase console (logs) I get this error:
{ Error: EISDIR: illegal operation on a directory, read errno: -21, code: 'EISDIR', syscall: 'read' }
What am I doing wrong?
Thanks in advance!

The path that you provide to the method upload should be a file and not a directory.
upload(pathString, optionsopt, callbackopt) → {Promise.<UploadResponse>}
Upload a file to the bucket. This is a convenience method that wraps File#createWriteStream.
Example :
const options = {
destination: 'new-image.png',
resumable: true,
validation: 'crc32c',
metadata: {
metadata: {
event: 'Fall trip to the zoo'
}
}
};
bucket.upload('local-image.png', options, function(err, file) {
// Your bucket now contains:
// - "new-image.png" (with the contents of `local-image.png')
// `file` is an instance of a File object that refers to your new file.
});
https://googleapis.dev/nodejs/storage/latest/Bucket.html

Related

how to Upload a file from ue4 to Js server, using multer

I'm currently making a project that requires me to send a png image from unreal engine to a next JS server which uses multer to pass the file on to another server.
When sending my file as a binary the JS server (intermediate server) is not receiving a file from unreal.
I've tried the two following methods
TArray<uint8> rawFileData;
FFileHelper::LoadFileToArray(rawFileData, *media);
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContent(rawFileData);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
and
FString JsonString;
TArray<uint8> rawFileData;
TSharedRef<TJsonWriter<TCHAR>> JsonWriter = JsonWriterFactory<TCHAR>::Create(&JsonString);
JsonWriter->WriteObjectStart();
JsonWriter->WriteValue("fileName", pPathToFile);
JsonWriter->WriteValue("file", FBase64::Encode(rawFileData));
JsonWriter->WriteObjectEnd();
JsonWriter->Close();
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContentAsString(JsonString);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
both of these methods have the server return an undefined file obj
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import path from 'path';
import MulterGoogleCloudStorage from "multer-google-storage";
import nextConnect from 'next-connect';
const Multer = require('multer');
const { Storage } = require('#google-cloud/storage');
const CLOUD_BUCKET = 'nude_locks';
const PROJECT_ID = 'hp-production-338902';
const KEY_FILE = path.resolve('./hp-production-key.json')
const storage = new Storage({
projectId: PROJECT_ID,
keyFilename: KEY_FILE
});
const bucket = storage.bucket(CLOUD_BUCKET);
const upload = Multer({
storage: Multer.memoryStorage(),
limits: {
fileSize: 5 * 1024 * 1024,
}
}).single('file');
const apiRoute = nextConnect({
onNoMatch(req, res) {
res.status(405).json({ error: `Method '${req.method}' Not Allowed` });
},
});
apiRoute.use(upload);
apiRoute.post((req, res) => {
console.log(req.file);
if (!req.file) {
res.status(400).send("No file uploaded.");
return;
}
const blob = bucket.file(req.file.originalname);
// Make sure to set the contentType metadata for the browser to be able
// to render the image instead of downloading the file (default behavior)
const blobStream = blob.createWriteStream({
metadata: {
contentType: req.file.mimetype
}
});
blobStream.on("error", err => {
next(err);
return;
});
blobStream.on("finish", () => {
console.log('finish');
console.log(blob);
// The public URL can be used to directly access the file via HTTP.
const publicUrl = `https://storage.googleapis.com/${bucket.name}/${blob.name}`;
// Make the image public to the web (since we'll be displaying it in browser)
blob.makePublic().then(() => {
res.status(200).send(`Success!\n Image uploaded to ${publicUrl}`);
});
});
blobStream.end(req.file.buffer);
});
export default apiRoute;
export const config = {
api: {
bodyParser: false,
},
}
const fileSelectedHandler = e => {
const file = new File("D:/_Spectre/VHS/P210107_VHS_Configurator/P04_Unreal/Human_Configurator/Saved/Screenshots/Windows/-1-nude-2022-2-13.png");
console.log(file);
const formData = new FormData();
formData.append('file', file);
axios.post('/api/nude_upload', formData, {
headers: {
'Content-Type': 'multipart/form-data',
}
})
.then(res => {
console.log(res);
});
}
Is there a way to create a file object from UE4?
alternatively is there a way to retrieve google cloud storage access tokens from UE4

What might cause the firebase console to show the 'waiting' icon (spinning blue loop icon) after adding a file via cloud functions to cloud storage?

So, I can now use a google cloud function to store a file in cloud storage, and after uploading a file I can see it's name in the firebase console, i.e. within the Storage area as you drill down via the various folders.
I can also query and retrieve+view the file, so it seems like it's stored ok.
However, I don't understand why I cannot click on the file in the firebase console and see a preview of it.
My only wild guess is that the cloud function to upload the file didn't release a file handle or something, and so firebase console thinks it's locked.
The file attributes shown in firebase console seem correct.
The cloud function code to upload the file is as follows:
const imageBuffer = Buffer.from(arr[1], 'base64');
const bufferStream = new stream.PassThrough();
bufferStream.end(imageBuffer);
const fullPath = `${filePath}/${data.filename}`
console.log('saving to fullPath', fullPath)
const myFile = bucket.file(fullPath)
bufferStream.pipe(myFile.createWriteStream({
metadata: {
contentType: mime
},
public: true,
validation: "md5"
}))
.on('error', function (err) {
console.log('error from image upload', err);
})
.on('finish', function () {
console.log('!!!!!! finished')
})
where arr[1] is the base64 portion of a string, i.e. remove the mime type stuff from the beginning so arr[1] is the pure file data as base64.
So, basically everything seems to be working perfectly except the firebase console can't view the file, unless I do an app redeploy, i.e. npm run build && firebase deploy && npm start, and then it seems (?) to free up the lock and I can view the image preview in the firebase console for firebase storage.
Sending null at the end of the data as per the comments fixes this.
Revisiting this, rather than use the PassThrough stream which is adding unnecessary overhead, you could write your buffer to the stream itself directly.
const imageBuffer = Buffer.from(arr[1], 'base64');
const fullPath = `${filePath}/${data.filename}`
console.log('saving to fullPath', fullPath)
const myFile = bucket.file(fullPath)
const fileWriteStream = myFile
.createWriteStream({
metadata: {
contentType: mime
},
public: true,
validation: "md5"
})
.on('error', function (err) {
console.log('error from image upload', err);
})
.on('finish', function () {
console.log('!!!!!! finished')
});
fileWriteStream.end(imageBuffer);
Rewritten as a promise-based utility function:
import { File } from "#google-cloud/storage";
import { storage } from "firebase-admin";
async function uploadDataURI(dataURI: string, options: { bucket?: string, filename: string, filepath: string, public?: boolean, contentType?: string }): Promise<File> {
const { bucket, filename, filepath, public, contentType } = options;
const [, mediaType, data] = dataURI.split(/[;,]/);
let contentTypeFromURI: string, buffer: Buffer;
if (mediaType.endsWith(";base64")) {
contentTypeFromURI = mediaType.slice(0,-7);
buffer = Buffer.from(data, "base64");
} else {
contentTypeFromURI = mediaType;
buffer = Buffer.from(decodeURIComponent(data));
}
const storageRef = storage()
.bucket(bucket)
.file(`${filepath}/${filename}`);
return new Promise((resolve, reject) => {
try {
const fileWriteStream = storageRef
.createWriteStream({
metadata: {
contentType: contentType || contentTypeFromURI || undefined // coerce falsy values to undefined
},
public,
validation: "md5"
})
.on('error', reject)
.on('finish', () => resolve(storageRef));
fileWriteStream.end(buffer);
} catch (err) {
reject(err);
}
});
}

FormData using BusBoy for firebase works in serve but not in deploy

Situation
I have a firebase function that updates the user image.
Problem
When I run locally my function using firebase serve, I successfully upload the image to firestore using Postman. However, when I run firebase deploy and I try to upload the image using Postman, I get a 500 Internal Server Error. The other functions (not dealing with FormData, just json) work perfectly when I deploy them.
I don't understand why it works locally, but not on deploy when I am doing the exact same thing. Not sure if this is something in the config I am missing, or if I am doing something wrong. Any help would be appreciated!
Code
users.js
const { admin, db, firebase } = require('../util/admin');
const config = require('../util/config');
exports.postUserImage = (req, res) => {
const BusBoy = require('busboy');
const path = require('path');
const os = require('os');
const fs = require('fs');
let imgFileName;
let imgToBeUploaded = {};
const busboy = new BusBoy({ headers: req.headers });
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
// Invalid file type
if (mimetype !== 'image/jpeg' && mimetype !== 'image/png') {
return res.status(400).json({ error: 'Invalid file type' });
}
// Extract img extension
const imgDotLength = filename.split('.').length;
const imgExtension = filename.split('.')[imgDotLength - 1];
// Create img file name
imgFileName = `${Math.round(Math.random() * 1000000)}.${imgExtension}`;
// Create img path
const filepath = path.join(os.tmpdir(), imgFileName);
// Create img object to be uploaded
imgToBeUploaded = { filepath, mimetype };
// Use file system to create the file
file.pipe(fs.createWriteStream(filepath));
});
busboy.on('finish', () => {
admin
.storage()
.bucket()
.upload(imgToBeUploaded.filepath, {
resumable: false,
metadata: {
metadata: {
contentType: imgToBeUploaded.mimetype
}
}
})
.then(() => {
// Create img url to add to our user
const imgUrl = `https://firebasestorage.googleapis.com/v0/b/${config.storageBucket}/o/${imgFileName}?alt=media`;
// Add img url to user document
return db.doc(`/users/${req.user.handle}`).update({ imgUrl });
})
.then(() => {
return res.json({ message: 'Image uploaded succesfully' });
})
.catch((err) => {
console.error(err);
return res.status(500).json({ error });
});
});
busboy.end(req.rawBody);
};
index.js
const { app, functions } = require('./util/admin');
const FirebaseAuth = require('./util/firebaseAuth');
const {
postUserImage,
} = require('./handlers/users');
app.post('/user/image', FirebaseAuth, postUserImage);

How to create a folder in Firebase Storage using Admin API

Aim: to upload a file into a folder within Firebase Storage
E.g.
default_bucket/folder1/file1
default_bucket/folder1/file2
default_bucket/folder2/file3
Using Firebase client-side I am able to upload a file to a folder within Firebase Storage like this:
const storageRef = firebase.storage().ref();
const fileRef = storageRef.child(`${folder}/${filename}`);
const metadata = {
contentType: file.type,
customMetadata: { }
};
return fileRef.put(file, metadata);
If the folder does not exist, it get's created.
However I have not managed to do the same server-side using the Admin SDK.
The code below, uploads the file into the default bucket.
But, I want to upload the file into a named folder within the default bucket.
The client side makes a POST request to the GCF, sending the file and a folder name.
Busboy is used to extra the folder name and file and pass them to the upload function; which uploads the file, then returns a donwnload link for it.
index.js
const task = require('./tasks/upload-file-to-storage');
app.post('/upload', (req, res, next) => {
try {
let uploadedFilename;
let folder;
if (req.method === 'OPTIONS') {
optionsHelper.doOptions(res);
} else if (req.method === 'POST') {
res.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers });
const uploads = [];
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
uploadedFilename = `${folder}^${filename}`;
const filepath = path.join(os.tmpdir(), uploadedFilename);
uploads.push({ file: filepath, filename: filename, folder: folder });
file.pipe(fs.createWriteStream(filepath));
});
busboy.on('field', (fieldname, val) => {
if (fieldname === 'folder') {
folder = val;
}
});
busboy.on('finish', () => {
if (uploads.length === 0) {
res.end('no files found');
}
for (let i = 0; i < uploads.length; i++) {
const upload = uploads[i];
const file = upload.file;
task.uploadFile(helpers.fbAdmin, upload.folder, upload.file, uploadedFilename).then(downloadLink => {
res.write(`${downloadLink}\n`);
fs.unlinkSync(file);
res.end();
});
}
});
busboy.end(req.rawBody);
} else {
// Client error - only support POST
res.status(405).end();
}
} catch (e) {
console.error(e);
res.sendStatus(500);
}
});
const api = functions.https.onRequest(app);
module.exports = {
api
;
upload-file-to-storage.js
exports.uploadFile = (fbAdmin, folder, filepath, filename) => {
// get the bucket to upload to
const bucket = fbAdmin.storage().bucket(); //`venture-spec-sheet.appspot.com/${folder}`
const uuid = uuid();
// Uploads a local file to the bucket
return bucket
.upload(filepath, {
gzip: true,
metadata: {
//destination: `/${folder}/${filename}`,
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
.then(() => {
const d = new Date();
const expires = d.setFullYear(d.getFullYear() + 50);
// get file from the bucket
const myFile = fbAdmin
.storage()
.bucket()
.file(filename);
// generate a download link and return it
return myFile.getSignedUrl({ action: 'read', expires: expires }).then(urls => {
const signedUrl = urls[0];
return signedUrl;
});
});
};
I've tried a few things
Setting the bucket name to default and a folder. This resulted in a server error.
const bucket = fbAdmin.storage().bucket(`${defaultName}/${folder}`);
Setting the bucket name to the folder. This resulted in a server error.
const bucket = fbAdmin.storage().bucket(folder);
And, I've also tried using the destination property of uploadOptions.
But this still puts the file in the default bucket.
.upload(filepath, {
gzip: true,
metadata: {
destination: `${folder}/${filename}`, // and /${folder}/${filename}
}
})
Is it possible to upload to a folder using the Admin SDK?
E.g. I want to upload a file so that is is placed in a named "folder".
I.e. so I can reference the file at the path: bucket/folder/file.jpg
In the example below, each "folder" is named with a firebase key.
Found the problem.
I stupidly declared the destination option in the wrong place.
Instead of in the metadata object:
return bucket
.upload(filepath, {
gzip: true,
metadata: {
destination: `${folder}/${filename}`,
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
It should have been on the options object:
return bucket
.upload(filepath, {
gzip: true,
destination: `${folder}/${filename}`,
metadata: {
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
With this change made the file now gets uploaded into a named "folder".
There is a create folder option besides Upload File button for a bucket in Storage console.One can create folders in bucket and upload files to it on console. To create such folders in bucket using admin APIs, add folders before file reference. e.g.
const blob = bucket.file('folder1/folder2/' + req.file.originalname);

How to use a hbs template on Firebase Storage with nodemailer-express-handlebars in a database onCreate trigger function?

I am trying to use a handlebars template file uploaded to my firebase project storage (appId.appspot.com/templates/testTemplate.hbs) with nodemailer to send an email when an onCreate function is triggered on a realtime database node.
I can send emails successfully using html formatted string but really need to use a template to add dynamic data into the email.
Here my function:
import * as functions from "firebase-functions";
const admin = require("firebase-admin");
const hbs = require("nodemailer-express-handlebars");
const nodemailer = require("nodemailer");
const smtpConfig = {
host: "mailHost",
port: 465,
secure: true,
auth: {
user: "xxxxxxxx",
pass: "xxxxxxxx"
}
};
const transporter = nodemailer.createTransport(smtpConfig);
exports.sendEmail = functions.database
.ref("/databasePath/{pushId}")
.onCreate(async (snapshot, context) => {
const userData = snapshot.val();
admin.initializeApp({
storageBucket: "appId.appspot.com"
});
const bucket = admin.storage().bucket();
const templatesFolder = bucket.name + "/templates/"; // path to storage folder with templates
transporter.use(
"compile",
hbs({
viewPath: templatesFolder,
extName: ".hbs"
})
);
const uniqueCode = "generated by a function";
const uniqueLink = "https://appId.firebaseapp.com/?id=" + uniqueCode;
const message = {
from: "fromEmail",
to: "toEmail",
subject: "Subject",
template: "testTemplate", // name of the template file
context: {
user: "User name",
link: uniqueLink
}
};
try {
await transporter.sendMail(message);
console.log("Email sent to:", "toEmail");
} catch (error) {
console.error("Error sending email:", error);
}
return null;
});
When the function is triggered I get the following error in the logs:
There was an error while sending the email: { Error: ENOENT: no such file or directory, open '/user_code/appId.appspot.com/templates/testTemplate.hbs'
at Error (native)
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '/user_code/appId.appspot.com/templates/testTemplate.hbs' }
The bucket.name has '/user_code' at the start so hbs can't find the template. How can I get the right path to the templates folder?
It doesn't look like you haven't actually written any code that downloads a file from Cloud Storage. You can't just build a path to a file in Cloud Storage, pass it off to some other component, and hope it just knows what to do with the path. All you've done is pass it the name of a local file that doesn't exist. You're going to have to actually download the file to a temp folder in order to make use of it locally.
Or better yet, just skip Cloud Storage and deploy the template along with your functions. You can just read the file directly off disk at no additional cost. (Each Cloud Storage download costs money.)
Here's the updated function:
import * as functions from "firebase-functions";
const admin = require("firebase-admin");
const hbs = require("nodemailer-express-handlebars");
const nodemailer = require("nodemailer");
const smtpConfig = {
host: "mailHost",
port: 465,
secure: true,
auth: {
user: "xxxxxxxx",
pass: "xxxxxxxx"
}
};
const transporter = nodemailer.createTransport(smtpConfig);
exports.sendEmail = functions.database
.ref("/databasePath/{pushId}")
.onCreate(async (snapshot, context) => {
const userData = snapshot.val();
const templatesFolder = __dirname + "/templates"; // <--
transporter.use(
"compile",
hbs({
viewPath: templatesFolder,
extName: ".handlebars"
})
);
const uniqueCode = "generated by a function";
const uniqueLink = "https://appId.firebaseapp.com/?id=" + uniqueCode;
const message = {
from: "fromEmail",
to: userData.email, // from the snapshot
subject: "Subject",
template: "testTemplate", // name of the template file
context: {
user: userDate.name, // from the snapshot
link: uniqueLink
}
};
try {
await transporter.sendMail(message);
console.log("Email sent to:", userData.email);
} catch (error) {
console.error("Error sending email:", error);
}
return null;
});
Add the template files to "functions/lib/templates/testTemplate.handlebars"

Resources