How to create a folder in Firebase Storage using Admin API - firebase

Aim: to upload a file into a folder within Firebase Storage
E.g.
default_bucket/folder1/file1
default_bucket/folder1/file2
default_bucket/folder2/file3
Using Firebase client-side I am able to upload a file to a folder within Firebase Storage like this:
const storageRef = firebase.storage().ref();
const fileRef = storageRef.child(`${folder}/${filename}`);
const metadata = {
contentType: file.type,
customMetadata: { }
};
return fileRef.put(file, metadata);
If the folder does not exist, it get's created.
However I have not managed to do the same server-side using the Admin SDK.
The code below, uploads the file into the default bucket.
But, I want to upload the file into a named folder within the default bucket.
The client side makes a POST request to the GCF, sending the file and a folder name.
Busboy is used to extra the folder name and file and pass them to the upload function; which uploads the file, then returns a donwnload link for it.
index.js
const task = require('./tasks/upload-file-to-storage');
app.post('/upload', (req, res, next) => {
try {
let uploadedFilename;
let folder;
if (req.method === 'OPTIONS') {
optionsHelper.doOptions(res);
} else if (req.method === 'POST') {
res.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers });
const uploads = [];
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
uploadedFilename = `${folder}^${filename}`;
const filepath = path.join(os.tmpdir(), uploadedFilename);
uploads.push({ file: filepath, filename: filename, folder: folder });
file.pipe(fs.createWriteStream(filepath));
});
busboy.on('field', (fieldname, val) => {
if (fieldname === 'folder') {
folder = val;
}
});
busboy.on('finish', () => {
if (uploads.length === 0) {
res.end('no files found');
}
for (let i = 0; i < uploads.length; i++) {
const upload = uploads[i];
const file = upload.file;
task.uploadFile(helpers.fbAdmin, upload.folder, upload.file, uploadedFilename).then(downloadLink => {
res.write(`${downloadLink}\n`);
fs.unlinkSync(file);
res.end();
});
}
});
busboy.end(req.rawBody);
} else {
// Client error - only support POST
res.status(405).end();
}
} catch (e) {
console.error(e);
res.sendStatus(500);
}
});
const api = functions.https.onRequest(app);
module.exports = {
api
;
upload-file-to-storage.js
exports.uploadFile = (fbAdmin, folder, filepath, filename) => {
// get the bucket to upload to
const bucket = fbAdmin.storage().bucket(); //`venture-spec-sheet.appspot.com/${folder}`
const uuid = uuid();
// Uploads a local file to the bucket
return bucket
.upload(filepath, {
gzip: true,
metadata: {
//destination: `/${folder}/${filename}`,
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
.then(() => {
const d = new Date();
const expires = d.setFullYear(d.getFullYear() + 50);
// get file from the bucket
const myFile = fbAdmin
.storage()
.bucket()
.file(filename);
// generate a download link and return it
return myFile.getSignedUrl({ action: 'read', expires: expires }).then(urls => {
const signedUrl = urls[0];
return signedUrl;
});
});
};
I've tried a few things
Setting the bucket name to default and a folder. This resulted in a server error.
const bucket = fbAdmin.storage().bucket(`${defaultName}/${folder}`);
Setting the bucket name to the folder. This resulted in a server error.
const bucket = fbAdmin.storage().bucket(folder);
And, I've also tried using the destination property of uploadOptions.
But this still puts the file in the default bucket.
.upload(filepath, {
gzip: true,
metadata: {
destination: `${folder}/${filename}`, // and /${folder}/${filename}
}
})
Is it possible to upload to a folder using the Admin SDK?
E.g. I want to upload a file so that is is placed in a named "folder".
I.e. so I can reference the file at the path: bucket/folder/file.jpg
In the example below, each "folder" is named with a firebase key.

Found the problem.
I stupidly declared the destination option in the wrong place.
Instead of in the metadata object:
return bucket
.upload(filepath, {
gzip: true,
metadata: {
destination: `${folder}/${filename}`,
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
It should have been on the options object:
return bucket
.upload(filepath, {
gzip: true,
destination: `${folder}/${filename}`,
metadata: {
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
With this change made the file now gets uploaded into a named "folder".

There is a create folder option besides Upload File button for a bucket in Storage console.One can create folders in bucket and upload files to it on console. To create such folders in bucket using admin APIs, add folders before file reference. e.g.
const blob = bucket.file('folder1/folder2/' + req.file.originalname);

Related

how to Upload a file from ue4 to Js server, using multer

I'm currently making a project that requires me to send a png image from unreal engine to a next JS server which uses multer to pass the file on to another server.
When sending my file as a binary the JS server (intermediate server) is not receiving a file from unreal.
I've tried the two following methods
TArray<uint8> rawFileData;
FFileHelper::LoadFileToArray(rawFileData, *media);
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContent(rawFileData);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
and
FString JsonString;
TArray<uint8> rawFileData;
TSharedRef<TJsonWriter<TCHAR>> JsonWriter = JsonWriterFactory<TCHAR>::Create(&JsonString);
JsonWriter->WriteObjectStart();
JsonWriter->WriteValue("fileName", pPathToFile);
JsonWriter->WriteValue("file", FBase64::Encode(rawFileData));
JsonWriter->WriteObjectEnd();
JsonWriter->Close();
Request->SetURL(API_HP_URL + "nude_upload");
Request->SetHeader(TEXT("Content-Type"), TEXT("multipart/form-data; boundary=----WebKitFormBoundarywpp9S2IUDici8hpI"));
Request->SetHeader(TEXT("Connection"), TEXT("keep-alive"));
Request->SetHeader(TEXT("accept"), TEXT("application/json, text/plain, */*"));
Request->SetContentAsString(JsonString);
Request->SetVerb("POST");
Request->OnProcessRequestComplete().BindUObject(this, &AHttpCommunicator::OnPostNudeSSResponse);
Request->ProcessRequest();
both of these methods have the server return an undefined file obj
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import path from 'path';
import MulterGoogleCloudStorage from "multer-google-storage";
import nextConnect from 'next-connect';
const Multer = require('multer');
const { Storage } = require('#google-cloud/storage');
const CLOUD_BUCKET = 'nude_locks';
const PROJECT_ID = 'hp-production-338902';
const KEY_FILE = path.resolve('./hp-production-key.json')
const storage = new Storage({
projectId: PROJECT_ID,
keyFilename: KEY_FILE
});
const bucket = storage.bucket(CLOUD_BUCKET);
const upload = Multer({
storage: Multer.memoryStorage(),
limits: {
fileSize: 5 * 1024 * 1024,
}
}).single('file');
const apiRoute = nextConnect({
onNoMatch(req, res) {
res.status(405).json({ error: `Method '${req.method}' Not Allowed` });
},
});
apiRoute.use(upload);
apiRoute.post((req, res) => {
console.log(req.file);
if (!req.file) {
res.status(400).send("No file uploaded.");
return;
}
const blob = bucket.file(req.file.originalname);
// Make sure to set the contentType metadata for the browser to be able
// to render the image instead of downloading the file (default behavior)
const blobStream = blob.createWriteStream({
metadata: {
contentType: req.file.mimetype
}
});
blobStream.on("error", err => {
next(err);
return;
});
blobStream.on("finish", () => {
console.log('finish');
console.log(blob);
// The public URL can be used to directly access the file via HTTP.
const publicUrl = `https://storage.googleapis.com/${bucket.name}/${blob.name}`;
// Make the image public to the web (since we'll be displaying it in browser)
blob.makePublic().then(() => {
res.status(200).send(`Success!\n Image uploaded to ${publicUrl}`);
});
});
blobStream.end(req.file.buffer);
});
export default apiRoute;
export const config = {
api: {
bodyParser: false,
},
}
const fileSelectedHandler = e => {
const file = new File("D:/_Spectre/VHS/P210107_VHS_Configurator/P04_Unreal/Human_Configurator/Saved/Screenshots/Windows/-1-nude-2022-2-13.png");
console.log(file);
const formData = new FormData();
formData.append('file', file);
axios.post('/api/nude_upload', formData, {
headers: {
'Content-Type': 'multipart/form-data',
}
})
.then(res => {
console.log(res);
});
}
Is there a way to create a file object from UE4?
alternatively is there a way to retrieve google cloud storage access tokens from UE4

What might cause the firebase console to show the 'waiting' icon (spinning blue loop icon) after adding a file via cloud functions to cloud storage?

So, I can now use a google cloud function to store a file in cloud storage, and after uploading a file I can see it's name in the firebase console, i.e. within the Storage area as you drill down via the various folders.
I can also query and retrieve+view the file, so it seems like it's stored ok.
However, I don't understand why I cannot click on the file in the firebase console and see a preview of it.
My only wild guess is that the cloud function to upload the file didn't release a file handle or something, and so firebase console thinks it's locked.
The file attributes shown in firebase console seem correct.
The cloud function code to upload the file is as follows:
const imageBuffer = Buffer.from(arr[1], 'base64');
const bufferStream = new stream.PassThrough();
bufferStream.end(imageBuffer);
const fullPath = `${filePath}/${data.filename}`
console.log('saving to fullPath', fullPath)
const myFile = bucket.file(fullPath)
bufferStream.pipe(myFile.createWriteStream({
metadata: {
contentType: mime
},
public: true,
validation: "md5"
}))
.on('error', function (err) {
console.log('error from image upload', err);
})
.on('finish', function () {
console.log('!!!!!! finished')
})
where arr[1] is the base64 portion of a string, i.e. remove the mime type stuff from the beginning so arr[1] is the pure file data as base64.
So, basically everything seems to be working perfectly except the firebase console can't view the file, unless I do an app redeploy, i.e. npm run build && firebase deploy && npm start, and then it seems (?) to free up the lock and I can view the image preview in the firebase console for firebase storage.
Sending null at the end of the data as per the comments fixes this.
Revisiting this, rather than use the PassThrough stream which is adding unnecessary overhead, you could write your buffer to the stream itself directly.
const imageBuffer = Buffer.from(arr[1], 'base64');
const fullPath = `${filePath}/${data.filename}`
console.log('saving to fullPath', fullPath)
const myFile = bucket.file(fullPath)
const fileWriteStream = myFile
.createWriteStream({
metadata: {
contentType: mime
},
public: true,
validation: "md5"
})
.on('error', function (err) {
console.log('error from image upload', err);
})
.on('finish', function () {
console.log('!!!!!! finished')
});
fileWriteStream.end(imageBuffer);
Rewritten as a promise-based utility function:
import { File } from "#google-cloud/storage";
import { storage } from "firebase-admin";
async function uploadDataURI(dataURI: string, options: { bucket?: string, filename: string, filepath: string, public?: boolean, contentType?: string }): Promise<File> {
const { bucket, filename, filepath, public, contentType } = options;
const [, mediaType, data] = dataURI.split(/[;,]/);
let contentTypeFromURI: string, buffer: Buffer;
if (mediaType.endsWith(";base64")) {
contentTypeFromURI = mediaType.slice(0,-7);
buffer = Buffer.from(data, "base64");
} else {
contentTypeFromURI = mediaType;
buffer = Buffer.from(decodeURIComponent(data));
}
const storageRef = storage()
.bucket(bucket)
.file(`${filepath}/${filename}`);
return new Promise((resolve, reject) => {
try {
const fileWriteStream = storageRef
.createWriteStream({
metadata: {
contentType: contentType || contentTypeFromURI || undefined // coerce falsy values to undefined
},
public,
validation: "md5"
})
.on('error', reject)
.on('finish', () => resolve(storageRef));
fileWriteStream.end(buffer);
} catch (err) {
reject(err);
}
});
}

FormData using BusBoy for firebase works in serve but not in deploy

Situation
I have a firebase function that updates the user image.
Problem
When I run locally my function using firebase serve, I successfully upload the image to firestore using Postman. However, when I run firebase deploy and I try to upload the image using Postman, I get a 500 Internal Server Error. The other functions (not dealing with FormData, just json) work perfectly when I deploy them.
I don't understand why it works locally, but not on deploy when I am doing the exact same thing. Not sure if this is something in the config I am missing, or if I am doing something wrong. Any help would be appreciated!
Code
users.js
const { admin, db, firebase } = require('../util/admin');
const config = require('../util/config');
exports.postUserImage = (req, res) => {
const BusBoy = require('busboy');
const path = require('path');
const os = require('os');
const fs = require('fs');
let imgFileName;
let imgToBeUploaded = {};
const busboy = new BusBoy({ headers: req.headers });
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
// Invalid file type
if (mimetype !== 'image/jpeg' && mimetype !== 'image/png') {
return res.status(400).json({ error: 'Invalid file type' });
}
// Extract img extension
const imgDotLength = filename.split('.').length;
const imgExtension = filename.split('.')[imgDotLength - 1];
// Create img file name
imgFileName = `${Math.round(Math.random() * 1000000)}.${imgExtension}`;
// Create img path
const filepath = path.join(os.tmpdir(), imgFileName);
// Create img object to be uploaded
imgToBeUploaded = { filepath, mimetype };
// Use file system to create the file
file.pipe(fs.createWriteStream(filepath));
});
busboy.on('finish', () => {
admin
.storage()
.bucket()
.upload(imgToBeUploaded.filepath, {
resumable: false,
metadata: {
metadata: {
contentType: imgToBeUploaded.mimetype
}
}
})
.then(() => {
// Create img url to add to our user
const imgUrl = `https://firebasestorage.googleapis.com/v0/b/${config.storageBucket}/o/${imgFileName}?alt=media`;
// Add img url to user document
return db.doc(`/users/${req.user.handle}`).update({ imgUrl });
})
.then(() => {
return res.json({ message: 'Image uploaded succesfully' });
})
.catch((err) => {
console.error(err);
return res.status(500).json({ error });
});
});
busboy.end(req.rawBody);
};
index.js
const { app, functions } = require('./util/admin');
const FirebaseAuth = require('./util/firebaseAuth');
const {
postUserImage,
} = require('./handlers/users');
app.post('/user/image', FirebaseAuth, postUserImage);

Firebase Functions get files from storage

I have to send a file to an API, therefor I have to use fs.readFileSync(). After uploading the picture to the storage, I am calling my function to execute the API call. But I cannot get the file from the storage. This is a section of the code, which always gets null in the result. I tried also to .getFiles() without a parameter and then I got all files but I dont want to filter them by iteration.
exports.stripe_uploadIDs = functions.https //.region("europe-west1")
.onCall((data, context) => {
const authID = context.auth.uid;
console.log("request is authentificated? :" + authID);
if (!authID) {
throw new functions.https.HttpsError("not authorized", "not authorized");
}
let accountID;
let result_fileUpload;
let tempFile = path.join(os.tmpdir(), "id_front.jpg");
const options_id_front_jpeg = {
prefix: "/user/" + authID + "/id_front.jpg"
};
const storageRef = admin
.storage()
.bucket()
.getFiles(options_id_front)
.then(results => {
console.log("JPG" + JSON.stringify(results));
// need to write this file to tempFile
return results;
});
const paymentRef = storageRef.then(() => {
return admin
.database()
.ref("Payment/" + authID)
.child("accountID")
.once("value");
});
const setAccountID = paymentRef.then(snap => {
accountID = snap.val();
return accountID;
});
const fileUpload = setAccountID.then(() => {
return Stripe.fileUploads.create(
{
purpose: "identity_document",
file: {
data: tempFile, // Documentation says I should use fs.readFileSync("filepath")
name: "id_front.jpg",
type: "application/octet-stream"
}
},
{ stripe_account: accountID }
);
});
const fileResult = fileUpload.then(result => {
result_fileUpload = result;
console.log(JSON.stringify(result_fileUpload));
return result_fileUpload;
});
return fileResult;
});
Result is:
JPG[[]]
You need to download your file from a bucket to your local function context env.
After your Firebase function start executing you can call the below:
More or less the below should work, just tweak to your needs. Call this within you .onCall context, you get the idea
import admin from 'firebase-admin';
import * as path from 'path';
import * as os from 'os';
import * as fs from 'fs';
admin.initializeApp();
const { log } = console;
async function tempFile(fileBucket: string, filePath: string) {
const bucket = admin.storage().bucket(fileBucket);
const fileName = 'MyFile.ext';
const tempFilePath = path.join(os.tmpdir(), fileName);
const metadata = {
contentType: 'DONT_FORGET_CONTEN_TYPE'
};
// Donwload the file to a local temp file
// Do whatever you need with it
await bucket.file(filePath).download({ destination: tempFilePath });
log('File downloaded to', tempFilePath);
// After you done and if you need to upload the modified file back to your
// bucket then uploaded
// This is optional
await bucket.upload(tempFilePath, {
destination: filePath,
metadata: metadata
});
//free up disk space by realseasing the file.
// Otherwise you might be charged extra for keeping memory space
return fs.unlinkSync(tempFilePath);
}

Firebase cloud functions bucket upload no such file or directory

I'm writing a cloud function that triggers every time a file is uploaded on my default firebase storage bucket and if it's an image it converts it to JPG.
This is the official firebase example:
exports.processTripImage = functions.storage.object().onFinalize((object) => {
const filePath = object.name;
const baseFileName = path.basename(filePath, path.extname(filePath));
const fileDir = path.dirname(filePath);
const JPEGFilePath = path.normalize(path.format({ dir: fileDir, name: baseFileName, ext: JPEG_EXTENSION }));
const tempLocalFile = path.join(os.tmpdir(), filePath);
const tempLocalDir = path.dirname(tempLocalFile);
const tempLocalJPEGFile = path.join(os.tmpdir(), JPEGFilePath);
// Exit if this is triggered on a file that is not an image.
if (!object.contentType.startsWith('image/')) {
console.log('This is not an image.');
return null;
}
// Exit if the image is already a JPEG.
if (object.contentType.startsWith('image/jpeg')) {
console.log('Already a JPEG.');
return null;
}
const bucket = gcs.bucket(object.bucket);
// Create the temp directory where the storage file will be downloaded.
return mkdirp(tempLocalDir).then(() => {
// Download file from bucket.
return bucket.file(filePath).download({ destination: tempLocalFile });
}).then(() => {
console.log('The file has been downloaded to', tempLocalFile);
// Convert the image to JPEG using ImageMagick.
return spawn('convert', [tempLocalFile, tempLocalJPEGFile]);
}).then(() => {
console.log('JPEG image created at', tempLocalJPEGFile);
// Uploading the JPEG image.
return bucket.upload(tempLocalJPEGFile, { destination: JPEGFilePath });
}).then(() => {
console.log('JPEG image uploaded to Storage at', JPEGFilePath);
// Once the image has been converted delete the local files to free up disk space.
fs.unlinkSync(tempLocalJPEGFile);
fs.unlinkSync(tempLocalFile);
return;
});
})
The problem is that it writes the file and convert it correctly but when it tries to upload it back to the bucket it can't find the file in the tmp directory it just created.
My cloud function log:

Resources