Unzip file at Firebase storage - firebase

I am currently learning Firebase.
I have a requirement that from HTML I can transfer the file(.zip) successfully to firebase storage bucket.
My question, is it possible to unzip the file after uploading complete at firebase storage server.
I can do that using PHP, I just wonder if same is possible using Firebase without any server code.

I believe you can achieve it by leveraging cloud functions.
You can write storage triggers for paths and once a zip file is uploaded, it triggers the cloud function which unzips the file to where you want it. To save space you can also delete the zip file after unzipping it.

No, the file you upload from the client gets stored as exactly the same file in your storage bucket. There is not currently a way to automatically change that stored file after it's been uploaded.

There is a way to perform it with FirebaseFunctions. We can modify the code in Aeyrium's answer to this Stack Overflow question to suite our requirements, as follows:
const functions = require('firebase-functions');
const admin = require("firebase-admin");
const path = require('path');
const fs = require('fs');
const os = require('os');
const unzip = require('unzipper')
var serviceAccount = require("./serviceAccountKey.json");
const firebaseConfig = {
apiKey: "*",
authDomain: "*",
databaseURL: "*.firebaseio.com",
projectId: "*",
storageBucket: "p*.appspot.com",
messagingSenderId: "*",
appId: "*",
measurementId: "*",
credential: admin.credential.cert(serviceAccount)
};
admin.initializeApp(firebaseConfig);
const storage = admin.storage();
const runtimeOpts = {
timeoutSeconds: 540,
memory: '256MB'
}
exports.unzip = functions.runWith(runtimeOpts).storage.object().onFinalize((object) => {
return new Promise((resolve, reject) => {
//console.log("objct is:",object)
if (object.contentType !== 'application/x-zip') {
reject();
} else {
//const bucket = admin.storage.bucket(object.bucket)
const bucket = admin.storage().bucket()
const remoteFile = bucket.file(object.name)
const remoteDir = object.name.replace('.zip', '')
console.log(`Downloading ${remoteFile}`)
remoteFile.createReadStream()
.on('error', err => {
console.error(err)
reject(err);
})
.on('response', response => {
// Server connected and responded with the specified status and headers.
//console.log(response)
})
.on('end', () => {
// The file is fully downloaded.
console.log("Finished downloading.")
resolve();
})
.pipe(unzip.Parse())
.on('entry', entry => {
const file = bucket.file(`${remoteDir}/${entry.path}`)
entry.pipe(file.createWriteStream())
.on('error', err => {
console.log(err)
reject(err);
})
.on('finish', () => {
console.log(`Finsihed extracting ${remoteDir}/${entry.path}`)
});
//entry.autodrain();
});
}
})
});
Also, there is a short tutorial about that here in TypeScript.

Related

Firebase Storage`uploadBytes`: "TypeError: Cannot read properties of undefined (reading 'byteLength')"

I'm trying to write a Firebase Cloud Function that uploads a file to Firebase Cloud Storage using uploadBytes. I'm following the documentation for web apps. Whatever I do throws this error:
TypeError: Cannot read properties of undefined (reading 'byteLength')
This error message isn't listed on the documentation page for error handling but I've deduced that the error message means that it can't find the file to upload. I'm getting this error with the emulator and with the cloud.
Let's start with uploadString, which works.
import { initializeApp } from "firebase/app";
import * as functions from "firebase-functions";
import { getStorage, ref, uploadBytes, uploadString, connectStorageEmulator } from "firebase/storage";
const firebaseConfig = {
apiKey: "12345",
authDomain: "my-awesome-app.firebaseapp.com",
databaseURL: "https://my-awesome-app.firebaseio.com",
projectId: "my-awesome-app",
storageBucket: "my-awesome-app.appspot.com",
messagingSenderId: "12345",
appId: "12345"
};
initializeApp(firebaseConfig);
export const StringMe = functions.firestore.document('StringMe/{userID}').onUpdate((change, context) => {
const storage = getStorage();
// const storageRef = ref(storage, 'message.txt'); // location to write to
// connectStorageEmulator(storage, "localhost", 9199); // comment out to write to the cloud
const storageRef = ref(storage, 'gs://my-awesome-app.appspot.com/Pictures/message.txt');
const message = "Hello world!";
async function uploadMessage() {
try {
await uploadString(storageRef, message);
console.log("Uploaded a string!");
} catch (error) {
console.error(error);
}
};
return uploadMessage()
});
This uploads a string to my Cloud Firestore and logs Uploaded a string! and then Finished "StringMe" in 417.150521ms. 60 seconds later it throws an error:
functions: Your function timed out after ~60s. To configure this timeout, see
https://firebase.google.com/docs/functions/manage-functions#set_timeout_and_memory_allocation.
⚠ Your function was killed because it raised an unhandled error.
That error seems to be a bug in the Firebase CLI, I ignore it.
Let's try this with the emulator. We'll comment out the storageRef and uncomment the two commented lines.
const storageRef = ref(storage, 'message.txt');
connectStorageEmulator(storage, "localhost", 9199);
That doesn't throw any errors (except the 60 second timeout), doesn't log anything, and nothing is written to Storage. Why doesn't the Storage Emulator work?
Now let's make a file to upload.
// my-module.js
export const file = "Hello world";
Then we'll upload it to Cloud Storage.
import { initializeApp } from "firebase/app";
import * as functions from "firebase-functions";
import { getStorage, ref, uploadBytes, uploadString, connectStorageEmulator } from "firebase/storage";
import { file } from "./my-module.js";
const firebaseConfig = {
apiKey: "...",
authDomain: "...",
databaseURL: "...",
projectId: "...",
storageBucket: "...",
messagingSenderId: "...",
appId: "..."
};
initializeApp(firebaseConfig);
export const ByteMe = functions.firestore.document('ByteMe/{userID}').onUpdate((change, context) => {
const storage = getStorage(app);
// const storageRef = ref(storage, 'hello.txt'); // location to write to
// connectStorageEmulator(storage, "localhost", 9199); // comment out to write to the cloud
const storageRef = ref(storage, 'gs://my-awesome-app.appspot.com/Pictures/hello.txt');
const metadata = {
contentType: 'text/plain',
};
async function uploadFile() {
try {
console.log(file);
await uploadBytes(storageRef, file, metadata);
console.log('Uploaded a file!');
} catch (error) {
console.error(error);
}
}
return uploadFile();
});
This logs Hello world (we know that the file is available and readable within the function) and then throws this error:
TypeError: Cannot read properties of undefined (reading 'byteLength')
Something is undefined. Nothing has changed in the code except that the string became file. The error message must be saying that it can't read the file. byteLength seems to be a red herring, best ignored unless you like rabbit holes. Why can't uploadBytes read the file?
Switching to the emulator throws the same error message.
Let's try getting a file from an API and then uploading it to Storage.
import { initializeApp } from "firebase/app";
import * as functions from "firebase-functions";
import { getStorage, ref, uploadBytes, uploadString, connectStorageEmulator } from "firebase/storage";
import got from 'got';
const firebaseConfig = {
apiKey: "...",
authDomain: "...",
databaseURL: "...",
projectId: "...",
storageBucket: "...",
messagingSenderId: "...",
appId: "..."
};
initializeApp(firebaseConfig);
export const ByteAPI = functions.firestore.document('ByteAPI/{userID}').onUpdate((change, context) => {
const storage = getStorage();
// const storageRef = ref(storage, 'picture.jpg'); // location to write to
// connectStorageEmulator(storage, "localhost", 9199); // comment out to write to the cloud
const storageRef = ref(storage, 'gs://my-awesome-app.appspot.com/Pictures/winter.mp3');
const metadata = {
contentType: 'audio/mpeg',
};
async function uploadFile() {
try {
let file = await got('https://audio.oxforddictionaries.com/en/mp3/winter__us_2.mp3');
await uploadBytes(storageRef, file, metadata);
console.log('Uploaded a file!');
} catch (error) {
console.error(error);
}
}
return uploadFile();
});
You can click on https://audio.oxforddictionaries.com/en/mp3/winter__us_2.mp3 and listen to the audio file.
This throws the same error, with Cloud Storage or the emulator:
TypeError: Cannot read properties of undefined (reading 'byteLength')
I also tried uploading a Uint8Array, same error message. Is uploadBytes broken?
My answer below works but is not best practices. I'm working on a best practices tutorial: https://github.com/tdkehoe/Cloud-Functions-for-Firebase-Tutorial. The short answer is, use uploadBytes from the front end, not from Cloud Functions. Use Node in Cloud Functions
--
I was able to download the audiofile from the Oxford English Dictionary API and upload it to Cloud Storage by changing file to file['rawBody']:
await uploadBytes(storageRef, file['rawBody'], metadata);
This didn't work for uploading the Hello world text file. The documentation says that uploadBytes will handle "JavaScript File and Blob APIs". This has to do with JSON file and buffers, which I don't understand. I'll keep working on this.

Firebase storage - Download directory as ".zip"

If you use firebase storage, you could see it's not possible to download a folder (as zip) directly from firebase UI or using gcloud UI. So it can be hard to create backup of your firebase storage bucket, moreover if you use firestore you can export collections into firebase storage.
I created a nodejs script based on firebase-admin and jszip, with two args : first the download path in firebase storage and second is the path where stored the zip file
package.json:
{
"dependencies": {
"firebase-admin": "^9.6.0",
"jszip": "^3.6.0"
}
}
index.js:
const fs = require("fs");
const JSZip = require('jszip');
const admin = require('firebase-admin');
const serviceAccount = require("./service-account-key.json");
async function main (){
try{
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "stackd-56e96.appspot.com",
});
const bucket = admin.storage().bucket();
const src_storage_path = process.argv[2];
let dest_storage_path = process.argv[3];
if(dest_storage_path.includes(":"))
dest_storage_path = dest_storage_path.replace(":", "_")
const jszip = new JSZip();
const files = (await bucket.getFiles({
prefix: `${src_storage_path}/`
}))[0]
const filesContent = await Promise.all(
files.map(file => file.download())
);
filesContent.forEach((content, i) => {
jszip.file(files[i].name, content[0])
});
const content = await jszip.generateAsync({ type: 'nodebuffer' });
await fs.promises.writeFile(dest_storage_path, content)
} catch (error){
console.error(error)
}
}
main();
command line exemple:
node index.js 2021-04-16T11:47:46_54052 backup.zip

Saving a buffer in Google Cloud Bucket

I'm trying to find a solution that will let me stream in-memory created zip to Google Cloud Bucket (I'm using Firebase, but seems like it's beyond it so I need to handle it through GCB).
I have nailed down file creation part (code below) and when it's working locally on my machine it saves it in the main folder where server files reside. So far so good.
Now I found this link that lets stream transfers, but not sure how to connect them. Should it be after zip is created? Instead? Any suggestions are welcome!
const express = require('express')
var router = express.Router()
var archiver = require('archiver')
var admin = require("firebase-admin");
var serviceAccount = require("../servicekey.json")
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://myName.firebaseio.com",
storageBucket: "myName.appspot.com"
})
var bucket = admin.storage().bucket()
const {
convertCSVtoJSON,
generateDocuments,
generateDocx,
isCorrectTemplateFileType
} = require('./generateServices')
router.post('/', async (req, res) => {
try {
if(!isCorrectTemplateFileType(req.files.template))
return res.status(403).send({
message: 'Wrong file type. Please provide .docx file.'
})
const template = req.files.template.data
const data = await convertCSVtoJSON(req.files.data1)
let zip = archiver('zip')
zip.on('warning', function(err) {
console.log(err)
});
zip.on('error', function(err) {
res.status(500).send({error: err.message})
});
zip.on('entry', function(ars) {
// console.log(ars)
});
zip.on('end', function() {
console.log('Archive wrote %d bytes', zip.pointer())
});
// res.attachment('archive-name.zip')
// zip.pipe(output)
// zip.pipe(res)
data.forEach((docData, index) => {
let buff = generateDocx(template, docData, 'title')
zip.append(buff, { name: `${index}.docx` })
})
zip.finalize()
console.log(zip)
const file = bucket.file("pliki.zip") // nazwa do zmiany
file.save(zip, (err) => {
if (!err) {
console.log("cool");
} else {
console.log("error " + err);
}
});
res.sendStatus(201)
} catch (error) {
console.log(error)
res.send(error)
}
})
module.exports = router

Uploading to Firebase Storage from a Google Cloud Function

I'm trying to create a Firebase Function that allows me to pass an array of image URLs in to create generate a montage, upload the file to Firebase Storage and then return the generated Download URL. This will be called from my app, so I'm using functions.https.onCall.
const functions = require("firebase-functions");
const admin = require('firebase-admin');
var gm = require('gm').subClass({imageMagick: true});
admin.initializeApp();
exports.createMontage = functions.https.onCall((data, context) => {
var storageRef = admin.storage().bucket( 'gs://xyz-zyx.appspot.com' );
var createdMontage = storageRef.file('createdMontage.jpg');
function generateMontage(list){
let g = gm()
list.forEach(function(p){
g.montage(p);
})
g.geometry('+81+81')
g.density(5000,5000)
.write(createdMontage, function(err) {
if(!err) console.log("Written montage image.");
});
return true
}
generateMontage(data)
return createdMontage.getDownloadURL();
});
The function generateMontage() works locally on NodeJs (with a local write destination).
Thank you.
Have a look at this example from the docs:
https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-code-sample
2021-01-11 Update
Here's a working example. I'm using regular Cloud Functions and it's limited in that the srcObject, dstObject and bucketName are constants but, it does create montages which is your goal.
PROJECT=[[YOUR-PROJECT]]
BILLING=[[YOUR-BILLING]]
REGION=[[YOUR-REGION]]
FUNCTION=[[YOUR-FUNCTION]]
BUCKET=[[YOUR-BUCKET]]
OBJECT=[[YOUR-OBJECT]] # Path from ${BUCKET} root
gcloud projects create ${PROJECT}
gcloud beta billing projects link ${PROJECT} \
--billing-account=${BILLING}
gcloud services enable cloudfunctions.googleapis.com \
--project=${PROJECT}
gcloud services enable cloudbuild.googleapis.com \
--project=${PROJECT}
gcloud functions deploy ${FUNCTION} \
--memory=4gib \
--max-instances=1
--allow-unauthenticated \
--entry-point=montager \
--set-env-vars=BUCKET=${BUCKET},OBJECT=${OBJECT} \
--runtime=nodejs12 \
--trigger-http \
--project=${PROJECT} \
--region=${REGION}
ENDPOINT=$(\
gcloud functions describe ${FUNCTION} \
--project=${PROJECT} \
--region=${REGION} \
--format="value(httpsTrigger.url)")
curl \
--request GET \
${ENDPOINT}
`package.json`:
```JSON
{
"name": "montage",
"version": "0.0.1",
"dependencies": {
"#google-cloud/storage": "5.7.1",
"gm": "^1.23.1"
}
}
And index.js:
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
const gm = require('gm').subClass({ imageMagick: true });
const bucketName = process.env["BUCKET"];
const srcObject = process.env["OBJECT"];
const dstObject = "montage.png";
// Creates 2x2 montage
const list = [
`/tmp/${srcObject}`,
`/tmp/${srcObject}`,
`/tmp/${srcObject}`,
`/tmp/${srcObject}`
];
const montager = async (req, res) => {
// Download GCS `srcObject` to `/tmp`
const f = await storage
.bucket(bucketName)
.file(srcObject)
.download({
destination: `/tmp/${srcObject}`
});
// Creating GCS write stream for montage
const obj = await storage
.bucket(bucketName)
.file(dstObject)
.createWriteStream();
let g = gm();
list.forEach(f => {
g.montage(f);
});
console.log(`Returning`);
g
.geometry('+81+81')
.density(5000, 5000)
.stream()
.pipe(obj)
.on(`finish`, () => {
console.log(`finish`);
res.status(200).send(`ok`);
})
.on(`error`, (err) => {
console.log(`error: ${err}`);
res.status(500).send(`uhoh!`);
});
}
exports.montager = montager;
I have never used 'gm', but, according to its npm page, it has a toBuffer function.
So maybe something like this could work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const gm = require('gm').subClass({ imageMagick: true });
admin.initializeApp();
exports.createMontage = functions.https.onCall((data, _context) => {
const bucketName = 'xyz-zyx'; // not sure, I've always used the default bucket
const bucket = admin.storage().bucket(bucketName);
const storagePath = 'createdMontage.jpg';
const fileRef = bucket.file(storagePath);
const generateMontage = async (list) => {
const g = gm();
list.forEach(function (p) {
g.montage(p);
});
g.geometry('+81+81');
g.density(5000, 5000);
return new Promise(resolve => {
g.toBuffer('JPG', (_err, buffer) => {
const saveTask = fileRef.save(buffer, { contentType: 'image/jpeg' });
const baseStorageUrl = `https://firebasestorage.googleapis.com/v0/b/${bucket.name}/o/`;
const encodedPath = encodeURIComponent(storagePath);
const postfix = '?alt=media'; // see stackoverflow.com/a/58443247/6002078
const publicUrl = baseStorageUrl + encodedPath + postfix;
saveTask.then(() => resolve(publicUrl));
});
});
};
return generateMontage(data);
});
But it seems it can be done more easily. As Methkal Khalawi commented:
here is a full example on how to use ImageMagic with Functions. Though they are using it for blurring an image but the idea is the same. And here is a tutorial from the documentation.
I think you can pipe output stream from gm module to firebase storage object write stream.
const functions = require("firebase-functions");
const admin = require('firebase-admin');
var gm = require('gm').subClass({imageMagick: true});
admin.initializeApp();
exports.createMontage = functions.https.onCall(async (data, context) => {
var storage = admin.storage().bucket( 'gs://xyz-zyx.appspot.com' );
var downloadURL = await new Promise((resolve, reject) => {
let g = gm()
list.forEach(function(p){
g.montage(p);
})
g.geometry('+81+81')
g.density(5000,5000)
.stream((err, stdout, stderr) => {
if (err) {
reject();
}
stdout.pipe(
storage.file('generatedMotent.png).createWriteStream({
metadata: {
contentType: 'image/png',
},
})
).on('finish', () => {
storage
.file('generatedMotent')
.getSignedUrl({
action: 'read',
expires: '03-09-2491', // Non expring public url
})
.then((url) => {
resolve(url);
});
});
})
});
return downloadURL;
});
FYI, Firebase Admin SDK storage object does not have getDownloadURL() function.
You should generate non-expiring public signed URL from the storage object.
In addition to, it should cause another problem after some period of time according to this issue.
To get rid of this issue happening, you should initialize firebase app with permanent service account.
const admin = require('firebase-admin');
const serviceAccount = require('../your-service-account.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
projectId: JSON.parse(process.env.FIREBASE_CONFIG).projectId,
databaseURL: JSON.parse(process.env.FIREBASE_CONFIG).databaseURL,
storageBucket: JSON.parse(process.env.FIREBASE_CONFIG).storageBucket,
});

Cloud Storage API doesn't work when deploy on Google Cloud Functions using Firebase

This work perfectly in local serve with firebase :
const gCloudConfig = {
projectId: 'XXXX-X1234',
keyFilename: './key.json'
}
const Storage = require('#google-cloud/storage')(gCloudConfig);
const storageBucket = Storage.bucket(bucketUrl);
storageBucket.upload(file.path, {destination: file.name})
.then(() => {
//
});
But this doesn't work when i deploy to firebase :
const Storage = require('#google-cloud/storage')();
const storageBucket = Storage.bucket(bucketUrl);
storageBucket.upload(file.path, {destination: file.name})
.then(() => {
//
});
I put this line after the admin.initializeApp(...), since i saw that it fixed the problem for someone, but it still doesn't work.
I've tried a lot of stuff :
const gCloudConfig = { projectId: 'XXXX-X1234' };
const gCloudConfig = { key: API_KEY };
const gCloudConfig = { key: API_KEY, projectId: 'XXXX-X1234' };
const gCloudConfig = functions.config().firebase;
I'm kinda lost, please help me !
It's easier if you just initialize the Firebase Admin SDK with its default credentials, then access the Cloud Storage APIs via that. There's no need to initialize Storage on its own.
const admin = require('firebase-admin')
admin.initializeApp()
const bucket = admin.storage().bucket()
bucket.upload(localPath, {
destination: remotePath
})
Here, bucket is your project default storage bucket, just like you would have gotten it from the Cloud Storage API.
Note that the no-argument init of the Admin SDK is available when using firebase-functions#1.0.0 or later (current 1.0.2).

Resources