Firebase Function & Storage has stopped working - firebase

I wrote two functions (~ 07/20/21) using the FB admin SDK to upload a file to FB Storage. It worked for 300+ uploads. I came back to it yesterday (08/11) and it no longer works, and has had no code changes (that I know of).
The FB functions Logs are saying app ReferenceError: bucket is not defined at Object.writeFileToFirebase (/workspace/uploadUtils.js:12:18) The storage bucket is defined in my ./firebaseConfig.js in accordance with the documentation.
const firebase = require('firebase')
const functions = require('firebase-functions')
const admin = require('firebase-admin')
const firebaseConfig = require('./firebaseConfig.js')
firebase.initializeApp(firebaseConfig)
admin.initializeApp(firebaseConfig)
const bucket = admin.storage().bucket()
And my two functions are exported from uploadUtils.js
async function writeFileToFirebase(filename, mimetype, filebuffer) {
const file = bucket.file(filename)
const filestream = file.createWriteStream({
metadata: {
contentType: mimetype
}
})
await filestream.end(filebuffer).catch(functions.logger.log(err))
return
}
async function createThumbnail(newthumbname, mimetype, filebuffer) {
const file = bucket.file(newthumbname)
const thumbstream = file.createWriteStream({
metadata: {
contentType: mimetype
}
})
const gm = require('gm').subClass({
imageMagick: true
})
gm(filebuffer)
.resize(240, 240)
.toBuffer('jpg', (err, thumbbuffer) => {
thumbstream.end(thumbbuffer).catch(console.log(err))
})
return
}
exports.writeFileToFirebase = writeFileToFirebase
exports.createThumbnail = createThumbnail
Line 12 from the err msg is the const file = admin.storage().bucket()
My declaration of bucket is at a higher scope than the function call. And I'm using the default bucket I've specified in the firebaseConfig.js.
Can anyone tell me what's wrong with my bucket declaration? Or is my problem elsewhere?

It looks like I needed to move these two lines into the function declaration. They're not inherited... I have no idea why this used to work before this change.
const admin = require('firebase-admin')
const bucket = admin.storage().bucket()

Related

update firebase upon cloud storage thumbnail completion

I have the following index.js that's triggered when a thumbnail is generated in cloud storage. It seems to work fine.
I'd like to replace the console.log line with code that adds a field like {"thumbnail_done":true} to the firebase document docid found in the script. I'm not clear on how to do that.
exports.thumbComplete = (event, context) => {
const fname = event.name;
let suffix = "_200x200.jpeg"
if (fname.endsWith(suffix)) {
let docid = fname.substring(0, fname.length - suffix.length);
console.log(`thumbnail processing complete: ${docid}`);
}
};
Thanks!
Got it working with the following:
// The Cloud Functions for Firebase SDK to create Cloud Functions and set up triggers.
const functions = require('firebase-functions');
// The Firebase Admin SDK to access Firestore.
const admin = require('firebase-admin');
admin.initializeApp();
exports.generateThumbnail = functions.storage.object().onFinalize(async (object) => {
const fname = object.name;
let suffix = "_200x200.jpeg"
if (fname.endsWith(suffix)) {
let docid = fname.substring(0, fname.length - suffix.length);
await admin.firestore().doc(`photo/${docid}`).update({ 'uploaded': true });
}
});

Read from firebase storage and write to firestore using firebase functions

I had tried this typescript code 👇
import * as functions from "firebase-functions";
import * as admin from "firebase-admin";
import serviceAccount from "/Users/300041370/Downloads/serviceKey.json";
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
});
const buckObj = functions.storage.bucket("myBucket").object();
export const onWikiWrite = buckObj.onFinalize(async (object) => {
const filePath = object.name ?? "test.json";
const bucket = admin.storage().bucket("myBucket");
bucket.file(filePath).download().then((data) => {
const contents = data[0];
data = {"key": "value"};
const doc = admin.firestore().collection("myCollection").doc();
doc.set(data);
});
});
but this gave me following error
"status":{"code":7,"message":"Insufficient permissions to (re)configure a trigger (permission denied for bucket myBucket). Please, give owner permissions to the editor role of the bucket and try again.
I had asked this question here but it got closed as duplicate of this question. It basically said, storage.bucket("myBucket") feature is not supported and that I'll have to instead use match for limiting this operation to files in this specific bucket/folder. Hence, I tried this 👇
const buckObj = functions.storage.object();
export const onWikiWrite = buckObj.onFinalize(async (object) => {
if (object.name.match(/myBucket\//)) {
const fileBucket = object.bucket;
const filePath = object.name;
const bucket = admin.storage().bucket(fileBucket);
bucket.file(filePath).download().then((data) => {
const contents = data[0];
const doc = admin.firestore().collection("myCollection").doc();
const data = {content: contents}
doc.set(data);
});
}
});
I am still facing the same issue. I'll repeat that here:
"status":{"code":7,"message":"Insufficient permissions to (re)configure a trigger (permission denied for bucket myBucket). Please, give owner permissions to the editor role of the bucket and try again.
Since version 1.0 of the Firebase SDK for Cloud Functions, firebase-admin shall be initialized without any parameters within the Cloud Functions runtime.
The following should work (I've removed the check on filePath):
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp();
export const onWikiWrite = functions.storage
.object()
.onFinalize(async (object) => {
const fileBucket = object.bucket;
const filePath = object.name;
const bucket = admin.storage().bucket(fileBucket);
return bucket
.file(filePath)
.download()
.then((data) => {
const contents = data[0];
return admin
.firestore()
.collection('myCollection')
.add({ content: contents });
});
});
Note that we return the chain of promises returned by the asynchronous Firebase methods. It is key, in a Cloud Function which performs asynchronous processing (also known as "background functions") to return a JavaScript promise when all the asynchronous processing is complete.
We also use the add() method instead of doing doc().set().
Finally, when checking the value of the filePath, be aware of the fact that there is actually no concept of folder or subdirectory in Cloud Storage (See this answer).

Uploading files from Firebase Cloud Functions to Cloud Storage

The documentation is too complex for me to understand. It shows how to download a file from Cloud Storage to Cloud Functions, manipulate the file, and then upload the new file to Cloud Storage. I just want to see the basic, minimum instructions for uploading a file from Cloud Functions to Cloud Storage. Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
var metadata = {
contentType: 'text',
};
admin.storage().ref().put( {'test': 'test'}, metadata)
.then(function() {
console.log("Document written.");
})
.catch(function(error) {
console.error(error);
})
});
The error message is admin.storage(...).ref is not a function. I'm guessing that firebase-admin includes Firestore but not Storage? Instead of firebase-admin should I use #google-cloud/storage? Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const {Storage} = require('#google-cloud/storage')();
const storage = new Storage();
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
storage.bucket().upload( {'test': 'test'} , {
metadata: {
contentType: 'text'
}
})
});
I can't even deploy this code, the error message is
Error parsing triggers: Cannot find module './clone.js'
Apparently a npm module dependency is missing? But the module isn't called clone.js? I tried requiring child-process-promise, path, os, and fs; none fixed the missing clone.js error.
Why does admin.initializeApp(); lack parameters, when in my index.html file I have:
firebase.initializeApp({
apiKey: 'swordfish',
authDomain: 'myapp.firebaseapp.com',
databaseURL: "https://myapp.firebaseio.com",
projectId: 'myapp',
storageBucket: "myapp.appspot.com"
});
Another issue I'm seeing:
npm list -g --depth=0
/Users/TDK/.nvm/versions/node/v6.11.2/lib
├── child_process#1.0.2
├── UNMET PEER DEPENDENCY error: ENOENT: no such file or directory, open '/Users/TDK/.nvm/versions/node/v6.11.2/lib/node_modules/firebase-admin/package.json
├── firebase-functions#2.1.0
├── firebase-tools#6.0.1
├── firestore-backup-restore#1.3.1
├── fs#0.0.2
├── npm#6.4.1
├── npm-check#5.9.0
├── protractor#5.4.1
├── request#2.88.0
└── watson-developer-cloud#3.13.0
In other words, there's something wrong with firebase-admin, or with Node 6.11.2. Should I use a Node Version Manager to revert to an older version of Node?
Go to https://console.cloud.google.com/iam-admin/iam
Click the pencil icon next to your App Engine default service account
+ ADD ANOTHER ROLE
Add Cloud Functions Service Agent
In my specific use case, I needed to decode a base64 string into a byte array and then use that to save the image.
var serviceAccount = require("./../serviceAccountKey.json");
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp({
projectId: serviceAccount.project_id,
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your_project_id_here.firebaseio.com", //update this
storageBucket: "your_bucket_name_here.appspot.com" //update this
});
function uploadProfileImage(imageBytes64Str: string): Promise<any> {
const bucket = admin.storage().bucket()
const imageBuffer = Buffer.from(imageBytes64Str, 'base64')
const imageByteArray = new Uint8Array(imageBuffer);
const file = bucket.file(`images/profile_photo.png`);
const options = { resumable: false, metadata: { contentType: "image/jpg" } }
//options may not be necessary
return file.save(imageByteArray, options)
.then(stuff => {
return file.getSignedUrl({
action: 'read',
expires: '03-09-2500'
})
})
.then(urls => {
const url = urls[0];
console.log(`Image url = ${url}`)
return url
})
.catch(err => {
console.log(`Unable to upload image ${err}`)
})
}
Then you can call the method like this and chain the calls.
uploadProfileImage(image_bytes_here)
.then(url => {
//Do stuff with the url here
})
Note: You must initialize admin with a service account and specify the default bucket. If you simply do admin.initializeApp() then your image urls will expire in 10 days.
Steps to properly use a service account.
Go to Service Accounts and generate a private key
Put the JSON file in your functions folder (next to src and node_modules)
Go to Storage and copy the URL not including the "gs://" in the front. Use this for the storage bucket url when initializing admin.
Use your project ID above for the database URL.
See Introduction to the Admin Cloud Storage
API for further
details on how to use the Cloud Storage service in Firebase Admin SDK.
var admin = require("firebase-admin");
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "<BUCKET_NAME>.appspot.com"
});
var bucket = admin.storage().bucket();
// 'bucket' is an object defined in the #google-cloud/storage library.
// See https://googlecloudplatform.github.io/google-cloud-node/#/docs/storage/latest/storage/bucket
// for more details.
Regarding uploading objects, see Cloud Storage Documentation Uploading Objects sample code:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
I uploaded a file from my hard drive to Firebase Cloud Storage via Google Cloud Functions. First, I found the documentation for Google Cloud Functions bucket.upload.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('hello_world.ogg', options).then(function(data) {
const file = data[0];
});
return 0;
});
The first three lines are Cloud Functions boilerplate. The next line
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
creates the Cloud Function and sets the trigger. The next three lines are more Google Cloud boilerplate.
The rest of the code locates the file hello_world.ogg on my computer's hard drive in the functions folder of my project directory and uploads it to the directory Test_Folder and changes the name of the file to hello_world.dog in my Firebase Cloud Storage. This returns a promise, and the next line const file = data[0]; is unnecessary unless you want to do something else with the file.
Lastly we return 0;. This line does nothing except prevent the error message
Function returned undefined, expected Promise or Value
if (req.rawBody) {
busboy.end(req.rawBody);
}
else {
req.pipe(busboy);
}
As described in this issue: https://github.com/GoogleCloudPlatform/cloud-functions-emulator/issues/161#issuecomment-376563784

Firebase storage upload string with Cloud function

I would like to upload a string of text and have that string uploaded to Cloud storage. I've build it in plain JS, but having issues hacking it into a cloud function.
function download(exportObj){
var databuk = gcs.bucket('******.appspot.com');
// var bucket = admin.storage().bucket();
//var tocfileloc = storageRef.child('toctest.json');
// const name = "toctest.json";
// const bucketdes = bucket.name;
var dataStr = "data:text/json;charset=utf-8," + encodeURIComponent(JSON.stringify(exportObj));
databuk.putString(dataStr, 'data_url').then(snapshot => {
console.log('Uploaded a data_url string!');
return true;
}).catch(err=>{
console.log("error",err);
})
}
I have some code above! The string is "exportObj"
You'll want to use the Admin SDK for this. It'll be something along the lines of:
const admin = require('firebase-admin');
admin.initializeApp();
// ... then later, in your function
const file = admin.storage().bucket().file('path/to/your/file.txt');
return file.save('This will get stored in my storage bucket.', {
gzip: true,
contentType: 'text/plain'
}).then(() => {
console.log('all done!');
});
The specific "save" method is documented here.

How to store strings into firebase storage via cloud function

I did Google around and tried some code but didn't work, since every time I deploy cloud functions to firebase, it takes about 30 secs - 1 min, I think it's a complete waste of time if I continued to try code from the internet
So, I need to write a cloud function like this:
const admin = require('firebase-admin');
module.exports = function (request, response) {
const { message } = request.body;
// Now, store `message` into firebase storage
// path is: /messages/new_message, where `new_message`
// is NOT a folder, but the file that contains `message`
}
I do have a solution, but obviously, it's not a wise choice, I mean, I can always install firebase package, then call initializeApp(...), then firebase.storage().ref().... Is there another way to do this? Could you please write a little code to elaborate it?
You'll want to use the #google-cloud/storage module.
// Creates a GCS client,
const storage = new Storage();
module.exports = function (req, res) {
const { message } = req.body;
const bucket = storage .bucket('projectid.appspot.com');
const file = bucket.file('myFolder/myFilename');
// gcloud supports upload(file) not upload(bytes), so we need to stream.
const uploadStream = file.createWriteStream();
.on('error', (err) => {
res.send(err);
}).on('finish', () => {
res.send('ok');
}
uploadStream.write(data);
uploadStream.end();
}
See my parse-server GCS adapter for an example.

Resources