Firebase Emulator Cloud Functions getFiles() retrieve no file - firebase

From cloud functions I want to list all the files in the storage bucket.
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp();
const { Storage } = require("#google-cloud/storage");
const gcs = new Storage();
exports.listFiles = functions.https.onRequest(async (req, res) => {
const [files] = await gcs.bucket("default-bucket").getFiles();
let filesArray = [];
files.forEach((file) => {
filesArray.push(file.name);
});
res.status(200).send({ filesArrayName: filesArray });
});
This is working fine when the functions is run remotely, but using the Firebase Emulator, I can list the buckets, upload a file, but the query getFiles() will always show an empty array.
Is it an issue with credentials ? My GOOGLE_APPLICATION_CREDENTIALS variable is set on my local machine, and I added the serviceAccount JSON file when doing admin.initializeApp(), but the result still the same. It's really curious that I can list the buckets but not the file..
Here is my storage rules on local
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write;
}
}
}
On my local machine :
On the cloud :

Related

Read from firebase storage and write to firestore using firebase functions

I had tried this typescript code 👇
import * as functions from "firebase-functions";
import * as admin from "firebase-admin";
import serviceAccount from "/Users/300041370/Downloads/serviceKey.json";
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
});
const buckObj = functions.storage.bucket("myBucket").object();
export const onWikiWrite = buckObj.onFinalize(async (object) => {
const filePath = object.name ?? "test.json";
const bucket = admin.storage().bucket("myBucket");
bucket.file(filePath).download().then((data) => {
const contents = data[0];
data = {"key": "value"};
const doc = admin.firestore().collection("myCollection").doc();
doc.set(data);
});
});
but this gave me following error
"status":{"code":7,"message":"Insufficient permissions to (re)configure a trigger (permission denied for bucket myBucket). Please, give owner permissions to the editor role of the bucket and try again.
I had asked this question here but it got closed as duplicate of this question. It basically said, storage.bucket("myBucket") feature is not supported and that I'll have to instead use match for limiting this operation to files in this specific bucket/folder. Hence, I tried this 👇
const buckObj = functions.storage.object();
export const onWikiWrite = buckObj.onFinalize(async (object) => {
if (object.name.match(/myBucket\//)) {
const fileBucket = object.bucket;
const filePath = object.name;
const bucket = admin.storage().bucket(fileBucket);
bucket.file(filePath).download().then((data) => {
const contents = data[0];
const doc = admin.firestore().collection("myCollection").doc();
const data = {content: contents}
doc.set(data);
});
}
});
I am still facing the same issue. I'll repeat that here:
"status":{"code":7,"message":"Insufficient permissions to (re)configure a trigger (permission denied for bucket myBucket). Please, give owner permissions to the editor role of the bucket and try again.
Since version 1.0 of the Firebase SDK for Cloud Functions, firebase-admin shall be initialized without any parameters within the Cloud Functions runtime.
The following should work (I've removed the check on filePath):
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp();
export const onWikiWrite = functions.storage
.object()
.onFinalize(async (object) => {
const fileBucket = object.bucket;
const filePath = object.name;
const bucket = admin.storage().bucket(fileBucket);
return bucket
.file(filePath)
.download()
.then((data) => {
const contents = data[0];
return admin
.firestore()
.collection('myCollection')
.add({ content: contents });
});
});
Note that we return the chain of promises returned by the asynchronous Firebase methods. It is key, in a Cloud Function which performs asynchronous processing (also known as "background functions") to return a JavaScript promise when all the asynchronous processing is complete.
We also use the add() method instead of doing doc().set().
Finally, when checking the value of the filePath, be aware of the fact that there is actually no concept of folder or subdirectory in Cloud Storage (See this answer).

Firestore denies add operation in web worker with completely open security rules (web-app)

So, I have setup the following web worker to add data to a Firestore Collection.
The first part of the web worker code initializes Firebase and get some references
importScripts("https://www.gstatic.com/firebasejs/8.3.1/firebase-app.js");
importScripts("https://www.gstatic.com/firebasejs/8.3.1/firebase-auth.js");
importScripts("https://www.gstatic.com/firebasejs/8.3.1/firebase-firestore.js");
// firebase configuration object
const firebaseConfig = {
apiKey: "XXX",
authDomain: "XXX",
projectId: "XXX",
storageBucket: "XXX",
messagingSenderId: "XXX",
appId: "XXX"
};
// initialize firebase
firebase.initializeApp(firebaseConfig);
// get reference to db
const db = firebase.firestore();
// get reference to collection
const collection = db.collection("data");
The following function performs the actual upload inside the web worker (at least in my intentions)
// function to handle document upload to firestore
const uploadData = async document => {
// try to
try {
// sign in anonymously to firebase
await firebase.auth().signInAnonymously();
// upload document
const docRef = await collection.add(document);
// output docRef to console
console.log(docRef);
// signout
firebase.auth().signout();
}
// on error
catch(error) {
// output error to console
const errorCode = error.code;
const errorMessage = error.message;
console.log(error, errorCode, errorMessage);
};
};
Finally, this is the event-listener to start the web worker:
// event handler
onmessage = async function(e) {
// cache data
const workerData = e.data;
// upload data to firestore
await uploadData(workerData);
// send upload finished
postMessage("upload finished");
};
The Firestore rules are completely open (just for testing purposes):
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read, write;
}
}
}
Still, when my web worker runs, I receive the following error in the console:
"permission-denied" "Missing or insufficient permissions"
What I am doing wrong?
For some strange reason, in the current firebase project, the config object was missing the prop "databaseURL".
I deleted the project, created a new one and the code worked as expected. As a matter of fact, the new firebase config object had the "databaseURL" prop correctly set.

Error: ****#appspot.gserviceaccount.com does not have storage.objects.get access

I have a simple firebase function that triggers on a file being uploaded to Firebase Storage. It was working on the non-main bucket, but once I changed it to listen to the main bucket I began receiving these error messages....
Error: *****#appspot.gserviceaccount.com does not have storage.objects.get access to *****.appspot.com/ff-icon-01.png.
The function is in the same project as the storage bucket.
const admin = require('firebase-admin');
admin.initializeApp();
const functions = require('firebase-functions');
const {Storage} = require('#google-cloud/storage');
const gcs = new Storage();
import { tmpdir } from 'os';
import { join, dirname } from 'path';
import * as sharp from 'sharp';
import * as fs from 'fs-extra';
export const makeThumbnail = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const workingDir = join(tmpdir(), 'thumbs');
const tmpFilePath = join(workingDir, 'source.png');
if (fileName.includes('thumb#') || !object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
// 1. Ensure thumbnail dir exists
await fs.ensureDir(workingDir);
// 2. Download Source File
await bucket.file(filePath).download({
destination: tmpFilePath
});
// 3. Resize the images and define an array of upload promises
const sizes = [64, 128, 256];
const uploadPromises = sizes.map(async size => {
const thumbName = `thumb#${size}_${fileName}`;
const thumbPath = join(workingDir, thumbName);
// Resize source image
await sharp(tmpFilePath)
.resize(size, size)
.toFile(thumbPath);
// Upload to GCS
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName)
});
});
// 4. Run the upload operations
await Promise.all(uploadPromises);
// 5. Cleanup remove the tmp/thumbs from the filesystem
return fs.remove(workingDir);
});
They have the same rules. Not sure what's up.
rules_version = '2';
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}
For someone else who runs into this issue.
The problem for me was that I were using the wrong project in my gcloud setup when uploading my functions. So I used one project in the firebase cli while using another project in the gcloud cli.
It worked for me when I deleted all the functions, change the gcloud cli project to the right one and then uploaded the functions again.

Uploading files from Firebase Cloud Functions to Cloud Storage

The documentation is too complex for me to understand. It shows how to download a file from Cloud Storage to Cloud Functions, manipulate the file, and then upload the new file to Cloud Storage. I just want to see the basic, minimum instructions for uploading a file from Cloud Functions to Cloud Storage. Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
var metadata = {
contentType: 'text',
};
admin.storage().ref().put( {'test': 'test'}, metadata)
.then(function() {
console.log("Document written.");
})
.catch(function(error) {
console.error(error);
})
});
The error message is admin.storage(...).ref is not a function. I'm guessing that firebase-admin includes Firestore but not Storage? Instead of firebase-admin should I use #google-cloud/storage? Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const {Storage} = require('#google-cloud/storage')();
const storage = new Storage();
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
storage.bucket().upload( {'test': 'test'} , {
metadata: {
contentType: 'text'
}
})
});
I can't even deploy this code, the error message is
Error parsing triggers: Cannot find module './clone.js'
Apparently a npm module dependency is missing? But the module isn't called clone.js? I tried requiring child-process-promise, path, os, and fs; none fixed the missing clone.js error.
Why does admin.initializeApp(); lack parameters, when in my index.html file I have:
firebase.initializeApp({
apiKey: 'swordfish',
authDomain: 'myapp.firebaseapp.com',
databaseURL: "https://myapp.firebaseio.com",
projectId: 'myapp',
storageBucket: "myapp.appspot.com"
});
Another issue I'm seeing:
npm list -g --depth=0
/Users/TDK/.nvm/versions/node/v6.11.2/lib
├── child_process#1.0.2
├── UNMET PEER DEPENDENCY error: ENOENT: no such file or directory, open '/Users/TDK/.nvm/versions/node/v6.11.2/lib/node_modules/firebase-admin/package.json
├── firebase-functions#2.1.0
├── firebase-tools#6.0.1
├── firestore-backup-restore#1.3.1
├── fs#0.0.2
├── npm#6.4.1
├── npm-check#5.9.0
├── protractor#5.4.1
├── request#2.88.0
└── watson-developer-cloud#3.13.0
In other words, there's something wrong with firebase-admin, or with Node 6.11.2. Should I use a Node Version Manager to revert to an older version of Node?
Go to https://console.cloud.google.com/iam-admin/iam
Click the pencil icon next to your App Engine default service account
+ ADD ANOTHER ROLE
Add Cloud Functions Service Agent
In my specific use case, I needed to decode a base64 string into a byte array and then use that to save the image.
var serviceAccount = require("./../serviceAccountKey.json");
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp({
projectId: serviceAccount.project_id,
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your_project_id_here.firebaseio.com", //update this
storageBucket: "your_bucket_name_here.appspot.com" //update this
});
function uploadProfileImage(imageBytes64Str: string): Promise<any> {
const bucket = admin.storage().bucket()
const imageBuffer = Buffer.from(imageBytes64Str, 'base64')
const imageByteArray = new Uint8Array(imageBuffer);
const file = bucket.file(`images/profile_photo.png`);
const options = { resumable: false, metadata: { contentType: "image/jpg" } }
//options may not be necessary
return file.save(imageByteArray, options)
.then(stuff => {
return file.getSignedUrl({
action: 'read',
expires: '03-09-2500'
})
})
.then(urls => {
const url = urls[0];
console.log(`Image url = ${url}`)
return url
})
.catch(err => {
console.log(`Unable to upload image ${err}`)
})
}
Then you can call the method like this and chain the calls.
uploadProfileImage(image_bytes_here)
.then(url => {
//Do stuff with the url here
})
Note: You must initialize admin with a service account and specify the default bucket. If you simply do admin.initializeApp() then your image urls will expire in 10 days.
Steps to properly use a service account.
Go to Service Accounts and generate a private key
Put the JSON file in your functions folder (next to src and node_modules)
Go to Storage and copy the URL not including the "gs://" in the front. Use this for the storage bucket url when initializing admin.
Use your project ID above for the database URL.
See Introduction to the Admin Cloud Storage
API for further
details on how to use the Cloud Storage service in Firebase Admin SDK.
var admin = require("firebase-admin");
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "<BUCKET_NAME>.appspot.com"
});
var bucket = admin.storage().bucket();
// 'bucket' is an object defined in the #google-cloud/storage library.
// See https://googlecloudplatform.github.io/google-cloud-node/#/docs/storage/latest/storage/bucket
// for more details.
Regarding uploading objects, see Cloud Storage Documentation Uploading Objects sample code:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
I uploaded a file from my hard drive to Firebase Cloud Storage via Google Cloud Functions. First, I found the documentation for Google Cloud Functions bucket.upload.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('hello_world.ogg', options).then(function(data) {
const file = data[0];
});
return 0;
});
The first three lines are Cloud Functions boilerplate. The next line
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
creates the Cloud Function and sets the trigger. The next three lines are more Google Cloud boilerplate.
The rest of the code locates the file hello_world.ogg on my computer's hard drive in the functions folder of my project directory and uploads it to the directory Test_Folder and changes the name of the file to hello_world.dog in my Firebase Cloud Storage. This returns a promise, and the next line const file = data[0]; is unnecessary unless you want to do something else with the file.
Lastly we return 0;. This line does nothing except prevent the error message
Function returned undefined, expected Promise or Value
if (req.rawBody) {
busboy.end(req.rawBody);
}
else {
req.pipe(busboy);
}
As described in this issue: https://github.com/GoogleCloudPlatform/cloud-functions-emulator/issues/161#issuecomment-376563784

Get Public URL from file uploaded with firebase-admin

I use firebase-admin and firebase-functions to upload a file in Firebase Storage.
I have this rules in storage:
service firebase.storage {
match /b/{bucket}/o {
match /images {
allow read;
allow write: if false;
}
}
}
And I want get a public URL with this code:
const config = functions.config().firebase;
const firebase = admin.initializeApp(config);
const bucketRef = firebase.storage();
server.post('/upload', async (req, res) => {
// UPLOAD FILE
await stream.on('finish', async () => {
const fileUrl = bucketRef
.child(`images/${fileName}`)
.getDownloadUrl()
.getResult();
return res.status(200).send(fileUrl);
});
});
But I have this error .child is not a function.
How can I get the public url of a file with firebase-admin?
From the sample application code on the using Cloud Storage documentation, you should be able to implement the following code to obtain the public download URL after the upload is successful:
// Create a new blob in the bucket and upload the file data.
const blob = bucket.file(req.file.originalname);
const blobStream = blob.createWriteStream();
blobStream.on('finish', () => {
// The public URL can be used to directly access the file via HTTP.
const publicUrl = format(`https://storage.googleapis.com/${bucket.name}/${blob.name}`);
res.status(200).send(publicUrl);
});
Alternatively, if you need a publicly accessible download URL, see this answer which suggests using getSignedUrl() from the Cloud Storage NPM module because the Admin SDK doesn't support this directly:
You'll need to generate a signed URL using getSignedURL via the
#google-cloud/storage NPM module.
Example:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
// ...
const bucket = gcs.bucket(bucket);
const file = bucket.file(fileName);
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
// signedUrls[0] contains the file's public URL
});
What worked for me is to compose a URL like this:
https://storage.googleapis.com/<bucketName>/<pathToFile>
Example: https://storage.googleapis.com/mybucket.appspot.com/public/myFile.png
How I found it?
I went to GCP Console, Storage. Located the uploaded file. Clicked "Copy URL".
You may want to make a file Public first. I did it like this:
const bucket = seFirebaseService.admin().storage().bucket()
await bucket.file(`public/myFile.png`).makePublic()
I've been tinkering with this for days and realized
A) correct access rights on the bucket is key:
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read;
allow write: if request.auth != null;
}
}
}
B) The functional public URL is right there in the meta data (tested and works). Notice the access rights.
const pdfDoc = printer.createPdfKitDocument(docDefinition);
const pdfFile = admin
.storage()
.bucket()
.file(newId + '.pdf');
pdfDoc.pipe(
pdfFile.createWriteStream({
contentType: 'application/pdf',
public: true,
})
);
pdfDoc.end();
console.log('Get public URL');
const publicUrl = pdfFile.metadata.mediaLink;

Resources