UPDATE: The code below works fine as it's own "test.js" file. It only doesn't work within the context of Firebase Cloud Functions.
============================
I'm trying to upload a file to Firebase Storage from a Firebase Cloud Function. I'm testing this locally, using
firebase serve --only functions
The snippet below is just to see the issue...I'm actually calling "uploadFile" from within a Cloud Functions endpoint. Everything looks good up to the "bucket.upload(...)" line. I can echo out the options right before that line, then I get a non-descript error that it "finished with status: 'crash'".
Any ideas? This seems pretty straightforward to me!
const firebase = require('firebase-admin');
var serviceAccount = require("./certs/key-name.json");
firebase.initializeApp({
credential: firebase.credential.cert(serviceAccount),
databaseURL: "https://my-app.firebaseio.com",
storageBucket: "my-app.appspot.com"
});
var storage = firebase.storage();
var metadata = {
id: '1234'
};
uploadFile('myFile.pdf', metadata);
function uploadFile(file, metadata) {
var bucket = storage.bucket();
var options = {
destination: file,
resumable: false,
metadata: {
metadata: metadata
}
};
bucket.upload(file, options, function(err, remoteFile) {
if (!err) {
console.log("Uploaded!");
} else {
console.log(err);
}
});
}
Related
The documentation is too complex for me to understand. It shows how to download a file from Cloud Storage to Cloud Functions, manipulate the file, and then upload the new file to Cloud Storage. I just want to see the basic, minimum instructions for uploading a file from Cloud Functions to Cloud Storage. Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
var metadata = {
contentType: 'text',
};
admin.storage().ref().put( {'test': 'test'}, metadata)
.then(function() {
console.log("Document written.");
})
.catch(function(error) {
console.error(error);
})
});
The error message is admin.storage(...).ref is not a function. I'm guessing that firebase-admin includes Firestore but not Storage? Instead of firebase-admin should I use #google-cloud/storage? Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const {Storage} = require('#google-cloud/storage')();
const storage = new Storage();
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
storage.bucket().upload( {'test': 'test'} , {
metadata: {
contentType: 'text'
}
})
});
I can't even deploy this code, the error message is
Error parsing triggers: Cannot find module './clone.js'
Apparently a npm module dependency is missing? But the module isn't called clone.js? I tried requiring child-process-promise, path, os, and fs; none fixed the missing clone.js error.
Why does admin.initializeApp(); lack parameters, when in my index.html file I have:
firebase.initializeApp({
apiKey: 'swordfish',
authDomain: 'myapp.firebaseapp.com',
databaseURL: "https://myapp.firebaseio.com",
projectId: 'myapp',
storageBucket: "myapp.appspot.com"
});
Another issue I'm seeing:
npm list -g --depth=0
/Users/TDK/.nvm/versions/node/v6.11.2/lib
├── child_process#1.0.2
├── UNMET PEER DEPENDENCY error: ENOENT: no such file or directory, open '/Users/TDK/.nvm/versions/node/v6.11.2/lib/node_modules/firebase-admin/package.json
├── firebase-functions#2.1.0
├── firebase-tools#6.0.1
├── firestore-backup-restore#1.3.1
├── fs#0.0.2
├── npm#6.4.1
├── npm-check#5.9.0
├── protractor#5.4.1
├── request#2.88.0
└── watson-developer-cloud#3.13.0
In other words, there's something wrong with firebase-admin, or with Node 6.11.2. Should I use a Node Version Manager to revert to an older version of Node?
Go to https://console.cloud.google.com/iam-admin/iam
Click the pencil icon next to your App Engine default service account
+ ADD ANOTHER ROLE
Add Cloud Functions Service Agent
In my specific use case, I needed to decode a base64 string into a byte array and then use that to save the image.
var serviceAccount = require("./../serviceAccountKey.json");
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp({
projectId: serviceAccount.project_id,
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your_project_id_here.firebaseio.com", //update this
storageBucket: "your_bucket_name_here.appspot.com" //update this
});
function uploadProfileImage(imageBytes64Str: string): Promise<any> {
const bucket = admin.storage().bucket()
const imageBuffer = Buffer.from(imageBytes64Str, 'base64')
const imageByteArray = new Uint8Array(imageBuffer);
const file = bucket.file(`images/profile_photo.png`);
const options = { resumable: false, metadata: { contentType: "image/jpg" } }
//options may not be necessary
return file.save(imageByteArray, options)
.then(stuff => {
return file.getSignedUrl({
action: 'read',
expires: '03-09-2500'
})
})
.then(urls => {
const url = urls[0];
console.log(`Image url = ${url}`)
return url
})
.catch(err => {
console.log(`Unable to upload image ${err}`)
})
}
Then you can call the method like this and chain the calls.
uploadProfileImage(image_bytes_here)
.then(url => {
//Do stuff with the url here
})
Note: You must initialize admin with a service account and specify the default bucket. If you simply do admin.initializeApp() then your image urls will expire in 10 days.
Steps to properly use a service account.
Go to Service Accounts and generate a private key
Put the JSON file in your functions folder (next to src and node_modules)
Go to Storage and copy the URL not including the "gs://" in the front. Use this for the storage bucket url when initializing admin.
Use your project ID above for the database URL.
See Introduction to the Admin Cloud Storage
API for further
details on how to use the Cloud Storage service in Firebase Admin SDK.
var admin = require("firebase-admin");
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "<BUCKET_NAME>.appspot.com"
});
var bucket = admin.storage().bucket();
// 'bucket' is an object defined in the #google-cloud/storage library.
// See https://googlecloudplatform.github.io/google-cloud-node/#/docs/storage/latest/storage/bucket
// for more details.
Regarding uploading objects, see Cloud Storage Documentation Uploading Objects sample code:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
I uploaded a file from my hard drive to Firebase Cloud Storage via Google Cloud Functions. First, I found the documentation for Google Cloud Functions bucket.upload.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('hello_world.ogg', options).then(function(data) {
const file = data[0];
});
return 0;
});
The first three lines are Cloud Functions boilerplate. The next line
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
creates the Cloud Function and sets the trigger. The next three lines are more Google Cloud boilerplate.
The rest of the code locates the file hello_world.ogg on my computer's hard drive in the functions folder of my project directory and uploads it to the directory Test_Folder and changes the name of the file to hello_world.dog in my Firebase Cloud Storage. This returns a promise, and the next line const file = data[0]; is unnecessary unless you want to do something else with the file.
Lastly we return 0;. This line does nothing except prevent the error message
Function returned undefined, expected Promise or Value
if (req.rawBody) {
busboy.end(req.rawBody);
}
else {
req.pipe(busboy);
}
As described in this issue: https://github.com/GoogleCloudPlatform/cloud-functions-emulator/issues/161#issuecomment-376563784
This work perfectly in local serve with firebase :
const gCloudConfig = {
projectId: 'XXXX-X1234',
keyFilename: './key.json'
}
const Storage = require('#google-cloud/storage')(gCloudConfig);
const storageBucket = Storage.bucket(bucketUrl);
storageBucket.upload(file.path, {destination: file.name})
.then(() => {
//
});
But this doesn't work when i deploy to firebase :
const Storage = require('#google-cloud/storage')();
const storageBucket = Storage.bucket(bucketUrl);
storageBucket.upload(file.path, {destination: file.name})
.then(() => {
//
});
I put this line after the admin.initializeApp(...), since i saw that it fixed the problem for someone, but it still doesn't work.
I've tried a lot of stuff :
const gCloudConfig = { projectId: 'XXXX-X1234' };
const gCloudConfig = { key: API_KEY };
const gCloudConfig = { key: API_KEY, projectId: 'XXXX-X1234' };
const gCloudConfig = functions.config().firebase;
I'm kinda lost, please help me !
It's easier if you just initialize the Firebase Admin SDK with its default credentials, then access the Cloud Storage APIs via that. There's no need to initialize Storage on its own.
const admin = require('firebase-admin')
admin.initializeApp()
const bucket = admin.storage().bucket()
bucket.upload(localPath, {
destination: remotePath
})
Here, bucket is your project default storage bucket, just like you would have gotten it from the Cloud Storage API.
Note that the no-argument init of the Admin SDK is available when using firebase-functions#1.0.0 or later (current 1.0.2).
Quick question. Long story short, I am getting this error in my google cloud functions log:
Firestore (4.10.1): Could not reach Firestore backend.
Here is my code in my functions file:
// pull in firebase
const firebase = require('firebase');
// required
require("firebase/firestore");
// initialize firebase
const firebaseApp = firebase.initializeApp({
// Firebase configuration.
apiKey: "<Key Here>",
authDomain: "<Auth Domain>",
databaseURL: "<database url>",
projectId: "<project id>",
storageBucket: "<storage bucket>",
messagingSenderId: "<messaging sender id>"
});
// setup the firestore
var fs = firebaseApp.firestore();
exports.search = functions.https.onRequest((request, response) => {
cors(request, response, () => {
// set a reference to the foo table in firestore
var docRef = fs.collection("foo");
// check for the foo in the firestore
docRef.where('bar', '==', <something>).get().then(function(doc) {
if (!doc.docs) {
return db.collection("foo").add({
bar: <something>
})
.then(function(docRef) {
console.log("Document written with ID: ", docRef.id);
})
.catch(function(error) {
console.error("Error adding document: ", error);
});
}
});
});
});
At this point I am stuck. As far as I can tell, I have things set up, but maybe not? I have searched the docs and googled the issue, without much success. Do you see anything wrong?
All right. So the answer to my question is that I was not being very smart. A big thank you to Taha Azzabi for pointing me in the right direction. It turns out my problem was here:
docRef.where('bar', '==', <something>).get().then(function(doc) {
if (!doc.docs) {
return db.collection("foo").add({
bar: <something>
})
This would never work. My query was correct, but the check on doc.docs was incorrect. My code is now:
// setup the firestore
const fs = firebase.firestore();
// set a reference to the document for the searched summoner
var docRef = fs.collection("bars").doc("snickers");
// check for the bar in the firestore
return docRef.get()
.then(function(doc) {
if (!doc.docs) {
return fs.collection("bars").doc("snickers").set({
name: "snickers"
})
.then(function(reference) {
console.log("Document written");
return response.status(200).send('');
})
This is what I was looking for so I am good to go. Long story short, I was grabbing a collection of results then trying to check to see if a single result existed. What I needed to do was grab a single doc from the firestore and from there check to see if the single doc existed. However, the error:
Firestore (4.10.1): Could not reach Firestore backend.
Didn't really do a very good job at pointing me in that direction.
Did you install the Firebase CLI ?
npm install -g firebase-tools
Did you log in to the Firebase console through the CLI ?
firebase login
Did you initialize Firebase Cloud Functions ?
firebase init functions
You don't need then to reinitialize the app, initialize an admin app instance,thought.
Here's an example hope that will help
const functions = require('firebase-functions')
const admin = require('firebase-admin')
admin.initializeApp(functions.config().firebase)
//trigger a function to fire when new user document created
exports.createUser = functions.firestore
.document('users/{userId}')
.onCreate(event => {
// perform desired operations ...
});
to deployer your functions
firebase deploy --only functions
read more here https://firebase.google.com/docs/functions/get-started
I am trying to store images to firebase storage, I am using node.js.
import * as admin from 'firebase-admin';
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'https://xxx-app.firebaseio.com',
storageBucket: 'xxxx-app.appspot.com'
});
function storeHeadShots(){
const bucket = admin.storage().bucket();
bucket.upload('./headshots/Acker.W.png', function(err, file){
if (!err){
console.log('file uploaded')
} else {
console.log('error uploading image: ', err)
}
});
}
storeHeadShots();
Now the above works fine, but I have a folder in storage users/ and inside this folder I am storing images. How do I access this folder ?
For anybody who was trying to solve the same problem.
The example is taken from the official documentation here but it doesn't say anything about how to put a file inside a folder. The default example puts the file under the root of the bucket with the same name as the local file.
I had to dig a little deeper into the Cloud Storage class documentation here and I found that for the options a destination parameter exists that describes the file inside the bucket.
In the following example the local file images/bar.png will be uploaded to the default bucket's file /foo/sub/bar.png.
const bucket = admin.storage().bucket();
bucket.upload('images/bar.png', {
destination: 'foo/sub/bar.png',
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000'
}
}).then(() => {
console.log('file uploaded.');
}).catch(err => {
console.error('ERROR:', err);
});
Hopefully that saved some time for others. Happy uploading!
I'm trying to understand how to upload files in Firebase Storage, using Node.js. My first try was to use the Firebase library:
"use strict";
var firebase = require('firebase');
var config = {
apiKey: "AIz...kBY",
authDomain: "em....firebaseapp.com",
databaseURL: "https://em....firebaseio.com",
storageBucket: "em....appspot.com",
messagingSenderId: "95...6"
};
firebase.initializeApp(config);
// Error: firebase.storage is undefined, so not a function
var storageRef = firebase.storage().ref();
var uploadTask = storageRef.child('images/octofez.png').put(file);
// Register three observers:
// 1. 'state_changed' observer, called any time the state changes
// 2. Error observer, called on failure
// 3. Completion observer, called on successful completion
uploadTask.on('state_changed', function(snapshot){
...
}, function(error) {
console.error("Something nasty happened", error);
}, function() {
var downloadURL = uploadTask.snapshot.downloadURL;
console.log("Done. Enjoy.", downloadURL);
});
But it turns out that Firebase cannot upload files from the server side, as it clearly states in the docs:
Firebase Storage is not included in the server side Firebase npm module. Instead, you can use the gcloud Node.js client.
$ npm install --save gcloud
In your code, you can access your Storage bucket using:
var gcloud = require('gcloud')({ ... }); var gcs = gcloud.storage();
var bucket = gcs.bucket('<your-firebase-storage-bucket>');
Can we use gcloud without having an account on Google Cloud Platform? How?
If not, how come that uploading files to Firebase Storage from the client side is possible?
Can't we just create a library that makes the same requests from the server side?
How is Firebase Storage connected with Google Cloud Platform at all? Why Firebase allows us to upload images only from the client side?
My second try was to use the gcloud library, like mentioned in the docs:
var gcloud = require("gcloud");
// The following environment variables are set by app.yaml when running on GAE,
// but will need to be manually set when running locally.
// The storage client is used to communicate with Google Cloud Storage
var storage = gcloud.storage({
projectId: "em...",
keyFilename: 'auth.json'
});
storage.createBucket('octocats', function(err, bucket) {
// Error: 403, accountDisabled
// The account for the specified project has been disabled.
// Create a new blob in the bucket and upload the file data.
var blob = bucket.file("octofez.png");
var blobStream = blob.createWriteStream();
blobStream.on('error', function (err) {
console.error(err);
});
blobStream.on('finish', function () {
var publicUrl = `https://storage.googleapis.com/${bucket.name}/${blob.name}`;
console.log(publicUrl);
});
fs.createReadStream("octofez.png").pipe(blobStream);
});
When using the firebase library on a server you would typically authorize using a service account as this will give you admin access to the Realtime database for instance. You can use the same Service Account's credentials file to authorize gcloud.
By the way: A Firebase project is essentially also a Google Cloud Platform project, you can access your Firebase project on both https://console.firebase.google.com and https://console.cloud.google.com and https://console.developers.google.com
You can see your Project ID on the Firebase Console > Project Settings or in the Cloud Console Dashboard
When using the gcloud SDK make sure that you use the (already existing) same bucket that Firebase Storage is using. You can find the bucket name in the Firebase web config object or in the Firebase Storage tab. Basically your code should start like this:
var gcloud = require('gcloud');
var storage = gcloud.storage({
projectId: '<projectID>',
keyFilename: 'service-account-credentials.json'
});
var bucket = storage.bucket('<projectID>.appspot.com');
...
Firebase Storage is now supported by the admin SDK with NodeJS:
https://firebase.google.com/docs/reference/admin/node/admin.storage
// Get the Storage service for the default app
var defaultStorage = firebaseAdmin.storage();
var bucket = defaultStorage.bucket('bucketName');
...
Firebase Admin SDK allows you to directly access your Google Cloud Storage.
For more detail visit Introduction to the Admin Cloud Storage API
var admin = require("firebase-admin");
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "<BUCKET_NAME>.appspot.com"
});
var bucket = admin.storage().bucket();
bucket.upload('Local file to upload, e.g. ./local/path/to/file.txt')
I hope It will useful for you. I uploaded one file from locally and then I added access Token using UUID after that I uploaded into firebase storage.There after I am generating download url. If we hitting that generate url it will automatically downloaded a file.
const keyFilename="./xxxxx.json"; //replace this with api key file
const projectId = "xxxx" //replace with your project id
const bucketName = "xx.xx.appspot.com"; //Add your bucket name
var mime=require('mime-types');
const { Storage } = require('#google-cloud/storage');
const uuidv1 = require('uuid/v1');//this for unique id generation
const gcs = new Storage({
projectId: projectId,
keyFilename: './xxxx.json'
});
const bucket = gcs.bucket(bucketName);
const filePath = "./sample.odp";
const remotePath = "/test/sample.odp";
const fileMime = mime.lookup(filePath);
//we need to pass those parameters for this function
var upload = (filePath, remoteFile, fileMime) => {
let uuid = uuidv1();
return bucket.upload(filePath, {
destination: remoteFile,
uploadType: "media",
metadata: {
contentType: fileMime,
metadata: {
firebaseStorageDownloadTokens: uuid
}
}
})
.then((data) => {
let file = data[0];
return Promise.resolve("https://firebasestorage.googleapis.com/v0/b/" + bucket.name + "/o/" + encodeURIComponent(file.name) + "?alt=media&token=" + uuid);
});
}
//This function is for generation download url
upload(filePath, remotePath, fileMime).then( downloadURL => {
console.log(downloadURL);
});
Note that gcloud is deprecated, use google-cloud instead.
You can find SERVICE_ACCOUNT_KEY_FILE_PATH at project settings->Service Accounts.
var storage = require('#google-cloud/storage');
var gcs = storage({
projectId: PROJECT_ID,
keyFilename: SERVICE_ACCOUNT_KEY_FILE_PATH
});
// Reference an existing bucket.
var bucket = gcs.bucket(PROJECT_ID + '.appspot.com');
...
Or you could simply polyfill XmlHttpRequest like so -
const XMLHttpRequest = require("xhr2");
global.XMLHttpRequest = XMLHttpRequest
and import
require('firebase/storage');
That's it. All firebase.storage() methods should now work.