I am trying to store images to firebase storage, I am using node.js.
import * as admin from 'firebase-admin';
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'https://xxx-app.firebaseio.com',
storageBucket: 'xxxx-app.appspot.com'
});
function storeHeadShots(){
const bucket = admin.storage().bucket();
bucket.upload('./headshots/Acker.W.png', function(err, file){
if (!err){
console.log('file uploaded')
} else {
console.log('error uploading image: ', err)
}
});
}
storeHeadShots();
Now the above works fine, but I have a folder in storage users/ and inside this folder I am storing images. How do I access this folder ?
For anybody who was trying to solve the same problem.
The example is taken from the official documentation here but it doesn't say anything about how to put a file inside a folder. The default example puts the file under the root of the bucket with the same name as the local file.
I had to dig a little deeper into the Cloud Storage class documentation here and I found that for the options a destination parameter exists that describes the file inside the bucket.
In the following example the local file images/bar.png will be uploaded to the default bucket's file /foo/sub/bar.png.
const bucket = admin.storage().bucket();
bucket.upload('images/bar.png', {
destination: 'foo/sub/bar.png',
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000'
}
}).then(() => {
console.log('file uploaded.');
}).catch(err => {
console.error('ERROR:', err);
});
Hopefully that saved some time for others. Happy uploading!
Related
I'm trying to setup automatic backup of my Firestore using instructions here: https://firebase.google.com/docs/firestore/solutions/schedule-export
I get error:
firestoreExpert
g2o6pmdwatdp
TypeError: Cannot read properties of undefined (reading 'charCodeAt')
at peg$parsetemplate (/workspace/node_modules/google-gax/build/src/pathTemplateParser.js:304:17)
at Object.peg$parse [as parse] (/workspace/node_modules/google-gax/build/src/pathTemplateParser.js:633:18)
at new PathTemplate (/workspace/node_modules/google-gax/build/src/pathTemplate.js:55:54)
Any suggestions to debug this?
I've tried looking for errors in my permissions. E.g. I don't know how to check if the service has access to the specific bucket, although the GCL ran OK.
I've also tried looking for errors in the script.
index.js
const firestore = require('#google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient();
// Replace BUCKET_NAME
const bucket = 'gs://EDITEDHERE.appspot.com'
exports.scheduledFirestoreExport = (event, context) => {
const databaseName = client.databasePath(
process.env.GCLOUD_PROJECT,
'(default)'
);
return client
.exportDocuments({
name: databaseName,
outputUriPrefix: bucket,
// Leave collectionIds empty to export all collections
// or define a list of collection IDs:
// collectionIds: ['users', 'posts']
collectionIds: [],
})
.then(responses => {
const response = responses[0];
console.log(`Operation Name: ${response['name']}`);
return response;
})
.catch(err => {
console.error(err);
});
};
and package.json
{
"dependencies": {
"#google-cloud/firestore": "^1.3.0"
}
}
I found these great video tutorials
How to schedule firestorm backups and
How To Transfer Firestore Data From One Project To Another
The documentation is too complex for me to understand. It shows how to download a file from Cloud Storage to Cloud Functions, manipulate the file, and then upload the new file to Cloud Storage. I just want to see the basic, minimum instructions for uploading a file from Cloud Functions to Cloud Storage. Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
var metadata = {
contentType: 'text',
};
admin.storage().ref().put( {'test': 'test'}, metadata)
.then(function() {
console.log("Document written.");
})
.catch(function(error) {
console.error(error);
})
});
The error message is admin.storage(...).ref is not a function. I'm guessing that firebase-admin includes Firestore but not Storage? Instead of firebase-admin should I use #google-cloud/storage? Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const {Storage} = require('#google-cloud/storage')();
const storage = new Storage();
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
storage.bucket().upload( {'test': 'test'} , {
metadata: {
contentType: 'text'
}
})
});
I can't even deploy this code, the error message is
Error parsing triggers: Cannot find module './clone.js'
Apparently a npm module dependency is missing? But the module isn't called clone.js? I tried requiring child-process-promise, path, os, and fs; none fixed the missing clone.js error.
Why does admin.initializeApp(); lack parameters, when in my index.html file I have:
firebase.initializeApp({
apiKey: 'swordfish',
authDomain: 'myapp.firebaseapp.com',
databaseURL: "https://myapp.firebaseio.com",
projectId: 'myapp',
storageBucket: "myapp.appspot.com"
});
Another issue I'm seeing:
npm list -g --depth=0
/Users/TDK/.nvm/versions/node/v6.11.2/lib
├── child_process#1.0.2
├── UNMET PEER DEPENDENCY error: ENOENT: no such file or directory, open '/Users/TDK/.nvm/versions/node/v6.11.2/lib/node_modules/firebase-admin/package.json
├── firebase-functions#2.1.0
├── firebase-tools#6.0.1
├── firestore-backup-restore#1.3.1
├── fs#0.0.2
├── npm#6.4.1
├── npm-check#5.9.0
├── protractor#5.4.1
├── request#2.88.0
└── watson-developer-cloud#3.13.0
In other words, there's something wrong with firebase-admin, or with Node 6.11.2. Should I use a Node Version Manager to revert to an older version of Node?
Go to https://console.cloud.google.com/iam-admin/iam
Click the pencil icon next to your App Engine default service account
+ ADD ANOTHER ROLE
Add Cloud Functions Service Agent
In my specific use case, I needed to decode a base64 string into a byte array and then use that to save the image.
var serviceAccount = require("./../serviceAccountKey.json");
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp({
projectId: serviceAccount.project_id,
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your_project_id_here.firebaseio.com", //update this
storageBucket: "your_bucket_name_here.appspot.com" //update this
});
function uploadProfileImage(imageBytes64Str: string): Promise<any> {
const bucket = admin.storage().bucket()
const imageBuffer = Buffer.from(imageBytes64Str, 'base64')
const imageByteArray = new Uint8Array(imageBuffer);
const file = bucket.file(`images/profile_photo.png`);
const options = { resumable: false, metadata: { contentType: "image/jpg" } }
//options may not be necessary
return file.save(imageByteArray, options)
.then(stuff => {
return file.getSignedUrl({
action: 'read',
expires: '03-09-2500'
})
})
.then(urls => {
const url = urls[0];
console.log(`Image url = ${url}`)
return url
})
.catch(err => {
console.log(`Unable to upload image ${err}`)
})
}
Then you can call the method like this and chain the calls.
uploadProfileImage(image_bytes_here)
.then(url => {
//Do stuff with the url here
})
Note: You must initialize admin with a service account and specify the default bucket. If you simply do admin.initializeApp() then your image urls will expire in 10 days.
Steps to properly use a service account.
Go to Service Accounts and generate a private key
Put the JSON file in your functions folder (next to src and node_modules)
Go to Storage and copy the URL not including the "gs://" in the front. Use this for the storage bucket url when initializing admin.
Use your project ID above for the database URL.
See Introduction to the Admin Cloud Storage
API for further
details on how to use the Cloud Storage service in Firebase Admin SDK.
var admin = require("firebase-admin");
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "<BUCKET_NAME>.appspot.com"
});
var bucket = admin.storage().bucket();
// 'bucket' is an object defined in the #google-cloud/storage library.
// See https://googlecloudplatform.github.io/google-cloud-node/#/docs/storage/latest/storage/bucket
// for more details.
Regarding uploading objects, see Cloud Storage Documentation Uploading Objects sample code:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
I uploaded a file from my hard drive to Firebase Cloud Storage via Google Cloud Functions. First, I found the documentation for Google Cloud Functions bucket.upload.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('hello_world.ogg', options).then(function(data) {
const file = data[0];
});
return 0;
});
The first three lines are Cloud Functions boilerplate. The next line
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
creates the Cloud Function and sets the trigger. The next three lines are more Google Cloud boilerplate.
The rest of the code locates the file hello_world.ogg on my computer's hard drive in the functions folder of my project directory and uploads it to the directory Test_Folder and changes the name of the file to hello_world.dog in my Firebase Cloud Storage. This returns a promise, and the next line const file = data[0]; is unnecessary unless you want to do something else with the file.
Lastly we return 0;. This line does nothing except prevent the error message
Function returned undefined, expected Promise or Value
if (req.rawBody) {
busboy.end(req.rawBody);
}
else {
req.pipe(busboy);
}
As described in this issue: https://github.com/GoogleCloudPlatform/cloud-functions-emulator/issues/161#issuecomment-376563784
The Google Cloud Storage documentation for download() suggests that a destination folder can be specified:
file.download({
destination: '/Users/me/Desktop/file-backup.txt'
}, function(err) {});
No matter what value I put in my file is always downloaded to Firebase Cloud Storage at the root level. This question says that the path can't have an initial slash but changing the example to
file.download({
destination: 'Users/me/Desktop/file-backup.txt'
}, function(err) {});
doesn't make a difference.
Changing the destination to
file.download({
destination: ".child('Test_Folder')",
})
resulted in an error message:
EROFS: read-only file system, open '.child('Test_Folder')'
What is the correct syntax for a Cloud Storage destination (folder and filename)?
Changing the bucket from myapp.appspot.com to myapp.appspot.com/Test_Folder resulted in an error message:
Cannot parse JSON response
Also, the example path appears to specify a location on a personal computer's hard drive. It seems odd to set up a Cloud Storage folder for Desktop. Does this imply that there's a way to specify a destination somewhere other than Cloud Storage?
Here's my code:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
bucket.upload('./hello_world.ogg')
.then(function(data) {
const file = data[0];
file.download({
destination: 'Test_Folder/hello_dog.ogg',
})
.then(function(data) {
const contents = data[0];
console.log("File uploaded.");
})
.catch(error => {
console.error(error);
});
})
.catch(error => {
console.error(error);
});
return 0;
});
According to the documentation:
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
The rest of the file system is read-only and accessible to the
function.
You should use os.tmpdir() to get the best writable directory for the current runtime.
Thanks Doug, the code is working now:
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('hello_world.ogg', options)
.then(function(data) {
const file = data[0];
});
return 0;
});
The function gets the file hello_world.ogg from the functions folder of my project, then writes it to Test_Folder in my Firebase Cloud Storage, and changes the name of the file to hello_world.dog. I copied the download URL and audio file plays perfectly.
Yesterday I thought it seemed odd that writing a file to Cloud Storage was called download(), when upload() made more sense. :-)
You can download the files from Google Cloud Storage to your computer using the following code or command
Install python on your PC
Install GCS on your PC
pip install google-cloud-storage
kesaktopdi.appspot.com
Download .json file and save it in /home/login/ folder
Change your account
https://console.cloud.google.com/apis/credentials/serviceaccountkey?project=kesaktopdi
import os
ACCOUNT_ID='kesaktopdi'
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="/home/login/" + ACCOUNT_ID + ".json"
def download_blob(bucket_name, source_blob_name, destination_file_name):
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
#print('Blob {} downloaded to {}.'.format(source_blob_name,destination_file_name))
download_blob(ACCOUNT_ID +'.appspot.com', #account link
'user.txt', #file location on the server
'/home/login/kesaktopdi.txt') #file storage on a computer
You can also download files from the Google Cloud Storage server to your computer using the following command.
file location on the server file storage on a computer
gsutil -m cp -r gs://kesaktopdi.appspot.com/text.txt /home/login
The program was created by the APIuz team https://t.me/apiuz
UPDATE: The code below works fine as it's own "test.js" file. It only doesn't work within the context of Firebase Cloud Functions.
============================
I'm trying to upload a file to Firebase Storage from a Firebase Cloud Function. I'm testing this locally, using
firebase serve --only functions
The snippet below is just to see the issue...I'm actually calling "uploadFile" from within a Cloud Functions endpoint. Everything looks good up to the "bucket.upload(...)" line. I can echo out the options right before that line, then I get a non-descript error that it "finished with status: 'crash'".
Any ideas? This seems pretty straightforward to me!
const firebase = require('firebase-admin');
var serviceAccount = require("./certs/key-name.json");
firebase.initializeApp({
credential: firebase.credential.cert(serviceAccount),
databaseURL: "https://my-app.firebaseio.com",
storageBucket: "my-app.appspot.com"
});
var storage = firebase.storage();
var metadata = {
id: '1234'
};
uploadFile('myFile.pdf', metadata);
function uploadFile(file, metadata) {
var bucket = storage.bucket();
var options = {
destination: file,
resumable: false,
metadata: {
metadata: metadata
}
};
bucket.upload(file, options, function(err, remoteFile) {
if (!err) {
console.log("Uploaded!");
} else {
console.log(err);
}
});
}
I am having the opposite as this issue:
issues deleting an image using Cloud Functions for Firebase and #google-cloud/storage
(for the record, I have tried all things suggested there).
Basically I have a known file path, then a cloud function triggered by a database event.
I can initialise a bucket, get a file as well as its name, but then when I try and download it I get API Error: not found.
Here is my code:
module.exports = (orgID, reportID) => {
const bucket = gcs.bucket("MY_PROJECT.appspot.com");
const filePath = `/safety_hotline/${orgID}/${reportID}`;
const file = bucket.file(filePath);
// the name is shown correctly in the console
console.log(file.name);
const tempLocalFile = path.join(os.tmpdir(), filePath);
const tempLocalDir = path.dirname(tempLocalFile);
return mkdirp(tempLocalDir)
.then(() => {
// Download file from bucket.
return file.download({ destination: tempLocalFile });
})
.then(() => {
console.log("file downloaded succesfully");
})
.catch(err => {
console.log(err);
});
}
You can see I get the console log of the file name, so I don't understand why I can't then download it?
Any advice would be amazing, thanks!
Edit: edited code a bit for clarity
I see you have this line:
const filePath = `/safety_hotline/${orgID}/${reportID}`;
I am guessing that you may have named your objects with the pattern safety_hotline/org/report, but as written above the first character of the object name would be a slash. That's also a legal object name but it's usually unintentional. Try removing the slash?
You Try This Follow functions-samples ?
i'am try this follow done Download file successful