Firebase cloud function cannot upload simple txt file - firebase

i'm trying to upload a simple test file from my working directory to firebase storage. To do so i created this code:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.https.onCall((data, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('./tst.txt', options).then(function(data) {
const file = data[0];
if (file) {
return file;
}else{
throw new Error("Irgendwas geht ned")
}
}).catch(e);
return 200;
});
Unfortunately firebase is always saying:
Error: ENOENT: no such file or directory, stat '/tst.txt'
My File is located in ./tst.txt in my project
My Goal is to generate a text file inside my cloudfunction and upload them to firebase storage to store it. No files are currently stored in storage. Now i want to be able to upload a file which is already created.
This is how my files are organized:

The Firebase CLI will deploy all of the files in the functions folder, except for node_modules. Your tst.txt file isn't in there - it's one folder higher. So it's not even being deployed. You will have to move it into functions in order to make it available to the function at runtime.

Related

Generating a PDF when a document is created in Firebase Cloud Firestore

I'm developing an app that creates a PDF based on a web form.
I am currently attempting to use pdfmake to generate the PDFs based on a firestore document create trigger
import * as functions from 'firebase-functions';
const admin = require('firebase-admin);
admin.initializeApp();
const PdfPrinter = require('pdfmake');
const fs = require('fs');
export const createPDF = functions.firestore
.document('pdfs/{pdf}')
.onCreate(async (snap, context) => {
var pdfName = context.params.pdf;
var printer = new PdfPrinter();
var docDefinition = {
// Pdf Definitions
};
var options = {
// Pdf Options
};
var pdfDoc = printer.createPdfKitDocument(docDefinition, options);
pdfDoc.pipe(fs.createWriteStream('tempDoc.pdf'));
await pdfDoc.end();
// Upload to Firebase Storage
const bucket = admin.storage().bucket('myproject.appspot.com');
bucket.upload('tempDoc.pdf', {
destination: pdfName + '.pdf',
});
return fs.unlinkSync('document.pdf');
});
The trigger is called, however i get the error "Error: ENOENT: no such file or directory, stat 'document.pdf'"
I have tried it with the onCreate function being async and without.
Any help is greatly appreciated
It's not possible to write to any file location in Cloud Functions outside of /tmp. If your code needs to write a file, it should build paths off of os.tmpdir() as described in the documentation:
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
The rest of the file system is read-only and accessible to the
function.

How to include storage service account key in function

I'm trying to include service account key into my storage function to be able to get long lived signed url by following out of date example here
https://github.com/firebase/functions-samples/blob/b404482342906ee1b46dddb4c75667685ab098a1/generate-thumbnail/functions/index.js#L21
I have downloaded my key from IAM which is in JSON format. I have tried to save it right next to my function
-functions/storage/resizeProfileImg.js
-functions/storage/service-account-credentials.json
-functions/index.js
-functions/admin.js
where resizeProfileImg.js is my function and call it like this
const { Storage } = require('#google-cloud/storage');
const storage = new Storage({ projectId: projectId ,keyFilename: './service-account-credentials.json'})
but after deployment when the function is triggered then I get an error
Error: ENOENT: no such file or directory, open '/srv/service-account-credentials.json'
I have even tried to add it in constant like this
const serviceAccountCredentials = require('./accountKey/service-account-credentials.json')
const { Storage } = require('#google-cloud/storage');
const storage = new Storage({ projectId: projectId ,keyFilename: serviceAccountCredentials})
but then I get an error
TypeError: Path must be a string. Received { type: 'service_account',...
Any idea how to do this properly
In Cloud Functions, the current directory . isn't where your source file is located. It's where the functions folder was deployed. Since your credentials file is in a subdirectory called "storage", you will need to use that in the path.
const serviceAccountCredentials = require('./storage/service-account-credentials.json')

Cloud Functions: zip multiple documents from Cloud Storage

I already searched through a lot of questions on stack overflow but couldn't find a fitting answer from which I can derive the answer I need:
I want to zip multiple files from a folder within Google Cloud Storage/Firebase Storage with a Cloud Function.
I already found the solution for zipping documents from the local filesystem but could not derive how to do it within a Cloud Function for Cloud Storage.
Google Cloud Storage supports the decompressive form of transcoding but not a compressive form of transcoding. However, at Cloud Storage, any user can store a gzip-compressed file.
To zip multiple documents from Cloud Storage using Cloud Functions, you can download the files from Cloud Storage to functions instances using gcs.bucket.file(filePath). download, zip the file, and re-upload the files to the Cloud Storage. Here you will find an example of downloading, transforming, and uploading a file. You can find an example to zip multiple files in this StackOverflow thread. This document explains how you can upload objects to Cloud Storage using Console, Gsutil, Code Sample, or REST APIs.
A bit late, but I had the same problem to solve.
The following Firebase Function:
Runs with 1 GB / 120 seconds timeout (for good measure)
Is triggered by WRITE calls (do this only if you have few calls!)
Ignores all paths except background_thumbnail/
Creates a random working directory and deletes it afterwards
Downloads images from Firebase Storage
Zips these images in a folder: background_thumbnail/<IMAGE>
Uploads created ZIP to Firebase Storage
Creates a signed URL for the ZIP file at Firebase Storage
Stores the signed URL in Firestore.
The code can probably be improved and made more elegant, but it works (for now).
const {v4: uuidv4} = require("uuid"); // for random working dir
const JSZip = require("jszip");
exports.generateThumbnailZip = functions
.runWith({memory: "1GB", timeoutSeconds: 120})
.region("europe-west3")
.storage.object()
.onFinalize(async (object) => {
// background_thumbnail/ is the watched folder
if (!object.name.startsWith("background_thumbnail/")) {
return functions.logger.log(`Aborting, got: ${object.name}.`);
}
const jszip = new JSZip();
const bucket = admin.storage().bucket();
const fileDir = path.dirname(object.name);
const workingDir = path.join(os.tmpdir(), uuidv4());
const localZipPath = path.join(workingDir, `${fileDir}.zip`);
const remoteZipPath = `${fileDir}.zip`;
await mkdirp(workingDir);
// -------------------------------------------------------------------
// DOWNLOAD and ZIP
// -------------------------------------------------------------------
const [files] = await bucket.getFiles({prefix: `${fileDir}/`});
for (let index = 0; index < files.length; index++) {
const file = files[index];
const name = path.basename(file.name);
const tempFileName = path.join(workingDir, name);
functions.logger.log("Downloading tmp file", tempFileName);
await file.download({destination: tempFileName});
jszip.folder(fileDir).file(name, fs.readFileSync(tempFileName));
}
const content = await jszip.generateAsync({
type: "nodebuffer",
compression: "DEFLATE",
compressionOptions: { level: 9 }
});
functions.logger.log("Saving zip file", localZipPath);
fs.writeFileSync(localZipPath, content);
// -------------------------------------------------------------------
// UPLOAD ZIP
// -------------------------------------------------------------------
functions.logger.log("Uploading zip to storage at", remoteZipPath);
const uploadResponse = await bucket
.upload(path.resolve(localZipPath), {destination: remoteZipPath});
// -------------------------------------------------------------------
// GET SIGNED URL FOR ZIP AND STORE IT IN DB
// -------------------------------------------------------------------
functions.logger.log("Getting signed URLs.");
const signedResult = await uploadResponse[0].getSignedUrl({
action: "read",
expires: "03-01-2500",
});
const signedUrl = signedResult[0];
functions.logger.log("Storing signed URL in db", signedUrl);
// Stores the signed URL under "zips/<WATCHED DIR>.signedUrl"
await db.collection("zips").doc(fileDir).set({
signedUrl: signedUrl,
}, {merge: true});
// -------------------------------------------------------------------
// CLEAN UP
// -------------------------------------------------------------------
functions.logger.log("Unlinking working dir", workingDir);
fs.rmSync(workingDir, {recursive: true, force: true});
functions.logger.log("DONE");
return null;
});

Uploading files from Firebase Cloud Functions to Cloud Storage

The documentation is too complex for me to understand. It shows how to download a file from Cloud Storage to Cloud Functions, manipulate the file, and then upload the new file to Cloud Storage. I just want to see the basic, minimum instructions for uploading a file from Cloud Functions to Cloud Storage. Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
var metadata = {
contentType: 'text',
};
admin.storage().ref().put( {'test': 'test'}, metadata)
.then(function() {
console.log("Document written.");
})
.catch(function(error) {
console.error(error);
})
});
The error message is admin.storage(...).ref is not a function. I'm guessing that firebase-admin includes Firestore but not Storage? Instead of firebase-admin should I use #google-cloud/storage? Why doesn't this work:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const {Storage} = require('#google-cloud/storage')();
const storage = new Storage();
admin.initializeApp();
exports.storage = functions.firestore.document('Test_Value').onUpdate((change, context) => {
storage.bucket().upload( {'test': 'test'} , {
metadata: {
contentType: 'text'
}
})
});
I can't even deploy this code, the error message is
Error parsing triggers: Cannot find module './clone.js'
Apparently a npm module dependency is missing? But the module isn't called clone.js? I tried requiring child-process-promise, path, os, and fs; none fixed the missing clone.js error.
Why does admin.initializeApp(); lack parameters, when in my index.html file I have:
firebase.initializeApp({
apiKey: 'swordfish',
authDomain: 'myapp.firebaseapp.com',
databaseURL: "https://myapp.firebaseio.com",
projectId: 'myapp',
storageBucket: "myapp.appspot.com"
});
Another issue I'm seeing:
npm list -g --depth=0
/Users/TDK/.nvm/versions/node/v6.11.2/lib
├── child_process#1.0.2
├── UNMET PEER DEPENDENCY error: ENOENT: no such file or directory, open '/Users/TDK/.nvm/versions/node/v6.11.2/lib/node_modules/firebase-admin/package.json
├── firebase-functions#2.1.0
├── firebase-tools#6.0.1
├── firestore-backup-restore#1.3.1
├── fs#0.0.2
├── npm#6.4.1
├── npm-check#5.9.0
├── protractor#5.4.1
├── request#2.88.0
└── watson-developer-cloud#3.13.0
In other words, there's something wrong with firebase-admin, or with Node 6.11.2. Should I use a Node Version Manager to revert to an older version of Node?
Go to https://console.cloud.google.com/iam-admin/iam
Click the pencil icon next to your App Engine default service account
+ ADD ANOTHER ROLE
Add Cloud Functions Service Agent
In my specific use case, I needed to decode a base64 string into a byte array and then use that to save the image.
var serviceAccount = require("./../serviceAccountKey.json");
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
admin.initializeApp({
projectId: serviceAccount.project_id,
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://your_project_id_here.firebaseio.com", //update this
storageBucket: "your_bucket_name_here.appspot.com" //update this
});
function uploadProfileImage(imageBytes64Str: string): Promise<any> {
const bucket = admin.storage().bucket()
const imageBuffer = Buffer.from(imageBytes64Str, 'base64')
const imageByteArray = new Uint8Array(imageBuffer);
const file = bucket.file(`images/profile_photo.png`);
const options = { resumable: false, metadata: { contentType: "image/jpg" } }
//options may not be necessary
return file.save(imageByteArray, options)
.then(stuff => {
return file.getSignedUrl({
action: 'read',
expires: '03-09-2500'
})
})
.then(urls => {
const url = urls[0];
console.log(`Image url = ${url}`)
return url
})
.catch(err => {
console.log(`Unable to upload image ${err}`)
})
}
Then you can call the method like this and chain the calls.
uploadProfileImage(image_bytes_here)
.then(url => {
//Do stuff with the url here
})
Note: You must initialize admin with a service account and specify the default bucket. If you simply do admin.initializeApp() then your image urls will expire in 10 days.
Steps to properly use a service account.
Go to Service Accounts and generate a private key
Put the JSON file in your functions folder (next to src and node_modules)
Go to Storage and copy the URL not including the "gs://" in the front. Use this for the storage bucket url when initializing admin.
Use your project ID above for the database URL.
See Introduction to the Admin Cloud Storage
API for further
details on how to use the Cloud Storage service in Firebase Admin SDK.
var admin = require("firebase-admin");
var serviceAccount = require("path/to/serviceAccountKey.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
storageBucket: "<BUCKET_NAME>.appspot.com"
});
var bucket = admin.storage().bucket();
// 'bucket' is an object defined in the #google-cloud/storage library.
// See https://googlecloudplatform.github.io/google-cloud-node/#/docs/storage/latest/storage/bucket
// for more details.
Regarding uploading objects, see Cloud Storage Documentation Uploading Objects sample code:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'Local file to upload, e.g. ./local/path/to/file.txt';
// Uploads a local file to the bucket
await storage.bucket(bucketName).upload(filename, {
// Support for HTTP requests made with `Accept-Encoding: gzip`
gzip: true,
metadata: {
// Enable long-lived HTTP caching headers
// Use only if the contents of the file will never change
// (If the contents will change, use cacheControl: 'no-cache')
cacheControl: 'public, max-age=31536000',
},
});
console.log(`${filename} uploaded to ${bucketName}.`);
I uploaded a file from my hard drive to Firebase Cloud Storage via Google Cloud Functions. First, I found the documentation for Google Cloud Functions bucket.upload.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('hello_world.ogg', options).then(function(data) {
const file = data[0];
});
return 0;
});
The first three lines are Cloud Functions boilerplate. The next line
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
creates the Cloud Function and sets the trigger. The next three lines are more Google Cloud boilerplate.
The rest of the code locates the file hello_world.ogg on my computer's hard drive in the functions folder of my project directory and uploads it to the directory Test_Folder and changes the name of the file to hello_world.dog in my Firebase Cloud Storage. This returns a promise, and the next line const file = data[0]; is unnecessary unless you want to do something else with the file.
Lastly we return 0;. This line does nothing except prevent the error message
Function returned undefined, expected Promise or Value
if (req.rawBody) {
busboy.end(req.rawBody);
}
else {
req.pipe(busboy);
}
As described in this issue: https://github.com/GoogleCloudPlatform/cloud-functions-emulator/issues/161#issuecomment-376563784

How to set the destination for file.download() in Google Cloud Storage?

The Google Cloud Storage documentation for download() suggests that a destination folder can be specified:
file.download({
destination: '/Users/me/Desktop/file-backup.txt'
}, function(err) {});
No matter what value I put in my file is always downloaded to Firebase Cloud Storage at the root level. This question says that the path can't have an initial slash but changing the example to
file.download({
destination: 'Users/me/Desktop/file-backup.txt'
}, function(err) {});
doesn't make a difference.
Changing the destination to
file.download({
destination: ".child('Test_Folder')",
})
resulted in an error message:
EROFS: read-only file system, open '.child('Test_Folder')'
What is the correct syntax for a Cloud Storage destination (folder and filename)?
Changing the bucket from myapp.appspot.com to myapp.appspot.com/Test_Folder resulted in an error message:
Cannot parse JSON response
Also, the example path appears to specify a location on a personal computer's hard drive. It seems odd to set up a Cloud Storage folder for Desktop. Does this imply that there's a way to specify a destination somewhere other than Cloud Storage?
Here's my code:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
bucket.upload('./hello_world.ogg')
.then(function(data) {
const file = data[0];
file.download({
destination: 'Test_Folder/hello_dog.ogg',
})
.then(function(data) {
const contents = data[0];
console.log("File uploaded.");
})
.catch(error => {
console.error(error);
});
})
.catch(error => {
console.error(error);
});
return 0;
});
According to the documentation:
The only writeable part of the filesystem is the /tmp directory, which
you can use to store temporary files in a function instance. This is a
local disk mount point known as a "tmpfs" volume in which data written
to the volume is stored in memory. Note that it will consume memory
resources provisioned for the function.
The rest of the file system is read-only and accessible to the
function.
You should use os.tmpdir() to get the best writable directory for the current runtime.
Thanks Doug, the code is working now:
exports.Storage = functions.firestore.document('Storage_Value').onUpdate((change, context) => {
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('myapp.appspot.com');
const options = {
destination: 'Test_Folder/hello_world.dog'
};
bucket.upload('hello_world.ogg', options)
.then(function(data) {
const file = data[0];
});
return 0;
});
The function gets the file hello_world.ogg from the functions folder of my project, then writes it to Test_Folder in my Firebase Cloud Storage, and changes the name of the file to hello_world.dog. I copied the download URL and audio file plays perfectly.
Yesterday I thought it seemed odd that writing a file to Cloud Storage was called download(), when upload() made more sense. :-)
You can download the files from Google Cloud Storage to your computer using the following code or command
Install python on your PC
Install GCS on your PC
pip install google-cloud-storage
kesaktopdi.appspot.com
Download .json file and save it in /home/login/ folder
Change your account
https://console.cloud.google.com/apis/credentials/serviceaccountkey?project=kesaktopdi
import os
ACCOUNT_ID='kesaktopdi'
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="/home/login/" + ACCOUNT_ID + ".json"
def download_blob(bucket_name, source_blob_name, destination_file_name):
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(source_blob_name)
blob.download_to_filename(destination_file_name)
#print('Blob {} downloaded to {}.'.format(source_blob_name,destination_file_name))
download_blob(ACCOUNT_ID +'.appspot.com', #account link
'user.txt', #file location on the server
'/home/login/kesaktopdi.txt') #file storage on a computer
You can also download files from the Google Cloud Storage server to your computer using the following command.
file location on the server file storage on a computer
gsutil -m cp -r gs://kesaktopdi.appspot.com/text.txt /home/login
The program was created by the APIuz team https://t.me/apiuz

Resources