Function execution took 60002 ms, finished with status: 'timeout' [duplicate] - firebase

This question already has an answer here:
Google Cloud Function Environmnet Timing out on every functions
(1 answer)
Closed 4 years ago.
I have a Firebase cloud function that is loosely based on this example
https://github.com/firebase/functions-samples/tree/master/image-sharp
Today, when deploying some small changes it gave me this warning
$ firebase deploy --only functions
⚠ functions: package.json indicates an outdated version of firebase-functions.
Please upgrade using npm install --save firebase-functions#latest in your functions directory.
so I did the upgrade, which upgraded firebase-functions from ^1.0.3 to ^2.0.0
since then have been getting this when running the function
Function execution took 60002 ms, finished with status: 'timeout'
instead of the usual
Function execution took 10 ms, finished with status: 'ok'
I started stripping down my function but even going down to bare bones it was still getting the error.
I then started a new project used the example function as is and it behaves exactly the same way. With firebase-functions ^2.0.0 it gives the timeout error but with ^1.0.0 it works fine.
Is this a known issue?
Thanks
Here is the example code
exports.generateThumbnail = functions.storage.object().onFinalize((object) => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const contentType = object.contentType; // File content type.
// Exit if this is triggered on a file that is not an image.
if (!contentType.startsWith('image/')) {
console.log('This is not an image.');
return null;
}
// Get the file name.
const fileName = path.basename(filePath);
// Exit if the image is already a thumbnail.
if (fileName.startsWith('thumb_')) {
console.log('Already a Thumbnail.');
return null;
}
// Download file from bucket.
const bucket = gcs.bucket(fileBucket);
const metadata = {
contentType: contentType,
};
// We add a 'thumb_' prefix to thumbnails file name. That's where we'll upload the thumbnail.
const thumbFileName = `thumb_${fileName}`;
const thumbFilePath = path.join(path.dirname(filePath), thumbFileName);
// Create write stream for uploading thumbnail
const thumbnailUploadStream = bucket.file(thumbFilePath).createWriteStream({metadata});
// Create Sharp pipeline for resizing the image and use pipe to read from bucket read stream
const pipeline = sharp();
pipeline
.resize(THUMB_MAX_WIDTH, THUMB_MAX_HEIGHT)
.max()
.pipe(thumbnailUploadStream);
bucket.file(filePath).createReadStream().pipe(pipeline);
const streamAsPromise = new Promise((resolve, reject) =>
thumbnailUploadStream.on('finish', resolve).on('error', reject));
return streamAsPromise.then(() => {
console.log('Thumbnail created successfully');
return null;
});

I was going to comment but it requires me 50+ reputations.....
Anyway, I am experiencing the same problem:
exports.sendNotificationForMessage = functions.firestore.document('chatrooms/{chatroomId}/messages/{messageId}').onCreate((snap, context) => {
const newMessage = snap.data();
const messageContent = newMessage.text;
const senderName = newMessage.senderDisplayName;
const senderId = newMessage.senderId;
const chatroomId = context.params.chatroomId;
console.log(newMessage)
return true;
});
It finished with status timeout.
If it's a problem with firebase-function 2.0, what is the command to downgrade it back to version 1.x? Googled about it but no luck.

Try calling resolve and reject with ():
exports.generateThumbnail = functions.storage.object().onFinalize((object) => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePath = object.name; // File path in the bucket.
const contentType = object.contentType; // File content type.
// Exit if this is triggered on a file that is not an image.
if (!contentType.startsWith('image/')) {
console.log('This is not an image.');
return null;
}
// Get the file name.
const fileName = path.basename(filePath);
// Exit if the image is already a thumbnail.
if (fileName.startsWith('thumb_')) {
console.log('Already a Thumbnail.');
return null;
}
// Download file from bucket.
const bucket = gcs.bucket(fileBucket);
const metadata = {
contentType: contentType,
};
// We add a 'thumb_' prefix to thumbnails file name. That's where we'll upload the thumbnail.
const thumbFileName = `thumb_${fileName}`;
const thumbFilePath = path.join(path.dirname(filePath), thumbFileName);
// Create write stream for uploading thumbnail
const thumbnailUploadStream = bucket.file(thumbFilePath).createWriteStream({metadata});
// Create Sharp pipeline for resizing the image and use pipe to read from bucket read stream
const pipeline = sharp();
pipeline.resize(THUMB_MAX_WIDTH, THUMB_MAX_HEIGHT).max().pipe(thumbnailUploadStream);
bucket.file(filePath).createReadStream().pipe(pipeline);
const streamAsPromise = new Promise((resolve, reject) =>
thumbnailUploadStream.on('finish', resolve()).on('error', reject()));
return streamAsPromise.then(() => {
console.log('Thumbnail created successfully');
return null;
});
});

Related

Firebase Cloud Function - Container worker exceeded memory limit of 256 MiB with 258 MiB used after servicing 29 requests - GenerateThumbnail

I am developing an android app with firebase as a backend. Still in prototyping phase, single user, no heavy traffic at all. I have deployed (so far) 10 Cloud Function. So far no tweaking regarding memory (256MB) or other settings. One of them is
generateThumbnail from samples (slightly modified). As I am testing my app, new Images are uploaded to bucket, and thumbnails were created in same folder . . Basically, function worked as expected. However, yesterday, I got last log statement before error:
Container worker exceeded memory limit of 256 MiB with 258 MiB used after servicing 29 requests total. Consider setting a larger instance class and
and then actual error:
Function invocation was interrupted. Error: function terminated. Recommended action: inspect logs for termination reason. Additional troubleshooting documentation can be found at https://cloud.google.com/functions/docs/troubleshooting#logging
Again, I am currently single user, and function was triggered probably around 50 times so far. Obviously something is not working as expected.
this is the function:
exports.generateThumbnail = functions.storage.object().onFinalize(async (object) => {
// File and directory paths.
const filePath = object.name;
const contentType = object.contentType; // This is the image MIME type
const fileDir = path.dirname(filePath);
const fileName = path.basename(filePath);
const thumbFilePath = path.normalize(path.join(fileDir, `${THUMB_PREFIX}${fileName}`));
const tempLocalFile = path.join(os.tmpdir(), filePath);
const tempLocalDir = path.dirname(tempLocalFile);
const tempLocalThumbFile = path.join(os.tmpdir(), thumbFilePath);
//foldername in docId from pozes-test collection
const folderName = path.basename(fileDir)
const docIdFromFolderName = path.basename(fileDir)
// Exit if this is triggered on a file that is not an image.
if (!contentType.startsWith('image/')) {
return functions.logger.log('This is not an image.');
}
// Exit if the image is already a thumbnail.
if (fileName.startsWith(THUMB_PREFIX)) {
return functions.logger.log('Already a Thumbnail.');
}
// Cloud Storage files.
const bucket = admin.storage().bucket(object.bucket);
const file = bucket.file(filePath);
const thumbFile = bucket.file(thumbFilePath);
const metadata = {
contentType: contentType,
// To enable Client-side caching you can set the Cache-Control headers here. Uncomment below.
'Cache-Control': 'public,max-age=3600',
};
// Create the temp directory where the storage file will be downloaded.
await mkdirp(tempLocalDir)
// Download file from bucket.
await file.download({destination: tempLocalFile});
functions.logger.log('The file has been downloaded to', tempLocalFile);
// Generate a thumbnail using ImageMagick.
await spawn('convert', [tempLocalFile, '-thumbnail', `${THUMB_MAX_WIDTH}x${THUMB_MAX_HEIGHT}>`, tempLocalThumbFile], {capture: ['stdout', 'stderr']});
functions.logger.log('Thumbnail created at', tempLocalThumbFile);
// Uploading the Thumbnail.
await bucket.upload(tempLocalThumbFile, {destination: thumbFilePath, metadata: metadata});
functions.logger.log('Thumbnail uploaded to Storage at', thumbFilePath);
// Once the image has been uploaded delete the local files to free up disk space.
fs.unlinkSync(tempLocalFile);
fs.unlinkSync(tempLocalThumbFile);
// Get the Signed URLs for the thumbnail and original image.
const results = await Promise.all([
thumbFile.getSignedUrl({
action: 'read',
expires: '03-01-2500',
}),
file.getSignedUrl({
action: 'read',
expires: '03-01-2500',
}),
]);
functions.logger.log('Got Signed URLs.');
const thumbResult = results[0];
const originalResult = results[1];
const thumbFileUrl = thumbResult[0];
const fileUrl = originalResult[0];
// Add the URLs to the Database
if (fileName == "image_0") {
await admin.firestore().collection('testCollection').doc(docIdFromFolderName).update({thumbnail: thumbFileUrl});
return functions.logger.log('Thumbnail URLs saved to database.');
} else {
return ("fileName: " + fileName + " , nothing written to firestore")
}
This is from my package.json:
"dependencies": {
"firebase-admin": "^10.0.2",
"firebase-functions": "^3.22.0",
"googleapis": "^105.0.0",
"child-process-promise": "^2.2.1",
"mkdirp": "^1.0.3"
Can someone please explain what could be the reason this is happening. Why is this function exceeding memory of 256MB with so little traffic? ? Is this a working memory? Could it be that files are not getting deleted from tmp folder?
I have recreated the setup from your given code and data but I am not getting any kind of error like you are experiencing. I have also tried with a Image having large size more than 20MB but my memory consumption for the function is still hovering around 60MB/call.
And more specifically I have also tried to hammer the function with providing more than 40 times but still no error pops up for me.
I think it is best to create a github issue about this issue under the same github link you have provided generateThumbnail from samples, so the actual engineers behind the product will support you OR you can also try to contact firebase support

Generate thumbnail cloud function error with GCS

I have an app in whcih I want to generate an thumbnail for every image uploaded to storage. I'm trying to use generate thumbnail cloud function ,but when a image is uploaded to the storage the cloud function resulting an error in it's logs in firebase.
TypeError: gcs(...).bucket is not a function
at exports.generateThumbnail.functions.storage.object.onFinalize (/user_code/index.js:77:73)
at cloudFunctionNewSignature (/user_code/node_modules/firebase-functions/lib/cloud-functions.js:105:23)
at cloudFunction (/user_code/node_modules/firebase-functions/lib/cloud-functions.js:135:20)
at /var/tmp/worker/worker.js:768:24
at process._tickDomainCallback (internal/process/next_tick.js:135:7)
Here is my index.js file. Added only GCS required here.
const functions = require("firebase-functions");
const gcs = require("#google-cloud/storage");
admin.initializeApp();
const THUMB_MAX_HEIGHT = 200;
const THUMB_MAX_WIDTH = 200;
// Thumbnail prefix added to file names.
const THUMB_PREFIX = 'thumb_';
exports.generateThumbnail = functions.storage.object().onFinalize((object) => {
// File and directory paths.
const filePath = object.name;
const contentType = object.contentType; // This is the image MIME type
const fileDir = path.dirname(filePath);
const fileName = path.basename(filePath);
const thumbFilePath = path.normalize(path.join(fileDir, `${THUMB_PREFIX}${fileName}`));
const tempLocalFile = path.join(os.tmpdir(), filePath);
const tempLocalDir = path.dirname(tempLocalFile);
const tempLocalThumbFile = path.join(os.tmpdir(), thumbFilePath);
// Exit if this is triggered on a file that is not an image.
if (!contentType.startsWith('image/')) {
console.log('This is not an image.');
return null;
}
// Exit if the image is already a thumbnail.
if (fileName.startsWith(THUMB_PREFIX)) {
console.log('Already a Thumbnail.');
return null;
}
// Cloud Storage files.
const bucket = gcs({keyFilename: 'service-account-credentials.json'}).bucket(object.bucket);
const file = bucket.file(filePath);
const thumbFile = bucket.file(thumbFilePath);
const metadata = {
contentType: contentType,
// To enable Client-side caching you can set the Cache-Control headers here. Uncomment below.
// 'Cache-Control': 'public,max-age=3600',
};
// Create the temp directory where the storage file will be downloaded.
return mkdirp(tempLocalDir).then(() => {
// Download file from bucket.
return file.download({destination: tempLocalFile});
}).then(() => {
console.log('The file has been downloaded to', tempLocalFile);
// Generate a thumbnail using ImageMagick.
return spawn('convert', [tempLocalFile, '-thumbnail', `${THUMB_MAX_WIDTH}x${THUMB_MAX_HEIGHT}>`, tempLocalThumbFile], {capture: ['stdout', 'stderr']});
}).then(() => {
console.log('Thumbnail created at', tempLocalThumbFile);
// Uploading the Thumbnail.
return bucket.upload(tempLocalThumbFile, {destination: thumbFilePath, metadata: metadata});
}).then(() => {
console.log('Thumbnail uploaded to Storage at', thumbFilePath);
// Once the image has been uploaded delete the local files to free up disk space.
fs.unlinkSync(tempLocalFile);
fs.unlinkSync(tempLocalThumbFile);
// Get the Signed URLs for the thumbnail and original image.
const config = {
action: 'read',
expires: '03-01-2500',
};
return Promise.all([
thumbFile.getSignedUrl(config),
file.getSignedUrl(config),
]);
}).then((results) => {
console.log('Got Signed URLs.');
const thumbResult = results[0];
const originalResult = results[1];
const thumbFileUrl = thumbResult[0];
const fileUrl = originalResult[0];
console.log('Got Signed URLs. '+ thumbFileUrl);
return result;
}).then(() => console.log('Thumbnail URLs saved to database.'));
});
Unable to understand what's the issue is!.
I made changes according to suggestion of an answer
require(...) is not a function
at Object.<anonymous> (D:\mercury_two\mercury\functions\index.js:20:45)
at Module._compile (internal/modules/cjs/loader.js:678:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:689:10)
at Module.load (internal/modules/cjs/loader.js:589:32)
at tryModuleLoad (internal/modules/cjs/loader.js:528:12)
at Function.Module._load (internal/modules/cjs/loader.js:520:3)
at Module.require (internal/modules/cjs/loader.js:626:17)
at require (internal/modules/cjs/helpers.js:20:18)
at C:\Users\Harsha\AppData\Roaming\npm\node_modules\firebase-tools\lib\triggerParser.js:15:15
at Object.<anonymous> (C:\Users\Harsha\AppData\Roaming\npm\node_modules\firebase-tools\lib\triggerParser.js:53:3)
Then I can't even deploy the function. 'firebase deploy' throws above error.
You should not do
const bucket = gcs({keyFilename: 'service-account-credentials.json'}).bucket(object.bucket);
but only
const bucket = gcs.bucket(object.bucket);
You need to use the service-account-credentials.json (i.e. the Service Account Key JSON file) only when you require the gcs module, at the top of your Cloud Function, as follows:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account-credentials.json});
See the official Firebase sample that shows that in detail: https://github.com/firebase/functions-samples/blob/master/generate-thumbnail/functions/index.js

Cloud function generating thumbnails doesn't overwrite old thumbnail with the same name

I'm using cloud functions to resize and generate thumbnails for uploaded images in Firebase Storage. On the first upload the thumbnails are generated but i also want to be able to edit those images while keeping the same name.
This is how i'm doing it :
I upload an image with this function on the client :
uploadImage (imageFile, folderName, imageName){
const storageRef = firebase.storage().ref();
// need to prefix image name with "slot_"
storageRef.child(`${folderName}/slot_${imageName}`).put(imageFile)
}
Thumbnails are then generated with this cloud function :
export const generateThumbs = functions.storage.object().onFinalize(async
object => {
const bucket = gcs.bucket(object.bucket)
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const slotSizes = [64,250]
const temporaryDirectory = join(tmpdir(), `thumbs_${fileName}`);
const temporaryFilePath = join(temporaryDirectory, 'source.png');
// avoid loop includes only images
if (fileName.includes('thumb_') ||
!object.contentType.includes('image')) {
return false;
}
await fileSystem.ensureDir(temporaryDirectory);
// original file in temp directory
await bucket.file(filePath).download({
destination: temporaryFilePath
});
const slotUploadPromises = slotSizes.map(async size => {
const thumbName = `thumb_${size}_${fileName}`;
const thumbPath = join(temporaryDirectory, thumbName);
await sharp(temporaryFilePath).resize(size, size).toFile(thumbPath);
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName),
metadata: {
contentType: 'image/jpeg',
}
})
});
await Promise.all(slotUploadPromises)
// removes temp directory
return fileSystem.remove(temporaryDirectory);
So if i call uploadImage(appleImage, 'MyImages', 'test') i'll have in my storage folder 3 images (naming IS important):
slot_test
thumb_250_slot_test
thumb_64_slot_test
At this point if i call again uploadImage(avocadoImage, 'MyImages', 'test') i'd expect to have in the storage the same "naming structure" but with the updated image in place of the old ones, so the new thumbnails should just overwrite the old ones. What actually happens is that the "base" image gets updated while both thumbnails don't. Ending up with :
slot_test (displaying the UPDATED image)
thumb_250_slot_test (displaying the thumbnail of the OLD image)
thumb_64_slot_test (displaying the thumbnail of the OLD image)
I've logged the cloud function extensively, no errors are thrown from the function during execution, thumbnails are created normally and the firebase console also updates the creation date of the thumbnails but i still get the old thumbnails image. I've tried removing the temporary directory using fs-extra emptyDir(), i've tried to remove every single thumbnail first (via client) and then uploading again with no luck.
EDIT : Found a solution to my problem by NOT creating any temporary folder or temporary files and using sharp pipeline instead. That said i'm still missing the underlying problem in the code above. I'm quite convinced that, for whatever reason, the function didn't remove the temporary folder and that was generating problems whenever i tried to overwrite the images. This function works :
export const generateThumbs = functions.storage.object().onFinalize(async object => {
const bucket = gcs.bucket(object.bucket)
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
// metadata file
const metadata = {
contentType: 'image/jpeg',
}
if (fileName.includes('thumb_') || !object.contentType.includes('image')) {
return false;
}
if (fileName.includes('slot')) {
// array of promises
const slotUploadPromises = slotSizes.map(async size => {
const thumbName = `thumb_${size}_${fileName}`;
const thumbPath = join(path.dirname(filePath), thumbName);
const thumbnailUploadStream = bucket.file(thumbPath).createWriteStream({ metadata });
const pipeline = sharp();
pipeline.resize(size, Math.floor((size * 9) / 16)).max()
.pipe(thumbnailUploadStream);
bucket.file(filePath).createReadStream().pipe(pipeline);
return new Promise((resolve, reject) =>
thumbnailUploadStream
.on('finish', resolve)
.on('error', reject));
});
await Promise.all(slotUploadPromises)
}

How to increase the max http request size limit for HTTP triggers in Cloud Functions

I'm trying to invoke a google cloud function sending images larger than 50Mb. The purpose of the cloud function is to resize the images and upload them to google cloud storage.
However, when I send the HTTP post to my cloud function I get the following error: 413 Request Entity Too Large
Does anyone have any workaround to this error? Can I increase the http request size limit?
The limit for HTTP trigger upload and download payload size is documented at 10MB. There is no way to get this limit increased, but you can always file a feature request explaining why it should be increased.
You can let the client upload directly to storage. authinticated onto his own user folder and security rules limiting the file size to whatever size you wish into a temp folder.
Then have a cloud function trigger started resizing the image.
And Delete the original image when finished.
I'm attaching a code example of mine -
you should add the a delete of the file after conversion...
/**
* When an image is uploaded in the Storage bucket We generate a thumbnail automatically using
* ImageMagick.
* After the thumbnail has been generated and uploaded to Cloud Storage,
* we write the public URL to the Firebase Realtime Database.
*/
exports.generateThumbnail = functions.storage.object().onFinalize((object) => {
console.log('Generated Started');
// File and directory paths.
const filePath = object.name;
const contentType = object.contentType; // This is the image MIME type
const fileDir = path.dirname(filePath);
const fileName = path.basename(filePath);
const thumbFilePath = path.normalize(path.join(fileDir, `${THUMB_PREFIX}${fileName}`));
const tempLocalFile = path.join(os.tmpdir(), filePath);
const tempLocalDir = path.dirname(tempLocalFile);
const tempLocalThumbFile = path.join(os.tmpdir(), thumbFilePath);
// Exit if this is triggered on a file that is not an image.
if (!contentType.startsWith('image/')) {
console.log('This is not an image.');
deleteImage(filename);
return null;
}
// Exit if the image is already a thumbnail.
if (fileName.startsWith(THUMB_PREFIX)) {
console.log('Already a Thumbnail.');
deleteImage(filename);
return null;
}
// Cloud Storage files.
const bucket = gcs.bucket(object.bucket);
const file = bucket.file(filePath);
const thumbFile = bucket.file(thumbFilePath);
const metadata = {
contentType: contentType,
// To enable Client-side caching you can set the Cache-Control headers here. Uncomment below.
'Cache-Control': 'public,max-age=3600',
};
// Create the temp directory where the storage file will be downloaded.
return mkdirp(tempLocalDir).then(() => {
console.log('DL Started');
// Download file from bucket.
return file.download({
destination: tempLocalFile
});
}).then(() => {
console.log('The file has been downloaded to', tempLocalFile);
// Generate a thumbnail using ImageMagick.
return spawn('convert', [tempLocalFile, '-thumbnail', `${THUMB_MAX_WIDTH}x${THUMB_MAX_HEIGHT}>`, tempLocalThumbFile], {
capture: ['stdout', 'stderr']
});
}).then(() => {
console.log('Thumbnail created at', tempLocalThumbFile);
// Uploading the Thumbnail.
return bucket.upload(tempLocalThumbFile, {
destination: thumbFilePath,
metadata: metadata
});
}).then(() => {
console.log('Thumbnail uploaded to Storage at', thumbFilePath);
// Once the image has been uploaded delete the local files to free up disk space.
fs.unlinkSync(tempLocalFile);
fs.unlinkSync(tempLocalThumbFile);
// Get the Signed URLs for the thumbnail and original image.
const config = {
action: 'read',
expires: '03-01-2500',
};
return Promise.all([
thumbFile.getSignedUrl(config),
// file.getSignedUrl(config),
]);
}).then((results) => {
console.log('Got Signed URLs.');
const thumbResult = results[0];
// const originalResult = results[1];
const thumbFileUrl = thumbResult[0];
// const fileUrl = originalResult[0];
// Add the URLs to the Database
const uid = getUidFromFilePath(fileDir);
if (!uid) return null;
return Promise.all([
admin.auth().updateUser(uid, {
photoURL: thumbFileUrl
}),
admin.database().ref(`/users/${uid}/profile/photoURL`).set(thumbFileUrl)
]);
}).then(() => console.log('Thumbnail URLs saved to database.'));
});
As of 2022, the quota limit for the second generation of cloud functions is 32MB.

Getting a thumbnail from a video using Cloud Functions for Firebase

The code I currently have:
exports.generateThumbnail = functions.storage.object().onChange(event => {
...
.then(() => {
console.log('File downloaded locally to', tempFilePath);
// Generate a thumbnail using ImageMagick.
if (contentType.startsWith('video/')) {
return spawn('convert', [tempFilePath + '[0]', '-quiet', `${tempFilePath}.jpg`]);
} else if (contentType.startsWith('image/')){
return spawn('convert', [tempFilePath, '-thumbnail', '200x200', tempFilePath]);
The error I get in the console:
Failed AGAIN! { Error: spawn ffmpeg ENOENT
at exports._errnoException (util.js:1026:11)
at Process.ChildProcess._handle.onexit (internal/child_process.js:193:32)
at onErrorNT (internal/child_process.js:359:16)
at _combinedTickCallback (internal/process/next_tick.js:74:11)
at process._tickDomainCallback (internal/process/next_tick.js:122:9)
code: 'ENOENT',
errno: 'ENOENT',
syscall: 'spawn ffmpeg',
path: 'ffmpeg',
spawnargs: [ '-t', '1', '-i', '/tmp/myVideo.m4v', 'theThumbs.jpg' ] }
I also tried Imagemagick:
return spawn('convert', [tempFilePath + '[0]', '-quiet',`${tempFilePath}.jpg`]);
Also without any success.
Can anyone point me to the right direction here?
#andrew-robinson post was a good start.
The following will generate a thumbnail for both images and videos.
Add the following to your npm packages:
#ffmpeg-installer/ffmpeg
#google-cloud/storage
child-process-promise
mkdirp
mkdirp-promise
Use the following to generate a thumbnail from a larger image:
function generateFromImage(file, tempLocalThumbFile, fileName) {
const tempLocalFile = path.join(os.tmpdir(), fileName);
// Download file from bucket.
return file.download({destination: tempLocalFile}).then(() => {
console.info('The file has been downloaded to', tempLocalFile);
// Generate a thumbnail using ImageMagick with constant width and variable height (maintains ratio)
return spawn('convert', [tempLocalFile, '-thumbnail', THUMB_MAX_WIDTH, tempLocalThumbFile], {capture: ['stdout', 'stderr']});
}).then(() => {
fs.unlinkSync(tempLocalFile);
return Promise.resolve();
})
}
Use the following to generate a thumbnail from a video:
function generateFromVideo(file, tempLocalThumbFile) {
return file.getSignedUrl({action: 'read', expires: '05-24-2999'}).then((signedUrl) => {
const fileUrl = signedUrl[0];
const promise = spawn(ffmpegPath, ['-ss', '0', '-i', fileUrl, '-f', 'image2', '-vframes', '1', '-vf', `scale=${THUMB_MAX_WIDTH}:-1`, tempLocalThumbFile]);
// promise.childProcess.stdout.on('data', (data) => console.info('[spawn] stdout: ', data.toString()));
// promise.childProcess.stderr.on('data', (data) => console.info('[spawn] stderr: ', data.toString()));
return promise;
})
}
The following will execute when a video or image is uploaded to storage.
It determines the file type, generates the thumbnail to a temp file, uploads the thumbnail to storage, then call 'updateDatabase()' which should be a promise that updates your database (if necessary):
const functions = require('firebase-functions');
const mkdirp = require('mkdirp-promise');
const gcs = require('#google-cloud/storage');
const admin = require('firebase-admin');
const spawn = require('child-process-promise').spawn;
const ffmpegPath = require('#ffmpeg-installer/ffmpeg').path;
const path = require('path');
const os = require('os');
const fs = require('fs');
const db = admin.firestore();
// Max height and width of the thumbnail in pixels.
const THUMB_MAX_WIDTH = 384;
const SERVICE_ACCOUNT = '<your firebase credentials file>.json';
const adminConfig = JSON.parse(process.env.FIREBASE_CONFIG);
module.exports = functions.storage.bucket(adminConfig.storageBucket).object().onFinalize(object => {
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePathInBucket = object.name;
const resourceState = object.resourceState; // The resourceState is 'exists' or 'not_exists' (for file/folder deletions).
const metageneration = object.metageneration; // Number of times metadata has been generated. New objects have a value of 1.
const contentType = object.contentType; // This is the image MIME type
const isImage = contentType.startsWith('image/');
const isVideo = contentType.startsWith('video/');
// Exit if this is a move or deletion event.
if (resourceState === 'not_exists') {
return Promise.resolve();
}
// Exit if file exists but is not new and is only being triggered
// because of a metadata change.
else if (resourceState === 'exists' && metageneration > 1) {
return Promise.resolve();
}
// Exit if the image is already a thumbnail.
else if (filePathInBucket.indexOf('.thumbnail.') !== -1) {
return Promise.resolve();
}
// Exit if this is triggered on a file that is not an image or video.
else if (!(isImage || isVideo)) {
return Promise.resolve();
}
const fileDir = path.dirname(filePathInBucket);
const fileName = path.basename(filePathInBucket);
const fileInfo = parseName(fileName);
const thumbFileExt = isVideo ? 'jpg' : fileInfo.ext;
let thumbFilePath = path.normalize(path.join(fileDir, `${fileInfo.name}_${fileInfo.timestamp}.thumbnail.${thumbFileExt}`));
const tempLocalThumbFile = path.join(os.tmpdir(), thumbFilePath);
const tempLocalDir = path.join(os.tmpdir(), fileDir);
const generateOperation = isVideo ? generateFromVideo : generateFromImage;
// Cloud Storage files.
const bucket = gcs({keyFilename: SERVICE_ACCOUNT}).bucket(fileBucket);
const file = bucket.file(filePathInBucket);
const metadata = {
contentType: isVideo ? 'image/jpeg' : contentType,
// To enable Client-side caching you can set the Cache-Control headers here. Uncomment below.
// 'Cache-Control': 'public,max-age=3600',
};
// Create the temp directory where the storage file will be downloaded.
return mkdirp(tempLocalDir).then(() => {
return generateOperation(file, tempLocalThumbFile, fileName);
}).then(() => {
console.info('Thumbnail created at', tempLocalThumbFile);
// Get the thumbnail dimensions
return spawn('identify', ['-ping', '-format', '%wx%h', tempLocalThumbFile], {capture: ['stdout', 'stderr']});
}).then((result) => {
const dim = result.stdout.toString();
const idx = thumbFilePath.indexOf('.');
thumbFilePath = `${thumbFilePath.substring(0,idx)}_${dim}${thumbFilePath.substring(idx)}`;
console.info('Thumbnail dimensions:', dim);
// Uploading the Thumbnail.
return bucket.upload(tempLocalThumbFile, {destination: thumbFilePath, metadata: metadata});
}).then(() => {
console.info('Thumbnail uploaded to Storage at', thumbFilePath);
const thumbFilename = path.basename(thumbFilePath);
return updateDatabase(fileDir, fileName, thumbFilename);
}).then(() => {
console.info('Thumbnail generated.');
fs.unlinkSync(tempLocalThumbFile);
return Promise.resolve();
})
});
parseName() should parse your filename format. At the very least it should return the file's basename and extension.
updateDatabase() should return a promise that updates your database with the newly generated thumbnail (if necessary).
Note that #ffmpeg-installer/ffmpeg removes the need of directly including a ffmpeg binary in your cloud function.
To use ffmpeg or any other system command-line tool that is not pre-installed on the firebase cloud function container, you can add a pre-compiled binary to the functions folder (alongside index.js) and it will upload it along with your cloud function code in the deploy step. You can then execute the binary using child-process-promise spawn as you were doing with ImageMagick (which is already installed).
You can get the ffmpeg binary here https://johnvansickle.com/ffmpeg/
I used the x86_64 build https://johnvansickle.com/ffmpeg/builds/ffmpeg-git-64bit-static.tar.xz
Untar with
tar -xvzf ffmpeg-release-64bit-static.tar.xz
and just add the one ffmpeg file to the functions folder.
This link explains how you can extract the thumbnail from the video with just the url so there is no need to download the file fully.
https://wistia.com/blog/faster-thumbnail-extraction-ffmpeg
The command to extract the thumbnail with width 512px and keeping the aspect ratio is
const spawn = require('child-process-promise').spawn;
const extractThumbnailFromVideoUrl = (fileUrl, tempThumbnailFilePath) => {
return spawn('./ffmpeg', ['-ss', '0', '-i', fileUrl, '-f', 'image2', '-vframes', '1', '-vf', 'scale=512:-1', tempThumbnailFilePath]);
};
Note the ./ in ./ffmpeg
For more details on the scale arguments you can see here https://trac.ffmpeg.org/wiki/Scaling%20(resizing)%20with%20ffmpeg
If the spawn command fails then as you have seen you will not get a very helpful error output. To get better output you can listen to the stdout and stderr event streams on the ChildProcess
const extractThumbnailFromVideoUrl = (fileUrl, tempThumbnailFilePath) => {
const promise = spawn('./ffmpeg', ['-ss', '0', '-i', fileUrl, '-f', 'image2', '-vframes', '1', '-vf', 'scale=512:-1', tempThumbnailFilePath]);
promise.childProcess.stdout.on('data', (data: any) => console.log('[spawn] stdout: ', data.toString()));
promise.childProcess.stderr.on('data', (data: any) => console.log('[spawn] stderr: ', data.toString()));
return promise;
};
The output of the ffmpeg call will then be displayed in your cloud function logs like they would if you ran the command locally from the terminal. For more info on that you can see https://www.npmjs.com/package/child-process-promise
http://node.readthedocs.io/en/latest/api/child_process/
The following is a complete version of the cloud function assuming only video files. If you want to handle images or other files as well then you can add the code to exit early or call different methods as you were doing. This makes calls to create temp directories and cleans those directories up at the end of the method but I've omitted the details of those functions.
import * as functions from 'firebase-functions';
import * as gcs from '#google-cloud/storage';
import {cleanupFiles, makeTempDirectories} from '../services/system-utils';
const spawn = require('child-process-promise').spawn;
const storageProjectId = `${functions.config().project_id}.appspot.com`;
export const videoFileThumbnailGenerator = functions.storage.bucket(storageProjectId).object().onChange(event => {
const object = event.data;
const fileBucket = object.bucket; // The Storage bucket that contains the file.
const filePathInBucket = object.name; // File path in the bucket.
const resourceState = object.resourceState; // The resourceState is 'exists' or 'not_exists' (for file/folder deletions).
const metageneration = object.metageneration; // Number of times metadata has been generated. New objects have a value of 1.
// Exit if this is a move or deletion event.
if (resourceState === 'not_exists') {
console.log('This is a deletion event.');
return Promise.resolve();
}
// Exit if file exists but is not new and is only being triggered
// because of a metadata change.
if (resourceState === 'exists' && metageneration > 1) {
console.log('This is a metadata change event.');
return Promise.resolve();
}
const bucket = gcs({keyFilename: `${functions.config().firebase_admin_credentials}`}).bucket(fileBucket);
const filePathSplit = filePathInBucket.split('/');
const filename = filePathSplit.pop();
const filenameSplit = filename.split('.');
const fileExtension = filenameSplit.pop();
const baseFilename = filenameSplit.join('.');
const fileDir = filePathSplit.join('/') + (filePathSplit.length > 0 ? '/' : '');
const file = bucket.file(filePathInBucket);
const tempThumbnailDir = '/tmp/thumbnail/';
const jpgFilename = `${baseFilename}.jpg`;
const tempThumbnailFilePath = `${tempThumbnailDir}${jpgFilename}`;
const thumbnailFilePath = `${fileDir}thumbnail/${jpgFilename}`;
return makeTempDirectories([tempThumbnailDir])
.then(() => file.getSignedUrl({action: 'read', expires: '05-24-2999'}))
.then(signedUrl => signedUrl[0])
.then(fileUrl => extractThumbnailFromVideoUrl(fileUrl, tempThumbnailFilePath))
.then(() => bucket.upload(tempThumbnailFilePath, {destination: thumbnailFilePath}))
.then(() => cleanupFiles([
{directoryName: tempThumbnailFilePath},
]))
.catch(err => console.error('Video upload error: ', err));
});
const extractThumbnailFromVideoUrl = (fileUrl, tempThumbnailFilePath) => {
return spawn('./ffmpeg', ['-ss', '0', '-i', fileUrl, '-f', 'image2', '-vframes', '1', '-vf', 'scale=512:-1', tempThumbnailFilePath]);
};

Resources