Firebase-Admin storage uploaded text file can't be opened - firebase

I'm trying to create a simple function that can upload docs/pdfs etc. I have a tried couple of ways but the file I uploaded always opens as Undefined. Now I have created a testing environment function to upload a simple text file but still failing to do so. What part am I missing?
exports.testUpload = functions.region('europe-west1').https.onRequest(async (req, res) => {
const fileName = 'test1.txt';
const buffer = new Buffer('dGVzdGluZyBkYXRhCg==', 'base64');
const file = admin.storage().bucket().file('testing/' + fileName);
await file.save(buffer, {
metadata: {contentType: 'text/*'},
}, ((error) => {
if (error) {
res.status(500).json(error);
}
res.status(200).json('Uploaded');
}));
});

Related

What might cause the firebase console to show the 'waiting' icon (spinning blue loop icon) after adding a file via cloud functions to cloud storage?

So, I can now use a google cloud function to store a file in cloud storage, and after uploading a file I can see it's name in the firebase console, i.e. within the Storage area as you drill down via the various folders.
I can also query and retrieve+view the file, so it seems like it's stored ok.
However, I don't understand why I cannot click on the file in the firebase console and see a preview of it.
My only wild guess is that the cloud function to upload the file didn't release a file handle or something, and so firebase console thinks it's locked.
The file attributes shown in firebase console seem correct.
The cloud function code to upload the file is as follows:
const imageBuffer = Buffer.from(arr[1], 'base64');
const bufferStream = new stream.PassThrough();
bufferStream.end(imageBuffer);
const fullPath = `${filePath}/${data.filename}`
console.log('saving to fullPath', fullPath)
const myFile = bucket.file(fullPath)
bufferStream.pipe(myFile.createWriteStream({
metadata: {
contentType: mime
},
public: true,
validation: "md5"
}))
.on('error', function (err) {
console.log('error from image upload', err);
})
.on('finish', function () {
console.log('!!!!!! finished')
})
where arr[1] is the base64 portion of a string, i.e. remove the mime type stuff from the beginning so arr[1] is the pure file data as base64.
So, basically everything seems to be working perfectly except the firebase console can't view the file, unless I do an app redeploy, i.e. npm run build && firebase deploy && npm start, and then it seems (?) to free up the lock and I can view the image preview in the firebase console for firebase storage.
Sending null at the end of the data as per the comments fixes this.
Revisiting this, rather than use the PassThrough stream which is adding unnecessary overhead, you could write your buffer to the stream itself directly.
const imageBuffer = Buffer.from(arr[1], 'base64');
const fullPath = `${filePath}/${data.filename}`
console.log('saving to fullPath', fullPath)
const myFile = bucket.file(fullPath)
const fileWriteStream = myFile
.createWriteStream({
metadata: {
contentType: mime
},
public: true,
validation: "md5"
})
.on('error', function (err) {
console.log('error from image upload', err);
})
.on('finish', function () {
console.log('!!!!!! finished')
});
fileWriteStream.end(imageBuffer);
Rewritten as a promise-based utility function:
import { File } from "#google-cloud/storage";
import { storage } from "firebase-admin";
async function uploadDataURI(dataURI: string, options: { bucket?: string, filename: string, filepath: string, public?: boolean, contentType?: string }): Promise<File> {
const { bucket, filename, filepath, public, contentType } = options;
const [, mediaType, data] = dataURI.split(/[;,]/);
let contentTypeFromURI: string, buffer: Buffer;
if (mediaType.endsWith(";base64")) {
contentTypeFromURI = mediaType.slice(0,-7);
buffer = Buffer.from(data, "base64");
} else {
contentTypeFromURI = mediaType;
buffer = Buffer.from(decodeURIComponent(data));
}
const storageRef = storage()
.bucket(bucket)
.file(`${filepath}/${filename}`);
return new Promise((resolve, reject) => {
try {
const fileWriteStream = storageRef
.createWriteStream({
metadata: {
contentType: contentType || contentTypeFromURI || undefined // coerce falsy values to undefined
},
public,
validation: "md5"
})
.on('error', reject)
.on('finish', () => resolve(storageRef));
fileWriteStream.end(buffer);
} catch (err) {
reject(err);
}
});
}

How to create a folder in Firebase Storage using Admin API

Aim: to upload a file into a folder within Firebase Storage
E.g.
default_bucket/folder1/file1
default_bucket/folder1/file2
default_bucket/folder2/file3
Using Firebase client-side I am able to upload a file to a folder within Firebase Storage like this:
const storageRef = firebase.storage().ref();
const fileRef = storageRef.child(`${folder}/${filename}`);
const metadata = {
contentType: file.type,
customMetadata: { }
};
return fileRef.put(file, metadata);
If the folder does not exist, it get's created.
However I have not managed to do the same server-side using the Admin SDK.
The code below, uploads the file into the default bucket.
But, I want to upload the file into a named folder within the default bucket.
The client side makes a POST request to the GCF, sending the file and a folder name.
Busboy is used to extra the folder name and file and pass them to the upload function; which uploads the file, then returns a donwnload link for it.
index.js
const task = require('./tasks/upload-file-to-storage');
app.post('/upload', (req, res, next) => {
try {
let uploadedFilename;
let folder;
if (req.method === 'OPTIONS') {
optionsHelper.doOptions(res);
} else if (req.method === 'POST') {
res.set('Access-Control-Allow-Origin', '*');
const busboy = new Busboy({ headers: req.headers });
const uploads = [];
busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
uploadedFilename = `${folder}^${filename}`;
const filepath = path.join(os.tmpdir(), uploadedFilename);
uploads.push({ file: filepath, filename: filename, folder: folder });
file.pipe(fs.createWriteStream(filepath));
});
busboy.on('field', (fieldname, val) => {
if (fieldname === 'folder') {
folder = val;
}
});
busboy.on('finish', () => {
if (uploads.length === 0) {
res.end('no files found');
}
for (let i = 0; i < uploads.length; i++) {
const upload = uploads[i];
const file = upload.file;
task.uploadFile(helpers.fbAdmin, upload.folder, upload.file, uploadedFilename).then(downloadLink => {
res.write(`${downloadLink}\n`);
fs.unlinkSync(file);
res.end();
});
}
});
busboy.end(req.rawBody);
} else {
// Client error - only support POST
res.status(405).end();
}
} catch (e) {
console.error(e);
res.sendStatus(500);
}
});
const api = functions.https.onRequest(app);
module.exports = {
api
;
upload-file-to-storage.js
exports.uploadFile = (fbAdmin, folder, filepath, filename) => {
// get the bucket to upload to
const bucket = fbAdmin.storage().bucket(); //`venture-spec-sheet.appspot.com/${folder}`
const uuid = uuid();
// Uploads a local file to the bucket
return bucket
.upload(filepath, {
gzip: true,
metadata: {
//destination: `/${folder}/${filename}`,
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
.then(() => {
const d = new Date();
const expires = d.setFullYear(d.getFullYear() + 50);
// get file from the bucket
const myFile = fbAdmin
.storage()
.bucket()
.file(filename);
// generate a download link and return it
return myFile.getSignedUrl({ action: 'read', expires: expires }).then(urls => {
const signedUrl = urls[0];
return signedUrl;
});
});
};
I've tried a few things
Setting the bucket name to default and a folder. This resulted in a server error.
const bucket = fbAdmin.storage().bucket(`${defaultName}/${folder}`);
Setting the bucket name to the folder. This resulted in a server error.
const bucket = fbAdmin.storage().bucket(folder);
And, I've also tried using the destination property of uploadOptions.
But this still puts the file in the default bucket.
.upload(filepath, {
gzip: true,
metadata: {
destination: `${folder}/${filename}`, // and /${folder}/${filename}
}
})
Is it possible to upload to a folder using the Admin SDK?
E.g. I want to upload a file so that is is placed in a named "folder".
I.e. so I can reference the file at the path: bucket/folder/file.jpg
In the example below, each "folder" is named with a firebase key.
Found the problem.
I stupidly declared the destination option in the wrong place.
Instead of in the metadata object:
return bucket
.upload(filepath, {
gzip: true,
metadata: {
destination: `${folder}/${filename}`,
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
It should have been on the options object:
return bucket
.upload(filepath, {
gzip: true,
destination: `${folder}/${filename}`,
metadata: {
cacheControl: 'public, max-age=31536000',
firebaseStorageDownloadTokens: uuid
}
})
With this change made the file now gets uploaded into a named "folder".
There is a create folder option besides Upload File button for a bucket in Storage console.One can create folders in bucket and upload files to it on console. To create such folders in bucket using admin APIs, add folders before file reference. e.g.
const blob = bucket.file('folder1/folder2/' + req.file.originalname);

Expo/Firebase: Image chosen from camera roll uploading as octet-stream instead of .jpg

I've been having trouble viewing the image files I've uploaded to firebase and just noticed the issue is with the file type in firebase.
Two files in my firebase storage console. One uploaded from my IOS simulator (octet-stream) and the other uploaded directly into the console from the browser which uploads properly and is viewable.
Here are my select and upload functions:
_selectPhoto = async () => {
const status = await getPermission(Permissions.CAMERA_ROLL);
if (status) {
let imageName = "pic"
const result = await ImagePicker.launchImageLibraryAsync(options);
if (!result.cancelled) {
Animated.timing(this.animatedWidth, {
toValue: 600,
duration: 15000
}).start()
this.uploadImage(result.uri, imageName)
.then(() => {
this.props.navigation.navigate('Profile')
})
.catch((error) => {
Alert.alert('Must Sign In');
this.props.navigation.navigate('Login')
console.log(error);
})
}
}
};
uploadImage = async (uri, imageName) => {
const user = firebase.auth().currentUser;
const response = await fetch(uri);
const blob = await response.blob();
let storageRef = firebase.storage().ref().child(''images/'+user.displayName+'/'+imageName+'.jpg'');
const snapshot = await storageRef.put(blob);
blob.close();
snapshot.ref.getDownloadURL().then(function(downloadURL) {
console.log("File available at", downloadURL);
user.updateProfile({
photoURL: downloadURL.toString(),
}).then(function() {
console.log('saved photo')
}).catch(function(error) {
console.log('failed photo')
});
});
}
When I get the link in my console, it also has the media&token:
... .appspot.com/o/profile-pic.jpg?alt=media&token=56eb9c36-b5cd-4dbb-bec1-3ea5c3a74bdd
If I CMD+Click in VS Code I receive an error:
{
error: {
code: 400,
message: "Invalid HTTP method/URL pair."
}
}
So naturally, when I put that link in the browser it downloads a file with that name but says:
The file “pic.jpg” could not be opened.
It may be damaged or use a
file format that Preview doesn’t recognize.
Maybe it could be something with mediaTypes, but I'm not exactly sure how to use it.
mediaTypes : String -- Choose what type of media to pick. Usage:
ImagePicker.MediaTypeOptions., where is one of: Images,
Videos, All.
Thanks!
I've been fighting with this same issue for the past few days. I was finally able get images to upload and render as expected by following the Firebase Upload example in the Expo repo. I don't fully understand why it works, but it seems like Firebase doesn't like the blob that's generated by
const blob = await response.blob();
Try replacing the above with:
const blob = await new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.onload = function() {
resolve(xhr.response);
};
xhr.onerror = function(e) {
console.log(e);
reject(new TypeError('Network request failed'));
};
xhr.responseType = 'blob';
xhr.open('GET', uri, true);
xhr.send(null);
});

Cloud function generating thumbnails doesn't overwrite old thumbnail with the same name

I'm using cloud functions to resize and generate thumbnails for uploaded images in Firebase Storage. On the first upload the thumbnails are generated but i also want to be able to edit those images while keeping the same name.
This is how i'm doing it :
I upload an image with this function on the client :
uploadImage (imageFile, folderName, imageName){
const storageRef = firebase.storage().ref();
// need to prefix image name with "slot_"
storageRef.child(`${folderName}/slot_${imageName}`).put(imageFile)
}
Thumbnails are then generated with this cloud function :
export const generateThumbs = functions.storage.object().onFinalize(async
object => {
const bucket = gcs.bucket(object.bucket)
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const slotSizes = [64,250]
const temporaryDirectory = join(tmpdir(), `thumbs_${fileName}`);
const temporaryFilePath = join(temporaryDirectory, 'source.png');
// avoid loop includes only images
if (fileName.includes('thumb_') ||
!object.contentType.includes('image')) {
return false;
}
await fileSystem.ensureDir(temporaryDirectory);
// original file in temp directory
await bucket.file(filePath).download({
destination: temporaryFilePath
});
const slotUploadPromises = slotSizes.map(async size => {
const thumbName = `thumb_${size}_${fileName}`;
const thumbPath = join(temporaryDirectory, thumbName);
await sharp(temporaryFilePath).resize(size, size).toFile(thumbPath);
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName),
metadata: {
contentType: 'image/jpeg',
}
})
});
await Promise.all(slotUploadPromises)
// removes temp directory
return fileSystem.remove(temporaryDirectory);
So if i call uploadImage(appleImage, 'MyImages', 'test') i'll have in my storage folder 3 images (naming IS important):
slot_test
thumb_250_slot_test
thumb_64_slot_test
At this point if i call again uploadImage(avocadoImage, 'MyImages', 'test') i'd expect to have in the storage the same "naming structure" but with the updated image in place of the old ones, so the new thumbnails should just overwrite the old ones. What actually happens is that the "base" image gets updated while both thumbnails don't. Ending up with :
slot_test (displaying the UPDATED image)
thumb_250_slot_test (displaying the thumbnail of the OLD image)
thumb_64_slot_test (displaying the thumbnail of the OLD image)
I've logged the cloud function extensively, no errors are thrown from the function during execution, thumbnails are created normally and the firebase console also updates the creation date of the thumbnails but i still get the old thumbnails image. I've tried removing the temporary directory using fs-extra emptyDir(), i've tried to remove every single thumbnail first (via client) and then uploading again with no luck.
EDIT : Found a solution to my problem by NOT creating any temporary folder or temporary files and using sharp pipeline instead. That said i'm still missing the underlying problem in the code above. I'm quite convinced that, for whatever reason, the function didn't remove the temporary folder and that was generating problems whenever i tried to overwrite the images. This function works :
export const generateThumbs = functions.storage.object().onFinalize(async object => {
const bucket = gcs.bucket(object.bucket)
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
// metadata file
const metadata = {
contentType: 'image/jpeg',
}
if (fileName.includes('thumb_') || !object.contentType.includes('image')) {
return false;
}
if (fileName.includes('slot')) {
// array of promises
const slotUploadPromises = slotSizes.map(async size => {
const thumbName = `thumb_${size}_${fileName}`;
const thumbPath = join(path.dirname(filePath), thumbName);
const thumbnailUploadStream = bucket.file(thumbPath).createWriteStream({ metadata });
const pipeline = sharp();
pipeline.resize(size, Math.floor((size * 9) / 16)).max()
.pipe(thumbnailUploadStream);
bucket.file(filePath).createReadStream().pipe(pipeline);
return new Promise((resolve, reject) =>
thumbnailUploadStream
.on('finish', resolve)
.on('error', reject));
});
await Promise.all(slotUploadPromises)
}

database triggers firebase function to download images from URL and save it to storage

I want to download the image and save it to storage when my database is updated with the 'photo_url' field
exports.saveToStorage = functions.database.ref(`/images/${itemImageRef}`)
.onWrite(event => {
const filePath = event.data.val();
const filename = filePath.split('/').pop();
var download = request.get(filePath).on('error', (err) => {
console.log(err)
})
.pipe(fs.createWriteStream(filename));
download.on('finish', () => {
const bucket = gcs.bucket('id.appspot.com');
const storagePath = `images/${filename}`;
return bucket.upload(download, { destination: storagePath })
.then(() => {
console.log('success upload');
});
});
});
it logs "Error: EROFS: read-only file system, open 'image.jpg' at Error (native)." I suppose I cannot retrieve the file saved by createWriteStream?
So how should I download images from the web?
with the post suggested by #Jobsamuel, the code now works:
exports.saveToStorage = functions.database.ref(`/images/${itemImageRef}`)
.onWrite(event => {
const filePath = event.data.val();
const filename = filePath.split('/').pop();
const bucket = gcs.bucket('id.appspot.com');
const remoteWriteStream = bucket.file(filename).createWriteStream({
metadata: { contentType: 'image/jpeg' }
});
request(filePath).pipe(remoteWriteStream)
.on('error', (err) => console.log(err))
.on('finish', () => console.log('success save image'));
});
By pipe the request result directly to the bucket, it solves the problem by skipping the step writing to a local file, which I suspect is the reason my original code fails. Also, don't forget to set contentType for images.

Resources