Asking google form to populate multiple file upload questions into 1 folder for each submission - google-forms

I have this script for my google form that has multiple file upload questions. Generally google forms will automatically create folders for each question in our drive.
My aim is to transfer all files into 1 folder (separated by response), rather than having to look into multiple folders for 1 response. I have done this script where the files will create a folder based on 2nd question (as title) and be in a designated folder (added folder id). However, there are some times where the google form response does not perform the trigger (failed).
Hope someone can help with the script. Because there is no indication of the error and i will need to manually look occassionally.
This is my script
const PARENT_FOLDER_ID = "1Za2tSQzjjmT4spA7azrI9hVerRLrPcz2";
const initialize = () => {
const form = FormApp.getActiveForm();
ScriptApp.newTrigger("onFormSubmit").forForm(form).onFormSubmit().create();
};
const onFormSubmit = ({ response } = {}) => {
try {
// Get some useful data to create the subfolder name
const firstItemAnswer = response.getItemResponses()[0].getResponse() // text in first answer
const seconditemAnswer = response.getItemResponses () [1].getResponse() // text in 2nd
const user = response.getRespondentEmail() // email (Collect email addresses must be enabled)
const time = response.getTimestamp() // when the response was submited
const subfolderName = seconditemAnswer
// Get a list of all files uploaded with the response
const files = response
.getItemResponses()
// We are only interested in File Upload type of questions
.filter(
(itemResponse) =>
itemResponse.getItem().getType().toString() === "FILE_UPLOAD"
)
.map((itemResponse) => itemResponse.getResponse())
// The response includes the file ids in an array that we can flatten
.reduce((a, b) => [...a, ...b], []);
if (files.length > 0) {
// Each form response has a unique Id
const parentFolder = DriveApp.getFolderById(PARENT_FOLDER_ID);
const subfolder = parentFolder.createFolder(subfolderName);
files.forEach((fileId) => {
// Move each file into the custom folder
DriveApp.getFileById(fileId).moveTo(subfolder);
});
}
} catch (f) {
Logger.log(f);
}
};

Related

Get values from SvelteKit's $app/stores outside of the lifecycle of a component

My Svelte components import readable stores like this:
import { classes, locations, schedule } from 'stores.ts'
In stores.ts, I want to build the URL for fetch dynamically using page.host from $app/stores.
// Note: this is not a Svelte component; it's stores.ts
import { readable } from 'svelte/store'
import { getStores } from '$app/stores'
const { page } = getStores()
let FQDN
page.subscribe(({ host }) => {
FQDN = host
})
const getArray = async (url) => {
const response: Response = await fetch(url)
if (!response.ok) throw new Error(`Bad response trying to retrieve from ${url}.`)
return await response.json()
}
const getReadableStore = (url: string) => readable([], set => {
getArray(`http://${FQDN}${url}`)
.then(set)
.catch(err => console.error('Failed API call:', err))
return () => {}
})
export const classes = getReadableStore('/api/class/public.json')
export const locations = getReadableStore('/api/location/public.json')
export const schedule = getReadableStore('/api/schedule/public.json')
The sixth line throws this error...
Error: Function called outside component initialization
at get_current_component (/Users/nates/dev/shy-svelte/node_modules/svelte/internal/index.js:652:15)
at Proxy.getContext (/Users/nates/dev/shy-svelte/node_modules/svelte/internal/index.js:685:12)
at Module.getStores (/.svelte-kit/dev/runtime/app/stores.js:17:26)
at eval (/src/stores.ts:6:38)
at instantiateModule (/Users/nates/dev/shy-svelte/node_modules/#sveltejs/kit/node_modules/vite/dist/node/chunks/dep-e9a16784.js:68197:166)
Two questions...
What is the correct way to get page values from $app/stores outside of the context of a component? Is this possible? Answer from below: No, this is not possible outside the context of a component.
If I'm accessing a SvelteKit site, let's say http://localhost:3000/something or https://example.com and a Svelte component loads a readable store from stores.ts, is there a way in stores.ts to determine whether the original page request that loaded the component (which loaded from stores.ts) was http or https? Answer from below: No, this is not possible in stores.ts - only from a component.
UPDATE: Based on the feedback, I'm going to set a value in my .env called VITE_WEB_URL=http://localhost:3000 and change it for the production system. This cuts down on the number of lines of code and may be a better practice (comments welcome)...
// revised stores.ts
import { readable } from 'svelte/store'
const { VITE_WEB_URL } = import.meta.env
const getArray = async (url) => {
const response: Response = await fetch(url)
if (!response.ok) throw new Error(`Bad response trying to retrieve from ${url}.`)
return await response.json()
}
const getReadableStore = (url: string) => readable([], set => {
getArray(`${VITE_WEB_URL}${url}`)
.then(set)
.catch(err => console.error('Failed API call:', err))
return () => {}
})
export const classes = getReadableStore('/api/class/public.json')
export const locations = getReadableStore('/api/location/public.json')
export const schedule = getReadableStore('/api/schedule/public.json')
Extract from https://kit.svelte.dev/docs#modules-$app-stores
Because of that, the stores are not free-floating objects: they must be accessed during component initialisation, like anything else that would be accessed with getContext.
Therefore, since the readable store is bound to the context of a svelte component, I suggest you subscribe either way ($ or .subscribe) inside the component of the SvelteKit website and then send the protocol value (http or https) as parameter when it updates so that stores.ts stores it in a variable.
However, it looks like SvelteKit does not provide the protocol value, so parse the client side window.location.href in the page subscription and then send it.
Referencing a svelte store can be done everywhere.
Using the $: shorthand syntax, however, only works within a component.
$: BASE = `http://${$page.host}`
SvelteKit appears to delegate this to fetch indeed

Firebase Storage - Image preview is permenantly loading

I've started working with firebase storage and firebase functions recently. Right now I've been developing file upload from functions to storage .
I've got it working (upload is done and file appears on the storage section), yet, the image, stays like this forever (loading forever on the right side):
I though that it was an error from my code. Yet, if I open Google Cloud Platform - Storage, the image appears and I can open it and preview it.
In firebase storage, if I open the image (select on it and click open), it returns the following url: https://console.firebase.google.com/u/0/undefined
What may I been doing wrong? Here's the code I'm using:
function uploadImage() {
const newImageData = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAOEAAADhCAMAAAAJbSJIAAAAgVBMVEX///8AAAAEBASAgIDr6+vw8PBYWFjU1NTGxsbz8/P29vb8/Py1tbVhYWHd3d1ra2vk5OS/v78pKSlTU1NOTk6Tk5OpqanNzc13d3dKSkplZWWbm5s5OTkfHx+NjY2GhoYcHBw9PT0TExOioqJ7e3soKCiurq5CQkI6OjoXFxcwMDAuPQWoAAAIJ0lEQVR4nO2daXuqPBCGVfYtIbKLgorLKf//B75ga2sPAdmS8F5n7m/VXgyPISGZzExWKwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAOAVy6wwlCZG/YUl+vamYLoy0lAkORWbdZNN/YUUVf8ju6bomx2Kq+mEBL5K0UVD9QNCdM0Vfds9QSQMnYTWaN1sEicMCRJ9+29QSCqp28HiftiqUkoU0TLaMDLsl8PbrtGWpY8zQ7QYCoE/pe3+ZusHogW9Yln2nxnVPfljW8t4l8go2zPQV7PPkCxa3kqOvOldr52NF4nVKNuYobxPsC1OoxVilu33ZINDQf1Rw2cO+mrOWBMhMD5w0ldziLnrQyeO+mpOfOdzZspZX03KcQGC2I+gNDC3ZkQ7IQLX6x0niW7fZd83ief7Pg7tH0JcfeIlQy+kcllCyk7/O/LUWIsiDbkVv9/bcv0Rqr+MVa//BR0Ob/++AstdKLuK8n4ZZCiKK4e7cikSez2iTjxm2DPjPj8e8wf1zWviXPqpPuHyeuqXb2ZK6WxaqJCuZe7m7qTTp1da6ty7prtbMoOOVro64VktonlmyFZUqB0NybQrkna7l1ndZIhc2k0xbESjdTEf63M7yBQ9bjO2Z+enamvCXcTCphG1zZ3YNSJ9OroJmRkM6UMOZmXPoppzWE4WEX1oY7Xmv1Fs7Zn9nl9gWt+/MTImNU2dprze+6GfmmYlRrYoHZ9dF/whbJrdMTLVdMzEPDZRlOZr48DIVMMQ2wnUN5SpIiNLjaF7z74X1uiNwWbDyNI/qHCd8XBGW1nDLj+F64K9RKtomuWocMvq3fvDjbIk5ahwvc0YGXuS0dbcPBWu1zlLx4mbU23yVbi+s3srkjvdJGeF1WqGTTO6rZsH3BVWjyqae1C1EP0BFaVwvQ60OR1EshZ0GROisJqG23Pt1Gp2q49GqML1OsmK6aMOKbK3OzbCFFZsLwEZ75YySHDpE1olUmFl/XrP02jE5aM0v1972phd2ydDQkvOnjRk4aFL3pDAjiUo/LoTHOuKLMtGzet+lPn4pPpG0eMRQTnLUfi8oTreWc0j7UmUq22x0f9PhbMDCkEhKBQPKJxNYTkgEmYCXiMShZvCXEkHTUXGcPZSpbFS5KZQWq3Qu4XORGIbUTa9eCqsFqt6OGemxSvbUH8sqwUrXK0sl/gM9PnE/XKNCFdYYZozB7VvsPkzU1+Cwhrrph7mGHfOB/X226+1FIU1duztr+Nlnq97L7YbV12SworITqXdvmcw5QvlfielNtU5sDCFNYpO0iJTk2MvbcdEzYqUtIdVLVDhAwvVubIk8D3Po421j88DUufHvvEnL1XhF2Yd7+xqTR6f9wq1XbjCGQCFs/EPKsR8sq6Vxi4bvxVwzCOJBTVXLxzX+BxSrgPKm4arFwM3J1lzQk/D5eunOTjsIqN0h57Gyd0TlbBJYtVa9xFF+NpUedZSM5Ypd6UfifEm4tts5QFkdOvO4RTlLz3GoTZDVpAWxu/WJAI9wn/uRaiPj65x9bC496ixIdjnvXek7Db8HWLfMsnpWWFjAV79w9ZxqlVtzwtXq2XH2Q7I71+Aws/72FyvVz/IsrzlgnmWBX71P5vBV16IQhr3uCUYbxBLVjgPoBAUgkLxgEJQCArFAwpBISgUDzeF1+E7vWMor8IUZi5mFUrzwxa7jRRLnntP0YCqUaNwItG7a3LBUqNT1B5K0fuHboj7RSQM5YjDT7edaIV1HlbeGAwmc82/c8XEK1ytTDkcXKeuEzWUlxf1JRdzOJlq7sXv/YGlKKxwi91pQtBXHfZ12hUNn/mCFNZEaYxPY6YC5QnH9JSwhSl8qCRpFu9Pfb3Yh9M+zlLSmvAmTmGbL/uBoUfkFhY48dvDa30/wUV4I5HembMoLs7b6xGKISOE9Ce5p6qql3//XX3XY88RNVICllvbpM7Hk+WhOab8aps0nzIeRZSoZZQYWaIkyLMNNfnEbprtHAAmQCtlyL6OEq3Q34WRLcqPyb4V+RqlGVuHLMP3FEofXDPrhqsVPYIHM6xtQg86SZgZ1Kj21iWjEjVu3jL5Y3eOgEw3WK3l4vnrRVlx65qTXRFaWlGqJ/Fq1qivVUdKHMvyW3JXMJZPBs9W6BhyZ6YYZlrSW++wXHVIiaCpIg1EpO61F9syePK7fMokt8noekMWInb+rnRLzLgqu/ved3jw8tQePrq6dpp771eUDvPDA+TT25uo2Ho4v4V9n1gjvOXY6+U+P3E4OqC7K75w/DglSZymQftNyUGaxkly+ujtcOVSi7K3xF987KRXdh+jrsKn2OZK57Pl1KTkJLCaD7PejqHjcDwYsXXGyBBWs98WrBtvieWN89FkJucTkTDif9ysgTgKnDwZHMfwY5FGwucQJAqdaS2zKhR1AiIoBIWgEBSCQlAICkEhKASFi1DIokobDV/Y2ePjvG7D4eaA+gcVRnyi9zdj6p/Pg8vncOedqCX+amXxOYA85exle8Xg4XDDYrxQX2g9iyBMYM8uLqEXiHVKyZbbkfFt5GwLCZ9ZhbANIJwrfp3GnU/04xsidpM3X9yb8BcoYJQzEwjvg09M++3ZNyNIbP6bMa1YCjVIchK2IvBFT8PaDaig85bDbmHyHljhaZ7+eDyFS9RXo9jx9K3hMrY57tcPh6SX/fiWPO4vKZ8TeKdg6SRTx4g8qhnRl/p4/oWJIiKpQx7YUpVIJGCrfgqG7CIb9zjSaeNgG7kzhaZyx7JM07Ljy4XWnuXlEtv1P7B9Mv8DltyUV+hIpoIAAAAASUVORK5CYII="
var mimeTypes = require('mimetypes');
var image = newImageData,
mimeType = image.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)![1],
fileName = 'test.' + mimeTypes.detectExtension(mimeType),
base64EncodedImageString = image.replace(/^data:image\/\w+;base64,/, ''),
imageBuffer = new Buffer(base64EncodedImageString, 'base64');
// Instantiate the GCP Storage instance
const { Storage } = require('#google-cloud/storage');
const googleCloudStorage = new Storage(firebaseSettings);
const bucket = googleCloudStorage.bucket('projectID.appspot.com');
var file = bucket.file(fileName);
return file.save(imageBuffer, {
metadata: { contentType: mimeType, cacheControl: "public, max-age=300" },
public: true,
validation: 'md5'
}, function (error: any) {
if (error) {
throw 'error';
}
return "https://storage.googleapis.com/share-expanses-dcc9f.appspot.com/" + fileName;
});
}
Thanks for the help
Haven't been able to test the solution given by Firebase, but here's the transcript of the response:
The problem that you are facing could be because of two reasons. The
first one is how you are uploading the files, via the Firebase
Console, using any Admin SDK, or via the gsutil command. If using the
Admin SDK option, the problem is a known issue where the required
metadata doesn’t exist, fortunately there is a workaround, you can try
this script to solve this issue.
Now, the second one is related to the network if you are using
comcast, please, try on a different network to see if this issue is
related to that.
When you save an image to firebase, you need to provide an access token in metadata : firebaseStorageDownloadTokens. It has to be an uuid.
More info can be found here : https://www.sentinelstand.com/article/guide-to-firebase-storage-download-urls-tokens
const { v4: uuid } = require("uuid")
function uploadImage() {
const newImageData = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAOEAAADhCAMAAAAJbSJIAAAAgVBMVEX///8AAAAEBASAgIDr6+vw8PBYWFjU1NTGxsbz8/P29vb8/Py1tbVhYWHd3d1ra2vk5OS/v78pKSlTU1NOTk6Tk5OpqanNzc13d3dKSkplZWWbm5s5OTkfHx+NjY2GhoYcHBw9PT0TExOioqJ7e3soKCiurq5CQkI6OjoXFxcwMDAuPQWoAAAIJ0lEQVR4nO2daXuqPBCGVfYtIbKLgorLKf//B75ga2sPAdmS8F5n7m/VXgyPISGZzExWKwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAOAVy6wwlCZG/YUl+vamYLoy0lAkORWbdZNN/YUUVf8ju6bomx2Kq+mEBL5K0UVD9QNCdM0Vfds9QSQMnYTWaN1sEicMCRJ9+29QSCqp28HiftiqUkoU0TLaMDLsl8PbrtGWpY8zQ7QYCoE/pe3+ZusHogW9Yln2nxnVPfljW8t4l8go2zPQV7PPkCxa3kqOvOldr52NF4nVKNuYobxPsC1OoxVilu33ZINDQf1Rw2cO+mrOWBMhMD5w0ldziLnrQyeO+mpOfOdzZspZX03KcQGC2I+gNDC3ZkQ7IQLX6x0niW7fZd83ief7Pg7tH0JcfeIlQy+kcllCyk7/O/LUWIsiDbkVv9/bcv0Rqr+MVa//BR0Ob/++AstdKLuK8n4ZZCiKK4e7cikSez2iTjxm2DPjPj8e8wf1zWviXPqpPuHyeuqXb2ZK6WxaqJCuZe7m7qTTp1da6ty7prtbMoOOVro64VktonlmyFZUqB0NybQrkna7l1ndZIhc2k0xbESjdTEf63M7yBQ9bjO2Z+enamvCXcTCphG1zZ3YNSJ9OroJmRkM6UMOZmXPoppzWE4WEX1oY7Xmv1Fs7Zn9nl9gWt+/MTImNU2dprze+6GfmmYlRrYoHZ9dF/whbJrdMTLVdMzEPDZRlOZr48DIVMMQ2wnUN5SpIiNLjaF7z74X1uiNwWbDyNI/qHCd8XBGW1nDLj+F64K9RKtomuWocMvq3fvDjbIk5ahwvc0YGXuS0dbcPBWu1zlLx4mbU23yVbi+s3srkjvdJGeF1WqGTTO6rZsH3BVWjyqae1C1EP0BFaVwvQ60OR1EshZ0GROisJqG23Pt1Gp2q49GqML1OsmK6aMOKbK3OzbCFFZsLwEZ75YySHDpE1olUmFl/XrP02jE5aM0v1972phd2ydDQkvOnjRk4aFL3pDAjiUo/LoTHOuKLMtGzet+lPn4pPpG0eMRQTnLUfi8oTreWc0j7UmUq22x0f9PhbMDCkEhKBQPKJxNYTkgEmYCXiMShZvCXEkHTUXGcPZSpbFS5KZQWq3Qu4XORGIbUTa9eCqsFqt6OGemxSvbUH8sqwUrXK0sl/gM9PnE/XKNCFdYYZozB7VvsPkzU1+Cwhrrph7mGHfOB/X226+1FIU1duztr+Nlnq97L7YbV12SworITqXdvmcw5QvlfielNtU5sDCFNYpO0iJTk2MvbcdEzYqUtIdVLVDhAwvVubIk8D3Po421j88DUufHvvEnL1XhF2Yd7+xqTR6f9wq1XbjCGQCFs/EPKsR8sq6Vxi4bvxVwzCOJBTVXLxzX+BxSrgPKm4arFwM3J1lzQk/D5eunOTjsIqN0h57Gyd0TlbBJYtVa9xFF+NpUedZSM5Ypd6UfifEm4tts5QFkdOvO4RTlLz3GoTZDVpAWxu/WJAI9wn/uRaiPj65x9bC496ixIdjnvXek7Db8HWLfMsnpWWFjAV79w9ZxqlVtzwtXq2XH2Q7I71+Aws/72FyvVz/IsrzlgnmWBX71P5vBV16IQhr3uCUYbxBLVjgPoBAUgkLxgEJQCArFAwpBISgUDzeF1+E7vWMor8IUZi5mFUrzwxa7jRRLnntP0YCqUaNwItG7a3LBUqNT1B5K0fuHboj7RSQM5YjDT7edaIV1HlbeGAwmc82/c8XEK1ytTDkcXKeuEzWUlxf1JRdzOJlq7sXv/YGlKKxwi91pQtBXHfZ12hUNn/mCFNZEaYxPY6YC5QnH9JSwhSl8qCRpFu9Pfb3Yh9M+zlLSmvAmTmGbL/uBoUfkFhY48dvDa30/wUV4I5HembMoLs7b6xGKISOE9Ce5p6qql3//XX3XY88RNVICllvbpM7Hk+WhOab8aps0nzIeRZSoZZQYWaIkyLMNNfnEbprtHAAmQCtlyL6OEq3Q34WRLcqPyb4V+RqlGVuHLMP3FEofXDPrhqsVPYIHM6xtQg86SZgZ1Kj21iWjEjVu3jL5Y3eOgEw3WK3l4vnrRVlx65qTXRFaWlGqJ/Fq1qivVUdKHMvyW3JXMJZPBs9W6BhyZ6YYZlrSW++wXHVIiaCpIg1EpO61F9syePK7fMokt8noekMWInb+rnRLzLgqu/ved3jw8tQePrq6dpp771eUDvPDA+TT25uo2Ho4v4V9n1gjvOXY6+U+P3E4OqC7K75w/DglSZymQftNyUGaxkly+ujtcOVSi7K3xF987KRXdh+jrsKn2OZK57Pl1KTkJLCaD7PejqHjcDwYsXXGyBBWs98WrBtvieWN89FkJucTkTDif9ysgTgKnDwZHMfwY5FGwucQJAqdaS2zKhR1AiIoBIWgEBSCQlAICkEhKASFi1DIokobDV/Y2ePjvG7D4eaA+gcVRnyi9zdj6p/Pg8vncOedqCX+amXxOYA85exle8Xg4XDDYrxQX2g9iyBMYM8uLqEXiHVKyZbbkfFt5GwLCZ9ZhbANIJwrfp3GnU/04xsidpM3X9yb8BcoYJQzEwjvg09M++3ZNyNIbP6bMa1YCjVIchK2IvBFT8PaDaig85bDbmHyHljhaZ7+eDyFS9RXo9jx9K3hMrY57tcPh6SX/fiWPO4vKZ8TeKdg6SRTx4g8qhnRl/p4/oWJIiKpQx7YUpVIJGCrfgqG7CIb9zjSaeNgG7kzhaZyx7JM07Ljy4XWnuXlEtv1P7B9Mv8DltyUV+hIpoIAAAAASUVORK5CYII="
var mimeTypes = require('mimetypes');
var image = newImageData,
mimeType = image.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)![1],
fileName = 'test.' + mimeTypes.detectExtension(mimeType),
base64EncodedImageString = image.replace(/^data:image\/\w+;base64,/, ''),
imageBuffer = new Buffer(base64EncodedImageString, 'base64');
// Instantiate the GCP Storage instance
const { Storage } = require('#google-cloud/storage');
const googleCloudStorage = new Storage(firebaseSettings);
const bucket = googleCloudStorage.bucket('projectID.appspot.com');
var file = bucket.file(fileName);
return file.save(imageBuffer, {
metadata: {
contentType: mimeType,
cacheControl: "public,
max-age=300",
// THIS IS THE LINE YOU NEED TO ADD
firebaseStorageDownloadTokens: uuid(),
},
public: true,
validation: 'md5'
}, function (error: any) {
if (error) {
throw 'error';
}
return "https://storage.googleapis.com/share-expanses-dcc9f.appspot.com/" + fileName;
});
}
After that you'll need to click on "Create access token"
#jean-smaug answer is almost complete. Based on the page he linked (https://www.sentinelstand.com/article/guide-to-firebase-storage-download-urls-tokens), the only missing thing is to wrap the firebaseStorageDownloadTokens property inside a metadata object. I've just tested it and it's working fine πŸ‘Œ No need to create access token afterwards.
In my case I added metadata while uploading and it loading as it showed in image but when I'm refresh page after 3 min I found that it upload correctly , so as Cafn explain if it not matter of metadata you should wait until it loaded
$uploadedObject=$bucket->upload($imageFile, [
'name' => 'Image_Name',
"metadata" => [ "contentType"=> 'image/png'],
]);

How to get public download link within a firebase storage trigger function: "onFinalize"?

I am writing a firebase cloud function that records the download link of a recentally uploaded file to real-time database:
exports.recordImage = functions.storage.object().onFinalize((object) => {
});
"object" gives me access to two variables "selfLink" and "mediaLink" but both of them when entered in a browser they return the following:
Anonymous caller does not have storage.objects.get access to ... {filename}
So, they are not public links. How can I get the public download link within this trigger function?
You have to use the asynchronous getSignedUrl() method, see the doc of the Cloud Storage Node.js library: https://cloud.google.com/nodejs/docs/reference/storage/2.0.x/File#getSignedUrl.
So the following code should do the trick:
.....
const defaultStorage = admin.storage();
.....
exports.recordImage = functions.storage.object().onFinalize(object => {
const bucket = defaultStorage.bucket();
const file = bucket.file(object.name);
const options = {
action: 'read',
expires: '03-17-2025'
};
// Get a signed URL for the file
return file
.getSignedUrl(options)
.then(results => {
const url = results[0];
console.log(`The signed url for ${filename} is ${url}.`);
return true;
})
});
Note that, in order to use the getSignedUrl() method, you need to initialize the Admin SDK with the credentials for a dedicated service account, see this SO Question & Answer firebase function get download url after successfully save image to firebase cloud storage.
*use this function:
function mediaLinkToDownloadableUrl(object) {
var firstPartUrl = object.mediaLink.split("?")[0] // 'https://storage.googleapis.com/download/storage/v1/b/abcbucket.appspot.com/o/songs%2Fsong1.mp3.mp3'
var secondPartUrl = object.mediaLink.split("?")[1] // 'generation=123445678912345&alt=media'
firstPartUrl = firstPartUrl.replace("https://storage.googleapis.com/download/storage", "https://firebasestorage.googleapis.com")
firstPartUrl = firstPartUrl.replace("v1", "v0")
firstPartUrl += "?" + secondPartUrl.split("&")[1]; // 'alt=media'
firstPartUrl += "&token=" + object.metadata.firebaseStorageDownloadTokens
return firstPartUrl
}
this is how your code might look like:
export const onAddSong = functions.storage.object().onFinalize((object) => {
console.log("object: ", object);
var url = mediaLinkToDownloadableUrl(object);
//do anything with url, like send via email or save it in your database in playlist table
//in my case I'm saving it in mongodb database
return new playlistModel({
name: storyName,
mp3Url: url,
ownerEmail: ownerEmail
})
.save() // I'm doing nothing on save complete
.catch(e => {
console.log(e) // log if error occur in database write
})
})
*I have tested this method on mp3 files, I'm sure it will work on all type of files but incase if it doesnt work for you simply go to firebase storage dashboard open any file and copy download url, and try to generate the same url in your code, and edit this answer too if possible

Error occurred while parsing your function triggers

The following error is shown while deploying firebase function.
I tried initializing the firebase functions.
I also double-checked the index.js file.
I'm new to deploying firebase functions so please help me for the same.
index.js is as follows:
const functions = require('firebase-functions');
// replaces keywords with emoji in the "text" key of messages
// pushed to /messages
exports.emojify =
functions.database.ref('/messages/{pushId}/text')
.onWrite(event => {
// Database write events include new, modified, or deleted
// database nodes. All three types of events at the specific
// database path trigger this cloud function.
// For this function we only want to emojify new database nodes,
// so we'll first check to exit out of the function early if
// this isn't a new message.
// !event.data.val() is a deleted event
// event.data.previous.val() is a modified event
if (!event.data.val() || event.data.previous.val()) {
console.log("not a new write event");
return;
}
// Now we begin the emoji transformation
console.log("emojifying!");
// Get the value from the 'text' key of the message
const originalText = event.data.val();
const emojifiedText = emojifyText(originalText);
// Return a JavaScript Promise to update the database node
return event.data.ref.set(emojifiedText);
});
// Returns text with keywords replaced by emoji
// Replacing with the regular expression /.../ig does a case-insensitive
// search (i flag) for all occurrences (g flag) in the string
function emojifyText(text) {
var emojifiedText = text;
emojifiedText = emojifiedText.replace(/\blol\b/ig, "πŸ˜‚");
emojifiedText = emojifiedText.replace(/\bcat\b/ig, "😸");
return emojifiedText;
}
Please check the current documentation on triggers, and specifically on migration from Beta to Version 1.0 .
event.data.previous.val() has changed to change.before.val()
event.data.val() has changed to change.after.val()
Also, the Promise statement changes to:
return change.after.ref.parent.child('text').set(emojifiedText);
The complete index.js looks like:
const functions = require('firebase-functions');
// replaces keywords with emoji in the "text" key of messages
// pushed to /messages
exports.emojify=
functions.database.ref('/messages/{pushId}/text')
.onWrite((change,context)=>{
// Database write events include new, modified, or deleted
// database nodes. All three types of events at the specific
// database path trigger this cloud function.
// For this function we only want to emojify new database nodes,
// so we'll first check to exit out of the function early if
// this isn't a new message.
// Only edit data when it is first created.
if (change.before.exists()){
return null;
}
// Exit when the data is deleted.
if (!change.after.exists()){
return null;
}
// Now we begin the emoji transformation
console.log("emojifying!");
//Get the value from the 'text' key of the message
const originalText = change.after.val();
const emojifiedText = emojifyText(originalText);
//Return a JavaScript Promise to update the database nodeName
return change.after.ref.parent.child('text').set(emojifiedText);
});
// Returns text with keywords replaced by emoji
// Replacing with the regular expression /.../ig does a case-insensitive
// search (i flag) for all occurrences (g flag) in the string
function emojifyText(text){
var emojifiedText=text;
emojifiedText=emojifiedText.replace(/\blol\b/ig,"πŸ˜‚");
emojifiedText=emojifiedText.replace(/\bcat\b/ig,"😸");
return emojifiedText;
}

Get storage space used in a folder in Firebase Storage

I'm creating a Firebase app that you can use to upload files. How can I get the amount of space used by a user in his folder (users/{userId}/{allPaths=**}) ?
Great question. In short, there's no easy way to do this (even for us!) since this effectively requires that we recurse over an entire set of files and sum them all up. It's a pretty big mapreduce that isn't efficient to run every time a file is uploaded.
We do however, return the size of an individual file in the metadata.size property, so you can perform your own list call on a server (look at gcloud`) which will give you a list of files and "folders". Take the sizes of the files and add them up, then recurse and do the same for all subfolders. Sum them up, and write them something like the Firebase Realtime Database, where you can easily grab the folder sizes from clients.
Here's a little script I wrote that calculates the count of files and bytes used for each of your "folders" and outputs to console.
function main(bucketName = 'YOUR_BUCKET_NAME') {
/**
* TODO(developer): Uncomment the following line before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function listFiles() {
// Lists files in the bucket
const [files] = await storage.bucket(bucketName).getFiles();
console.log('Files:');
let bucketList = {};
files.forEach(file => {
let folder = file.name.split('/')[0];
if (!bucketList[folder]) {
bucketList[folder] = {};
bucketList[folder]['bytes'] = 0;
bucketList[folder]['count'] = 0;
}
bucketList[folder]['bytes'] += Number(file.metadata.size);
bucketList[folder]['count'] +=1;
});
console.log(bucketList);
}
listFiles().catch(console.error);
// [END storage_list_files]
}
main(...process.argv.slice(2));
A slightly improved version based on #Mike's answer that also outputs size recursively for all subfolders and prints size in "human readable" format, ie MB, kB etc.
It also writes output to a json file that you can explore using some json viewers like https://jsoneditoronline.org/.
Note that you also need to pass a serviceaccount.json as credentials to Storage
function main(bucketName) {
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
const fs = require('fs').promises;
// Creates a client
const storage = new Storage({
credentials: //your service account json key
});
async function listFiles() {
// Lists files in the bucket
const [files] = await storage.bucket(bucketName).getFiles();
console.log('Files:');
let bucketList = {};
files.forEach(file => {
let folders = file.name.split('/');
let curFolder = bucketList
folders.forEach(subFolder => {
if (!curFolder[subFolder]) {
curFolder[subFolder] = {};
curFolder[subFolder]['bytes'] = 0;
curFolder[subFolder]['count'] = 0;
}
curFolder[subFolder]['bytes'] += Number(file.metadata.size);
curFolder[subFolder]['count'] +=1;
curFolder[subFolder]['size'] = humanFileSize(curFolder[subFolder]['bytes'])
curFolder = curFolder[subFolder]
})
});
console.log(bucketList);
await fs.writeFile("sizes.json", JSON.stringify(bucketList))
}
function humanFileSize(bytes, si=true, dp=1) {
const thresh = si ? 1000 : 1024;
if (Math.abs(bytes) < thresh) {
return bytes + ' B';
}
const units = si
? ['kB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB']
: ['KiB', 'MiB', 'GiB', 'TiB', 'PiB', 'EiB', 'ZiB', 'YiB'];
let u = -1;
const r = 10**dp;
do {
bytes /= thresh;
++u;
} while (Math.round(Math.abs(bytes) * r) / r >= thresh && u < units.length - 1);
return bytes.toFixed(dp) + ' ' + units[u];
}
listFiles().catch(console.error);
// [END storage_list_files]
}
//TODO change the name of the bucket to yours
main("my-bucket-name");

Resources