I'm developing locally with firebase emulator.
The front app create a project with some files uploaded to a 'tmp/' folder in storage bucket.
I need to move the uploaded files to the project folder once the project is saved. But when doing (a cloud function)
const storage = new Storage();
// also tried:
// const storage = admin.storage();
const bucket = storage.bucket(DEFAULT_BUCKET);
const destRoot = `${project.id}/files`;
// `tmpFiles` is an array of file paths
const movedfiles = await Promise.all(
tmpFiles.map((file: string) =>
bucket
.file(`tmp/${file}`)
.move(`${destRoot}/${file}`)
.then((result) => result[0].publicUrl())
.catch(console.log)
)
);
I got this error
ApiError: file#copy failed with an error - Not Implemented
So it's no possible to move storage files for now?
The Firebase emulators typically only/first implement operations that are also possible in the client-side SDK. Since the client-side SDK for Cloud Storage doesn't have an API to move files, the operation is (currently) also not implemented in the emulator when using the Admin SDK.
This specific feature is being tracked on the Github repo as #3751, so I recommend checking there for updates, or submitting a PR adding the functionality.
Related
I'm trying to trigger a Firebase Cloud Function on a Realtime Database owned by a different app: the Hacker News API. I'm passing databaseUrl: https://hacker-news.firebaseio.com/ to admin.initializeApp(), and it deploys fine, but it never triggers, even though new items in this RTDB are regularly getting created.
const functions = require('firebase-functions')
const admin = require('firebase-admin')
admin.initializeApp({databaseUrl: 'https://hacker-news.firebaseio.com/'})
exports.itemUpdated = functions.database.ref('/v0/item/{id}').onCreate((snapshot, context) => {
functions.logger.log('Item: ', context.params.id, snapshot.val())
return null
})
I've tried it without databaseUrl in the local emulator, and it works fine. I've also tried a number of variations, eg functions.database.instance('hacker-news') and databaseUrl: 'https://hacker-news.firebaseio.com/v0', but no luck, it never triggers in production. Any idea what I'm missing?
It's just not possible to write functions in one project for a database in another project. The events generated from database changes always stay within the project.
I needed a way to quickly clear an entire firestore DB but couldn't find great documentation for how to do it. Eventually I found on stack overflow this answer (Clear Firestore database of all data?) but was a bit nervous about how long it would take to clear my entire DB of millions of documents, so I want a way to just recursively delete a collection at a time.
Background: I've been running some tests migrating large amounts of data from an old DB to firestore, and after each run I want a clean slate to work with in firestore. Do NOT use this on production data!
This is now documented via recursiveDelete function:
https://googleapis.dev/nodejs/firestore/latest/Firestore.html#recursiveDelete
Note that this is a relatively new feature, so you need to make sure your Firebase libraries are updated.
// Setup
const admin = require('firebase-admin');
const serviceAccount = require("./files/my-file.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount)
});
const firestore = admin.firestore();
// Delete
const documentRef = firestore
.collection("users")
.doc("M3S2iPhsiu2ZQmOK8ZcC");
await firestore.recursiveDelete(documentRef);
I was able to do this by running the firestore cli delete command, specifying each collection by path. Make sure not to start the path with a leading slash, or it will think you are referring to a directory on your computer. Example of how to run:
firebase firestore:delete "path/to/collection" --recursive
The firestore:delete command is sort of documented here as well: https://firebase.google.com/docs/firestore/manage-data/delete-data#delete_data_with_the_firebase_cli
Update
Please note that the command may fail after deleting about 20000 documents. I think it might be a limit it hits. I just re-run with the same collection path a few times to fully delete collections that are larger than 20k docs.
From the Fireship website : https://fireship.io/snippets/delete-firestore-collection/ there is 2 options to delete collections in firestore :
Option 1 :
You can manually delete a collection or subcollection from the Firebase Console OR by using the CLI :
firebase firestore:delete path-to-delete
Option 2 :
It is possible to interact with Firebase Tools from a Cloud Function. This works especially well with Callable functions because you most certainly want to enforce some form of user authorization.
First, obtain CI token to authenticate firebase tools.
cd functions
npm i firebase-tools -D
firebase login:ci
# your_token
firebase functions:config:set ci_token='your_token'
The function should validate the user has permission to run the operation. If allowed, it runs the CLI command recursively on the collection and its nested subcollections.
const project = process.env.GCLOUD_PROJECT;
const token = functions.config().ci_token;
exports.deleteCollection = functions.runWith({ timeoutSeconds: 540})
.https.onCall((data, context) => {
const path = data.path;
const allowed = context.auth.uid === path.split('/')[0]; // TODO your own logic
if (!allowed) {
throw new functions.https.HttpsError(
'permission-denied',
'Hey, that is not cool buddy!'
);
}
return firebase_tools.firestore
.delete(path, {
project,
token,
recursive: true,
yes: true,
})
.then(() => ({ result: 'all done!' }));
});
I want to upload some data to Google Cloud Storage and then run a Firebase Function on the server to process the data.
My reason for this is that the data (representing information I want to import into Firestore) could be quite large – maybe 50MB. I could easily have data that represents 30,000 documents. I don't want to process the data on my local machine because it would possibly take several hours.
My problem is I can't find a way to access Cloud Storage from the server itself. I can easily download and upload files between local and server. I've spent the last several hours reading through the docs but everything seems to be based on accessing Cloud Storage from a local instance.
You can trigger Cloud Functions to run when a change is made in Cloud Storage. Docs here. The file uploaded or changed is already included in the metadata, so you don't need to "access storage" per se. From the example:
/**
* Generic background Cloud Function to be triggered by Cloud Storage.
*
* #param {object} data The event payload.
* #param {object} context The event metadata.
*/
exports.helloGCSGeneric = (data, context) => {
const file = data;
console.log(` Event ${context.eventId}`);
console.log(` Event Type: ${context.eventType}`);
console.log(` Bucket: ${file.bucket}`);
console.log(` File: ${file.name}`);
console.log(` Metageneration: ${file.metageneration}`);
console.log(` Created: ${file.timeCreated}`);
console.log(` Updated: ${file.updated}`);
};
If you want direct access to Storage from within the function (if for example you want to write back to Storage with new/altered data), you use the Storage SDKs. Examples for all supported languages are here. Example from the docs:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
// Creates a client from a Google service account key.
// const storage = new Storage({keyFilename: "key.json"});
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// const bucketName = 'bucket-name';
async function createBucket() {
// Creates the new bucket
await storage.createBucket(bucketName);
console.log(`Bucket ${bucketName} created.`);
}
createBucket();
A very important caveat! Cloud Functions has a maximum execution time of about 60 seconds (limits doc). So long running processes up to 15 minutes should utilize Cloud Run instead. Longer running processes should run in an environment like App Engine (which you could kick off from your function and delegate the work to the App Engine instance).
I am trying to write a function that updates a node in database and then create a directory in the default storage bucket.
admin.database().ref('messages').push({ original: original })
.then(() => {
//looking for something like this
//functions.storage.object().mkdir('myFolder');
})
Function samples from firebase docs use const gcs = require('#google-cloud/storage')(); but i am having hard time importing this package using typescript.
importing it this way does not work. instead of having access to gcs.bucket(...) i have access to gcs.Bucket
import * as gcs from '#google-cloud/storage';
I am looking for ways to get this import working or other ways i can use in typescript.
thanks.
Google Cloud Storage does not have a concept of "folders" -- as a pure object store, you can store a file with any arbitrary key. Various UIs (including the Firebase console) look for slashes in the object names to provide a virtual structure, but no actual structure exists.
On the import issue -- as of version 5.2.0 of the Admin SDK, you can just do:
admin.storage().bucket()
to get a reference to the Cloud Storage bucket.
How can I integrate Firebase with a Java desktop application?
Is there a .jar file I could download?
I've seen plain Java docs on the old Firebase version but I'm unable to find any documentation on the latest version of Firebase.
My IDE is Netbeans.
Thanks.
According to the documentation website Firebase will not work just like that, it's an application designed to run only on 3 platforms, namely:
Android
iOS
Web
You can try and use the maven repository for the integration purpose, with any build script. I'm not exactly sure what you expect to do.
For Firebase Storage on the server, I recommend using gcloud-java:
// Authenticate using a service account
Storage storage = StorageOptions.builder()
.authCredentials(AuthCredentials.createForJson(new FileInputStream("/path/to/my/key.json"))
.build()
.service();
// Create blob
BlobId blobId = BlobId.of("bucket", "blob_name");
// Add metadata to the blob
BlobInfo blobInfo = BlobInfo.builder(blobId).contentType("text/plain").build();
// Upload blob to GCS (same as Firebase Storage)
Blob blob = storage.create(blobInfo, "Hello, Cloud Storage!".getBytes(UTF_8));
You can use firebase-server-sdk-3.0.1.jar (current version)
In Netbeans I would recommend to create Maven project and use artifact: GroupId - com.google.firebase, ArtifactId: firebase-server-sdk.
I works perfectly for me.
You can find some documentation here.
To initialize SDK just follow documentation: add service account (I use Owner role, I haven't tried weaker roles), download private key, and use this snippet:
FirebaseOptions options = new FirebaseOptions.Builder()
.setServiceAccount(new FileInputStream("path/to/downloaded private key.json"))
.setDatabaseUrl("https://your database name.firebaseio.com/")
.build();
FirebaseApp.initializeApp(options);