Google send an email to me explaining they delete ->
Deletion notice for your Google Cloud Shell home directory
It's been over 120 days since you opened Cloud Shell from the Google Cloud Platform console. In 7 days, your Cloud Shell home directory will be automatically scheduled for deletion.
I have several Firebase projects, they have huge and important data that is stored in Firebase Storage and Firebase Firestore. Does it mean they will delete firebase databases also?
No data will be removed from Cloud Storage or Cloud Firestore when you Cloud Shell home directory is deleted.
As the email says:
your Cloud Shell home directory will be automatically scheduled for deletion.
So only data in that home directory will be deleted.
Related
I am writing files to someone else GCP storage bucket, for which they have made a service account for me, and I'm using the service account key (a .json file) as credentials in my code.
I made a Firebase Cloud Function to run my code, so and when I test it locally everything works as expected. It works when I run the cloud emulator as well.
However, when I deploy the function and try to execute it, I get an error like this in the logs:
ApiError: my-project#appspot.gserviceaccount.com does not have storage.objects.create access to the Google Cloud Storage object
Why might I have permissions locally, but not in the cloud? I thought the service account key should be all I need for credentials here.
As mentioned in the error message the service account
'my-project#appspot.gserviceaccount.com' doesn't have permission to
create bucket or add objects in Google Cloud Storage.
As you are writing into someone else's project, please ask the
project owner to verify if the service account
'my-project#appspot.gserviceaccount.com' have the necessary
permission to write to the Google Cloud Storage bucket. To find an
appropriate role for the service account to be able to access Google Cloud Storage I would recommend you to go
through this link.
I want to extend an existing GCP project with a Firebase Realtime Database.
According to the Firebase documentation,
all Firebase projects are internally hosted on GCP
it is possible to use Firebase features directly from a GCP project,
so I went to the Marketplace and found the corresponding product page there.
Unlike other product pages, this one doesn't have an "Enable" button on it but rather "Get started for free".
This button does nothing, no response at all. Any deas?
You can just go to the Firebase console and from there you will be able to use Firebase Realtime Database for the same project.
It is not possible to use all the features of Firebase directly on Google Cloud Platform, but you will find common products as Storage and Cloud Functions i.e. that are shared between Firebase and Cloud Platform
I have an application on firebase, but I must have server backup on another server, in cloud hosting or VPS, Please help me
Both Firebase Realtime Database and Firestore offer backup capability to a GCS bucket.
Firebase Realtime Database:
Use the "Backups" tab on the console and use the wizard to configure the backup. There's more about how to restore and how files are named here. There is no additional cost for the backup operation, but you are obviously charged for the storage of the backups in GCS.
For Firestore:
You can use the gcloud firestore export gs://[BUCKET_NAME] command to export to a GCS bucket, either the entire database or just one collection. Full documentation about restores, partial exports, etc, is here. Note you are charged for document reads to do the backup in addition to the GCS storage costs.
There does not appear to be a built-in way to automate this but the Admin SDK includes a FirestoreAdminClient.exportDocuments (that link is for Node) call you could presumably use to call from a scheduled cloud function to do the work.
You must be on the Blaze (pay as you go) plan for both databases.
I have created/forked a lil Google Apps Script Library to manage Firebase Firestore and Firebase Remote Config called FirebaseGoogleAppsScript. The goal is to simply manage the contents of your collections in an apps script as well as update your remote config.
My issue is I can't get the a service account to do both.
Firebase creates two service accounts upon creating a project:
The first is listed in the Firebase Console -> Project Settings -> Service Accounts. This one I use within my cloud functions to retrieve the Remote Config just fine. However in the Apps Script Project it is unable to retrieve any data from firestore. I tried adding all kinds of roles including Owner and Editor yet no firestore data, but I still can get the RemoteConfig.
The second is only visible in the GCP service accounts and has the title: Firebase Admin SDK Service Agent with the roles Firebase Admin SDK Administrator Service Agent and Service Account Token Creator. This one is able to retrieve all the data from firestore within an Apps Script Project. However in the apps script project I can't get it to retrieve the RemoteConfig even if I add the role Firebase Remote Config Admin.
I have also made my own service account which was able to get the Remote config and just about everything else from Firebase except the Firestore data. Seems only the one service account created by Firebase is able to get any data.
To recreate the issue simply deploy my lil FirebaseGoogleAppsScript project and associate it to the same GCP project Firebase is connected to. There is a test file in it which can recreate the issue assuming you have some data in RemoteConfig and a collection called posts with some docs.
What the heck is going on here? Why can't I make a service account who can access Firestore and RemoteConfig? Any ideas on what to do to create a proper role to do both? Do I really have to use two separate service accounts?
TL;DR:
Does my firebase project incur charges/use up quota by running firebase functions locally using $firebase serve --only functions?
Details:
I have a collection in my firestore database called locations with thousands of documents that include location data such as address and postCode. I want to add some geolocation data to each document using an external api service (Open Postcode Geo API). I've written an HTTP-triggered firebase function, triggered by hitting the endpoint /api/geocodeLocations. This cloud function will run through each document in the locations collection and use the postCode data to make an api call to Open Postcode Geo which will respond with the geolocation data (along with other data). It will then get the data it needs from the response and write it to the document in firestore.
I am aware that if I deploy this function and call it by hitting https://my-project/api/geocodeLocations, I will start racking up my quotas and maybe incur charges for the amount of external requests that it's making (and more than likely time out...). However, if I run $firebase serve --only functions locally on my dev machine to emulate functions and then hit the url http://localhost:5000/my-project/us-central1/app/api/geocodeLocations, will I still rack up quotas for external requests?
I don't think I will, as the requests are between my local machine and the external api, but I can't find any firebase documentation on quotas when running locally.
Thanks in advance!
Running functions locally in any way doesn't cost anything in terms of billing or quota for the Cloud Functions product.
If you read data out of Firestore or Realtime Database in a function running locally, you will still be billed for that access, but it won't be in relation to Cloud Functions billing or quotas.