It looks like Querying Google Cloud Bigtable Data is possible with BigQuery, with a url like:
https://googleapis.com/bigtable/projects/[PROJECT_ID]/instances/[INSTANCE_ID]/tables/[TABLE_NAME]
Even though Google Datastore is built on Google Bigtable, there's no indication of what the PROJECT_ID, INSTANCE_ID or TABLE_NAME would be, where
[PROJECT_ID] is the project containing your Cloud Bigtable instance
[INSTANCE_ID] is the Cloud Bigtable instance ID
[TABLE_NAME] is the name of the table you're querying
Is connecting to Datastore via a live connection possible via BigQuery? (i.e. not just via datastore-backup)
BigQuery allows you to query below sources
CSV files
Google sheets
Newline-delimited JSON
Avro files
Google Cloud Datastore backups
[Beta] Google Cloud Bigtable
BigQuery allows you to query below Google Cloud Datastore backups. But to do that you need create a table on BigQuery using the Datastore backup.
Follow the steps:
Step 1 - Create a bucket on gcs to store your backup. (link)
Step 2 - Take a backup of datastore. (link)
Step 3 - On Biguery load your backup creating a table (link)
Some considerations about Step 3:
You need import table by table.
The location will be the files ended on [Entity Name].backup_info.
Ex:
gs://bck-kanjih/ag9zfmdvb2dsLWNpdC1nY3ByQQsSHF9BRV9EYXRhc3RvcmVBZG1pbl9PcGVyYXRpb24YwZ-rAwwLEhZfQUVfQmFja3VwX0luZm9ybWF0aW9uGAEM.Conference.backup_info
Related
I want to transfer data in Firestore in a Firebase project to Firestore in another (GCP only) project.
What is the best solution for the this work?
Does the document ID change after transfer?
Project one:
Google Cloud console > firestore > Import/Export > Export
get a export the firestore data into the Storage bucket.
Google Cloud console > Cloud Storage > select bucket
download the export firestore folder
Project two:
Google Cloud console > Cloud Storage > select bucket
upload the export firestore folder
Google Cloud console > firestore > Import/Export > Import
import the firestore data into the Storage bucket.
Does the document ID change after transfer?
Documents ID did not change.
To move Cloud Firestore data from one project to another project you have to follow the following 4 steps -
Create a Cloud Storage bucket to hold the data from your source project
Export the data from your source project to the bucket.
Give your destination project permission to read from the bucket.
Import the data from the bucket into your destination project.
The data in the destination project remains as it is in the source project and specifically to answer your question, the document ID does not change after transfer.
To know the detailed process on how to move Cloud Firestore data from one project to another project you can follow this document.
Firebase Firestore and Google Cloud Firestore are the same thing. Firebase just provided client SDK and has security rules while Google Cloud mostly has server side SDKs. You can view the both Firebase console and Google Cloud console.
Also checkout this article for detailed information.
Does anyone know how I can manually copy/transfer data from Firestore database to Bigquery using Cloud Shell Terminal?
I did this in the past but I'm unable to find the documentation/video that I used. I find a lot that states that once Bigquery is connected to Firebase, it should be automatic but mine is not.
When I ran code in the Cloud Shell Terminal to pull data from Firebase the collection was copied as a table into a Bigquery dataset. Two tables were created and "raw_latest" and "raw_changelog" were created.
I'm not sure how to transfer another collection now.
I specifically need to transfer data from a subcollection in the Firestore database.
You can now export data from Cloud Firestore to BigQuery with a
Firebase Extension. To import all the previous data you will need
first to install the extension because all the writes while doing
the export first without installing the extension will be lost.
See: https://firebase.google.com/products/extensions/firestore-bigquery-export
Firestore allows import / export data to BigQuery using a GCS
bucket. The data is exported to a Cloud Storage bucket and from
there it can be imported into Big Query.
The gcloud commands for the same are :
export data :
gcloud beta firestore export --collection-ids=users gs://my bucket/users
load backup into bq :
bq load --source_format=DATASTORE_BACKUP mydataset.users gs://gs://mybucket/users/all_namespaces/kind_users/all_namespaces_kind_users.export_metadata
Here are some links that might be helpful:
https://firebase.google.com/docs/firestore/manage-data/export-import
https://cloud.google.com/bigquery/docs/loading-data-cloud-datastore
https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md
I have installed a firebase extension "collections to bigquery" to export the data in my firebase collection to the BigQuery table. I have installed the extension an hour ago and I can still not see any data in the BigQuery. I am new to FireBase so can anyone tell me how long does it usually take until data finally starts showing up in BigQuery?
Thanks!
The Firestore BigQuery extension will automatically sync future writes to a collection to BigQuery - if your database isn't actively being written to, it won't do anything just by installing it.
To bring existing documents into the BigQuery table, see this guide that runs through how to run the import script.
I have an existing Google Cloud project which uses Datastore. I'm excited by Firestore and would like to switch. According to https://www.youtube.com/watch?v=SYG-BgXoJFQ it is recommended to create a new project.
Is it possible to just nuke Datastore somehow (I don't care about the data) and start from scratch with Firestore?
If not, what are the implications of creating a new project?
If there is no data written to Cloud datastore, it's possible to convert from datastore to firestore with the following steps. I tried on 2 projects and each project took 3 minutes. The user should be the project owner to attempt:
Disable Cloud datastore API
Disable Cloud Firestore API
gcloud firestore databases create --region=us-central --project $PROJECT_ID
I have another project that has little datastore entities. I deleted them and executed the steps, but received the following error. While disabling datastore API, I read Disable Cloud Datastore API? If any resources were created by Cloud Datastore API, they may be deleted soon after Cloud Datastore API is disabled. All code that uses this project's credentials to call Cloud Datastore API will fail.
ERROR: (gcloud.firestore.databases.create) Error Response: [9] The
"database_type" field cannot be modified for this application. Note:
If data has already been written for this application, then the
"database_type" may not be modified.
The solution is to contact Google cloud support to convert from Datastore to Firestore. You need to make sure no entities exist and nothing is creating any entities.
In the case where no Datastore entities have been written, the operation should succeed:
Success! Selected Google Cloud Firestore Native database for $PROJECT_ID
It is not possible to switch from Datastore to Firestore within the same project as yet, but you may operate Firestore in Datastore mode. By creating a Cloud Firestore database in Datastore mode, you can access Cloud Firestore's improved storage layer while keeping Cloud Datastore system behavior. You may find more information of relevance by reading the "Automatic Upgrade to Cloud Firestore" documentation page.
How to automate the process of uploading .CSV files from Google Cloud onto Big Query.
Google Cloud Storage provides access and storage log files in CSV formats which can be directly imported into BigQuery for analysis. In order to access these logs, you must set up log delivery and enable logging. The schemas are available online, in JSON format, for both the storage access logs and storage bucket data. More information is available in the Cloud Storage access logs and storage data documentation.
In order to load storage and access logs into BigQuery from the command line, use a command such as:
bq load --schema=cloud_storage_usage_schema.json my_dataset.usage_2012_06_18_v0 gs://my_logs/bucket_usage_2012_06_18_14_v0