Creating a firebase database duplicate - firebase

I am trying to create a replica of my main database for the sake of testing. However, I am having a hard time figuring out how to do that.
What I have tried is exporting the entire main-database into a bucket. Then I downloaded the 2022-10-24T16-etc.overall_export_metadata file from that bucket and uploaded it to a bucket for test-database. However, when I try to import that file, I get an error
Google Cloud Storage file does not exist: /database-copy/database-copy.overall_export_metadata
I'm a little confused as to why its looking for /database-copy/database-copy.overall_export_metadata when the file im trying to upload looks more like /database-copy/2022-10-24T16-etc.overall_export_metadata.
Any help would be appreciated. Thanks!

I just found a document that explains how to do this,
https://cloud.google.com/firestore/docs/manage-data/move-data

Related

Does admin.storage.object contain/process multiple files if multiple files are uploaded to Firebase Storage?

I am trying to move files into separate folders in Firebase Storage once they have been uploaded. As it turns out, you can not achieve this with the the JavaScript Web Client SDK for Storage. However, it appears that you could do so with the Admin SDK for Storage using Firebase Functions. So that is what I am trying to do. I understand that you need to first download a file into your Firebase Functions and then re-upload into a new folder in Storage.
To download a file, I need to pass its reference from the client and here is where it gets confusing to me. I am currently getting all the uploaded files in the client via the listAll() function which returns items and prefixes. I am wondering whether or not I can use either the items or the prefixes to then download the files into Firebase Functions using them (items or prefixes). Alternatively, I can pass the URLs. However, the question is, which method do I use to get and download them in Functions afterwards?
I know of admin.storage.object as explained in https://firebase.google.com/docs/storage/extend-with-functions#trigger_a_function_on_changes. However, does it handle multiples files? In other words, the object, as I understand, is one file that is uploaded to Storage and you can use its attributes such as object.bucket or object.name to access more information. However, what if there are multiple files uploaded at the same time, does it handle them one by one? Also, if I am passing the references or URLs of the files that need to be downloaded from the client, is admin.storage.object the right choice? Because it seems to simply process all the files uploaded to Storage, instead of getting any references from the client.
Further, there is a description of how to download a file (https://firebase.google.com/docs/storage/extend-with-functions#example_image_transformation) which is this code: await bucket.file(filePath).download({destination: tempFilePath});
I understand that the filepath is basically the name of the file that is already in Storage (ex. /someimage). But what if there are other files with the same name? Might the wrong file be downloaded? And how do I make sure that the filepath is the file that I passed from the client?
Let me know what your thoughts are and whether or not I am heading in the right direction. If you include a code in your answer, please write it in JavaScript for the Web. Thank you.
Thank you!
Here are some points that could help:
In GCP Storage technically there are no folders, GCS emulates the directory structure by using / in the names of objects.
When setting a cloud function triggered by a GCS object change, each object change is an event, each event triggers an invocation of the function (you might have an bucket for unprocessed files which triggers the function and have them move to a different bucket when proccesed)
You might consider using the REST API to move/copy/rename the objects without having to download them
As a side note the question is a little too broad, possibly these points could help clarify things for you.

Accessing files from Google cloud storage in RStudio

I have been trying to create connection between the Google cloud storage and RStudio server(The one I spinned up in Google cloud), so that I can access the files in R to run sum analysis on.
I have found three different ways to do it on the web, but I don't see many clarity around these ways so far.
Access the file by using the public URL specific to the file [This is not an option for me]
Mount the Google cloud storage as a disc in RStudio server and access it like any other files in the server [ I saw someone post about this method but could not find on any guides or materials that shows how it's done]
Using the googleCloudStorageR package to get full access to the Cloud Storage bucket.
The step 3 looks like the pretty standard way to do it. But I get following error when I try to hit the gcs_auth() command
Error in gar_auto_auth(required_scopes, new_user = new_user, no_auto =
no_auto, : Cannot authenticate -
options(googleAuthR.scopes.selected) needs to be set to
includehttps://www.googleapis.com/auth/devstorage.full_control or
https://www.googleapis.com/auth/devstorage.read_write or
https://www.googleapis.com/auth/cloud-platform
The guide on how to connect using this is found on
https://github.com/cloudyr/googleCloudStorageR
but it says it requires a service-auth.json file to set the environment variables and all other keys and secret keys, but do not really specify on what these really are.
If someone could help me know how this is actually setup, or point me to a nice guide on setting the environment up, I would be very much grateful.
Thank you.
Before using any services by google cloud you have to attach your card.
So, I am assuming that you have created the account, after creating the account go to Console ,if you have not created Project then Create Project, then click on sidebar find APIs & Services > Credentials.
Then,
1)Create Service Account Keys save this File in json you can only download it once.
2)OAuth 2.0 client ID give the name of the app and select type as web application and download the json file.
Now For Storage go to Sidebar Find Storage and click on it.
Create Bucket and give the name of Bucket.
I have added the single image in bucket, you can also add for the code purpose.
lets look how to download this image from storage for other things you can follow the link that you have given.
First create environment file as .Renviron so it automatically catches the json file and save it in a working directory.
In .Renviron file add those two downloaded json files like this
GCS_AUTH_FILE="serviceaccount.json"
GAR_CLIENT_WEB_JSON="Oauthclient.json"
#R part
library(googleCloudStorageR)
library(googleAuthR)
gcs_auth() # for authentication
#set the scope
gar_set_client(scopes = c("https://www.googleapis.com/auth/devstorage.read_write",
"https://www.googleapis.com/auth/cloud-platform"))
gcs_get_bucket("you_bucket_name") #name of the bucket that you have created
gcs_global_bucket("you_bucket_name") #set it as global bucket
gcs_get_global_bucket() #check if your bucket is set as global,you should get your bucket name
objects <- gcs_list_objects() # data from the bucket as list
names(objects)
gcs_get_object(objects$name[[1]], saveToDisk = "abc.jpeg") #save the data
**Note :**if you dont get json file loaded restart the session using .rs.restartR()
and check the using
Sys.getenv("GCS_AUTH_FILE")
Sys.getenv("GAR_CLIENT_WEB_JSON")
#it should show the files
You probably want the FUSE adaptor - this will allow you to mount your GCS bucket as a directory on your Server.
Install gcsfuse on the R server.
create a mnt directory.
run gcsfuse your-bucket /path/to/mnt
Be aware though that RW performance isnt great vis FUSE
Full documentation
https://cloud.google.com/storage/docs/gcs-fuse

Alfresco accepting only .txt files while uploading and giving 500 internal server error for other files

Today I found very wearied behavior of alfresco, When I upload any .txt file via share UI, its getting uploaded successfully, but if I upload any other type of file then it's giving 500 internal server error as shown in attached screen shot, also you can see in the image that .txt file got uploaded successfully.
The strange thing is there is no any error in the server logs.
Doe's anyone faced similar issue?
Also it is working for .txt, so is it a issue of Transformation?
Please suggest the possibilities of error.
Thanks in Advance.
Error while uploading through CMIS workbench:
It does sound like a transformation problem to me, but it is hard to be sure.
Because TXT files are working that means there isn't a problem with your repo being read-only or something like that. If these were office files, especially if they were large, you might be hitting a configurable transformer limit.
I would try uploading the problem files using something other than Share, such as:
Alfresco FTP
Alfresco WebDAV
Alfresco CIFS/SMB
Apache Chemistry Workbench
Using any (or all) of these will give you a clue as to whether the problem is in Share or lower in the stack.

how to add a pushkey when manually entering data via the firebase console?

I would like to create data in a new firebase database and I need to include pushkeys. I know it can import data via a JSON file I do not know how to create a pushkey in a JSON file.
I have tried using the firebase console data entry tool but could not figure out how to do it. Googling the problem did not supply and results.
If anyone can help I would prefer to use the firebase data entry tool.
you can write any key just be sure that your key is not duplicated and upload your JSON file, there is no problem doing that

kibana 5 upload CSV

I am trying to use Kibana 5 to upload CSV.
https://www.elastic.co/blog/kibana-5-0-0-alpha4 as per this link - we have a provision to upload the CSV file in kibana 5.
I tried accessing to upload CSV as mentioned in the document as Management .. I am trying to look for UPLOAD CSV or adding some file.
Can anyone help with this? I am not seeing the provision to add a CSV file on top of it.
See https://github.com/elastic/kibana/pull/8497. Apparently they didn't think it was production ready. Real shame because I think it would work for 99% of the cases (as far as I fiddled around with it).
As per the link you've provided in the question, the Kibana version seems to be 5.0.1 in order to upload any CSV.

Resources