How to push files to firebase storage using firebase cli - firebase

I have used firebase cli to host a website.Today i tried to push my files from my local machine to firebase storage using firebase cli but when i give the command firebase deploy nothing happened.can anyone tell me how to push my files to firebase storage.

Install gsutil using tutorial
https://cloud.google.com/storage/docs/gsutil_install#deb
Example for Ubuntu:
echo "deb [signed-by=/usr/share/keyrings/cloud.google.gpg] https://packages.cloud.google.com/apt cloud-sdk main" | sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list
apt-get install apt-transport-https ca-certificates -y
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key --keyring /usr/share/keyrings/cloud.google.gpg add -
sudo apt-get update && sudo apt-get install google-cloud-sdk
Login to google cloud
gcloud auth login
Go to displayed link, login and paste verification code back to console. Select project
gcloud config set project PROJECT_ID
Send file. For example:
gsutil cp backup.$(date +%F).gz.gpg gs://PROJECT_ID.appspot.com/backups

Related

Can I run Firebase Functions locally?

Is it possible to use firebase-functions from my laptop? If not, is firebase-admin the only option remaining?
Here are some examples:
How can I rent and use my own servers for cloud functions?
Listen only to additions to a cloud firestore collection?
Does Firebase Admin SDK perform any caching?
I am able to make an index.js file on my laptop, npm install firebase-admin module, link to my Firestore database and make changes to data just fine, using admin-credentials. When I also try npm install firebase-functions make use of event-triggers onCreate/onWrite/onUpdate/onDelete, they do not get any updates?
To my understanding, the only way possible to make use of event-triggers is by uploading to cloud functions, since you need Google's infrastructure to use those and you can't use them on your local machine, which you can with firebase-admin package. You can use use the local emulator(?), but it isn't production ready and is not for that use case(?).
So, in order to listen for new events on my Firestore database, only using my laptop (not Google Cloud Functions platform or some other server-hosted option), I have to use .onSnapshot() from firebase-admin npm.
However that module is unable to cache, and you are left querying the whole firestore database, downloading every document.
Is this correct? or is there any way possible to make firebase-functions work from my laptop server using firebase-admin + admin credentials, almost as if I uploaded the file to cloud platform. I don't require this part of data to be on the cloud, so I want to make changes and adjust firestore database from my laptop's Terminal.
You will need to balance scalability and what you want to achieve. One approach uses a Parse Server and a Docker container that works with Express. This method's advantage is its flexibility as to where the parse server can run. You can run this on your laptop or move it to Google Cloud if you need more processing power. However, it is worth noting that the container cannot access all Firebase trigger types.
I am not sure which Operating System you are using, but for Ubuntu, you can install Docker and run Parse Server like this:
# Update the apt package first
$ sudo apt-get update
# install dependencies
$ sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \
gnupg-agent \
software-properties-common \
git
# Add Docker’s GPG key:
$ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
# add the Docker repository
$ sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) \
stable"
# Update the apt package again
$ sudo apt-get update
# Install Docker
$ sudo apt-get install docker-ce docker-ce-cli containerd.io
$ git clone https://github.com/parse-community/parse-server
$ cd parse-server
$ docker build --tag parse-server .
$ docker run --name my-mongo -d mongo
To run the Parse Server:
$ docker run --name my-parse-server -v config-vol:/parse-server/config \
-p 1337:1337 --link my-mongo:mongo -d parse-server --appId APPLICATION_ID \
--masterKey MASTER_KEY --databaseURI mongodb://mongo/test
To link Firebase and the Docker Parse Server, you will need an adapter. The container above is an example, but it should be enough to get you started running from your laptop.

How to mount Cloud Filestore in GCP AI platform Jupyter notebook?

I want to mount a Cloud Filestore instance in a GCP AI Platform Jupyter notebook instance so that I don't have to upload all of my data into the notebook.
I followed the instructions at https://cloud.google.com/filestore/docs/mounting-fileshares, but get these error messages:
root#0084329abd1b:/home# mount <IP_ADDRESS>:/streams cfs
mount.nfs: rpc.statd is not running but is required for remote locking.
mount.nfs: Either use '-o nolock' to keep locks local, or start statd.
root#0084329abd1b:/home# mount -o nolock <IP_ADDRESS>:/streams cfs
mount.nfs: Operation not permitted
From your terminal, you can do something like this.
mkdir des_bucket
gcsfuse --debug_gcs --implicit-dirs src_bucket des_bucket
Create a Filestore instance link
Crerate a Google VM instance link
Create a Notebook AI instance link
On the VM instance run the commands:
sudo apt-get -y update
sudo apt-get -y install nfs-common
sudo mkdir test
# fileshare remote target
sudo mount 111.11.111.11:/fileshare test
sudo chmod go+rw test
echo 'This is a test' > test/testfile
ls test
#testfile
On the Notebook AI instance run the commands link:
sudo apt-get -y update
sudo apt-get -y install nfs-common
sudo mkdir test
# fileshare remote target
sudo mount 111.11.111.11:/fileshare /test
ls test
#testfile
You can also check link

How to persist/keep sqlite database in docker container application? [duplicate]

I'm new to Docker. Is it possible to embed a sqlite database in a docker container and have it updated every time my script in that container runs?
Dockerfile example to install sqlite3
FROM ubuntu:trusty
RUN sudo apt-get -y update
RUN sudo apt-get -y upgrade
RUN sudo apt-get install -y sqlite3 libsqlite3-dev
RUN mkdir /db
RUN /usr/bin/sqlite3 /db/test.db
CMD /bin/bash
persist the db file inside host OS folder /home/dbfolder
docker run -it -v /home/dbfolder/:/db imagename
If you want to persist the data in sqlite, use host directory/file as a data volume
Refer "Mount a host directory as a data volume" section in
https://docs.docker.com/storage/volumes/

kibana-time-plugin downloaded but bower install not working and even kibana not working

I wanted to have a Widget to view and edit the time range from within dashboards of kibana. So after lot of research i found a plugin as Kibana-time-plugin. Ref: https://github.com/nreese/kibana-time-plugin
Currently i am using kibana 5.4.0 in my local. After installing the plugin i have tried "bower install" as per the command specified in git page. But getting an error as :-
$ bower install
/usr/bin/env: ‘node’: No such file or directory
And even if Kibana is not running and giving an error as below attached image:-
kibana5.4.0
Can anyone Guide me on this ?
Thanks in Advance !!!!!!!!!!!!!!!
I think the optimization failures may be due to file permissions, the plugin files need to be accessible by the kibana user. Specifically check this instruction:
Installing plugins with linux packages
Here is a complete script that worked for me. I am new to Kibana and Kibana plugins so any feedback appreciated. Two important notes:
1) I am pulling the zip file from S3 so you will need to edit that.
2) Be sure to restart kibana afterwards and check the logs
#!/bin/bash
# install nodejs and npm
sudo curl --silent --location https://rpm.nodesource.com/setup_6.x | sudo bash -
sudo yum install -y nodejs
sudo npm install -g bower
# copy the plugin zip and unzip it and fix the name
cd /usr/share/kibana/plugins
sudo aws s3 cp s3://<YOUR-BUCKET>/kibana-time-plugin-master.zip .
sudo unzip kibana-time-plugin-master.zip
sudo mv kibana-time-plugin-master kibana-time-plugin
# install the plugin
cd /usr/share/kibana/plugins/kibana-time-plugin
sudo sed -i -e 's/5.0.0/5.4.2/' package.json
sudo chown -R kibana:kibana *
sudo mkdir -p /home/kibana
sudo chown -R kibana:kibana /home/kibana
sudo -u kibana bower install

How to run Gcloud datastore emulator in Travis-ci?

I'm having some problems running Gcloud's Datastore emulator in Travis-ci.
Now running it like:
script:
- export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)"
- echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list
- curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -
- sudo apt-get update && sudo apt-get install google-cloud-sdk
- nohup gcloud beta emulators datastore start &
But this seems less than ideal.
Not sure what is wrong with this setup, as you say it is 'less than ideal', which indicates that it works.
If you want the setup steps to be cleaner, you can install the google-cloud-sdk directly because it's whitelisted by travis:
dist: trusty
apt:
packages:
- google-cloud-sdk
before_script:
- gcloud beta emulators datastore start &
- $(gcloud beta emulators datastore env-init)

Resources