How can I add --cert flag when my project run on deno deploy?
I am using deno fresh and supabase postgres. When in local environment I use --cert flag to set certificate, but in deno deploy I didn't know how to set --cert flag
Related
I'm developing some Firestore security rules locally. I use mocha to test the rules, and locally everything works. I've a Jenkins pipeline that every time I merge a PR on develop it published the rules on Firebase in cloud. What I want to do is running my unit tests within Jenkins. Anyway, every time Jenkins calls yarn test from the pipeline, I get an error that says
#firebase/firestore: Firestore (7.18.0): Could not reach Cloud Firestore backend. Connection failed 1 times. Most recent error: FirebaseError: [code=internal]: 13 INTERNAL: Received RST_STREAM with code 2 triggered by internal client error: Protocol error
This typically indicates that your device does not have a healthy Internet connection at the moment. The client will operate in offline mode until it is able to successfully connect to the backend.
Is there a way to run the firebase emulators from Jenkins?
Thanks!
I found a way to do that.
By using firebase-tools-docker I can easily run my tests inside a docker container that brings up the emulators suite.
The Jenkinsfile goes like this:
def jenkinsUser = 1001
def firebaseDocker = 'andreysenov/firebase-tools:9.14.0'
stage('Pull docker image') {
sh "docker pull $firebaseDocker"
}
stage('Unit tests') {
sh "docker run -d --rm \
--user $jenkinsUser:$jenkinsUser \
-p 8080:8080 \
-v ${pwd()}:/home/node \
--name firebase-emulators \
$firebaseDocker \
firebase emulators:start"
sleep(5)
sh "docker exec firebase-emulators /bin/bash -c 'cd tests && yarn test'"
sh "docker stop firebase-emulators"
}
This is my folder structure (for reference):
Hope this helps 😉
Is it possible to use firebase-functions from my laptop? If not, is firebase-admin the only option remaining?
Here are some examples:
How can I rent and use my own servers for cloud functions?
Listen only to additions to a cloud firestore collection?
Does Firebase Admin SDK perform any caching?
I am able to make an index.js file on my laptop, npm install firebase-admin module, link to my Firestore database and make changes to data just fine, using admin-credentials. When I also try npm install firebase-functions make use of event-triggers onCreate/onWrite/onUpdate/onDelete, they do not get any updates?
To my understanding, the only way possible to make use of event-triggers is by uploading to cloud functions, since you need Google's infrastructure to use those and you can't use them on your local machine, which you can with firebase-admin package. You can use use the local emulator(?), but it isn't production ready and is not for that use case(?).
So, in order to listen for new events on my Firestore database, only using my laptop (not Google Cloud Functions platform or some other server-hosted option), I have to use .onSnapshot() from firebase-admin npm.
However that module is unable to cache, and you are left querying the whole firestore database, downloading every document.
Is this correct? or is there any way possible to make firebase-functions work from my laptop server using firebase-admin + admin credentials, almost as if I uploaded the file to cloud platform. I don't require this part of data to be on the cloud, so I want to make changes and adjust firestore database from my laptop's Terminal.
You will need to balance scalability and what you want to achieve. One approach uses a Parse Server and a Docker container that works with Express. This method's advantage is its flexibility as to where the parse server can run. You can run this on your laptop or move it to Google Cloud if you need more processing power. However, it is worth noting that the container cannot access all Firebase trigger types.
I am not sure which Operating System you are using, but for Ubuntu, you can install Docker and run Parse Server like this:
# Update the apt package first
$ sudo apt-get update
# install dependencies
$ sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \
gnupg-agent \
software-properties-common \
git
# Add Docker’s GPG key:
$ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
# add the Docker repository
$ sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) \
stable"
# Update the apt package again
$ sudo apt-get update
# Install Docker
$ sudo apt-get install docker-ce docker-ce-cli containerd.io
$ git clone https://github.com/parse-community/parse-server
$ cd parse-server
$ docker build --tag parse-server .
$ docker run --name my-mongo -d mongo
To run the Parse Server:
$ docker run --name my-parse-server -v config-vol:/parse-server/config \
-p 1337:1337 --link my-mongo:mongo -d parse-server --appId APPLICATION_ID \
--masterKey MASTER_KEY --databaseURI mongodb://mongo/test
To link Firebase and the Docker Parse Server, you will need an adapter. The container above is an example, but it should be enough to get you started running from your laptop.
I'm trying to set Some variables on Dokku for deployment. As far as i can see from the dev files, one should create a .env file in the directory and put the variables in there. But this is not updating anything
.env file
DOKKU_NGINX_PORT=3000
MYSQL_URL=http://blabla
MYSQL_USER=mysqluser
I'm trying to map the port of the app to port 3000, and inject the mysql vars into the runtime environment.
I know I can set it with dokku config:set on the server, but I want to be able to automate it during deployment.
Any ideas? Or an example?
You'll need to install a Dokku client, or CLI in order to locally interact with the remote application on your Dokku instance.
Here are a few options:
(node.js) dokku-toolbelt
Dokku toolbelt is a node-based CLI wrapper that proxies requests to
the Dokku command running on remote hosts.
You can install it via the following shell command (assuming you have node and npm installed):
$ npm install -g dokku-toolbelt
See documentation here for more information.
(python) dokku-client
Dokku client is an extensible python-based cli wrapper for remote
Dokku hosts.
You can install it via the following shell command (assuming you have python and pip installed):
$ pip install dokku-client
See documentation here for more information.
(ruby) Dokku CLI
Dokku CLI is a rubygem that acts as a client for your Dokku
installation.
You can install it via the following shell command (assuming you have ruby and rubygems installed):
$ gem install dokku-cli
See documentation here for more information.
After the Dokku client is installed locally, make sure that the dokku app remote is set inside the repository directory.
You can verify this by running $ git remote -v.
If the output doesn't show your dokku application instance, set it with the following command:
$ git remote add dokku dokku#example.com:your-app-name
Here's an example from my terminal with some information redacted for security purposes.
seth#linuxmint ~/repos/Adopt-a-Pet $ git remote -v
dokku dokku#example.com:adopt-a-pet (fetch)
dokku dokku#example.com:adopt-a-pet (push)
origin https://github.com/sethbergman/Adopt-a-Pet.git (fetch)
origin https://github.com/sethbergman/Adopt-a-Pet.git (push)
Then you can set environment variables with the following commands:
$ dokku config:set DOKKU_NGINX_PORT=3000
You can optionally set environment variables with the .env file:
$ dokku config:set:file <path/to/.env>
If the .env file is in the root directory of the repository, then the command would be:
$ dokku config:set:file <.env>
If you're using ruby, you can use the gem 'dokku-cli'. With that, you can set config from any file by issuing the command
dokku config:set:file <path/to/file>
See ruby doc
We have build our first Nodejs app and I want to integrate Jenkins as continuous integration we are running node server behind Nginx as proxy and source control in Gitlab. I need example configurations or steps.
I am looking here any doc or wiki link or if you can point me into right direction it will be helpful
I have CentOS server and managed to install and configure Jenkins but not getting the proper way to connect my Gitlab server. I need to run npm commands after each build. If any one already has done that please let me know.
Thanks
Your question is still vague but I will try to provide you here how I had done Jenkins NodeJs with Gitlab integration. I have CentOS 6 and tested.
Steps
Open Java should be installed prior.
sudo wget -O /etc/yum.repos.d/jenkins.repo http://pkg.jenkins-ci.org/redhat-stable/jenkins.repo
sudo rpm --import https://jenkins-ci.org/redhat/jenkins-ci.org.key
sudo yum install jenkins
sudo service jenkins start
Login as jenkins
sudo -s -H -u jenkins
Now generate the ssh key in the folder /var/lib/jenkins/.ssh and copy that key to gitlab
ssh-keygen
Install Gitlab Hook Plugin and GitLab Plugin in jenkins.
As you will create a project by accessing your jenkins in browser
After creating the project go to configure (left side menu) project page
There lots of options are self explanatory - setup Git repo url
and setup mail git browser url.
Create a new item in jenkins and add the git repo url and in build triggers
select Build when a change is pushed to GitLab. GitLab CI Service URL:
Build Triggers
check the option
Build when a change is pushed to GitLab
Paste that url in your gitlab repo's webhooks in settings.
This is to run npm commands after build
There is one section SSH Publisher
In exec commands section (I have put my example you can write your commands)
cd project_dir
rm -rf public server package.json
tar -xvf projectname.tgzÂ
ls
npm install --production
export NODE_ENV=production
forever restartall
jasmine-node spec/api/frisbyapi_spec.js
rm -rf projectname.tgzÂ
I have written most the steps that I took to setup jenkins nodejs and gitlab.
I might have forgot any step. If you face any error please post that as well.
It's unclear to me how to get my build files from the Gitlab CI (hosted on https://ci.gitlab.com) over to my personal server using rsync.
I have setup 1Â test and 1 deploy job.
Under the deploy tab I have inputed the bash commands to:
Install rsync
Update packages
Finally, the rsync command to
transfer files over SSH to my personal server.
When I enter the SSH credentials (with verbose flag on) for my private personal server, it would appear that the SSH key is the issue. In Gitlab, I have already established the deploy key (for hooks - tested this and it works).
Where do I locate the public SSH key for the Gitlab deploy instance so that I can install that key on my server?
Below is the exact script entered in Gitlab CI deploy job script pane:
# Run as root
(
set -e
set -u
set -x
apt-get update -y
apt-get -y install rsync
)
git clone https://github.com/bla/deployments.git $HOME/deploy/deployments
SVR_WEB1_WEBSERVER="000.11.22.333"
USER1="franklin"
GROUP1="team1"
FROM_DIR="/gitlab-ci-runner/tmp/builds/myrepo-1/"
DEST1="subdomains/gitlab/myrepo"
EXCLUSIONS_LIST="${HOME}/deploy/deployments/exclusions/exclusions.txt"
ssh -v "$USER1#$SVR_WEB1_WEBSERVER"
/usr/bin/rsync -avzh --progress --delete -e ssh --group=$GROUP1 -p --exclude-from "$EXCLUSIONS_LIST" "$FROM_DIR" "$USER1#$SVR_WEB1_WEBSERVER:$DEST1"
Providing your private ssh key is dangerous unless you use your own gitlab-ci runners for deployment. That's why it is better to use rsync modules.