TCL command that waits for a string - networking

I am working on a script to upgrade firmware on a network switch. There are two commands that need to be ran but they have to wait for the other one to finish. I have used the after command but I am just taking a guess on the timing. Is there a way that when the switch comes back with 'TFTP to Flash Done.' it moves on to the next command?
Here is my command so far
CLI enable
CLI $deviceLogin
CLI $deviceEnablePwd
CLI copy tftp flash $tftpIP kxz10105.bin bootrom
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
after 120000
CLI copy tftp flash $tftpIP ICX64S08030h.bin primary
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
CLI !
after 180000
CLI show clock
CLI reload after 00:00:01

Related

How do you access Airflow Web Interface?

Hi I am taking a datacamp class on how to use Airflow and it shows how to create dags once you have access to an Airflow Web Interface.
Is there an easy way to create an account in the Airflow Web Interface? I am very lost on how to do this or is this just an enterprise tool where they provide you access to it once you pay?
You must do this on terminal. Run these commands:
export AIRFLOW_HOME=~/airflow
AIRFLOW_VERSION=2.2.5
PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
airflow standalone
Then, in there, you can see the username and password provided.
Then, open Chrome and search for:
localhost:8080
And write the username and password.
airflow has a web interface as well by default and default user pass is : airflow/airflow
you can run it by using :
airflow webserver --port 8080
then open the link : http://localhost:8080
if you want to make a new username by this command:
airflow create_user [-h] [-r ROLE] [-u USERNAME] [-e EMAIL] [-f FIRSTNAME]
[-l LASTNAME] [-p PASSWORD] [--use_random_password]
learn more about Running Airflow locally
You should install it , it is a python package not a website to register on.
The easiest way to install Airflow is:
pip install apache-airflow
if you need extra packages with it:
pip install apache-airflow[postgres,gcp]
finally run the webserver and the scheduler in different cmd :
airflow webserver # it is by default 8080
airflow scheduler

Can I run Firebase Functions locally?

Is it possible to use firebase-functions from my laptop? If not, is firebase-admin the only option remaining?
Here are some examples:
How can I rent and use my own servers for cloud functions?
Listen only to additions to a cloud firestore collection?
Does Firebase Admin SDK perform any caching?
I am able to make an index.js file on my laptop, npm install firebase-admin module, link to my Firestore database and make changes to data just fine, using admin-credentials. When I also try npm install firebase-functions make use of event-triggers onCreate/onWrite/onUpdate/onDelete, they do not get any updates?
To my understanding, the only way possible to make use of event-triggers is by uploading to cloud functions, since you need Google's infrastructure to use those and you can't use them on your local machine, which you can with firebase-admin package. You can use use the local emulator(?), but it isn't production ready and is not for that use case(?).
So, in order to listen for new events on my Firestore database, only using my laptop (not Google Cloud Functions platform or some other server-hosted option), I have to use .onSnapshot() from firebase-admin npm.
However that module is unable to cache, and you are left querying the whole firestore database, downloading every document.
Is this correct? or is there any way possible to make firebase-functions work from my laptop server using firebase-admin + admin credentials, almost as if I uploaded the file to cloud platform. I don't require this part of data to be on the cloud, so I want to make changes and adjust firestore database from my laptop's Terminal.
You will need to balance scalability and what you want to achieve. One approach uses a Parse Server and a Docker container that works with Express. This method's advantage is its flexibility as to where the parse server can run. You can run this on your laptop or move it to Google Cloud if you need more processing power. However, it is worth noting that the container cannot access all Firebase trigger types.
I am not sure which Operating System you are using, but for Ubuntu, you can install Docker and run Parse Server like this:
# Update the apt package first
$ sudo apt-get update
# install dependencies
$ sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \
gnupg-agent \
software-properties-common \
git
# Add Docker’s GPG key:
$ curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
# add the Docker repository
$ sudo add-apt-repository \
"deb [arch=amd64] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) \
stable"
# Update the apt package again
$ sudo apt-get update
# Install Docker
$ sudo apt-get install docker-ce docker-ce-cli containerd.io
$ git clone https://github.com/parse-community/parse-server
$ cd parse-server
$ docker build --tag parse-server .
$ docker run --name my-mongo -d mongo
To run the Parse Server:
$ docker run --name my-parse-server -v config-vol:/parse-server/config \
-p 1337:1337 --link my-mongo:mongo -d parse-server --appId APPLICATION_ID \
--masterKey MASTER_KEY --databaseURI mongodb://mongo/test
To link Firebase and the Docker Parse Server, you will need an adapter. The container above is an example, but it should be enough to get you started running from your laptop.

Airflow installation issue on Windows 7

How to install Airflow on Windows 7? getting below error while installing it using pip install apache-airflow :
---------------------------------------- Command "c:\users\shrgupta5\appdata\local\programs\python\python36-32\python.exe
-u -c "import setuptools, tokenize;__file__='C:\\Users\\SHRGUP~1\\AppData\\Loca l\\Temp\\pip-build-_yptw7sa\\psutil\\setup.py';f=getattr(tokenize, 'open', open) (__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __fi le__, 'exec'))" install
--record C:\Users\SHRGUP~1\AppData\Local\Temp\pip-_cwm0n u7-record\install-record.txt --single-version-externally-managed
--compile" fail ed with error code 1 in C:\Users\SHRGUP~1\AppData\Local\Temp\pip-build-_yptw7sa\ psutil\
I wouldn't bother trying to install Airflow on windows, even after you install it successfully you cannot run the airflow script due to a dependency on the unix-only module pwd
You can run Airflow on Windows by using the Docker setup from puckel https://github.com/puckel/docker-airflow.
Use VirtualBox and Docker Toolbox(legacy) and setup a docker-machine on your Windows computer (docker-machine create -d virtualbox --virtualbox-cpu-count "2" --virtualbox-memory "2048" default)
make sure to fork puckels git repo underneath c:/Users/yourusername/documents otherwise mounting of DAGS won't work
You should now be able to spin up Airflow e.g. by using the celery-executor setup with docker compose -f docker-compose-CeleryExecutor.yml up -d
I have setup a environment where I develop the DAGs on Windows, test them within the dockercontainer and then push the Dockerimage to Linux to production. I have added a more detailed tutorial here.
Airflow cannot be installed on Windows within the standard command prompt.
You need to use bash and afterwards change the config:
How to run Airflow on Windows
Download the source of airflow from pypi:
https://pypi.org/project/airflow/#files
Unzip and edit setup.cfg, then go to the install_requires section and change the version of psutil with the following: 'psutil>=5.4.7',
Finally, run python setup.py install in the source directory

openstack deployment using kolla erroring out

I have followed below document to deploy openstack using kolla and I have built all the docker images successfully. I am following this guide for all-in-one installation.
http://docs.openstack.org/developer/kolla/quickstart.html
I have cloned stable/liberty branch.
But while issuing kolla-ansible deploy getting below error.
:# kolla-ansible deploy
Deploying Playbook : ansible-playbook -i /usr/local/share/kolla/ansible/inventory/all-in-one -e #/etc/kolla/globals.yml -e #/etc/kolla/passwords.yml /usr/local/share/kolla/ansible/site.yml
ERROR: merge_configs is not a legal parameter in an Ansible task or handler
Command failed ansible-playbook -i /usr/local/share/kolla/ansible/inventory/all-in-one -e #/etc/kolla/globals.yml -e #/etc/kolla/passwords.yml /usr/local/share/kolla/ansible/site.yml
I have searched a lot about this error but could not find anything, any idea about this error?
Please make sure you have right version of ansible on your deployment node. It should expect version >1.9.4 but < 2.0

symfony2 app console: no available command

After HDD crash, I had to reimport my symfony2 app into Eclipse from my SVN server.
After syncing everything, I can't use the console anymore. I only get 2 commands : list and help.
I tried:
php bin/vendors install --reinstall
At the end, I got the following message:
[InvalidArgumentException]
There are no commands defined in the "assets" namespace.
[InvalidArgumentException]
There are no commands defined in the "cache" namespace.
My configuration is pretty simple:
- ubuntu server 11.04 (64bits)
- virtualbox OSE
How can I fix it?
Here is the result of app/console list command:
oc#ubuntu-server:/var/www/projets/Simoov2/src$ app/console list
Symfony version 2.0.0-RC4 - app/dev/debug
Usage:
[options] command [arguments]
Options:
--help -h Display this help message.
--quiet -q Do not output any message.
--verbose -v Increase verbosity of messages.
--version -V Display this program version.
--ansi Force ANSI output.
--no-ansi Disable ANSI output.
--no-interaction -n Do not ask any interactive question.
--shell -s Launch the shell.
--env -e The Environment name.
--no-debug Switches off debug mode.
Available commands:
help Displays help for a command
list Lists commands
OK. As I thought, this is not related to symfony2 but related to virtualbox mounting system.

Resources