I have my python package requirements in setup.py and I simply do pip install . from a directory where setup.py exists. I don't have a requirements file and I don't want to have one. How do I tell salt to use setup.py instead of requirements.txt?
Installing pip package to virtualenv and running setup.py should be different requirement.
I assume setup.py triggering is only mean for self-serving modules, external package should be put under saltstack virtualenv setup, so you can see if the required external package failed to install. But it also depends on your own taste.
To run setup.py inside your virtualenv, you must create a script that call the virutalenv, then run setup.py , e.g. vi run-setup.sh
#!/bin/bash
source $HOME/.virtualenv/xyz/bin/activate
cd $HOME/xyz_app
python setup.py
then use the cmd.run in state file to run it
run setup.py for my xyz app :
cmd_run:
- name: bash <xyz_app folder name>/run-setup.sh
- user: <username>
- group: <groupname>
UPDATE :
Since you want to load particular python package to your virtualenv, you can do it straight away during setup. Then only use cmd.run to load batch that launch setup.py (to make your custom app works)into the virtualenv.
create-my-apps-virtualenv:
virtualenv.managed:
- name: /home/myapphome/.virtualenv/myapp
- user: myappusername
- no_chown: False
# install this pacakge to my virtualenv, package must be case sensitive according.
- pip_pkgs: json, MySQL-python,SQLAlchemy
Related
I have few Jar files/packages in the DBFS and I want an init script (so that I can place that in the automated cluster) to install the Jar package everytime the cluster starts.
I also want to install maven packages from maven using an init script.
I can do all of these using databricks UI. But the requirement is to install libraries using an init script.
To install jar files, just put files onto DBFS, in some location, and in the init script do:
cp /dbfs/<some-location>/*.jar /databricks/jars/
Installation of the maven dependencies is more tricky, because you also will need to fetch dependencies. But it's doable - from the init script:
Download and unpack Maven
Execute:
mvn dependency:get -Dartifact=<maven_coordinates>
move downloaded jars:
find ~/.m2/repository/ -name \*.jar -print0|xargs -0 mv -t /databricks/jars/
(optional) remove not necessary directory:
rm -rf ~/.m2/
P.S. But really, I recommend to automate such stuff via Databricks Terraform Provider.
At this point I'm thinking about calling bash command pip install fabric2 each time my operator executed, but this does not looke like a good idea.
Create a requirements.txt file similar and pass that as a variable while creating the cloud composer enviroment.
Sample requirements.txt file:
scipy>=0.13.3
scikit-learn
nltk[machine_learning]
Pass the requirements.txt file to the environments.set-python-dependencies command to set your installation dependencies.
gcloud beta composer environments update ENVIRONMENT_NAME \
--update-pypi-packages-from-file requirements.txt \
--location LOCATION
Turns out you can use: PythonVirtualenvOperator it supports pip deps.
Another option that is available for the Composer users is to install deps via the composer itself: https://cloud.google.com/composer/docs/how-to/using/installing-python-dependencies
I know how to package and then deploy meteor application. But recently for one project i'm stuck at an error which i couldn't resolve.
Steps I followed for package and deploy of my meteor app:
1. meteor build package
2. cd package
3. tar -xf inventoryTool.tar.gz
4. cd bundle/programs/server
5. npm install
6. cd ../..
7. PORT=<port> MONGO_URL=mongodb://127.0.0.1:27017/dbName ROOT_URL=http://<ip> node main.js
Here is the log for the error when i run the npm install(STEP 5) command.
Is there anything missing in my execution?. I'm not using the fibers package anywhere in my project. Does anyone have solution to this problem? Thanks in advance.
Why this happens (a lot)?
Your local version of node is v8.9.4. When using the build command, you will export your application and build the code against this exact node version. Your server environment will require this exact version, too.
An excerpt from the custom deployment section of the guide:
Depending on the version of Meteor you are using, you should install
the proper version of node using the appropriate installation process
for your platform. To find out which version of node you should use,
run meteor node -v in the development environment, or check the
.node_version.txt file within the bundle generated by meteor build.
Even if you don't use fibers explicitly it will be required to run your Meteor app on the server correctly.
So what to do?
In order to solve this, you need to
a) ensure that your local version of node exactly matches the version on the server
b) ensure that you build against the server's architecture (see build command)
To install a) the very specific node version on your server you have two options:
Option I. Use n, as described here. However this works only if your server environment uses node and not nodejs (which depends on how you installed nodejs on the server).
II. To install a specific nodejs version from the repositories, you may do the following:
$ cd /tmp
$ wget https://deb.nodesource.com/node_8.x/pool/main/n/nodejs/nodejs_8.9.4-1nodesource1_amd64.deb
$ apt install nodejs_8.9.4-1nodesource1_amd64.deb
If you are not sure, which of both are installed on your server, check node -v and nodejs -v. One of both will return a version. If your npm install still fails, check the error output and if it involves either node or nodejs and install the desired distribution using the options above.
To build b) against the architecture on your server, you should use the --architecture flag in your build command.
I built the meteor package as follow:
meteor build /tmp
It built the jar.gz file.
Now How do install this on different server?
Thanks
upload the tar file using ftp or scp on the terminal.
untar it on the server
cd bundle/programs/server
npm install
node bundle/main.js
Additional info:
use pm2 on the server so you can manage the node process seamlessly.
OR
use forever to run your app even if you quit the terminal.
I am having problems compiling SQLite for use with Nodewebkit. After research, it seems that I am having wrong versions of the programs. So I have:
- Node
- NW
- SQLite
Apparently there must be certain version of each of the mentioned programs to make it work.
What versions of the programs I must have, so I can run this command:
npm install sqlite3 --build-from-source --runtime=node-webkit --target_arch=ia32 --target=0.12.3
This link suggests I should have NW version 0.8.x. But I cant find it for download. Or maybe that is not the problem at all...
I build on Mac using node-webkit v0.12.3 using the following commands:
sudo npm install nw-gyp -g
npm install sqlite3 --build-from-source --runtime=node-webkit --target_arch=ia32 --target=0.12.3
First, make sure you installed nw-gyp globally. Then, run the command either in the directory containing node-webkit executables (nwjs), or in a subfolder of that folder.
Running the command should then create a node_modules folder in the same directory as the binaries, containing the sqlite3 module.