jil file contains below profile location:
profile: /apps/properties/autosys_env.rc
Following environment variables are set in autosys_env.rc file:
JAVA_HOME=/apps/java/jdk1.7.0_51
export JAVA_HOME
ENV_MODE=DEV
export ENV_MODE
PATH=$JAVA_HOME/bin:$PATH
export PATH
But, environment variables are not detected on executing jil.
It was not working because I was creating autosys_env.rc file in windows and then copy it into unix machine using Winscp and Putty.
It seems windows has some characters that are not supported in unix.
So, it works after creating autosys_env.rc file in vi editor.
Related
What is the mechanism that rustup uses to set the PATH variable? I couldn't find any entry in my .bashrc.
Apparently the PATH variable is exported in .profile in the user home directory:
export PATH="$HOME/.cargo/bin:$PATH"
The following command works on Mac:
source $HOME/.cargo/env
I am following this tutorial.
https://towardsdatascience.com/getting-started-with-apache-airflow-df1aa77d7b1b
when I run the export command as below
export AIRFLOW_HOME='pwd' airflow_home
what is this export command doing. it will create a environment variable AIRFLOW_HOME = pwd
is this the purpose?
when I run the next command airflow initdb it creates a folder called pwd inside my newly created project directory and puts the files in there.
Am I missing something here?
I am using macbook, python 3.7, airflow 1.10.9
You're missing the correct backtick ` instead of a single quote '.
On *nix systems `pwd` will be evaluated to the current directory. That's why it creates a folder called pwd instead of using the current directory as the airflow home
Hi I'm using pyspark interactively. I think I'm failing loading a LOCAL file correctly.
how do I check current directory, so that I can go to browser to take a look at that actual file?
Or is the default directory where pyspark is? Thanks
You can't load local file unless you have same file in all workers under same path. For example if you want to read data.csv file in spark, copy this file to all workers under same path(say /tmp/data.csv). Now you can use sc.textFile("file:///tmp/data.csv") to create RDD.
Current working directory is the folder from where you have started pyspark. You can start pyspark using ipython and run pwd command to check working directory.
[Set PYSPARK_DRIVER_PYTHON=/path/to/ipython in spark-env.sh to use ipython]
import os
cwd = os.getcwd()
print(cwd)
I was using R installed on a Linux server using SSH. Everything was fine, but now I have been denied access to temp folder and if I am loading R it is giving error cannot create 'R_TempDir', as it can't create the temp folder.
Can you please tell me how to create own local temp folder so that R can create temporary directory there ?
You can try to set one of these environment variable :
TMPDIR, TMP, TEMP:
Consulted (in that order) when setting the temporary directory for the session: see tempdir. TMPDIR is also used by some of the utilities see the help for build
by doing for instance :
export TMPDIR=/tmp
source
Hope this answers.
From what I understand,
I just thought that you could use .bashrc files in your /home/username/ directory
~# nano /home/username/.bashrc
You can put the command to create the folder inside this .bashrc file by just adding this line mkdir /your/dir/path/yourDir
This file is just like an autorun file which run everytime you upstart your linux server
But this is just working per user setting
I am currently trying out on the docker link between my app and db containers. I've checked on my app container and environment variables are automatically set when I link the containers together.
What I want to do is for my config file, which is packaged into a jar file, to receive the environment variables and set the required values to it. Any advice or help?
And this is how I create a config file in my jar file to connect to MySQL
database { url="jdbc:mysql://${MYSQL_PORT_3306_TCP_ADDR}:${MYSQL_PORT_3306_TCP_PORT}/mydb" driver="com.mysql.jdbc.Driver"}
Updating the config file inside the jar could be quite overkill.
It think you have several choices
read the config environment variable directly in you program
use variable either directly or generate the config file there
create launch script (details of this depends of you guest os in docker how to do it; sh/bash for linux etc..)
that script can generate new config file from environment and put it on classpath before jar so you program sees it.
EDIT: added example
You can save this kind of launcher script on docker image which dynamically creates configuration before launching actual program.
#!/bin/bash
# some default values for testing even without links to other container
MYSQL_PORT_3306_TCP_ADDR=${MYSQL_PORT_3306_TCP_ADDR:-127.0.0.1}
MYSQL_PORT_3306_TCP_PORT=${MYSQL_PORT_3306_TCP_PORT:-3306}
cat << EOF > /opt/yourprogram/dbconfig.conf
database { url="jdbc:mysql://${MYSQL_PORT_3306_TCP_ADDR}:${MYSQL_PORT_3306_TCP_PORT}/mydb" driver="com.mysql.jdbc.Driver"
}
EOF
scala -classpath /opt/yourprogram YourProgram
What I did is that I wrote the sh file in my directory /tmp/restcore-1.0-SNAPSHOT/bin like this:
#!/bin/bash echo "database{url="jdbc:mysql://"${MYSQL_PORT_3306_TCP_ADDR}":"${MYSQL_PORT_3306_TCP_PORT}"/mydb" driver="com.mysql.jdbc.Driver" }" > myconf.conf
jar uf /tmp/restcore-SNAPSHOT/lib/com.organization.restcore-1.0-SNAPSHOT.jar /tmp/restcore-1.0-SNAPSHOT/bin/myconf.conf
After building the Dockerfile and running the sh file in CMD, I use cat myconf.conf to check the config file and I'll be able to see the environment set.