Access a variable defined in Jenkinsfiles in Shell Script within Jenkinsfile - unix

I am defining a shell script in one of the stages in my Jenkinsfile. How can I access a variable that I define in my Jenkinsfile with the shell script?
In below scenario , I am writing the value of the shell variable to a file and reading into a groovy variable. Is there a way to pass data from shell to groovy without writing it to file system?
unstash 'sources'
sh'''
source venv/bin/activate
export AWS_ROLE_ARN=arn:aws:iam::<accountid>:role/<role name>
layer_arn="$(awssume aws lambda list-layer-versions --layer-name dependencies --region us-east-1 --query \"LayerVersions[0].LayerVersionArn\" | tr -d '\"')"
echo $layer_arn > layer_arn
'''
layer_arn = readFile('layer_arn').trim()

You can can shell command line, providing variable value.
sh "some stuff $my_var"
You can defined environment variable and use it within your shell
withEnv(["MY_VAR=${my_var}") {
sh 'some stuff'
}
Regards

Related

Anyone have idea how to run linux command in robot framework at backend

I have to run this command
./emsInventory.sh -s 10 -i EMS1004 -p EMS -v 10.2.0.15.1 -d "EMS Patch " -c ems10/pass_ems10#rac_ems10.agnity.com
In robot framework to make sure that data is created or not
Use the Run Process in the Process library - this is its precise purpose.
You can use SSH Library in robot framework more details can be found here http://robotframework.org/SSHLibrary/SSHLibrary.html.
Below examples could be useful
Execute Command And Verify Return Code
[Documentation] Often getting the return code of the command is enough.
... This behaviour can be adjusted as Execute Command arguments.
${rc}= Execute Command echo Success guaranteed. return_stdout=False return_rc=True
Should Be Equal ${rc} ${0}
Executing Commands In An Interactive Session
[Documentation] Execute Command always executes the command in a new shell.
... This means that changes to the environment are not persisted
... between subsequent Execute Command keyword calls.
... Write and Read Until variants can be used to operate in the same shell.
Write cd ..
Write echo Hello from the parent directory!
${output}= Read Until directory!
Should End With ${output} Hello from the parent directory!

Unable to export env variable from script

I'm currently struggling with running a .sh script I'm trying to trigger from Jenkins.
Within the Jenkins "execute shell" section, I'm connecting to a remote server (The Jenkins agent does not have right OS to build what I need.), using:
cp -r . /to/shared/drive/to/have/access/on/remote
ssh -t -t username#servername << EOF
cd /to/shared/drive/to/have/access/on/remote
source build.sh dev
exit
EOF
Inside build.sh, I'm exporting R_LIBS to build a package for different R versions.
...
for path in "${!rVersionPaths[#]}"; do
export R_LIBS="${path}"
Rscript -e 'install.packages(c("someDependency", "someOtherDependency"), repos="http://cran.r-project.org");'
...
Setting R_LIBS should functions here like setting lib within install.packages(...). For some reason the R_LIBS export doesn't get picked up. Also setting other env variables like http_proxy are ignored. This causes any requests outside the network to fail.
Is there any particular way of achieving this?
Maybe pass those variables with env, like
env R_LIBS="${path}" Rscript -e 'install.packages(c("someDependency", .....
Well i'm not able to comment on the question, so posting it as answer.
I had similar problem when calling remote shell script from Jenkins, the problem was somehow bash_profile variables were not loaded when called the script from Jenkins but locally it worked. Loading the bash profile in ssh connection solved it for me.
Add source to bash_profile in build.sh
. ~/.bash_profile OR source ~/.bash_profile
Or
Reload bash_profile in ssh connection
`ssh -t -t username#servername << EOF
. ~/.bash_profile
your commands here
exit
EOF
You can set that variable in the same command line like this:
R_LIBS="${path}" Rscript -e \
'install.packages(c("someDependency", "someOtherDependency"), repos="http://cran.r-project.org");'
It's possible to append more variables in this way. Note that this will set those environment variables only for the command being called after them (and its children processes as well).
You said that "R_LIBS export doesn't get picked up". Question Is the value UNSET? Or is it set to some other value & you are trying to override it?
It is possible that SSH may be invoking "/bin/sh -c". Based on the second answer to: Why does 'cd' command not work via SSH?, you can simplify the SSH command and explicitly invoke the build.sh script in Bash:
cp -r . /to/shared/drive/to/have/access/on/remote
ssh -t -t username#servername "cd /to/shared/drive/to/have/access/on/remote && bash -f build.sh dev"
This makes the SSH invocation more similar to invoking the command within a remote interactive shell. (You can avoid sourcing scripts and exporting variables.)
You don't need to export R_LIBSor env R_LIBS when it is possible to prefix any command with local environment variable overrides (agrees with Luis' answer):
...
for path in "${!rVersionPaths[#]}"; do
R_LIBS="${path}" Rscript -e 'install.packages(c("someDependency", "someOtherDependency"), repos="http://cran.r-project.org");'
...
The Rscript may be doing a lot with env vars. You can verify that you are setting the R_LIBS env var by replacing Rscript with the env command and observe the output:
...
for path in "${!rVersionPaths[#]}"; do
R_LIBS="${path}" env
...
According to this manual "Initialization at Start of an R Session", Rscript looks in several places to load "site and user files":
$R_PROFILE
$R_HOME/etc/Renviron
$R_HOME/etc/Renviron.site
$R_ENVIRON_USER
$R_PROFILE_USER
./.Rprofile
$HOME/.Rprofile
./.RData
The "Examples" section of that manual shows this:
## Not run:
## Example ~/.Renviron on Unix
R_LIBS=~/R/library
PAGER=/usr/local/bin/less
If you add the --vanilla command-line option to ignore all of these files, then you may get different results and know something in the site/init/environ files is affecting your R_LIBS! I cannot run this system myself. Hopefully we have given you some areas to investigate.
You probably don't want to source build.sh, just invoke it directly (i.e. remove the source command).
By source-ing the file your script is executed in the SSH shell (likely sh) rather than by bash, which it sounds like is what you intended.

How to set shell used by system() to invoke bash scripts in R? [duplicate]

I have to run a shell script inside R. I've considered using R's system function.
However, my script involves source activate and other commands that are not available in /bin/sh shell. Is there a way I can use /bin/bash instead?
Thanks!
Invoke /bin/bash, and pass the commands via -c option in one of the following ways:
system(paste("/bin/bash -c", shQuote("Bash commands")))
system2("/bin/bash", args = c("-c", shQuote("Bash commands")))
If you only want to run a Bash file, supply it with a shebang, e.g.:
#!/bin/bash -
builtin printf %q "/tmp/a b c"
and call it by passing script's path to the system function:
system("/path/to/script.sh")
It is implied that the current user/group has sufficient permissions to execute the script.
Rationale
Previously I suggested to set the SHELL environment variable. But it probably won't work, since the implementation of the system function in R calls the C function with the same name (see src/main/sysutils.c):
int R_system(const char *command)
{
/*... */
res = system(command);
And
The system() library function uses fork(2) to create a child process that executes the shell command specified in command using execl(3) as follows:
execl("/bin/sh", "sh", "-c", command, (char *) 0);
(see man 3 system)
Thus, you should invoke /bin/bash, and pass the script body via the -c option.
Testing
Let's list the top-level directories in /tmp using the Bash-specific mapfile:
test.R
script <- '
mapfile -t dir < <(find /tmp -mindepth 1 -maxdepth 1 -type d)
for d in "${dir[#]}"
do
builtin printf "%s\n" "$d"
done > /tmp/out'
system2("/bin/bash", args = c("-c", shQuote(script)))
test.sh
Rscript test.R && cat /tmp/out
Sample Output
/tmp/RtmpjJpuzr
/tmp/fish.ruslan
...
Original Answer
Try to set the SHELL environment variable:
Sys.setenv(SHELL = "/bin/bash")
system("command")
Then the commands passed to system or system2 functions should be invoked using the specified shell.

Use remote server env variable in ksh via an SSH command

I can't find any answer to this "easy looking" problem.
I would like to execute an ssh command using a ksh shell or script which use an env variable of the SERVER.
Example:
ssh user#server "ls $DIR"
Where $DIR is an env variable define on the server (in this case: a directory path) and not the $DIR define on my client env.
In worst case scenario I can use something like env | grep DIR | cut -d "=" -f 2
to get the var but it looks weird.
Thanks for any help.
ssh user#server "ls $DIR"
Double-quoted strings undergo variable interpolation. So "$DIR" is being replaced on the local system, then the shell invokes the ssh command with the resulting string.
To pass the literal command through to the remote system, use single quotes:
ssh user#server 'ls $DIR'
The SSH command execution shell is a non-interactive shell, whereas your normal shell is either a login shell or an interactive shell.
In fact not all environment varialbles are available in non-interactive shell, you need to check ksh manual to figure out the configuration files that ksh reads when running in non-interactive mode, in case of bash those are roughly following
/etc/profile
~/.bash_profile
~/.bash_login
~/.profile
Just find out the corresponding ones for ksh and move/copy your DIR environment variable definition to there on sever side.

Crontab and testing a command to be executed

I'm quite new to cron and crontab.
I've edited the crontab file and I need to execute manually one of commands so I can try it and test it beforehand. How do I do that? If it fails, is there a mode that shows the errors?
Write a shell script that you can test.
Execute that shell script from the crontab.
Remember that cron provides barely any environment - so your script may have to fix that. In particular, your profile will not be used.
Do not get fancy with what you put in the crontab.
Build a debug mode into your shell script.
No, there isn't specifically a mode that shows errors. Usually, if the cron job witters, the output is emailed to you. That is, it sends standard output and standard error information to you if the executed command writes anything to either standard output or standard error.
On MacOS X (10.6.7), the environment I got was (via a crontab entry like 12 37 17 5 * env >/tmp/cron.env):
SHELL=/bin/sh
USER=jleffler
PATH=/usr/bin:/bin
PWD=/Users/jleffler
SHLVL=1
HOME=/Users/jleffler
LOGNAME=jleffler
_=/usr/bin/env
Of those, PWD, _ and SHLVL are handled by the shell. So, to test your script reliably in a cron-like environment, use:
(cd $HOME
env -i \
SHELL=/bin/sh \
USER=$USER \
PATH=/usr/bin:/bin \
HOME=$HOME \
LOGNAME=$LOGNAME \
/path/to/script/you/execute ...
)
The -i option to env means 'ignore all inherited enviroment'; the script will see exactly the five values specified plus anything the shell specifies automatically. With no arguments, env reports on the environment; with arguments, it adjusts the environment and executes a command.
To execute a script "manually" you first have to make it executable by doing:
$ chmod +x yourScriptName
Then do either
$ ./yourScriptName
if you execute it from its path or
$ /full/path/to/yourScriptName
from anywhere.

Resources