I'm checking few configuration file, and i want to stop the execution of that script if any configuration file is not available with suitable message.
I'm trying cmd.run to print the message and then exit but that's not working for me
SALTSTACK:
{% if not that config is present %}
cmd.run:
- name:
echo SUITABLE MESSAGE
exit
There is the Failhard option for salt. If you use it as a global option, any failure within a state will terminate the execution of all following states. But you can also use it selectively.
Maybe you could add the Failhard option to your existing states.
If you want to have a specific error message, a 1:1 translation of your example could look like this:
SALTSTACK:
cmd.run:
- name: bash -c test -e /tmp/notexist || (echo "unable to find file" && exit 1)
- failhard: True
It will result in an output like this. The error message is at the end.
local:
----------
ID: SALTSTACK
Function: cmd.run
Name: bash -c test -e /tmp/notexist || (echo "unable to find file" && exit 1)
Result: False
Comment: Command "bash -c test -e /tmp/notexist || (echo "unable to find file" && exit 1)" run
Started: 21:30:34.473132
Duration: 7.353 ms
Changes:
----------
pid:
5499
retcode:
1
stderr:
stdout:
unable to find file
Why would you exit in that manner.
The execution will halt when any of the state fails.
And you can use file.exists state to check if the particular configuration file is present. If not the state will fail. Check salt file state: file.exists.
Related
I am trying to Cross-compile a qt-project from a recipe. I have created a recipe file but when I try to bitbake it. I am met with error
Here is my recipe file
DESCRIPITION = "my_project File Transfer"
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://${COMMON_LICENSE_DIR}/MIT;md5=0835ade698e0bcf8506ecda2f7b4f302"
SRC_URI = "git://git#bitbucket.org/johndoe/my_ui.git;protocol=ssh;rev=master"
S = "${WORKDIR}/git/my_project"
RDEPENDS_${PN} ="bash"
inherit qmake5
require recipes-qt/qt5/qt5.inc
do_install_append() {
## Creating Folder Structure
install -d ${D}/opt/my_project/bin
install -d ${D}/home/root/my_project
install -d ${D}/home/root/my_project/font
install -d ${D}/home/root/my_project/Images
install -d ${D}/home/root/my_project/Qml
###compile the project
oe_runmake INSTALL_ROOT=${D} install
#### Copying files
install -m 0755 ${S}/font/* ${D}/home/root/my_project/font/
install -m 0755 ${S}/Images/* ${D}/home/root/my_project/Images/
install -m 0755 ${S}/Qml/* ${D}/home/root/my_project/Qml/
}
FILES_${PN} = "/home/root/my_project"
The error that I see is
Sstate summary: Wanted 335 Found 327 Missed 8 Current 1958 (97% match, 99% complete)
NOTE: Executing SetScene Tasks
NOTE: Executing RunQueue Tasks
ERROR: myproject-project-1.0-r0 do_configure: Error calling /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/recipe-sysroot-native/usr/bin/qmake -makefile -o Makefile /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/git/myproject/myproject.pro --
ERROR: myproject-project-1.0-r0 do_configure: Function failed: do_configure (log file is located at /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/temp/log.do_configure.20982)
ERROR: Logfile of failure stored in: /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/temp/log.do_configure.20982
Log data follows:
| DEBUG: Executing shell function qmake5_base_preconfigure
| DEBUG: Shell function qmake5_base_preconfigure finished
| DEBUG: Executing shell function do_configure
| NOTE: qmake prevar substitution: ' '
| Could not find qmake spec 'linux-oe-g++'.
| Error processing project file: /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/git/myproject/myproject.pro
| ERROR: Error calling /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/recipe-sysroot-native/usr/bin/qmake -makefile -o Makefile /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/git/myproject/myproject.pro --
| WARNING: exit code 1 from a shell command.
| ERROR: Function failed: do_configure (log file is located at /home/blue/yacto/rpi-qt5/build/tmp/work/all-poky-linux/myproject-project/1.0-r0/temp/log.do_configure.20982)
ERROR: Task (/home/blue/yacto/poky-warrior-21.0.1/meta-rpi_custom/recipes-custom/myproject-project/myproject-project_1.0.bb:do_configure) failed with exit code '1'
NOTE: Tasks Summary: Attempted 4242 tasks of which 4241 didn't need to be rerun and 1 failed.
Summary: 1 task failed:
/home/blue/yacto/poky-warrior-21.0.1/meta-rpi_custom/recipes-custom/myproject-project/myproject-project_1.0.bb:do_configure
I know that in order to cross-compile. I had to run qmake from my cross-compile tool-chain location and then run make command on it.
I am guessing that is what's missing in my recipe. So my question is, Do I add that in my do_configure
if thats the case can anyone help me or point me how do I populate my do_configure
Is it as simple as source /opt/poky/2.7.1/environment----
then qmake
I am drawing a blank at this step
Please let me know what am I doing wrong
Edit1: remove inherit allarch from recipe
I would just leave inherit qmake5 and put DEPENDS + = "qtbase qtxyz ...", qtxyz would be the list of other dependency modules.
Need to check for distribution of a file in an array programmatically. Logging into a master server and then would like to check for file on workers using simple ssh. So far I have:
ssh $HOSTNAME "[ -e '$HOSTNAME:/directory/filename' ] && echo 'Exists'"
Based on some of the logging output, I know the ssh is successful, but how can I get the test to return a message to the master server? Running the above returns nothing.
SSH will exit with the same exit code as the command that you run on the remote host. If that command is a test, then the exit code will match what you would normally expect from a test.
I would suggest the following:
Simplify your command to only run the test over SSH
Run the echo on your local machine
It doesn't seem correct that you have $HOSTNAME: in front of your path.
ssh "$HOSTNAME" "test -e '/directory/filename'" && echo 'Exists'
I personally find if statements to be much more easily understandable, which is an optional change if you are willing to go that route:
if ssh "$HOSTNAME" "test -e '/directory/filename'"; then
echo "Exists"
else
echo "Does not exist" >&2
exit 1
fi
I'm trying to apply state 'nettools.sls' and I'm receiving
ERROR: Minions returned with non-zero exit code. ran command: salt
'minion*' state.apply nettools
Running on Ubuntu Server 18.04 with XFCE desktop, I've tried having the sls in etc/salt/ and in the home directory
install_network_packages:
pkg.installed:
- pkgs:
- rsync
- lftp
- curl
I receive data failed to compile: no matching sls found for 'nettools' in env 'base'
Salt's states (sls files) go in /srv/salt/ unless you've modified the file_roots option in your master config file.
So if you place your file in /srv/salt/nettools.sls you should be able to run the following command:
salt <minion id> state.apply nettools
I have the below rsh code as a part of a script. This code runs in a loop within the main script. In case the rsh fails, I wish to capture the exit code in a log for which the below If part was created. But it does not seem to be working as it always returns 0 for $? even when the remote server refuses connections.
I cannot use ssh as it is not configured.
rsh ${machine} -l ${osusernm} nohup ${ScrDir}/${LoadJobNm}.scr ${osusernm} ${machine} ${SIDFile} ${logon_id} ${calling_machine} &
if [ $? -ne 0 ]
then
echo "ERROR : Failed to execute ${LoadJobNm}.scr in ${machine} for file ${SIDFile}" >> ${LogDir}/${JobNm}.log
break
fi
I need to execute a script on another minion. The best solution seems to be Peer Publishing, but the only documentation I have been able to find only shows how to do it via CLI.
How can I define the following in a module?
salt-call system.example.com publish.publish '*' cmd.run './script_to_run'
You want the salt.client.Caller() API.
#!/usr/bin/env python
import salt.client
salt_call = salt.client.Caller()
salt_call.function('publish.publish', 'web001',
'cmd.run', 'logger "publish.publish success"')
You have to run the above as the salt user (usually root).
Then scoot over to web001 and confirm the message is in /var/log/syslog. Worked for me.
The syntax for the .sls file:
salt-call publish.publish \* cmd.run 'cd /*directory* && ./script_to_run.sh:
cmd.run
Alternative syntax:
execute script on other minion:
cmd.run
- name: salt-call publish.publish \* cmd.run 'cd /*directory* && ./script_to_run.sh
What I specifically did (I needed to execute a command, but only if a published command executed successfully. Which command to publish depends on the role of the minion):
execute script:
cmd.run:
- name: *some shell command here*
- cwd: /*directory*
- require:
- file: *some file here*
{% if 'role_1' in grains['roles'] -%}
- onlyif: salt-call publish.publish \* cmd.run 'cd /*other_directory* && ./script_to_run_A.sh'
{% elif 'role_2' in grains['roles'] -%}
- onlyif: salt-call publish.publish \* cmd.run 'cd /*other_directory* && ./script_to_run_B.sh'
{% endif %}
Remember to enable peer communication in /etc/salt/master under the section 'Peer Publish Settings':
peer:
.*:
- .*
This configuration is not secure, since it enables all minions to execute all commands on fellow minions, but I have not figured out the correct syntax to select minions based on their role yet.
Another note is that it probably would be better to create a custom command containing the cmd.run and then enable only that, since enabling all nodes to execute arbitrary scripts on each other is not secure.
The essence of this answer is the same as Dan Garthwaite's, but what I needed was a solution for a .sls file.