Quick RSYNC code correction - rsync

What's wrong with this code?
sudo -u replicant rsync -av -e "ssh -o 'StrictHostKeyChecking no' -i /home/replicant/.ssh/id_rsa" --exclude 'media/' --exclude 'var/' --exclude '.svn' root#$ADMIN:/var/www/ /var/www/ &> /tmp/rsync if [ $? -ne 0 ]; then
echo "date: Error rsync'ing code base from $ADMIN check /tmp/rsync" | mail -s "Rsync error!" $DEVEMAIL
echo "date: Error rsync'ing code base from $ADMIN check /tmp/rsync" >> $LOGFILE
echo "root#$ADMIN:/var/www /var/www" >> $LOGFILE
exit
fi
I keep getting this error:
Permission denied (publickey).
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: unexplained error (code 255) at io.c(605)
[Receiver=3.0.9]
Please help. Thanks.

Try to login directly on SSH to fix your issues, then move on to your rsync test. So start with:
ssh -o 'StrictHostKeyChecking no' -i /home/replicant/.ssh/id_rsa root#$ADMIN
Sidenotes:
don't use root for such a task
add set -eu at the start of your Bash script, so that errors will end up your script and ease debugging (for example if $ADMIN is not defined, the script will end in error)

Related

Get Return Code of SFTP command

I have the snippet below inside my ksh script. Is there a way that I can have a return code whether the sftp executed successfully and copied the files from source to target destination?
echo "sftp start" >> /test/logfile.log
sftp user#server <<EOF >> /test/logfile.log
cd /tgt/files
lcd /src/files
rm *.csv
put -p *.csv
exit
EOF
echo "sftp end" >> /test/logfile.log
The solution of Gilles Quenot will only work with the following three improvements. Without those improvement the exit status will always be 0 regardless the result of the sftp commands.
sftp option -b - needs to be added to the sftp command. Only then will sftp exit with status 1 if some sftp command goes wrong. Otherwise the exit status is always 0.
I've added 2>&1 | tee to log also the errors (it redirects stderr to stdout)
You must use ${PIPESTATUS[0]} to read the exit status of sftp. $? gives the exit status of the last command and that is the redirect to the logfile.
echo "sftp start" >> /test/logfile.log
sftp -b - user#server <<EOF 2>&1 | tee /test/logfile.log
cd /tgt/files
lcd /src/files
rm *.csv
put -p *.csv
exit
EOF
exit_code=${PIPESTATUS[0]}
if [[ $exit_code != 0 ]]; then
echo "sftp error" >&2
exit 1
fi
echo "sftp end" >> /test/logfile.log
Regards,
Maarten
What I would do :
echo "sftp start" >> /test/logfile.log
sftp user#server <<EOF >> /test/logfile.log
cd /tgt/files
lcd /src/files
rm *.csv
put -p *.csv
exit
EOF
exit_code=$?
if [[ $exit_code != 0 ]]; then
echo "sftp error" >&2
exit 1
fi
echo "sftp end" >> /test/logfile.log
Instead of using sftp and writing so many intermediate commands in order to move to the proper folders, remove files etc before your transfer, you could use the following way more compact commands:
echo "file transfer started" >> /test/logfile.log
ssh user#server 'rm /tgt/files/*.csv' >> /test/logfile.log 2>&1 && scp /src/files/*.csv user#server:/tgt/files/ >> /test/logfile.log 2>&1
rc=$?
if [[ $rc != 0 ]]; then
echo "ERROR: transfer failed" >> /test/logfile.log
exit 1
fi
echo "file transfer completed" >> /test/logfile.log
Explanations:
ssh user#server 'rm /tgt/files/*.csv' >> /test/logfile.log 2>&1 && scp /src/files/*.csv user#server:/tgt/files/ >> /test/logfile.log 2>&1 if the files are properly removed from the target folder than and only than (&&) the transfer will be done!! Intermediate errors are also redirected to the output log files.

Check if a file exists in artifactory before downloading it shell using curl/wget from jenkins

This is a common question, somehow I am not able to get it working for me. I have an artifactory folder where in I store json files.
On jenkins I am supposed to download one of the jsons e.g. Myfile.json. But this has to be done only if the file exists.
Tried using below approaches:
Approach: 1)
url="https://abc/folder/Myfile.json"
if curl --output /dev/null --silent --fail -r 0-0 "$url"; then
echo "URL exists: $url"
else
echo "URL does not exist: $url"
fi
Problem: It keeps on entering the else condition even for valid URLS.
I'm executing the same using in bash shell. What is the best way to achieve this?
You can check the error code of curl request, ie
curl -s -o /dev/null -w "%{http_code}" http://www.example.org/
#!/bin/bash
url=$1
check_url=$(curl -s -o /dev/null -w "%{http_code}" ${url})
#echo $check_url
case $check_url in
[200]*)
echo "URL exists: $url"
;;
[404]*)
echo "URL does not exist: $url"
;;
*)
echo "URL error - HTTP error code $check_url: $url"
exit 1
;;
esac

Rsync failing with Env Variable

I am using following script to rsync back files. If I amy execute those one by one on shell it work. But when I use to execute theem in script it is giving error
"rsync: link_stat "/home/tan/testnfs#015" failed: No such file or directory (2)"
015 is no where in script, I have edited the script and verify that no blank space or character left. But have same problem.
#!/bin/bash
#========================================
#Environment varibale settings
#========================================
username=test
codedir=/home/tan/testnfs
nfs=10.100.200.4::test
adminemail=backup#tan.com
errorlog=/home/tan/backuperror_log.txt
dat=$(date)
rm -fr $errorlog
echo $dat 2>&1>> $errorlog
echo $nfsserver
echo ========== Before rsync =================
rsync --stats -vr --exclude "*.png" --exclude "*.jpg" --exclude "*.jpeg" --exclude "*.zip" --exclude "*.pdf" --exclude "*.doc" --exclude "*.csv" --exclude "*.swf" $codedir $nfs
if [ $? = 0 ] then
mail -s "$username sync--complete" $adminemail < $errorlog
else
mail -s "$username sync--Incomplete" $adminemail < $errorlog
fi
I had figure that out. I was editing script on windows and it was adding its line terminator. I have saved it as linux file with notepad++ and it worked

logging unix "cp" (copy) command response

I am coping some file,So, the result can be either way.
eg:
>cp -R bin/*.ksh ../backup/
>cp bin/file.sh ../backup/bin/
When I execute above commands, its getting copied. No response from the system, if it copied successful. If not, prints the error or response in terminal itself cp: file.sh: No such file or directory.
Now, I want to log the error message, or if it successful I want to log my custom message to a file. How can I do?
Any help indeed.
Thanks
try writing this in a shell script:
#these three lines are to check if script is already running.
#got this from some site don't remember :(
ME=`basename "$0"`;
LCK="./${ME}.LCK";
exec 8>$LCK;
LOGFILE=~/mycp.log
if flock -n -x 8; then
# 2>&1 will redirect any error or other output to $LOGFILE
cp -R bin/*.ksh ../backup/ >> $LOGFILE 2>&1
# $? is shell variable that contains outcome of last ran command
# cp will return 0 if there was no error
if [$? -eq 0]; then
echo 'copied succesfully' >> $LOGFILE
fi
fi

shell script help - checking for file exists

I'm not sure why this code isn't working. Its not going to the copy command.
I successfully run this manually on the command line (without the check)
I don't think i'm performing a correct file check? Is there a better, cleaner way to write this?
I just want to make sure the file exists, if so, copy it over. Thanks.
#!/bin/bash
if [ $# != 1 ]; then
echo "Usage: getcnf.sh <remote-host>" 2>&1
exit 1
fi
#Declare variables
HOURDATE=`date '+%Y%m%d%H%M'`
STAMP=`date '+%Y%m%d-%H:%M'`
REMOTE_MYCNF=/var/log/mysoft/mysoft.log
BACKUP_DIR=/home/mysql/dev/logs/
export REMOTE_MYCNF HOURDATE STAMP
#Copy file over
echo "Checking for mysoft.log file $REMOTE_MYCNF $STAMP" 2>&1
if [ -f $REMOTE_MYCNF ]; then
echo "File exists lets bring a copy over...." 2>&1
/usr/bin/scp $1:$REMOTE_MYCNF $BACKUP_DIR$1.mysoft.log
echo "END CP" 2>&1
exit 0
else
echo "Unable to get file" 2>&1
exit 0
fi
your checking existing file on remote computer seems like:
you should do:
ssh $host "test -f $file"
if [ $? = 0 ]; then
use sh -x script.sh to see what is happening.
You are testing for the existence of a remote file
$1:$REMOTE_MYCNF
using the local name $REMOTE_MYCNF. The if test is never satisfied.
You don't check that $1 is set.
Your file check runs on the local machine - not on the remote.
Change your if to:
if[! -f $REMOTE_MYCNF -o ! -d $REMOTE_MYCNF];

Resources