Tryning to use Rsync to backup directory on Encrypted DMG on an USB external HD : hdutils attach failed - plist

I am trying to do this :
When " BCKUNIVERSITA " partition on external USB HD mounts , LAUNCHD launches " backup.com "
Through backup.com an encrypted DMG ( " riassunti.sparsebundle " ) inside " BCKUNIVERSITA " is mounted . Then with rsync files inside the directory " /DATI/UNIVERSITA " are copied inside mounted dmg " riassunti.sparsebundle " .
here is the code about LaunchD plist :
`<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>com.rsync.backup</string>
<key>LowPriorityIO</key>
<true/>
<key>Program</key>
<string>/Users/ikar0/Library/Scripts/backup.com</string>
<key>ProgramArguments</key>
<array>
<string>backup.com</string>
</array>
<key>WatchPaths</key>
<array>
<string>/Volumes</string>
</array>
</dict>
</plist>`
Here is the code inside " backup.com " :
#!/bin/bash
folderToBackup="/Volumes/DATI/UNIVERSITA"
backupVolume="/Volumes/BCKUNIVERSITA"
backupTo="/Volumes/RIASSUNTI/"
# mounts encrypted image
echo -n 465tyu567 | hdiutil attach -stdinpass /Volumes/BCKUNIVERSITA/riassunti.sparsebundle
sleep 10
if [ ! -e ${backupVolume} ]
then echo -n "[*]-- BackupVolume NOT connected - Exiting" | logger exit 0
else echo -n "[*]-- BackupVolume Connected - Continuing" | logger
fi
# Copy the files over. Here we are using rsync.
for i in ${folderToBackup}
do
echo -n "[*]-- Starting Rsync of ${i} to ${backupTo}" | logger
rsync -aq $folderToBackup $backupTo
echo -n "[*]-- rsync of ${i} to ${backupTo} complete..."| logger
done
# once the rsync is done, unmount the drive.
hdiutil detach $backupTo
exit 0
Here are the errors from console :
16/06/12 01.13.51 com.rsync.backup[14139] rsync: recv_generator: mkdir "/Volumes/RIASSUNTI/UNIVERSITA" failed: Permission denied (13)
16/06/12 01.13.51 com.rsync.backup[14139] *** Skipping everything below this failed directory ***
16/06/12 01.13.51 com.rsync.backup[14139] rsync error: some files could not be transferred (code 23) at /SourceCache/rsync/rsync-40/rsync/main.c(992) [sender=2.6.9]
16/06/12 01.13.51 com.rsync.backup[14139] hdiutil: detach failed - Nessun documento o directory esistente
16/06/12 01.13.56 com.rsync.backup[14271] /dev/disk3 GUID_partition_scheme
16/06/12 01.13.56 com.rsync.backup[14271] /dev/disk3s1 EFI
16/06/12 01.13.56 com.rsync.backup[14271] /dev/disk3s2 Apple_HFS /Volumes/RIASSUNTI 1
16/06/12 01.14.06 com.rsync.backup[14271] rsync: recv_generator: mkdir "/Volumes/RIASSUNTI/UNIVERSITA" failed: Permission denied (13)
16/06/12 01.14.06 com.rsync.backup[14271] *** Skipping everything below this failed directory ***
16/06/12 01.14.06 com.rsync.backup[14271] rsync error: some files could not be transferred (code 23) at /SourceCache/rsync/rsync-40/rsync/main.c(992) [sender=2.6.9]
16/06/12 01.14.06 com.rsync.backup[14271] hdiutil: detach failed - Nessun documento o directory esistente

The error message is saying that the user running rsync does not have permission to do
mkdir "/Volumes/RIASSUNTI/UNIVERSITA"
You may want to set up your sudoers file so that you can change:
rsync -aq $folderToBackup $backupTo
to
sudo rsync -aq $folderToBackup $backupTo

Related

AWK - if file is not found, display "not found"

Trying to print "Not found" using awk
Here's what I've tried so far:
latestlogfilename=$(echo $latestlogline | awk 'END {print} { if (!NF) print "file not found" }')
echo "LATEST LOG = $(echo $latestlogline)"
echo "COUNT = $(wc -l $latestlogfilename)"
OUTPUT:
LATEST LOG = 24003651 Jun 8 14:17 /dir/foo.tx
wc: 24003651: No such file or directory
wc: Jun: No such file or directory
wc: 8: No such file or directory
wc: 14:17: No such file or directory
COUNT = 51877 /dir/foo.tx
51877 total
ls: cannot access /dir/foo2.tx: No such file or directory
LATEST LOG =
wc: file: No such file or directory
wc: not: No such file or directory
wc: found: No such file or directory
COUNT = 0 total
It does work but it's processing every line. I just want it to print "file not found" if the file isn't there, and if file is found, just display the latest log and count.
It's not completely clear what you want, but maybe this will get you pointed in the right direction.
filename=$(echo $latestlogline | awk '{print $5}')
if [ -e "$filename" ]; then
echo "COUNT = $(wc -l "$filename")"
else
echo "file not found"
fi

Creating a .sh file through Batch script and executing .sh file through batch

Below is the part of a batch script that i have created:
{
REM ********* CONN SCRIPT CREATION ***************
echo #!/bin/sh >%conn_script%
echo >>%conn_script%
echo if [ %today% -eq 23 ] >>%conn_script%
echo then >>%conn_script%
echo **find . -maxdepth 0 -type f -mtime +0 -exec rm -rf {} \;>>%conn_script%
echo else >>%conn_script%**
echo echo Files are not from previous month >>%conn_script%
echo fi >>%conn_script%
type %conn_script%
::echo bye >>%conn_script%
echo The sftp_script is:
echo "command: call %executor%\RUN\plink -ssh %host% -batch -l %user% -pw ********** -m %conn_script%"
call %executor%\RUN\plink -ssh %host% -batch -l %user% -pw %password% -m %conn_script% >%logfile%
}
I have created a batch script that is creating a .sh file. That sh file is deleting files from a unix server. When batch script is executing sh file it is getting error "find: bad option -maxdepth
find: [-H | -L] path-list predicate-list" from the code which is in BOLD format.
Even i also want to append the log of deleted files in a .txt file which is in my local machine.
I have tried a lot but not able to append the log in .txt file.
Please provide yours valuable feedback for this issue.
Thanks
Have you tried /usr/xpg4/bin/find (Available in Solaris).
/usr/xpg4/bin/find . -maxdepth 0 -type f -mtime +0 | xargs rm -f

Copy a directory over http within a windows batch file

I need a command to use in a batch file, which copies the contents of a remote directory to a local directory over http.
For example to copy folder http ://path//folder to C:\folder
I need to do this without installing any additional tools.
Thanks in advance!
There's no standard way for an http server to list accessible directories.
For example I took http://unomoralez.com/content/files/catalog2/source/ as one of the common ways to list directory with http. Your site could look different though but there's no way for me tho know... (ther's a temp list2.txt file - you can remark its deletion to check the format of directory page and tell me if its not working. IF it is IIS could look like this: http://live.sysinternals.com/tools/)
the script downloads all content into .\download_dir (not recursive download) :
#if (#X)==(#Y) #end /****** jscript comment ******
#echo off
::::::::::::::::::::::::::::::::::::
::: compile the script ::::
::::::::::::::::::::::::::::::::::::
setlocal
if exist simpledownloader.exe goto :skip_compilation
set "frm=%SystemRoot%\Microsoft.NET\Framework\"
:: searching the latest installed .net framework
for /f "tokens=* delims=" %%v in ('dir /b /s /a:d /o:-n "%SystemRoot%\Microsoft.NET\Framework\v*"') do (
if exist "%%v\jsc.exe" (
rem :: the javascript.net compiler
set "jsc=%%~dpsnfxv\jsc.exe"
goto :break_loop
)
)
echo jsc.exe not found && exit /b 0
:break_loop
call %jsc% /nologo /out:"simpledownloader.exe" "%~dpsfnx0"
::::::::::::::::::::::::::::::::::::
::: end of compilation ::::
::::::::::::::::::::::::::::::::::::
:skip_compilation
:: download the file
:::::::::::::::::::::::::::::::::::::::::
::::just change the link and the file::::
:::::::::::::::::::::::::::::::::::::::::
::!!!!!!!!!!!!!!!!!!!!!!!!!:::
simpledownloader.exe "http://unomoralez.com/content/files/catalog2/source/" "list2.txt"
md download_dir >nul 2>&1
for /f "skip=1 tokens=4 delims=>< " %%a in ('type list2.txt^| find /i "href" ') do (
simpledownloader.exe "http://unomoralez.com/content/files/catalog2/source/%%a" .\download_dir\%%a
)
del /q /f list2.txt
exit /b 0
****** end of jscript comment ******/
import System;
var arguments:String[] = Environment.GetCommandLineArgs();
var webClient:System.Net.WebClient = new System.Net.WebClient();
print("Downloading " + arguments[1] + " to " + arguments[2]);
try {
webClient.DownloadFile(arguments[1], arguments[2]);
} catch (e) {
Console.BackgroundColor = ConsoleColor.Green;
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine("\n\nProblem with downloading " + arguments[1] + " to " + arguments[2] + "Check if the internet address is valid");
Console.ResetColor();
Environment.Exit(5);
}
As you have powershell you also have .net so this code will be executed without problems for you.
This was more or less a code that I already had but you can also check this -> https://code.google.com/p/curlie/ if you are familiar with cURL and create a hybrid jscript/.bat file.

Unix troubleshooting, missing /etc/init.d file

I am working through this tutorial on daemonizing php scripts. When I run the following Unix command:
. /etc/init.d/functions
#startup values
log=/var/log/Daemon.log
#verify that the executable exists
test -x /home/godlikemouse/Daemon.php || exit 0RETVAL=0
prog="Daemon"
proc=/var/lock/subsys/Daemon
bin=/home/godlikemouse/Daemon.php
start() {
# Check if Daemon is already running
if [ ! -f $proc ]; then
echo -n $"Starting $prog: "
daemon $bin --log=$log
RETVAL=$?
[ $RETVAL -eq 0 ] && touch $proc
echo
fi
return $RETVAL
}
I get the following output:
./Daemon: line 12: /etc/init.d/functions: No such file or directory
Starting Daemon: daemon: unrecognized option `--log=/var/log/Daemon.log'
I looked at my file system and there was no /etc/init.d file. Can anyone tell me what this is and where to obtain it? Also is the absence of that file what's causing the other error?
Separate your args within their own " " double-quotes:
args="--node $prog"
daemon "nohup ${exe}" "$args &" </dev/null 2>/dev/null
daemon "exe" "args"

help with unix tar and grep loop

I would like some help in creating a loop that will take one of my files extension .tar.gz
unzip it untar it and search the files inside (with extension .tlg) using grep -a >> output.text.
In the outout.text i will require the matching data as well as the name of the file and parent tar it came from
one this search has been performed i would like the untared files to be deleted and the preocess to continue on the next tar file until all tars have been checked.
I can't untar all at one as i dont have the disk space for this
Can anyone help
?
thanks
To avoid creating temporary files, you can use GNU tar's --to-stdout option.
The code below is careful about spaces and other characters in paths that may confuse the shell:
#! /usr/bin/perl
use warnings;
use strict;
sub usage { "Usage: $0 pattern tar-gz-file ..\n" }
sub output_from {
my($cmd,#args) = #_;
my $pid = open my $fh, "-|";
warn("$0: fork: $!"), return unless defined $pid;
if ($pid) {
my #lines = <$fh>;
close $fh or warn "$0: $cmd #args exited " . ($? >> 8);
wantarray ? #lines : join "" => #lines;
}
else {
exec $cmd, #args or die "$0: exec $cmd #args: $!\n";
}
}
die usage unless #ARGV >= 2;
my $pattern = shift;
foreach my $tgz (#ARGV) {
chomp(my #toc = output_from "tar", "-ztf", $tgz);
foreach my $tlg (grep /\.tlg\z/, #toc) {
my $line = 0;
for (output_from "tar", "--to-stdout", "-zxf", $tgz, $tlg) {
++$line;
print "$tlg:$line: $_" if /$pattern/o;
}
}
}
Sample runs:
$ ./grep-tlgs hello tlgs.tar.gz
tlgs/another.tlg:2: hello
tlgs/file1.tlg:2: hello
tlgs/file1.tlg:3: hello
tlgs/third.tlg:1: hello
$ ./grep-tlgs ^ tlgs.tar.gz
tlgs/another.tlg:1: blah blah
tlgs/another.tlg:2: hello
tlgs/another.tlg:3: howdy
tlgs/file1.tlg:1: whoah
tlgs/file1.tlg:2: hello
tlgs/file1.tlg:3: hello
tlgs/file1.tlg:4: good-bye
tlgs/third.tlg:1: hello
tlgs/third.tlg:2: howdy
$ ./grep-tlgs ^ xtlgs.tar.gz
tar: xtlgs.tar.gz: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now
tar: Child returned status 2
tar: Exiting with failure status due to previous errors
./grep-tlgs: tar -ztf xtlgs.tar.gz exited 2 at ./grep-tlgs line 14.
You could loop over the tars, extract them, then grep them; something like this should work:
match="somestring"
mkdir out/
for i in *.tar.gz; do
mkdir out/${i} # create outdir
tar -C out/${i} -xf ${i} # extract to sub-dir with same name as tar;
# this will show up in grep output
cd out
grep -r ${match} ${i} >> ../output.text
cd ..
rm -rf out/${i} # delete untarred files
done
be careful, as the contents of the $i variable are passed to rm -rf and has the power to delete stuff for good.

Resources