I want to create a message for terminal errors.
But my code does not work! :(
local return_code=""
local code="%?"
if [ "$code" = "130" ]; then
return_code="%F{red}TERMINATED BY USER ↵%f"
elif [ "$code" = "0" ]; then
return_code=""
else
return_code="%F{red}${code} ↵%f" # <= always return
fi
Or I used another method
local code=$? # always return zero
if [ $code -eq 130 ]; then
...
I found the answer in the comments and thank you
local e001="%(1?.(!) GENERAL ERROR%f ↵."
local e002="%(2?.(!) MISUSE OF SHELL BUILTINS%f ↵."
local e126="%(126?.(!) COMMAND INVOKED CANNOT EXECUTE%f ↵."
local e127="%(127?.(!) COMMAND NOT FOUND%f ↵."
local e128="%(128?.(!) INVALID ARGUMENT TO EXIT%f ↵."
local e130="%(130?.(!) TERMINATED BY USER%f ↵."
local e255="%(255?.(!) EXIT STATUS OUT OF RANGE%f ↵."
local return_code="%F{red}${e001}${e002}${e126}${e127}${e128}${e130}${e255}%(?..%?%f ↵)"
There was no need for conditions
I have a problem in PHP / Laravel to create multiple zip files synchronously, I copy all commands that is generated and squeeze into the Shell, it executes normally, but when I step into the PHP run it only generates the first file = /.
Controller code.
foreach ($passwords as $p){
if($i == 0){
$command = 'zip -u -j -P '.$p.' '.$dir.'/'.$count.'.zip '.storage_path().'/app/'.$directory.'/'.$file1->getClientOriginalName();
$commands->push($command);
}else{
$command = 'zip --quiet -j -P '.$p.' '.$dir.'/'.$count.'.zip '.storage_path().'/app/'.$directory.'/'.($count+1).'.zip';
$commands->push($command);
}
$count--;
$i++;
}
foreach ($commands as $p){
echo $p.'<br/>';
}
foreach ($commands as $c){
$process = new Process($c);
$process->start();
sleep(10);
if($process->isTerminated()){
sleep(1);
}
if ($errorOutput = $process->getErrorOutput()) {
throw new RuntimeException('Process: ' . $errorOutput);
}
}
Data $commands
The script only generates the file 50.zip.
Not sure if sleep could interfere with subprocess (shell command). Please try:
foreach ($commands as $c){
$process = new Process($c);
// Set the timeout instead of sleeping
$process->setTimeout(10);
$process->start();
// Wait for the process to finish
$process->wait();
if ($errorOutput = $process->getErrorOutput()) {
throw new RuntimeException('Process: ' . $errorOutput);
}
}
wait() call uses usleep in a more fine-grained manner it might help with that.
Does it work like this?
I have an init.d script to start my process on boot and requires networking to be initialized. I can use utility nm-online which comes with NetworkManager package but problem will be at deployment where NW will be not installed so I have to have some other reliable option which can tell me network is set and I can connect to other server over network. I can keep trying till I get the networking up or connection is set but that will cause some other problem related to error reporting.
Here is the similar question asked for some other folk.
How to detect when networking initialized in /etc/init.d script?
wait_for_network()
{
[ -z "${LINKDELAY}" ] && LINKDELAY=10
$INFO "Waiting for network..."
if [ -f /usr/sbin/nm-online ]; then
nm-online -q --timeout=$LINKDELAY || nm-online -q -x --timeout=30
else
check_for_network_up $LINKDELAY || check_for_network_up 30
fi
[ "$?" = "0" ] && success "network startup" || failure "network startup"
echo
}
I was trying some other approach where I can check for route table. If network is not up, route command return zero entry but problem is I don’t know real number of route entry. It could be two on one machine where 10 on other machine.
check_for_network_up_old3() {
let no_of_routes=`/bin/netstat -rn | wc -l`
$INFO "netstat result $?"
timeout=$1
while [ "$timeout" != "0" ]; do
let routes=`/sbin/ip route show | wc -l`
$INFO "$routes"
if [ $routes -gt 1 ]; then
return 0
fi
timeout=$((timeout-1))
sleep 1
$INFO "check_for_network_up $timeout"
done
return 1
}
In an effort to setup a "cron job" like scheduled task on Windows I've setup a Powershell script using code recommended via previous stackoverflow question.
I have some backups that I need to cleanup daily and delete old backups so I created a asp.net script to perform this task - the file name is BackupCleanup.aspx and I have confirmed that the ASP.net script does work when executed on its own by visiting the above url - I however cannot get it to execute using the Powershell script below.
The Powershell Script code I'm using is:
$request = [System.Net.WebRequest]::Create("http://127.0.0.1/BackupCleanup.aspx")
$response = $request.GetResponse()
$response.Close()
I have created this file with a PS1 extension, it shows properly in my os (Windows 2008) - I have tried both manually executing this task by right clicking and choosing "Run with Powershell" and also have scheduled this as a task - both to no avail.
I cannot figure out why the script does not work - any help would be GREATLY appreciated.
Here is the Powershell script I use to call up web pages using IE. Hopefully this will work for you as well.
Function NavigateTo([string] $url, [int] $delayTime = 100)
{
Write-Verbose "Navigating to $url"
$global:ie.Navigate($url)
WaitForPage $delayTime
}
Function WaitForPage([int] $delayTime = 100)
{
$loaded = $false
while ($loaded -eq $false) {
[System.Threading.Thread]::Sleep($delayTime)
#If the browser is not busy, the page is loaded
if (-not $global:ie.Busy)
{
$loaded = $true
}
}
$global:doc = $global:ie.Document
}
Function SetElementValueByName($name, $value, [int] $position = 0) {
if ($global:doc -eq $null) {
Write-Error "Document is null"
break
}
$elements = #($global:doc.getElementsByName($name))
if ($elements.Count -ne 0) {
$elements[$position].Value = $value
}
else {
Write-Warning "Couldn't find any element with name ""$name"""
}
}
Function ClickElementById($id)
{
$element = $global:doc.getElementById($id)
if ($element -ne $null) {
$element.Click()
WaitForPage
}
else {
Write-Error "Couldn't find element with id ""$id"""
break
}
}
Function ClickElementByName($name, [int] $position = 0)
{
if ($global:doc -eq $null) {
Write-Error "Document is null"
break
}
$elements = #($global:doc.getElementsByName($name))
if ($elements.Count -ne 0) {
$elements[$position].Click()
WaitForPage
}
else {
Write-Error "Couldn't find element with name ""$name"" at position ""$position"""
break
}
}
Function ClickElementByTagName($name, [int] $position = 0)
{
if ($global:doc -eq $null) {
Write-Error "Document is null"
break
}
$elements = #($global:doc.getElementsByTagName($name))
if ($elements.Count -ne 0) {
$elements[$position].Click()
WaitForPage
}
else {
Write-Error "Couldn't find element with tag name ""$name"" at position ""$position"""
break
}
}
#Entry point
# Setup references to IE
$global:ie = New-Object -com "InternetExplorer.Application"
$global:ie.Navigate("about:blank")
$global:ie.visible = $true
# Call the page
NavigateTo "http://127.0.0.1/BackupCleanup.aspx"
# Release resources
$global:ie.Quit()
$global:ie = $null
I had the same issue. I manually opened powershell and executed my script and I received "WebPage.ps1 cannot be loaded because running scripts is disabled on this system.".
You have to allow scripts to run
Execute the below in PowerShell
Set-ExecutionPolicy RemoteSigned -Scope LocalMachine
Are there any tools or reports out there that given a crontab file can output which jobs run within a specified time-frame.
Our crontab file has become very large and our system administrators struggle to find out which jobs need to be rerun when we have scheduled downtime on the server. We're trying to figure out which jobs we need to run.
I was planning on writing my own script but wondering if there was something out there already
One thing you can do is:
Get Perl module Schedule::Cron
Modify it to sleep only optionally (create "fast-forward" mode and where it does sleep($sleep) change to do nothing when fast-forwarding. This will also require changing $now = time; call to do $now++.
Modify it to be able indicate start and end times for emulation.
Create a Perl one-liner which takes the output of crontab -l and converts it into similar contab but one which replaces command cmd1 arg1 arg2 with a perl subroutine sub { print "Execution: cmd1 arg1 arg2\n"}
Run the scheduler in the fast-forward mode, as indicated in the POD.
It will read in your modified crontab, and emulate the execution.
There is a fine and clean solution for a 'simulation mode' of Schedule::Cron (and for any other module using sleep,time,alarm internally without modifying Schedule::Cron itself. You can use Time::Mock for throtteling, e.g. with
perl -MTime::Mock=throttle,600 schedule.pl
one can speedup your 'time machine' by a factor 600 (so, instead of sleeping for 10 minutes it will only sleep a second). Please refer to the manpage of Time::Mock for more details.
For using a crontab file directly with Schedule::Cron you should be able to take the example from the README directly:
use Schedule::Cron;
my $cron = new Schedule::Cron(sub { system(shift) },
file => "/var/spool/crontab.perl");
$cron->run();
The trick here is to use a default dispatcher method which calls system() with the stored parameters. Please let me know, whether this will work for you or whether it will need to be fixed. Instead of system, you could use print as well, of course.
Here's a similar approach to DVK's but using Perl module Schedule::Cron::Events.
This is very much a "caveat user" posting - a starting point. Given this crontab file a_crontab.txt:
59 21 * * 1-5 ls >> $HOME/work/stack_overflow/cron_ls.txt
# A comment
18 09 * * 1-5 echo "wibble"
The below script cron.pl, run as follows, gives:
$ perl cron.pl a_crontab.txt "2009/11/09 00:00:00" "2009/11/12 00:00:00"
2009/11/09 09:18:00 "echo "wibble""
2009/11/09 21:59:00 "ls >> $HOME/work/stack_overflow/cron_ls.txt"
2009/11/10 09:18:00 "echo "wibble""
2009/11/10 21:59:00 "ls >> $HOME/work/stack_overflow/cron_ls.txt"
2009/11/11 09:18:00 "echo "wibble""
2009/11/11 21:59:00 "ls >> $HOME/work/stack_overflow/cron_ls.txt"
2009/11/12 09:18:00 "echo "wibble""
2009/11/12 21:59:00 "ls >> $HOME/work/stack_overflow/cron_ls.txt"
Prototype (!) script:
use strict;
use warnings;
use Schedule::Cron::Events;
my $crontab_file = shift || die "! Must provide crontab file name";
my $start_time = shift || die "! Must provide start time YYYY/MM/DD HH:MM:SS";
my $stop_time = shift || die "! Must provide stop time YYYY/MM/DD HH:MM:SS";
open my $fh, '<', $crontab_file or die "! Could not open file $crontab_file for reading: $!";
my $table = [];
while ( <$fh> ) {
next if /^\s*$/;
next if /^\s*#/;
chomp;
push #$table, new Schedule::Cron::Events( $_, Date => [ smhdmy_from_iso( $start_time ) ] );
}
close $fh;
my $events = [];
for my $cron ( #$table ) {
my $event_time = $stop_time;
while ( $event_time le $stop_time ) {
my ( $sec, $min, $hour, $day, $month, $year ) = $cron->nextEvent;
$event_time = sprintf q{%4d/%02d/%02d %02d:%02d:%02d}, 1900 + $year, 1 + $month, $day, $hour, $min, $sec;
push #$events, qq{$event_time "} . $cron->commandLine . q{"};
}
}
print join( qq{\n}, sort #$events ) . qq{\n};
sub smhdmy_from_iso {
my $input = shift;
my ( $y, $m, $d, $H, $M, $S ) = ( $input =~ m=(\d{4})/(\d\d)/(\d\d) (\d\d):(\d\d):(\d\d)= );
( $S, $M, $H, $d, --$m, $y - 1900 );
}
Hope you can adapt to your needs.