How can i identify my cron job is processing successfully or fail in Unix.
eg
1 * * * * sh /usr/Script/Pri_Ip_Status.sh > /tmp/pri_log
check /tmp/pri_log this file . if any error coming while cron job append in this file. In every min it will change.
/tmp/pri_log if you are giving like this get all logs.
in script first check current working directory declared or not. other wise all cronjob script take home as current directory whether script in particular location also.
Related
I'm trying to run a daily taskscheduleR script that pulls data into R from an API. It works when I run it as a one time task but for some reason it won't work as a daily task. I keep getting the following error in the log file:
<HEAD><TITLE>Authorization Required</TITLE></HEAD>
<BODY BGCOLOR=white FGCOLOR=black>
<H1>Authorization Required</H1><HR>
<FONT FACE=Helvetica,Arial>
<B>Description: Authorization is required for access to this proxy</B>
</FONT>
<HR>
<!-- default Authorization Required response (401) -->
Here's the code:
library(httr)
library(jsonlite)
library(tidyverse)
library(taskscheduleR)
# Url to feed into GET function
url<-"https://urldefense.com/v3/__http://files.airnowtech.org/airnow/yesterday/daily_data_v2.dat__;!!J30X0ZrnC1oQtbA!Yh5wIss-mzbpMRXugALJoWEKLKcg1-7VmERQwcx2ESK0PZpM5NWNml5s9MVgwHr5LD1i5w$ "
# Sends request to AirNow API to get access to data
my_raw_result<-httr::GET(url)
# Retrieve contents of a request
my_content<-httr::content(my_raw_result,as="text")
# Parse content into a dataframe
my_content_from_delim <- my_content %>% textConnection %>% readLines %>% read.delim(text = ., sep = "|",header = FALSE)
head(my_content_from_delim)
I have been using the Rstudio add-in to create the task.
If you are trying to access this on a work computer, you may need to allow downloads from the url link. Open a browser, paste that url, click 'allow downloads', run the script.
I am not sure whether the solution I will offer will work for you, but it won't harm to try. If the problem related to the task scheduler, the following solution might work. However, if the problem of authorization issues, you may need to get some IT help from your workplace.
For the task scheduler issue, you can directly send your script to the windows task scheduler with a batch file and create a schedule for it.
To make it easy, you can use the following code. First, open a new folder and copy-paste your R script there. To run the following code, you should call you R script as My Script.r.
Then, in the same folder, create a batch file with the following codes. To create a batch file, you should copy the following code into a Notepad and save it as Run R Script.bat in the same folder.
cd %~dp0
"C:\PROGRA~1\R\R-40~1.0\bin\R.exe" -e "setwd(%~dp0)" CMD BATCH --vanilla --slave "%~dp0My Script.r" Log.txt
Here, cd %~dp0 will set the directory for the windows batch to the folder you run this batch. "C:\PROGRA~1\R\R-40~1.0\bin\R.exe" will specify your R.exe. You may need to change the path based on your system files.
-e "setwd(%~dp0)" will set the directory of R to the same folder in which the batch and script will be run.
"%~dp0My Script.r" Log.txt will define R script pathname and the log file for the batch.
Second, to create a daily schedule, we are going to create another batch file. To do so, copy and paste the following codes into a notepad and save as Daily Schedule.bat.
When you click the Daily Schedule.bat, it will create a daily task and run for the first time in one minute, and every day it will repeat itself at the same time when you first run this batch.
#echo off
for /F "tokens=1*" %%A in ('
powershell -NoP -C "(Get-Date).AddMinutes(1).ToString('MM/dd/yyyy HH:mm:ss')"
') do (
Set "MyDate=%%A"
set "MyTime=%%B"
)
::Execute path to bat path
cd %~dp0
::Create Task
SchTasks /Create /SC DAILY /TN "MY R TASK" /TR "%~dp0Run R Script.bat" /sd %MyDate% /st %MyTime%
This code will create a task called as "MY R TASK". To see whether it is scheduled, you can run the following codes on the windows prompt: taskschd.msc. This will open your task scheduler, and you can find your task there. If you want to modify or delete, you can use this task scheduler program; it has a nice GUI and easy to navigate.
For more details about the Task scheduler syntax, see the following link
If you have any questions, let me know.
I scheduled a data extraction with an Xquery query on ML 8.0.6 using the "scheduler tasks".
My Xquery query (this query works if I copy/paste it in the ML web console and I get a file available on AWS S3):
xdmp:save("s3://XX.csv",let $nl := "
"
return
document {
for $book in collection("books")/books
return (root($book)/bookId||","||
$optin/updatedDate||$nl
)
})
My scheduled task :
Task enabled : yes
Task path : /home/bob/extraction.xqy
task root : /
task type: hourly
task period : 1
task start time: 8 past the hour
task database : mydatabase
task modules : file system
task user : admin
task host: XX
task priority : higher
Unfortunately, my script is not executed because no file is generated on AWS S3 (the storage used)and I do not have any logs.
Any idea to :
1/debug a job in the scheduler task?
2/ See the job running at the expected time ?
Thanks,
Romain.
First, I would try take a look at the ErrorLog.txt because it will probably show you where to look for the problem.
xdmp:filesystem-file(concat(xdmp:data-directory(),"/","Logs","/","ErrorLog.txt"))
Where is the script located logically: Has it been uploaded to the content database, modules database, or ./MarkLogic/Modules directory?
If this is a cluster, have you specified which host it's running on? If so, and using the filesystem modules, then ensure the script exists in the ./MarkLogic/Modules directory of the specified host. Inf not, and using the filesystem modules, then ensure the script exists in the ./MarkLogic/Modules directory of all the hosts of the cluster.
As for seeing the job running, you can check the http://servername:8002/dashboard/ and take a look at the Query Execution tab see the running processes, or you can get a snapshot of the process by taking a look at the Status page of the task server (Configure > Groups > [group name] > Task Server: Status and click show more button)
I would like to set up a crontab that runs the emailSender.R script daily at 5pm Monday to Friday.
The script of emailSender.R is as follows:
library(rmarkdown)
rmarkdown::render("htmlmarkdown.Rmd")
library(gmailR)
gmailR::gmail(
to =c("recipient#email.com"),
subject = "Subject",
message = "Message",
username = "me#email.com",
password = "password",
attachment = "htmlmarkdown.html"
)
I then open up terminal to set up the crontab by first typing crontab -e.
Then a window pops up where I try to set up my cronjob using the following code.
0 17 * * * Rscript /Users/username/emailSender.R
Unfortunately, emailSender.R doesn't run as scheduled.
Would greatly appreciate any help on getting a crontab to schedule my R script
EDIT: After going back to my terminal and typing Rscript I am prompted:
-bash: Rscript: command not found
Perhaps I have to set Rscript in my PATH before cron can set-up the task. Unsure how to do that despite searching extensively.
Try installing the cronR package.
Once you do this, you should be able to navigate to tools > Addins where you can execute the package. It will bring up a scheduler that will allow you to schedule a time for your script to run.
If you have permission issues, go to System preferences > security > Privacy. Click the 'full disk access' and grant RStudio / R access. This should allow you to schedule jobs to run in the future.
You need to include the path of Rscript:
0 17 * * * /usr/local/bin/Rscript /Users/username/emailSender.R
Add this code at the beggining:
Sys.setenv(RSTUDIO_PANDOC="/usr/lib/rstudio-server/bin/pandoc")
Is there a way (or a module) to write log files into the disk instead of writing them to db? Because i really don't want my db getting fatter just because log lines.
Yes there is. It's part of the core of Drupal. It's call syslog. However, it logs in the system log file by default.
I hope you have some fast disks... you could easily create a bottleneck by doing so. Instead, I would regularly dump the log tables to a file, say using a cron job.
You could add this to a file called drupal_logs.sh:
NOW=$(date +"%Y%m%d")_$(date +"%H%M.%S")
mysqldump -p - -user=username dbname tableName1 tableName1 > /path/to/drupal_$NOW.sql
And schedule that to run every 15 minutes by adding the following cron job:
15 * * * * /path/to/drupal_logs.sh > /dev/null
And if you're worried about the log files in the database getting to large, you can follow your mysqldump command in your drupal_logs.sh with a truncate command of the exported tables.
I have written an R script that pulls some data from a database, performs several operations on it and post the output to a new database.
I would like this script to run every day at a specific time but I can not find any way to do this effectively.
Can anyone recommend a resource I could look at to solve this issue? I am running this script on a Windows machine.
Actually under Windows you do not even have to create a batch file first to use the Scheduler.
Open the scheduler: START -> All Programs -> Accesories -> System Tools -> Scheduler
Create a new Task
under tab Action, create a new action
choose Start Program
browse to Rscript.exe which should be placed e.g. here:
"C:\Program Files\R\R-3.0.2\bin\x64\Rscript.exe"
input the name of your file in the parameters field
input the path where the script is to be found in the Start in field
go to the Triggers tab
create new trigger
choose that task should be done each day, month, ... repeated several times, or whatever you like
Supposing your R script is mytest.r, located in D:\mydocuments\, you can create a batch file including the following command:
C:\R\R-2.10.1\bin\Rcmd.exe BATCH D:\mydocuments\mytest.r
Then add it, as a new task, to windows task scheduler, setting there the triggering conditions.
You could also omit the batch file. Set C:\R\R-2.10.1\bin\Rcmd.exe in the program/script textbox in task scheduler, and give as Arguments the rest of the initial command: BATCH D:\mydocuments\mytest.r
Scheduling R Tasks via Windows Task Scheduler (Posted on February 11, 2015)
taskscheduleR: R package to schedule R scripts with the Windows task manager (Posted on March 17, 2016)
EDIT
I recently adopted the use of batch files again, because I wanted the cmd window to be minimized (I couldn't find another way).
Specifically, I fill the windows task scheduler Actions tab as follows:
Program/script:
cmd.exe
Add arguments (optional):
/c start /min D:\mydocuments\mytest.bat ^& exit
Contents of mytest.bat:
C:\R\R-3.5.2\bin\x64\Rscript.exe D:\mydocuments\mytest.r params
Now there is built in option in RStudio to do this, to run scheduler first install below packages
install.packages('data.table')
install.packages('knitr')
install.packages('miniUI')
install.packages('shiny')
install.packages("taskscheduleR", repos = "http://www.datatailor.be/rcube", type =
"source")
After installing go to
**TOOLS -> ADDINS ->BROWSE ADDINS ->taskscheduleR -> Select it and execute it.**
Setting up the task scheduler
Step 1) Open the task scheduler (Start > search Task Scheduler)
Step 2) Click "Action" > "Create Task"
Step 3) Select "Run only when the user is logged on", uncheck "Run with highest priveledges", name your task,
configure for "Windows Vista/Windows Server 2008"
Step 4) Under the "Triggers" tab, set when you would like the script to run
Step 5) Under the "Actions" tab, put the full location of the Rscript.exe file, i.e.
"C:\Program Files\R\R-3.6.2\bin\Rscript.exe" (include the quotes)
Put the name of your script with with -e and source() in arguments wrapping it like this:
-e "source('C:/location_of_my_script/test.R')"
Troubleshooting a Rscript scheduled in the Task Scheduler
When you run a script using the Task Scheduler, it is difficult to troubleshoot any issues because you don't get any error messages.
This can be resolved by using the sink() function in R which will allow you to output all error messages to a file that you specify. Here is how you can do this:
# Set up error log ------------------------------------------------------------
error_log <- file("C:/location_of_my_script/error_log.Rout", open="wt")
sink(error_log, type="message")
try({
# insert your code here
})
The other thing that you will have to change to make your Rscript work is to specify the full file path of any file paths in your script.
This will not work in task scheduler:
source("./functions/import_function.R")
You will need to specify the full file path of any scripts you are sourcing within your Rscript:
source("C:/location_of_my_script/functions/import_function.R")
Additionally, I would remove any special characters from any file paths that you are referencing in your R script. For example:
df <- fread("C:/location_of_my_data/file#2342.csv")
may not run. Instead, try:
df <- fread("C:/location_of_my_data/file_2342.csv")
Changing windows passwords
Beware: Changing windows passwords will pause your task scheduler script(s). You will need to log back into the task scheduler and enter your password to get them started again.
I set up my tasks via the SCHTASKS program. For running scripts on startup, you would write something along the lines of
SCHTASKS /Create /SC ONSTART /TN MyProgram /TR "R CMD BATCH --vanilla d:\path\to\script.R"
See this website for more details on SCHTASKS. More details at Microsoft's website.
You can use Windows Task Scheduler.
After following any combination of these steps and you receive the "Argument Batch Ignored" error after R.exe runs, try this, it worked for me.
In Windows Task Scheduler:
Replace BATCH "C:\Users\desktop\yourscript.R"in the arguments field
with
CMD BATCH --vanilla --slave "C:\Users\desktop\yourscript.R"