setup_twitter_oauth, searchTwitter and Rscript - r

I run the following script using an installation of RStudio on a Linux-Server.
require(twitteR)
require(plyr)
setup_twitter_oauth(consumer_key='xxx', consumer_secret='xxx',
access_token='xxx', access_secret='xxx')
searchResults <- searchTwitter("#vds", n=15000, since = as.character(Sys.Date()-1), until = as.character(Sys.Date()))
head(searchResults)
tweetsDf = ldply(searchResults, function(t) t$toDataFrame())
write.csv(tweetsDf, file = paste("tweets_vds_", Sys.Date(), ".csv", sep = ""))
The script works fine, when I run it from the user-interface.
However, when I automatically run it via the terminal using crontab, I get the following error-message:
[1] "Using direct authentication"
Error in twInterfaceObj$getMaxResults :
could not find function "loadMethod"
Calls: searchTwitter -> doRppAPICall -> $
Execution halted
Why?

Related

R code runs well on one computer but not on another (via Task Scheduler in RScript.exe)

My issue is: when I run the following code from one laptop in RScript.exe via Task Scheduler, I get the desired output; that is the email is sent. But when I run the same code on another machine in RScript.exe via Task Scheduler, it doesn't run. Another machine (machine 2) is able to send emails (when only the code for email is run), so I think the issue is with the following part.
results <- get_everything(query = q, page = 1, page_size = 2, language = "en", sort_by = "popularity", from = Yest, to = Today)
I am unable to find what is the issue here. Can someone please help me with this?
My code is:
library(readxl)
library(float)
library(tibble)
library(string)
library(data.table)
library(gt)
library(tidyquant)
library(condformat)
library(xtable)
library(plyr)
library(dplyr)
library(newsanchor)
library(blastula)
Today <- Sys.Date()
Yest <- Sys.Date()-1
results <- get_everything(query = "Inflation", page = 1, page_size = 2, language =
"en", sort_by = "popularity", from = Yest, to = Today, api_key =
Sys.getenv("NEWS_API_KEY"))
OP <- results$results_df
OP <- OP[-c(1, 5:9)]
colnames(OP) <- c("News Title", "Description", "URL")
W <- print(xtable(OP), type="html", print.results=FALSE, align = "l")
email1 <-
compose_email(
body = md(
c("<tr>", "<td>", "<table>", "<tr>", "<td>", "<b>", "Losers News", "</b>", W,
"</td>", "</tr>", "</table>","</td>", "<td>")
)
)
email1 %>%
smtp_send(
from = "abc#domain.com",
to = "pqr#domain.com",
subject = "Hello",
credentials = creds_key(
"XYZ"
)
)
Whenever you schedule jobs, consider using a command line shell such as PowerShell or Bash to handle the automation steps, capture, and log errors and messages. Rscript fails on the second machine for some unknown reason which you cannot determine since you do not receive any error messages from console using TaskScheduler.
Therefore, consider PowerShell to run all needed Rscript.exe calls and other commands and capture all errors to date-stamped log file. Below script redirects all console output to a .log file with messages. When Rscript command fails, the log will dump error or any console output (i.e., head, tail) below it. Regularly check logs after scheduled jobs.
PowerShell script (save as .ps1 file)
cd "C:\path\to\scripts"
& {
echo "`nAutomation Start: $(Get-Date -format 'u')"
echo "`nSTEP 1: myscript.R - $(Get-Date -format 'u')"
Rscript myscript.R
# ... ADD ANY OTHER COMMANDS ...
echo "`nCAutomation End: $(Get-Date -format 'u')"
} 3>&1 2>&1 > "C:\path\to\logs\automation_run_$(Get-Date -format 'yyyyMMdd').log"
Command Line (to be used in Task Scheduler)
Powershell.exe -executionpolicy remotesigned -File myscheduler.ps1
Note: Either change directory in TaskScheduler job settings where myscheduler.ps1 resides or run absolute path in -File argument.

RStudio fails when run through "Source as Local Job"

The code consists in two files:
caller.R:
a <- 1
source("s1.R", encoding = "UTF-8")
b <- fa()
s1.R:
fa <- function() {
a*2
}
This code runs smoothly when caller.R is sourced (Crtl+Shift+S) in RStudio IDE, providing the correct expected result b=2.
However, when caller.R is sourced through "Source as Local Job...", it throws an error (Portuguese), meaning that execution was interrupted because it was not able to find object 'a':
Error in fa() : objeto 'a' n�o encontrado
Calls: sourceWithProgress -> eval -> eval -> fa
Execu��o interrompida
I have tried all possible "Source as Local Job..." options combinations ("Run job with copy of global environments, etc.) without success.
What do I have to do to be able to run caller.R as a local job?
If you want to have it available in the same environment, you can try to use the local = TRUE
source("s1.R", encoding = "UTF-8", local = TRUE)

bookdown::publish_book - error when publishing Rmarkdown in Rstudio

When I try to publish a book to bookdown by running the command:
bookdown::publish_book(render = "none", account="my_account", server="bookdown.org")
I get the following error:
Error in rsconnect::deploySite(siteDir = getwd(), siteName = name, account = account, :
index file with site entry not found in C:\Users\...\...
I have managed to connect to bookdown with the command rsconnect::connectUser(server = 'bookdown.org').
and when I run rsconnect::accounts I get a positive response:
name server
1 my_user bookdown.org
What could be causing this error? Thanks
in the end, I just used rsconnect instead:
library(rmarkdown)
library(rsconnect)
connectUser(account = "my_user", server = "bookdown.org", quiet = TRUE)
# reder app
render("script.Rmd")
deployApp(appFiles = "script.html")

passing extra argumenets to devtools::build

Something seems to have changed in the devtoolspackage, so that the following commands, that used to run now give an error I can't decipher:
> Sys.setenv(R_GSCMD="C:/Program Files/gs/gs9.21/bin/gswin64c.exe")
> devtools::build(args = c('--resave-data','--compact-vignettes="gs+qpdf"'))
The filename, directory name, or volume label syntax is incorrect.
Error in (function (command = NULL, args = character(), error_on_status = TRUE, :
System command error
I've tried other alternatives with other devtools commands, like just passing a single argument, but still get the same error
args = '--compact-vignettes="gs+qpdf"'
devtools::check_win_devel(args=args)
I'm using devtools 2.2.0, under R 3.5.2

Differences in calling `system()` from within RStudio or via Rscript?

I am trying to run external tools from the MEME suite, one of this tool (jaspar2meme) producing a text file that is then use as an input of a second tool (fimo). Here is my code :
#!usr/bin/Rscript
com1 <- "meme/bin/jaspar2meme"
arg1 <- "-bundle jaspar_plant_2014.pfm"
message("jaspar2meme command: ", com1, arg1)
system2(command = com1, args = arg1, stdout = "motif.fimo", wait = T)
com2 <- paste0("meme/bin/fimo")
arg2 <- paste0("--text --oc . --verbosity 1 --thresh 1.0E-4 --bgfile bg.fimo motif.fimo Genes_up_h16.ame")
message("FIMO command: ", com2, arg2)
system2(command = com2, args = arg2, stdout = "fimoresult.txt", wait = T)
When I run this code from within RStudio (via source), it works perfectly: the file motif.fimo is produced by jaspar2meme and use by fimo to produce the resulting file fimoresult.txt.
When I run the same script via Rscript from the shell (bash), the motif.fimo is also produced as expected but is not found by fimoand the fimoresult.txt remains empty.
What I tried so far is to use either system() or system2(), using the wait=T option or not, specifying the full path to motif.fimo but without success.
I finally got it... The locale variables were different in RStudio and Rscript. The motif.fimo file produced by jaspar2meme looked the same in both cases but was apparently not. By changing the first call to system2() by :
system2(command = com1, args = arg1, stdout = "motif.fimo", wait = T, env = c("LC_NUMERIC=C"))
solve my problem.

Resources