Error in par(bg = "seashell") : could not find function "par" - r

I´ve put par(bg = "seashell") in "Rprofile.site" file and got the message:
Error in par(bg = "seashell") : could not find function "par".
Then I put library(graphics) before it and that error stopped.
But now, RStudio opens these windows:
How can I set par(bg) at startup?

Related

Error when trying to prase a HTTP-Request in R

im using R package httr to get a HTTP-Response for a specific link.
When trying to parse the content of the response im getting the Error:
Fehler in parse(text = script_content) : <text>:1:10: Unerwartete(s) '['
1: {"lines":[
Translated to enlgish it says something like this (sorry for my error messages being in German):
Error in parsing(text = script_content) : <text>1:10: Unexpected '['
1: {"lines":[
It seems as there is a problem with the format/encoding of the text. Here is my code:
script <-
GET(
url = "https://my_url.which_origin_is_not_important/my_script.R",
authenticate(username, pass)
)
script_content <- content(script, as = "text", encoding = "ISO-8859-1")
parsed_condent <- parse(text = script_content )
The value of script_content looks like this:
"{\"lines\":[{\"text\":\"################## FUNCTION ##################\"},{\"text\":\"\"},{\"text\":\"library(log4r)\"}],\"start\":0,\"size\":32,\"isLastPage\":true,\"limit\":500,\"nextPageStart\":null}"
Some more background to this operation: Im trying to source a code, which is currently inside of a private repository. I wrote the code myself i'm trying to source. I made sure, that the issue is not coming from within th code.
I got the solution from: Sourcing R files in a private github folder
Thanks for any advice!!

R targets and dataRetrieval return a connection error

I am attempting to use a targets workflow in my R project. I am attempting to download water quality data using the dataRetrieval package. In a fresh R session this works:
dataRetrieval::readWQPdata(siteid="USGS-04024315",characteristicName="pH")
To use this in targets, I have the following _targets.R file:
library(targets)
tar_option_set(packages = c("dataRetrieval"))
list(
tar_target(
name = wqp_data,
command = readWQPdata(siteid="USGS-04024315",characteristicName="pH"),
format = "feather",
cue = tar_cue(mode = "never")
)
)
when I run tar_make() the following is returned:
* start target wqp_data
No internet connection.
The following url returned no data:
https://www.waterqualitydata.us/data/Result/search?siteid=USGS-04024315&characteristicName=pH&zip=yes&mimeType=tsv
x error target wqp_data
* end pipeline
Error : attempt to set an attribute on NULL
Error: callr subprocess failed: attempt to set an attribute on NULL
Visit https://books.ropensci.org/targets/debugging.html for debugging advice.
Run `rlang::last_error()` to see where the error occurred.
I have attempted debugging using tar_option_set(debug = "wqp_data") or tar_option_set(workspace_on_error = TRUE) but outside of isolating the error to readWQPdata() didn't get anywhere.
I also had success using curl directly in targets so I do not think it is my actual internet connection:
list(
tar_target(
name = wqp_data,
command = {con <- curl::curl("https://httpbin.org/get")
readLines(con)
close(con)}
)
)
tar_make()
* start target wqp_data
* built target wqp_data
* end pipeline
Any advice on how to diagnose the connection issue when using these two packages?

passing extra argumenets to devtools::build

Something seems to have changed in the devtoolspackage, so that the following commands, that used to run now give an error I can't decipher:
> Sys.setenv(R_GSCMD="C:/Program Files/gs/gs9.21/bin/gswin64c.exe")
> devtools::build(args = c('--resave-data','--compact-vignettes="gs+qpdf"'))
The filename, directory name, or volume label syntax is incorrect.
Error in (function (command = NULL, args = character(), error_on_status = TRUE, :
System command error
I've tried other alternatives with other devtools commands, like just passing a single argument, but still get the same error
args = '--compact-vignettes="gs+qpdf"'
devtools::check_win_devel(args=args)
I'm using devtools 2.2.0, under R 3.5.2

Not able to executing all text files from one folder by a Rscript

This is an R script for array quality metrics. The first step is going well but after the execution of the 2nd step an error occurs.
library(arrayQualityMetrics)
library(limma)
library(tcltk)
X <-tk_choose.files(caption = "Choose X")
maData<-read.maimages(X, source="agilent", other.columns = "g", green.only=TRUE)
eSet<-new("ExpressionSet", exprs = maData$other$g, annotation =maData$genes[,7])
arrayQualityMetrics(eSet, outdir="QC_C", force = TRUE, do.logtransform = TRUE)
The program is running now but it is showing this warning message:
The directory 'QC_C' has been created.
Warning messages:
1: In svgStyleAttributes(style) :
Removing non-SVG style attribute name(s): subscripts, group.number, group.value
2: In svgStyleAttributes(style) :
Removing non-SVG style attribute name(s): subscripts, group.number, group.value
Where am I getting wrong? Is this the errors in the file or the Rscript is wrong....
Give the full path of the folder under path as below:
scanFiles<-dir(path='/path/to/folder/',pattern = ".*.txt$")

setup_twitter_oauth, searchTwitter and Rscript

I run the following script using an installation of RStudio on a Linux-Server.
require(twitteR)
require(plyr)
setup_twitter_oauth(consumer_key='xxx', consumer_secret='xxx',
access_token='xxx', access_secret='xxx')
searchResults <- searchTwitter("#vds", n=15000, since = as.character(Sys.Date()-1), until = as.character(Sys.Date()))
head(searchResults)
tweetsDf = ldply(searchResults, function(t) t$toDataFrame())
write.csv(tweetsDf, file = paste("tweets_vds_", Sys.Date(), ".csv", sep = ""))
The script works fine, when I run it from the user-interface.
However, when I automatically run it via the terminal using crontab, I get the following error-message:
[1] "Using direct authentication"
Error in twInterfaceObj$getMaxResults :
could not find function "loadMethod"
Calls: searchTwitter -> doRppAPICall -> $
Execution halted
Why?

Resources