bookdown::publish_book - error when publishing Rmarkdown in Rstudio - r

When I try to publish a book to bookdown by running the command:
bookdown::publish_book(render = "none", account="my_account", server="bookdown.org")
I get the following error:
Error in rsconnect::deploySite(siteDir = getwd(), siteName = name, account = account, :
index file with site entry not found in C:\Users\...\...
I have managed to connect to bookdown with the command rsconnect::connectUser(server = 'bookdown.org').
and when I run rsconnect::accounts I get a positive response:
name server
1 my_user bookdown.org
What could be causing this error? Thanks

in the end, I just used rsconnect instead:
library(rmarkdown)
library(rsconnect)
connectUser(account = "my_user", server = "bookdown.org", quiet = TRUE)
# reder app
render("script.Rmd")
deployApp(appFiles = "script.html")

Related

DataSpell Remote R Console Failure

First of all, I use a remote R interpreter.
When I unselect "Disable .Rprofile execution on console start" in the settings of DataSpell and save it, IDE throws a weird error when I try to start an R console as below:
CompositeException (3 nested):
------------------------------
[1]: Cannot cast org.jetbrains.plugins.notebooks.jupyter.ui.remote.JupyterRemoteTreeModelServiceVfsListener to org.jetbrains.plugins.notebooks.jupyter.remote.vfs.JupyterVFileEvent$Listener
[2]: Cannot cast org.jetbrains.plugins.notebooks.jupyter.remote.modules.JupyterRemoteEphemeralModuleManagerVfsListener to org.jetbrains.plugins.notebooks.jupyter.remote.vfs.JupyterVFileEvent$Listener
[3]: Cannot cast org.jetbrains.plugins.notebooks.jupyter.ui.remote.JupyterRemoteVfsListener to org.jetbrains.plugins.notebooks.jupyter.remote.vfs.JupyterVFileEvent$Listener
------------------------------
I tried to give an empty .Rprofile file. Nothing changed. It throws the same error. Anyway, here is my .Rprofile file:
options(java.parameters = "-Xmx4G")
options(download.file.method = "wget")
project_base <- getwd()
print(paste("getwd:", getwd()))
Sys.setenv(R_PACKRAT_CACHE_DIR = "~/.rcache")
#### -- Packrat Autoloader (version 0.7.0) -- ####
source("packrat/init.R")
#### -- End Packrat Autoloader -- ####
# These ensures that the project uses it private library
p <- .libPaths()[[1]]
Sys.setenv(R_LIBS_SITE = p)
Sys.setenv(R_LIBS_USER = p)
Sys.setenv(R_PACKRAT_DEFAULT_LIBPATHS = p)
packrat::set_opts(use.cache = TRUE)
print(paste("whoami:", system("whoami", intern = TRUE)))
print(paste("libpaths:", .libPaths()))
print(paste0("cache_path: ", packrat:::cacheLibDir()))
restore_packrat <- function(restart = FALSE) {
packrat::restore(
overwrite.dirty = TRUE, prompt = F, restart =
restart, dry.run = F
)
}
snapshot_packrat <- function() {
packrat::snapshot(
ignore.stale = TRUE, snapshot.sources = FALSE,
infer.dependencies = FALSE
)
}
I appreciate the help of anyone who faced this issue and solved it.
PS: I also issued a bug report to the developers. If you have the same problem, please upvote the issue and this question.
https://youtrack.jetbrains.com/issue/R-1393

R targets and dataRetrieval return a connection error

I am attempting to use a targets workflow in my R project. I am attempting to download water quality data using the dataRetrieval package. In a fresh R session this works:
dataRetrieval::readWQPdata(siteid="USGS-04024315",characteristicName="pH")
To use this in targets, I have the following _targets.R file:
library(targets)
tar_option_set(packages = c("dataRetrieval"))
list(
tar_target(
name = wqp_data,
command = readWQPdata(siteid="USGS-04024315",characteristicName="pH"),
format = "feather",
cue = tar_cue(mode = "never")
)
)
when I run tar_make() the following is returned:
* start target wqp_data
No internet connection.
The following url returned no data:
https://www.waterqualitydata.us/data/Result/search?siteid=USGS-04024315&characteristicName=pH&zip=yes&mimeType=tsv
x error target wqp_data
* end pipeline
Error : attempt to set an attribute on NULL
Error: callr subprocess failed: attempt to set an attribute on NULL
Visit https://books.ropensci.org/targets/debugging.html for debugging advice.
Run `rlang::last_error()` to see where the error occurred.
I have attempted debugging using tar_option_set(debug = "wqp_data") or tar_option_set(workspace_on_error = TRUE) but outside of isolating the error to readWQPdata() didn't get anywhere.
I also had success using curl directly in targets so I do not think it is my actual internet connection:
list(
tar_target(
name = wqp_data,
command = {con <- curl::curl("https://httpbin.org/get")
readLines(con)
close(con)}
)
)
tar_make()
* start target wqp_data
* built target wqp_data
* end pipeline
Any advice on how to diagnose the connection issue when using these two packages?

Using RCurl, how to download the "clone from github" zip file?

If I want to download a clone as a zip, it does a redirect.
zip.url = "https://github.com/MonteShaffer/humanVerse/archive/refs/heads/main.zip"
redirects to:
<html><body>You are being redirected.</body></html>
I am trying to using the RCurl library:
require(RCurl)
curl.fun = basicTextGatherer();
curl.ch = getCurlHandle();
x = getBinaryURL(zip.url, curl = curl.ch, headerfunction = curl.fun$update )
One windoze 10, throwing this error:
Error in function (type, msg, asError = TRUE) :
error:1407742E:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version
I am assuming github is doing multiple redirects. I want to download the file as a binary 'zip'.
You have to set the curl option followlocation to TRUE, like this:
binary_blob <- RCurl::getBinaryURL(zip.url, .opts = list(followlocation = TRUE))
It might be easier to download the file instead with the following two options:
utils::download.file() comes with R and works for this.
zip.url <- "https://github.com/MonteShaffer/humanVerse/archive/refs/heads/main.zip"
download.file(zip.url, "main.zip")
The curl package has curl_download().
library(curl)
curl::curl_download(zip.url, "main2.zip")

Unable to install github package with devtools::install_github()

When I try to install a package from github I am gettin the following error:
devtools::install_github("<DeveloperName>/<PackageName>")
Error: HTTP error 404.
Not Found
Rate limit remaining: 24/60
Rate limit reset at: 2019-01-07 21:36:52 UTC
This doesn't make sense to me as the "/" are correctly entered and till take me to the repo when entered into my browser when entered after github.com. (Apologies that the repo is private to my company or I would share the actual developer and repo names.)
My traceback reads:
8.stop(github_error(res))
7.github_commit(username = remote$username, repo = remote$repo, host = remote$host, ref = remote$ref, pat = remote$auth_token %||% github_pat(), use_curl = use_curl)
6.remote_sha.github_remote(remote, local_sha)
5.remote_sha(remote, local_sha)
4.FUN(X[[i]], ...)
3.vapply(remotes, install_remote, ..., FUN.VALUE = character(1))
2.install_remotes(remotes, auth_token = auth_token, host = host, dependencies = dependencies, upgrade = upgrade, force = force, quiet = quiet, build = build, build_opts = build_opts, repos = repos, type = type, ...)
1.devtools::install_github("<DeveloperName>/<PackageName>")
I cannot find documentation to explain my final traceback error: "stop(github_error(res))". I have updated to latest version of devtools, but the problem persists.

setup_twitter_oauth, searchTwitter and Rscript

I run the following script using an installation of RStudio on a Linux-Server.
require(twitteR)
require(plyr)
setup_twitter_oauth(consumer_key='xxx', consumer_secret='xxx',
access_token='xxx', access_secret='xxx')
searchResults <- searchTwitter("#vds", n=15000, since = as.character(Sys.Date()-1), until = as.character(Sys.Date()))
head(searchResults)
tweetsDf = ldply(searchResults, function(t) t$toDataFrame())
write.csv(tweetsDf, file = paste("tweets_vds_", Sys.Date(), ".csv", sep = ""))
The script works fine, when I run it from the user-interface.
However, when I automatically run it via the terminal using crontab, I get the following error-message:
[1] "Using direct authentication"
Error in twInterfaceObj$getMaxResults :
could not find function "loadMethod"
Calls: searchTwitter -> doRppAPICall -> $
Execution halted
Why?

Resources