I am simply using:
extract_tables('/Users/ben/OneDrive/Utah Local Governments Trust/Underwriting - Documents/Data Analysis/Emod Calculation/Expected Loss Rates, D-Ratio, Etc.pdf')
after loading:
library(tabulizer)
library(tabulizerjars)
It says it needs to use Java SE 6 to open Rstudio but I can open it just fine with the latest java before running the funciton.
Related
Anyone run into the R haven package call generating: ***Recursive gc invocation error in a shared environment?
Here is what I am running.
RStudio Ver. 2021.09.2 Build 382
R Ver. 4.1.2
Rtools Vers 4.0
Haven package release 2.4.3
The code works on my local install. I have users accessing a shared instance to work with large data files and can't get it to consistently run in that environ. The essence of the call is:
library(haven)
SFF05 <- read_sas("file/path/name/filename.sas7bdat”)
Users in this space often get this error off the library statement:
*******recursive gc invocation
Adding gc() to the start of the code will fix it temporarily, but eventually stops working.
My users also get this error in relation to the file read_sas call:
Error in df_parse_sas_file(….)
They occasionally give us other recursive errors or namespace errors. It’s not very consistent, and sometimes it just hangs until R crashes and reloads.
These programs worked flawlessly on R3.6.1 and RStudio July 19th, 2018 (1.1.456). Also, it works in R4.1.2 console just fine, it appears be an issue with the combination of R 4.1.2 and RStudio Ver. 2021.09.2 Build 382 in a shared environment.
This could be related to a bug in your RStudio, e.g see:
https://github.com/rstudio/rstudio/issues/9868
https://github.com/rstudio/rstudio/issues/10565
https://github.com/rstudio/rstudio/issues/10040
It seems to be fixed in the latest RStudio release which should arrive any day now:
https://github.com/rstudio/rstudio/milestone/20?closed=1
A temporary fix might be to set session-handle-offline-timeout-ms=0 in your rsession.conf. See: https://github.com/rstudio/rstudio/issues/10565#issuecomment-1035517692
I have used RStudio to submit a job a few months ago to cloudml (AI platform) and it was successful.
Today I tried to use AI platform notebook to submit the same job but I get:
"ERROR: (gcloud.ai-platform.jobs.submit.training) INVALID_ARGUMENT: Field: runtime_version Error: The specified runtime version '1.9' with the Python version ''"
I even ran which python in the terminal and then in the R env.:
library(reticulate)
use_python("result of the which python")
I tried R in the terminal as well and get the same error.
I don't know if it helps or not but the previous run and this one were in different regions.
us-central was successful
australia-southeast1 was getting this error.
This error occurs because as of March 16, 2020, you can no longer create training jobs that use runtime version 1.9. You can try submitting the job with version 1.15 which is the only Tensorflow 1.x version that is currently supported for training jobs. It is still possible though that you may experience errors due to incompatibilities in the code.
I am using window 10 pro and downloaded Microsoft open R to run in rstudio. open R has built-in multithread processing I want to try out.
I tried command setMKLthreads(2). then R shows fatal error and needs to restart.
when I run the getMKLthreads() R is running fine
getMKLthreads()
[1] 8
only when I want to change the number of cores R collapse.
Please advise
I’m trying to run some modeling (random forest, using caret) in rstudio server 1.1.423 (with R version 3.4.4, running on an Ubuntu 16.04 server), and it comes back with the following error:
Error: protect(): protection stack overflow
This error doesn't come up if I run the same analysis in an interactive R session. I seem to recall that in the past (in rstudio server running an older version of R) was able to resolve this error by issuing memory.limit(500000) in an interactive rstudio server session, but these days this comes back with:
> memory.limit(500000)
[1] Inf
Warning message:
'memory.limit()' is Windows-specific
A solution that works and that I use routinely is to run my analysis from a script, like Rscript --max-ppsize=500000 --vanilla /location/of/the/script.R, but that’s not what I want to do, as in this particular case I need to run the analysis interactively.
I’ve also tried adding R_MAX_VSIZE=500000 at the end of my ~/.profile, or rsession-memory-limit-mb=500000 into /etc/rstudio/rserver.conf, as well as putting options(expressions = 5e5) in my ~/.Rprofile, or running options(expressions = 5e5) in an interactive rstudio server session. No luck so far, the “protect()” error keeps on popping up.
Any ideas as to how to remove the memory limit in rstudio server?
My R script worked fine in RStudio (Version 0.98.1091) on Windows 7. Then I restarted my laptop, entered again in RStudio and now it provides the following error messages each time I want to execute my code:
cl <- makeCluster(mc); # build the cluster
Error: could not find function "makeCluster"
> registerDoParallel(cl)
Error: could not find function "registerDoParallel"
> fileIdndexes <- gsub("\\.[^.]*","",basename(SF))
Error in basename(SF) : object 'SF' not found
These error messages are slightly different each time I run the code. It seems that RStudio cannot find any function that is used in the code.
I restarted R Session, cleaned Workspace, restarted RStudio. Nothing helps.
It must be noticed that after many attempts to execute the code, it finally was initialized. However, after 100 iterations, it crashed with the message related to unavailability of localhost.
Add library(*the package needed/where the function is*) for each of the packages you're using.