My R Shiny App was running fine in R-studio server. But today when I run it in R-studio server, I got an error:
Error in checkShinyVersion() : Please upgrade the 'shiny' package
to (at least) version 1.1
My app.R script has .libPaths(c('/home/jack/R/SLE_shinyApp/3.5')) in the beginning, because I intended to set up a dedicated path for this App, so that the packages won't be altered by other R scripts.
So here are all the paths; the 2nd one is the system default R library path which I have no control over.
> .libPaths()
[1] "/home/jack/R/SLE_shinyApp/3.5" "/app/x86_64/R/v3.5.0/lib64/R/library"
It turns out that htmlwidgets package, a dependency of shiny, is located only in the 2nd path and was updated by system admin since my last run, thus causing the error.
Here are my solutions and questions:
Make R search ONLY /home/jack/R/SLE_shinyApp/3.5, so I can install all needed packages and dependencies there. but how to make R search ONLY that path in app.R?
Update the shiny package in /home/jack/R/SLE_shinyApp/3.5. But if I run app.R line by line, it uses the shiny v1.1 in /home/jack/R/SLE_shinyApp/3.5 and runs OK. But if I run app.R by clicking "Run App" button, it uses the old shiny in the 2nd path and still gives the error, even with library(shiny, lib.loc = .libPaths()[1]) following .libPaths(c('/home/e0380702/R/SLE_shinyApp/3.5')).
Can anyone help?
Related
About 6 weeks ago (early April 2022), I had tested a Databricks workflow to ensure that I could trigger jobs on databricks remotely from Airflow, which was successful.
As part of the process the workflow activates a pre-built compute, it then loads various R libraries from DBFS into the compute, one of the packages is 'arrow', however while all the other packages load without issue this package fails to load successfully and then causes my workflow to crash.
When I look into the workflow I get the following error 'DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: Command to install library [RCranPkgId(arrow,None,None)] on [0303-130414-840hkwxf] orgId [5132544506122561] failed inside Databricks infrastructure', see image below. Arrow Fail Message" data-fileid="0698Y00000JFZosQAHI have triggered workflow directly inside of databricks and still get the same problem, so clearly it does not have an airflow related cause.
I tried to delete the arrow package from dbfs to see if I could run the test without it, but everytime I delete it, it returns when I retry the workflow.
I then checked CRAN to see if arrow had been updated recently, it was on the 2022-05-09, so I loaded an older version instead, (having first deleted everything relating to it from dbfs), this didn't work either, see images attached.
# Databricks notebook source
.libPaths()
# COMMAND ----------
dir("/databricks/spark/R/lib")
# COMMAND ----------
## Add current working directory to library paths
.libPaths(c(getwd(), .libPaths()))
# COMMAND ----------
## The latest versions from CRAN
install.packages(c('arrow', 'tidyverse', 'aws.s3', 'sparklyr', 'cluster', 'sqldf', 'lubridate', 'ChannelAttribution'), repos = "http://cran.us.r-project.org")
# COMMAND ----------
dir("/tmp/Rserv/conn970")
# COMMAND ----------
## Copy from driver to DBFS
system("cp -R /tmp/Rserv/conn970 /usr/lib/R/site-library")
# COMMAND ----------
dir("/usr/lib/R/site-library")
# COMMAND ----------
## Copy from driver to DBFS
system("cp -R /tmp/Rserv/conn970 /dbfs/r-libraries")
# COMMAND ----------
dir("/dbfs/r-libraries")
# COMMAND ----------
# Add packages to libPaths
.libPaths("/dbfs/r-libraries")
# COMMAND ----------
# Check that the dbfs libraries are in libPath
.libPaths()
I have also attached the script I'm using in R to load the packages to dbfs, which I think is good as every other package load properly, however it may be of use in understanding what I am doing or why the error occurs.
What I'd like to know is:
Do you see anything that I might be doing incorrectly inside my attached libraries script?
Is there an issue with loading the arrow package and if so do you have a work around that can prevent the failure?
Why does dbfs continue to re-install arrow despite my removal of it from the directory?
Can I permanently remove the arrow package from dbfs without it returning everytime I trigger the workflow?
Many thanks in advance for any help you guys can offer.
I figured it out, the issue had two aspects to it. The first was plain to see once I looked at the json file for the job. In it I specified that certain packages should be loaded when building the compute, this configuration overrode my model's script by virtue of the fact that it ran first. Seeing this in the json file explained why the arrow library kept trying to load regardless of the fact that I had removed it from dbfs.
The second part of the solution was how to get arrow to load without failing, which continued to happened when I retried to reinstall it on dbfs or independently on my model. By default Databricks seems to try to load packages from CRAN (https://cran.r-project.org/), I don't understand why it failed, maybe an issue with ubuntu from the last update???
The solution was to install it from a snapshot I got from MRAN at the following location 'https://cran.microsoft.com/snapshot/2022-02-24/'. Thank you user2554330, your comment got me rethinking and set me on the right direction.
I hope that this helps someone else if they are having similar issues.
I've created a new package project with RStudio but when I open .R files the source pane remains empty.
The tab shows that the file name is being registered, but none of the script appears.
When I open the hello.R file in a 'normal' project the contents of hello.R appear as expected.
I have tried reinstalling both R and RStudio and checked that windows 10 is up to date.
I'm using R version 4.2.0, RStudio 2022.02.2 Build 485, Rtools42 on a windows 10 OS
The same is true for a previous local package I have developed which I can no longer edit. The existing package works fine but when I open the .R files non of the code is displayed.
I'm hoping someone may have some ideas to resolve this or where to look to try to problem solve this.
My work round is to use a previous version of R: 4.1.3 in which case the scripts appear as expected in the source window.
Package project console
'Normal' project console
Attempting to use the mread function to open a cpp file through R. However, when I run the script I get the following:
setwd("C:/Users/Gustavo/Documents/R/page-2018-mrgsolve-master/model")
getwd()
#> [1] "C:/Users/Gustavo/Documents/R/page-2018-mrgsolve-master/model"
library(mrgsolve)
mod <- mread("simple", "model")
#> Error: project directory 'model' must exist and be readable.
Obviously I am setting the directory to "model" itself. So why isn't R able to read it? Any help would be appreciated as I am still learning R and want to learn the mrgsolve package as well.
Additional info: R version 3.4.4. Rtools version 3.4.0. Rstudio version 1.1.463.
An adaptation to the email I sent my colleagues that were assisting me with a similar issue:
To review, I was unable to open any files through RStudio because RStudio returned error messages indicating either that the file itself or the work directory did not exist. I've done multiple installations of different versions of R, RStudio, and Rtools in an attempt to resolve the issue. I also moved the locations of files and programs of interest and changed the work directory to see if that made a difference. Unfortunately, when RStudio is first initiated on a computer, it establishes a "hidden directory" folder that retains the settings of the program when it was first initiated. However, by deleting this folder, RStudio was wiped and I was able to regain control of where files would be stored and read as desired (more on this in the following link: https://support.rstudio.com/hc/en-us/articles/200534577-Resetting-RStudio-Desktop-s-State). A combination of this and forcing Rtools to the front of the 'path' also allowed me to resolve 'status 127' errors that I was receiving as well.
Unfortunately, this is the result of a more personal issue between the initial settings that RStudio took to my computer and my attempt to manipulate where RStudio should read files which I guess were discordant of one another? Regardless, it seems that I would need to be more cognizant of how RStudio establishes a folder which retains its initial settings.
Shiny app works fine locally but gets disconnected from the server with following error:
I have installed and loaded the library rgdal, but still keep on getting the error.
I'm guessing when it works locally you're running it on windows?
What does ld /usr/local/lib/R/site-library/rgdal/libs/rgdal.so return?
The error message is telling you that a shared object that the rgdal needs is missing.
This object should have been linked at install time and the install should have failed if it didn't exist (assuming it was installed through R).
At a guess, I'd say you're missing a system dependency, which is something like libgdal-dev or libgdal-devel, depending on your distro.
Based on the error, it looks like you may not have uploaded your data files when you uploaded your app to shiny. Click File > Publish and make sure all of the files, including your app.R (and/or ui & server files) AND your data files are selected.
Or, do you setwd() at the beginning of your code? If so, delete that line and try again.
I want to use Tensorflow package for R in Windows.
Done with Python 3.5.x installation from python.org and have installed the Tensorflow R package from devtools::install_github("rstudio/tensorflow")as per the official source https://rstudio.github.io/tensorflow/
I know I am not setting the environment variable in the right way in the windows and/or in the sys.setenv function of R.
Above link says it should be set to Sys.setenv(TENSORFLOW_PYTHON="/usr/local/bin/python").
See below the location of my Python35 folder which includes all the python stuff including the tensorflow library downloaded from the python side:
Python35 folder location:C:\Users\rgupta6\AppData\Local\Programs\Python\Python35
tensorflow folder location: C:\Users\rgupta6\AppData\Local\Programs\Python\Python35\Lib\site-packages\tensorflow
Code I used:
Sys.setenv(TENSORFLOW_PYTHON="C:\\Users\\rgupta6\\AppData\\Local\\Programs\\Python\\Python35")
Sys.setenv(TENSORFLOW_PYTHON="C:\\Users\\rgupta6\\AppData\\Local\\Programs\\Python\\Python35\\Lib\\site-packages\\tensorflow")
I use library(tensorflow) and get no error.
Then I use sess = tf$Session() and get an error:
Error in initialize_python(required_module) : Installation of Python not found, Python bindings not loaded*.
What should I do to make it work?
If you are getting errors such as:
Error in initialize_python(required_module) : Installation of Python not found, Python bindings not loaded
Error: Installation of TensorFlow not found
Python environments searched for 'tensorflow' package:
C:\Users\rgupta6\AppData\Local\Programs\Python\Python35\python.exe
Some error related to file does not exist
While trying to make Tensorflow package to work in Rstudio, the problem is with your environment variable in Windows.
What you need to do is:
From the desktop, right click the Computer icon.
Choose Properties from the context menu.
Click the Advanced system settings link.
Click Environment Variables. In the section System Variables, find the PATH environment variable and select it. Click Edit.
A new pop up will open. Variable name will remain Path. We will change the Variable value to the location of the folder where your tensorflow folder is located. Find it.
For e.g. I changed its value to:
C:\Users\rgupta6\AppData\Local\Programs\Python\Python35\Lib\site-packages\tensorflow
Close all remaining windows. Open Rstudio, and run your "Hello World" program to see if your tensorflow works fine in R:
library(tensorflow)
sess = tf$Session()
hello <- tf$constant('Hello, TensorFlow!')
sess$run(hello)