I have created an R Shiny app. It seems to run fine on my computer. I now need to upload it so others can use it. I created an account at: https://www.shinyapps.io/ and use the following two lines within the default R GUI:
library(rsconnect)
rsconnect::deployApp('C:/Users/mark_/Documents/simple_RShiny_files/surplus10')
I get the following warning where line 4 reads the external CSV file in the subfolder data which is followed by an error below. The app.R file is in the folder surplus10:
The following potential problems were identified in the project files:
-----
app.R
-----
The following lines contain absolute paths:
4: policy.data <- read.csv('C:/Users/mark_/Documents/simple_RShiny_files/surplus10/data/policy.outputs_June6_2020.csv', header = TRUE, stringsAsFactors = FALSE)
Paths should be to files within the project directory.
Do you want to proceed with deployment? [Y/n]: Y
Preparing to deploy application...DONE
Uploading bundle for application: 2430142...--- Please select a CRAN mirror for use in this session ---
DONE
Deploying bundle: 3246501 for application: 2430142 ...
Waiting for task: 744319366
building: Building image: 3633169
building: Fetching packages
building: Installing packages
An error has occurred
The application failed to start (exited with code 1).
Warning in file(file, "rt") :
cannot open file 'C:/Users/mark_/Documents/simple_RShiny_files/surplus10/data/policy.outputs_June6_2020.csv': No such file or directory
Error in value[[3L]](cond) : cannot open the connection
Calls: local ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
Execution halted
I imagine that once the app is uploaded the path to the data file is no longer valid. If that is the case which path should I use to read the CSV file? Which path do I use in the deployApp statement? I have never attempted to upload an app before and do not know what a project directory is. Sorry for my beginner's confusion.
When I changed line 4 in the app.R file that reads the external CSV data file to:
policy.data <- read.csv('data/policy.outputs_June6_2020.csv', header = TRUE, stringsAsFactors = FALSE)
the app uploaded without error.
I did not change my original path in the deployApp statement:
library(rsconnect)
rsconnect::deployApp('C:/Users/mark_/Documents/simple_RShiny_files/surplus10')
Related
I've deployed a Shiny app on shinyapps.io that executes code that is inserted within the app. For example, if you insert the following code within the app:
1 + 2
The Shiny app returns:
3
This works fine as long as all the packages used in the inserted code have been specified already during the deployment process of the app.
However, if an unknown package is used within the inserted code, the Shiny app doesn't work anymore. For example, the following input returns an error message:
install.packages("Hmisc")
1 + 2
Output:
'lib = ".../lib/R/library"' is not writable
Warning in install.packages("Hmisc") :
Warning: Error in install.packages: unable to install packages
This could be solved by specifying all required packages (i.e. "Hmisc") during the deployment of the app. However, since I don't know all the required packages before the deployment of the app, I need to find a way to install and load packages AFTER the deployment. How could I do that?
As per my above comment: we need to add a writable libPath on app or session start.
This can be done by placing the following line of code in the global (app start) or server part (session start) of the app:
.libPaths(c(tempdir(), .libPaths()))
PS: tempdir() can be replaced with any other writable directory and a repository should be provided to install.packages' repos parameter as the R session isn't interactive().
I created a R shiny app that automatically runs every day using a batch file.
Everything works fine when lauching the app, but the next day it crashes and I get the following message:
Warning in file(open = "w+") :
cannot open file
'C:\Users\bertin\AppData\Local\Temp\RtmpKiBPOU\Rf3f835d1a66' : No such file or directory
Warning: Error in file: cannot open the connection
[No stack trace available]
Actually this issue is related to the tempdir() folder created by the R session executing the shiny app. This folder is automatically deleted after a certain time. Do I have to delete all Temp files on each refreshing? Or on the contrary is it needed to prevent R from deleting all shiny temp files on Temp folder? Thanks!
Edit - Here is how to intentionally generate the error:
tempdir()
dir.exists(tempdir())
library(shiny)
# Windows shell required
shinyApp(
ui = fluidPage("Please reload to see me fail."),
server = function(input, output) {
shell(paste("rmdir", dQuote(
normalizePath(tempdir(), winslash = "/", mustWork = FALSE), q = FALSE
), "/s /q"))
}
)
By now I've found a setting in Windows 10 (Storage Sense) concerning the deletion of temporary files, which seems to be active by default.
Navigate as follows and uncheck:
Settings
System Storage
Storage Sense
Change how we free up space automatically
Delete temporary files that my apps aren't using
With the deletion of your temp directory also session data gets lost. But if I understand your question correctly, this is not relevant for your Shiny Application.
So if you don‘t need any session data from yesterday you could call ‘.rs.restartR()‘ to restart your R session and thus setting a new temporary directory. You will probably get an error that your last session could not be saved (as the directory doesn‘t exist anymore).
After this you should be able to start your Shiny App again.
I have written a Shiny App which runs perfectly in my local machine. I have used RJDBC to connect to the DB2 database in IBM Cloud. The code is as follows.
#Load RJDBC
dyn.load('/Library/Java/JavaVirtualMachines/jdk-9.0.4.jdk/Contents/Home/lib/server/libjvm.dylib')
# dyn.load('/Users/parthamajumdar/Documents/Solutions/PriceIndex/libjvm.dylib')
library(rJava)
library(RJDBC)
As the path is hard coded, I copied the file libjvm.dylib to the Project directory and pointed to that. When I do this, R gives a fatal error.
I remove the absolute path and replaced with "./libjvm.dylib" and deployed the application on ShinyApp.io website. When I run the program, it gives a fatal error.
#Values for you database connection
dsn_driver = "com.ibm.db2.jcc.DB2Driver"
dsn_database = "BLUDB" # e.g. "BLUDB"
dsn_hostname = "dashdb-entry-yp-lon02-01.services.eu-gb.bluemix.net" # e.g. replace <yourhostname> with your hostname, e.g., "Db2 Warehouse01.datascientstworkbench.com"
dsn_port = "50000" # e.g. "50000"
dsn_protocol = "TCPIP" # i.e. "TCPIP"
dsn_uid = "<UID>" # e.g. userid
dsn_pwd = "<PWD>" # e.g. password
#Connect to the Database
#jcc = JDBC("com.ibm.db2.jcc.DB2Driver", "/Users/parthamajumdar/lift-cli/lib/db2jcc4.jar");
jcc = JDBC("com.ibm.db2.jcc.DB2Driver", "db2jcc4.jar");
jdbc_path = paste("jdbc:db2://", dsn_hostname, ":", dsn_port, "/", dsn_database, sep="");
conn = dbConnect(jcc, jdbc_path, user=dsn_uid, password=dsn_pwd)
Similarly, I copied the file "db2jcc4.jar" to my local project directory. If I point to the local project directory for this file in my local machine, the program works. However, when I deploy on ShinyApp.io, it gives fatal error.
Request your please letting me know what I need to do so that the application runs properly on the ShinyApp.io website.
The error is as follows when I run the application from Shiny server:
Attaching package: ‘lubridate’
The following object is masked from ‘package:base’:
date
Loading required package: nlme
This is mgcv 1.8-23. For overview type 'help("mgcv-package")'.
Error in value[[3L]](cond) :
unable to load shared object '/srv/connect/apps/ExpenseAnalysis/Drivers/libjvm.dylib':
/srv/connect/apps/ExpenseAnalysis/Drivers/libjvm.dylib: invalid ELF header
Calls: local ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
Execution halted
What works for me is the following and it is independent of OS.
Create your own R package that contains the file you need somewhere in the extdata folder. As an example, your package could be yourpackage and the file would be something like extdata/drivers/mydriver.lib. Typically this would be stored at this location inst/extdata/drivers. See http://r-pkgs.had.co.nz/inst.html for details.
Store this package on github and if you want privacy you will need to work out how to grant an access token.
Use the devtools package to install it. The command would be something like this, devtools::install_github("you/yourpackage", auth_token = "youraccesstoken"). Do this once before deploying to Shiny.io. Ensure that you also do library(yourpackage). The package submission process will work out that it needs to fetch from Github.
Use the following R code to find the file.
system.file('extdata/drivers/mydriver.lib, package='yourpackage'). This will give you the full path to the file and you can use it.
I am having trouble with the XBRL library examples for reading XBRL documents from either the SEC website and from my local hard drive.
This code first attempts to do the read from the SEC site as written in the example in the pdf file for the XBRL library, and second tries to read a file saved locally:
# Following example from XBRL pdf doc - read xml file directly from sec web site
library(XBRL)
inst <- "http://www.sec.gov/Archives/edgar/data/1223389/000122338914000023/conn-20141031.xml"
options(stringsAsFactors = FALSE)
xbrl.vars <- xbrlDoAll(inst)
# attempt 2 - save the xml file to a local directory - so no web I/O
localdoc <- "~/R/StockTickers/XBRLdocs/aapl-20160326.xml"
xbrl.vars <- xbrlDoAll(localdoc)
Both of these throw an IO error. The first attempt to read from the SEC site results in this and crashes my RStudio instance:
error : Unknown IO error
I/O warning : failed to load external entity "http://www.sec.gov/Archives/edgar/data/1223389/000122338914000023/conn-20141031.xml"
So I restart RStudio, re-load XBRL library and try the second attempt, to read from a local file give this error:
I/O warning : failed to load external entity "~/R/StockTickers/XBRLdocs/aapl-20160326.xml"
I am using R version 3.3.0 (2016-05-03)
I hope I am missing something obvious to somebody, I am just not seeing it. Any help would be appreciated.
I'm running a script with input parameters that are referenced in the code to automate the directory creation, download of file and untar of file. I would be fine with unzip, however this particular file I want to analyze is .tar.gz. I manually unpacked and it was tar.gz, unpacked to .tar file. Would that be the problem?
Full error: Error in untar2(tarfile, files, list, exdir) : unsupported entry type ‘’
Running Windows 10, 64 bit, R set to: [Default] [64-bit] C:\Program Files\R\R-3.2.2
Script notes one solution found (issues, lines 28-31), but I don't really understand it.
I did install 7-zip on my computer, restart and of course restart R:
`#DOWNLOADING AND UNZIPPING TAR FILE
#load required packages.
#If there is a load package error, use install.packages("[package]")
library(dplyr)
library(lubridate)
library(XML) # HTML processing
options(stringsAsFactors = FALSE)
#Set directory locations, data file and fetch data file from internet
#enter full url including file name between ' ' marks
mainDir<-"C:/R/BEES/"
subDir<-"C:/R/BEES/Killers"
Fetch<-'http://dds.cr.usgs.gov/pub/data/nationalatlas/afrbeep020_nt00218.tar.gz'
ArchFile<-basename(Fetch)
download.file<-(ArchFile)
#Check for file directories and create if directory if it doesn't exist
if(!file.exists(mainDir)){dir.create(mainDir)}
if(!file.exists(subDir)){dir.create(subDir)}
#set the working directory
setwd(file.path(subDir))
#check if file exists and download if it doesn't exist.
if(!file.exists(ArchFile))
{download.file (url=Fetch,destfile=ArchFile,method='auto')}
#unpack and view file list
untar(path.expand(ArchFile),list=TRUE,exdir=subDir,compressed="gzip")
list.files(subDir)
#Error: Error in untar2(tarfile, files, list, exdir) :
# unsupported entry type ‘’
#Need solution to use tar/untar app
#instructions here: https://stevemosher.wordpress.com/step-10-build/`
Appreciate feedback - I've been lurking around StackOverflow for some time to use other people's solutions.