r opencpu root dir for reading csv - r

I have opencpu (single server) up and functioning. My first function will open a dataset from a csv file stored on my hard drive.
Where should I deploy the csv file? (I tried my apps www directory, but it doesn't work)
In sum: within an opencpu app, where do I deploy a file so that this line of code will work?
indf <- read.csv(".\\nouns-categorical_R1.csv")

The answer is simple to find.
add print(getwd()) to your opencpu function script.
First call the function using POST
The working directory can then be retrieved by calling a GET request for the url ending in "console"
The answer is that the working directory is a temp directory:
.....AppData/Local/Temp/Rtmp0qr704/ocpu_session_3780fc520c8"
This means you cannot store csvs in the working directory. This working directory changes every time you start opencpu.
It is possible to use full path to the csv when calling read.csv(). However, you need to watch for security issues and file privileges once deployed on Ubuntu.

Related

Azure Databricks: How do we access R Scripts present on DBFS?

I'm new to DataBricks. I am trying to access a .R file that is present in the DBFS storage but I cannot figure out how to do so. Any help is really appreciated.
I can read data from the storage using the file path /dbfs and also source code from the script but I want to make edits to the script.
You need some editor to do that - for example, you can setup RStudio on your cluster and connect to it via RStudio UI - in this case you can edit R files directly on DBFS.
But really, the simplest for you would be to use Databricks CLI fs command to copy the file to your local machine, make changes in the editor of your choice, and upload file back.

open a OneDrive file with r

Im tring to create a shiny app that read and online onedrive xlsx file and show some things, but for the moment Im unable to read the onedrive xlsx file, I already explore the Microsoft365R and I can conect to my onedrive and I even can open the fil but... what it does is from r open a tab in chrome with the excel file.
I need the file in the local enviroment of r.. this its beacause the shiny app must be deploy in a web server, that every time the app runs it reads the update the file.
library(Microsfot365R)
odb <- get_business_onedrive()
odb$open_file("lcursos.xlsx")
Also this its a business account, so I also have to put the username and key to acces each file, that its beacause use the simple url doesnt work, it says Error 403 FORBIDEEN.
Any ideas?
Thank you so much!
Use the download_file() method to download the file to your local machine:
odb$download_file("lcursos.xlsx")
You can set the location of the download with the dest argument. Once it's downloaded, open it with the xls reader package of your choice. I suggest either openxlsx or readxl.
Note that if your file is password protected, your options are limited. See this question for possible solutions.

R - copy files from folder (Shiny App)

I'm working on loading up a Shiny App IO. I use an R package that downloads data into a subfolder in my directory and saves two .RData files.
I'm having issues on the Shiny App IO server. I need to load the two .RData files. Locally, I can set the relative path using (~project/source-data). Shiny App does not respond to this.
I can set it as a working directory using a relative path, (./source-data), however, this is not ideal as I have further data manipulation to do at the parent level directory and I can't seem to set the working directory back to the parent level in Shiny App.
Here is what I had moved forward with:
wd = getwd()
sd = (paste(getwd(),"/source-data",sep=""))
sd2 = list.files(sd, full.names = TRUE)
file.copy(from=sd2, to=wd)
My solution, although not ideal, is to copy the two .RData files to the parent directory. At that point, the rest of my code will run smoothly. It works locally, but not on Shiny App.
Does anyone have experience with a similar problem and solution? Either helping me direct the Shiny App IO server to the sub-directory and back to the parent directory once I load in the two .RData files, or copying the files to the parent directory?
I've seen solutions that work on a local R environment, but not one that satisfies the conditions of a Shiny App IO server environment.
Thank you in advance.

Uploading csv file to shinyApps.io

My app runs fine locally and I am able to successfully deploy my app to the shinyapps.io server, but I get the following error message when I try and load the app in my browser using the shinyapps URL: "Error object 'data' not found.' I think this is because the 'data' variable reads from a csv file on my local directory. Is there a way I can upload this csv file to the shinyapps server? I've tried looking this up but I've found nothing.
Here's the code I am using to read in the files. I'm getting the file from the same working directory as my server.R and ui.R. Thanks
server.R
library(shiny)
college = read.csv("college.csv")
ui.R (I added to this to see if it fixes the problem, but it doesn't)
library(shiny)
college = read.csv("college.csv")
Currently I was facing a similar trouble.
Reading here and there, I realized that you can create a script called global.R in the same dir with ui.R and server.R.
On this file (global.R) you can load libraries and, in this case, objects previously saved on a dir, for example, called data.
I created the object and the saved it with saveRDS(df, "./data/df.RDS"). Then loaded it from the data dir with something like
df <- readRDS("data/df.RDS")
on the global.R
That works for me.
Best practice would be to place your data in a folder, say ~/<application name>/data and then call your data from your server.R treating your application's directory (/<application name>/) as the current working directory.
e.g. I save my files as RDS objects in ~/ImputationApp/data/ and then read them in with:
foo.rds <- readRDS("data/foo.rds")
Even though what you describe should run, double check your filepaths for the datafiles you are trying to load and any stray setwd() commands that could be mucking up the works. A common misstep is to put the fully qualified path to your data on your machine in your server.R.
I know it's too late but I believe creating a folder named www in your directory and placing the csv there should solve the problem.

csv files in opencpu

If I put a very small csv file in my GitHub directory so that it gets copied to /ocpu/github/username/projectname/www/ , will I be able to access the contents of the csv for use in a R function? I tried to ajax the file, but I get a 404 error even though I can see the csv file sitting in the www directory of my local server. I need to have the csv on the server as a static file rather than being uploaded by a function. Thanks
You should be able to access them like any other file. Can you post an example that shows what you are doing and what error you are getting?
That said, if you just want to use this data in your R functions, it is better to include it in the R package as an actual data file. Also see section 1.1.6 of Writing R Extensions. An example is the mapapp package, which includes a dataset called countryExData. Also see the live app.

Resources