Rstudio force Mac to download from iCloud - r

My Mac laptop is set up to upload large data files to iCloud, leave a name stub and delete from my local hard drive. This is not necessarily a bad thing. When I go to the directory and click the name stub, the file starts downloading. But when I try to read in one of these files in an R script, Rstudio (and presumably R, although I haven't tried that) just says that the file is missing.
How can I tell RStudio to force the mac to download the file?

Related

open a OneDrive file with r

Im tring to create a shiny app that read and online onedrive xlsx file and show some things, but for the moment Im unable to read the onedrive xlsx file, I already explore the Microsoft365R and I can conect to my onedrive and I even can open the fil but... what it does is from r open a tab in chrome with the excel file.
I need the file in the local enviroment of r.. this its beacause the shiny app must be deploy in a web server, that every time the app runs it reads the update the file.
library(Microsfot365R)
odb <- get_business_onedrive()
odb$open_file("lcursos.xlsx")
Also this its a business account, so I also have to put the username and key to acces each file, that its beacause use the simple url doesnt work, it says Error 403 FORBIDEEN.
Any ideas?
Thank you so much!
Use the download_file() method to download the file to your local machine:
odb$download_file("lcursos.xlsx")
You can set the location of the download with the dest argument. Once it's downloaded, open it with the xls reader package of your choice. I suggest either openxlsx or readxl.
Note that if your file is password protected, your options are limited. See this question for possible solutions.

R uploading files with a function

I usually use the upload button to upload a file, bui i would like to do it with a function. So i need a function to upload a binary file from my desktop to a remote Rstudio. The path is always the same. The size of the file is 70Mb. Which is the easiest way?
We can use base R's file.copy.
E.g.
file.copy(from="path_to_local_source_file", to="path_to_RStudioServer_destination_file")
Edit:
For file.copy to work, the (local) source directory needs to be mounted on the (probably Ubuntu) system RStudioServer is running on. We can use PAM session on 'RStudioServer Pro'... not sure about non 'Pro' version..

Unable to use correct file paths in R/RStudio

Disclaimer: I am very new here.
I am trying to learn R via RStudio through a tutorial and very early have encountered an extremely frustrating issue: when I am trying to use the read.table function, the program consistently reads my files (written as "~/Desktop/R/FILENAME") as going through the path "C:/Users/Chris/Documents/Desktop/R/FILENAME". Note that the program is considering my Desktop folder to be through my documents folder, which is preventing me from reading any files. I have already set and re-set my working directory multiple times and even re-downloaded R and RStudio and I still encounter this error.
When I enter the entire file path instead of using the "~" shortcut, the program is successfully able to access the files, but I don't want to have to type out the full file path every single time I need to access a file.
Does anyone know how to fix this issue? Is there any further internal issue with how my computer is viewing the desktop in relation to my other files?
I've attached a pic.
Best,
Chris L.
The ~ will tell R to look in your default directory, which in Windows is your Documents folder, this is why you are getting this error. You can change the default directory in the RStudio settings or your R profile. It just depends on how you want to set up your project. For example:
Put all the files in the working directory (getwd() will tell you the working directory for the project). Then you can just call the files with the filename, and you will get tab completion (awesome!). You can change the working directory with setwd(), but remember to use the full path not just ~/XX. This might be the easiest for you if you want to minimise typing.
If you use a lot of scripts, or work on multiple computers or cross-platform, the above solution isn't quite as good. In this situation, you can keep all your files in a base directory, and then in your script use the file.path function to construct the paths:
base_dir <- 'C:/Desktop/R/'
read.table(file.path(base_dir, "FILENAME"))
I actually keep the base_dir assignemnt as a code snippet in RStudio, so I can easily insert it into scripts and know explicitly what is going on, as opposed to configuring it in RStudio or R profile. There is a conditional in the code snippet which detects the platform and assigns the directory correctly.
When R reports "cannot open the connection" it means either of two things:
The file does not exist at that location - you can verify whether the file is there by pasting the full path echoed back in the error message into windows file manager. Sometimes the error is as simple as an extra subdirectory. (This seems to be the problem with your current code - Windows Desktop is never nested in Documents).
If the file exists at the location, then R does not have permission to access the folder. This requires changing Windows folder permissions to grant R read and write permission to the folder.
In windows, if you launch RStudio from the folder you consider the "project workspace home", then all path references can use the dot as "relative to workspace home", e.g. "./data/inputfile.csv"

Dropbox permissions on ggplot2 saved charts

I have an R script that I run on a regular basis with launchd (OS X 10.8.3 Mountain Lion), calling it with Rscript myscript.R
The script creates generates some ggplot2 plots and saves them into my Dropbox folder with the ggsave() function.
The problem I am having is that the saved plots don't sync to Dropbox properly - they get the little blue "synching" icon and never upload. I can fix it by going into the Dropbox preferences and using "fix permissions" but I'd like to have it so that when I output the files they will synch without any problems.
What could be the problem? If I run through the same script manually in RStudio, the plots save properly and synch to Dropbox without this happening.
It turns out that this was indeed a file ownership issue. I had launched set up to run my script as root, and because the files had the root owner, the .png charts saved from ggplot2 would not sync to Dropbox, which is under my user account.
The odd thing is that my script also output .html files, which do sync even with the root owner.
When I changed it to run under my user name, the output of the script synced to Dropbox as it should. Now, my only problem is that launchd will not run the script if I'm not logged in :/

Run R from dropbox

Often in "restricted security" situations in which programs can't be installed on a computer I run R from a flash drive. Works like a charm. I've recently started using dropbox and was thinking it could be used in a similar fashion to the flash drive. For anyone who has tried this does it work?
I can test it myself but don't want to go to the bother if it's a dead end.
Thanks in advance.
PS this has the advantage of storing an .Rprofile that people whom you share the dropbox folder with can then run your R code. This is particularly nice for people unfamiliar with R.
It should just work.
R is set up in such a way that all its files are relative to a given top-level directory. Whether that is a F:\ or Z:\ drive from your flashdrive, or your Dropbox folder should not matter.
By the same token, R can run happily off a shared folder, be it via Samba, NFS or another mechanism.
It is fine if you want to share .Rprofile or .Rhistory. However, I see a problem with .Rdata, because it can be big (for example 4Gb). For me to save 100 Mb file on Dropbox takes minutes and .RData can be far bigger.
An alternative would be a remote server, where you could connect through ssh.

Resources