Run R from dropbox - r

Often in "restricted security" situations in which programs can't be installed on a computer I run R from a flash drive. Works like a charm. I've recently started using dropbox and was thinking it could be used in a similar fashion to the flash drive. For anyone who has tried this does it work?
I can test it myself but don't want to go to the bother if it's a dead end.
Thanks in advance.
PS this has the advantage of storing an .Rprofile that people whom you share the dropbox folder with can then run your R code. This is particularly nice for people unfamiliar with R.

It should just work.
R is set up in such a way that all its files are relative to a given top-level directory. Whether that is a F:\ or Z:\ drive from your flashdrive, or your Dropbox folder should not matter.
By the same token, R can run happily off a shared folder, be it via Samba, NFS or another mechanism.

It is fine if you want to share .Rprofile or .Rhistory. However, I see a problem with .Rdata, because it can be big (for example 4Gb). For me to save 100 Mb file on Dropbox takes minutes and .RData can be far bigger.
An alternative would be a remote server, where you could connect through ssh.

Related

Change the directory where the SQLite *.db-journal file is created

Good day,
I have a small application created in Lazarus / Free Pascal. If I run this application located in a folder on my computer, it will start and SQLite will create a temporary file .db-journal in the current directory. Since the application is portable, it will also run from a flash drive. And now comes the problem. Some computers (eg at work) do not allow writing to external media. Therefore, when I start the application, it does not start and an error is displayed that it is not possible to open the database (tested on a locked SD card). And so that the application does not always have to be copied to the computer, I would like to know if it is possible to redirect the creation of a temporary file .db-journal to another directory, for example the "C:\WINDOWS\USERS<user>" user directory. Is it usually possible to write there always?
Of course, I searched the net, but so far I have not found anything that would help me, so I am addressing you here. Thank you for your advice or guidance.
Jirka

Is it possible to define a cross-platform working directory for R?

I am teaching R tutorials in person w/ a large number of undergraduate R novices. I am also trying to format my notes on RPubs so that they can easily be used by other people. Nothing derails things faster than people mis-specifying working directories or saving spreadsheet files to someplace different than their working directory.
Is it possible to define a working directory that is universal across platforms? For example, a line of code or a function like
setwd( someplace that is likely to exist on every computer)
This could involve a function that finds some place the usually exists on all computers, such as the desktop, downloads folder, or R directory.
In general your best bet is to go for the user's home area,
setwd("~")
path.expand("~")
Since you are teaching novices, a common problem is that students notice the R package directory ~/R/ and assume that they should put their scripts this directory; thereby creating odd bugs. To avoid this, I would go for
dir.create("~/RCourse", FALSE)
setwd("~/RCourse")
If you use RStudio, you could get them to create an RStudio project.
In the past, I have come across situations where this doesn't work. For example, some people have their home area as a network drive, but can't connect to the internet or get through a firewall.
You asked about
someplace that is likely to exist on every computer
and yes, R works hard to ensure this returns a valid directory: tempdir().
The main danger, though, may be that this directory will vanish after the session (unless you override the default behaviour of removing the per-session temporary directory at end). Until then, it works.
Still, this can be useful. I sometimes use that to write temporary files I don't want to clutter in the current directory, or ~.
Otherwise #csgillespie gave you a good answer pertaining to $HOME aka ~.

Can I run R on an alternate hard drive in Windows?

I installed R in C:/programfiles/users... directory. All my files are on a different hard drive, at N:/project/rasterimg/stacked/.... The size of files I want to use for my work are very large (126 GB), so I can not move them to C: drive.
Is there any method to run R without moving my files into C drive?
run
setwd("directory_path")
as already suggested. Then check with
getwd()
if you are really, where you wanted to be.
Then run
dir()
to see/list what files are there in your working dorectory.
You can even acces the file via its the dir() function output index number if you want to avoid mis-typing.
If I understood you correctly you want to change your working directory to some kind of external hard drive. You can do that by using the command:
setwd("N:/project/rasterimg/stacked/...")
or you can just access your files by reading them through a command like (or similarly for the objects you work with):
read.table("N:/project/rasterimg/stacked/...")
Edit: My concern with working with 126 GB files would be that you will run out of memory though... but that's a totally different issue, which is somewhat beyond my knowledge. But there might be others that could help you with that in a new thread (if needed).

How i Connect the data that's in my folder without writing the specific path like:"C:\\Users\\Dima\\Desktop\\NewData\\..."

I am writing a script that's Requires Data Which is in my computer folder.
But eventually this script will be used in another computer, by another person.
I can't tell him to change all the paths to the data in the script.
How i Connect the data that's in my folder without writing the specific path
Like:"C:\Users\Dima\Desktop\NewData\..."
The best way of making your code shareable depends upon your use case.
As Carl Witthoft pointed out, most code should be encapsulated in functions. These functions can then be packaged into packages and easily redistributed on other peoples's machines. Writing packages is easier than you think.
For one off analyses, scripts are appropriate. How you make them user-independent depends on who your users are. If your are sharing the script with colleagues, try to keep your data on a network drive, then the link to the data will be the same for everyone. If you are sharing your script with the world, then keep your data on the internet, and the link to the data will be a hyperlink, again, the same for everyone.
If you are sharing your script with a few people who don't have access to a common drive, and you can't put your data on the internet, then some directory manipulation is acceptable.
Change your working directory to the root of where your project files are.
setwd("c:/Users/Dima/My Project")
Then you can reference the location of the data using relative paths.
data_file <- "Data/My data file.csv"
my_data <- read.csv(data_file)
Assuming that you keep the directory structure within your project the same, then you only need to change the call to setwd on each machine.
Also note that the special location "~" refers to your user home directory. Try
normalizePath("~")
That way, if you keep your project in that location, you can avoid reference to "Dima" entirely.

Using R to save images & .csv's, can I use R to upload them to website (use filezilla to do it manually)?

First I should say that a lot of this is over my head, so I apologize in advance for using incorrect terminology and potentially asking an unclear question. I'm doing my best.
Also, I saw ThisPost; is RCurl the tool I want to use for this task?
Every day for 4 months I'll be analyzing new data, and generating .csv files and .png's that need to be uploaded to a web site so that other team members will be checking. I've (nearly) automated all of the data collecting, data downloading, analysis, and file saving. The analysis is carried out in R, and R saves the files. Currently I use Filezilla to manually upload the new files to the website. Is there a way to use R to upload the files to the web site, so that I don't have to open Filezilla and drag+drop files?
It'd be nice to run my R-code and walk away, knowing that once it finishes running, the newly saved files will be automatically be put on the website.
Thanks for any help!
You didn't specify which protocol you use to upload your files using FileZilla. I assume it is ftp. If so, you can use the ftpUpload function of RCurl:
library(RCurl)
ftpUpload("yourfile", "ftp://ftp.yourserver.foo/yourfile",
userpwd="username:passwd")
RCurl also had methods for scp and should also support sftp using ftpUpload.

Resources