I installed R in C:/programfiles/users... directory. All my files are on a different hard drive, at N:/project/rasterimg/stacked/.... The size of files I want to use for my work are very large (126 GB), so I can not move them to C: drive.
Is there any method to run R without moving my files into C drive?
run
setwd("directory_path")
as already suggested. Then check with
getwd()
if you are really, where you wanted to be.
Then run
dir()
to see/list what files are there in your working dorectory.
You can even acces the file via its the dir() function output index number if you want to avoid mis-typing.
If I understood you correctly you want to change your working directory to some kind of external hard drive. You can do that by using the command:
setwd("N:/project/rasterimg/stacked/...")
or you can just access your files by reading them through a command like (or similarly for the objects you work with):
read.table("N:/project/rasterimg/stacked/...")
Edit: My concern with working with 126 GB files would be that you will run out of memory though... but that's a totally different issue, which is somewhat beyond my knowledge. But there might be others that could help you with that in a new thread (if needed).
Related
I'm providing a .zip with a .R file and a .xlsx file to some people
I need to make a code that can read this .xlsx file in any directory of any pc.
But as the directories vary from computer to computer, I couldn't find a solution.
IMPORTANT: I'm not using Rstudio for read this .R, so i just can use base functions
Using R - How do I search for a file/folder on all drives (hard drives as well as USB drives) This question don't solve my problem..
Take a look at the here package. When you load the library (library("here")) it sets "base" working directory and then you can use the package to construct relative file paths given that location. For example, if inside your .zip file you have an R script (e.g., My Data Analysis.R) that analyzes data that is kept within a folder called data you could read it in using, for example, read.csv(here("data", "my_csv_file.csv")) and it will construct the full appropriate file path no matter what computer it is on. Of course the file structure of the program needs to stay the same across programs.
I would like to set my temporary directory using the scratch space of the cluster. I have tried various methods and this one: How to change directory for temporary files - problems with huge temporary raster files but nothing works.
I have to read a large file (12 GB) in R and run some code using it.
I would like to read the file in this way:
library(data.table)
mydata<-fread("path/file")
But first I believe it is necessary to set the temporary directory as scratch/ otherwise the job has been killed.
Feel free to suggest any other approach.
I am teaching R tutorials in person w/ a large number of undergraduate R novices. I am also trying to format my notes on RPubs so that they can easily be used by other people. Nothing derails things faster than people mis-specifying working directories or saving spreadsheet files to someplace different than their working directory.
Is it possible to define a working directory that is universal across platforms? For example, a line of code or a function like
setwd( someplace that is likely to exist on every computer)
This could involve a function that finds some place the usually exists on all computers, such as the desktop, downloads folder, or R directory.
In general your best bet is to go for the user's home area,
setwd("~")
path.expand("~")
Since you are teaching novices, a common problem is that students notice the R package directory ~/R/ and assume that they should put their scripts this directory; thereby creating odd bugs. To avoid this, I would go for
dir.create("~/RCourse", FALSE)
setwd("~/RCourse")
If you use RStudio, you could get them to create an RStudio project.
In the past, I have come across situations where this doesn't work. For example, some people have their home area as a network drive, but can't connect to the internet or get through a firewall.
You asked about
someplace that is likely to exist on every computer
and yes, R works hard to ensure this returns a valid directory: tempdir().
The main danger, though, may be that this directory will vanish after the session (unless you override the default behaviour of removing the per-session temporary directory at end). Until then, it works.
Still, this can be useful. I sometimes use that to write temporary files I don't want to clutter in the current directory, or ~.
Otherwise #csgillespie gave you a good answer pertaining to $HOME aka ~.
I am writing a script that's Requires Data Which is in my computer folder.
But eventually this script will be used in another computer, by another person.
I can't tell him to change all the paths to the data in the script.
How i Connect the data that's in my folder without writing the specific path
Like:"C:\Users\Dima\Desktop\NewData\..."
The best way of making your code shareable depends upon your use case.
As Carl Witthoft pointed out, most code should be encapsulated in functions. These functions can then be packaged into packages and easily redistributed on other peoples's machines. Writing packages is easier than you think.
For one off analyses, scripts are appropriate. How you make them user-independent depends on who your users are. If your are sharing the script with colleagues, try to keep your data on a network drive, then the link to the data will be the same for everyone. If you are sharing your script with the world, then keep your data on the internet, and the link to the data will be a hyperlink, again, the same for everyone.
If you are sharing your script with a few people who don't have access to a common drive, and you can't put your data on the internet, then some directory manipulation is acceptable.
Change your working directory to the root of where your project files are.
setwd("c:/Users/Dima/My Project")
Then you can reference the location of the data using relative paths.
data_file <- "Data/My data file.csv"
my_data <- read.csv(data_file)
Assuming that you keep the directory structure within your project the same, then you only need to change the call to setwd on each machine.
Also note that the special location "~" refers to your user home directory. Try
normalizePath("~")
That way, if you keep your project in that location, you can avoid reference to "Dima" entirely.
Often in "restricted security" situations in which programs can't be installed on a computer I run R from a flash drive. Works like a charm. I've recently started using dropbox and was thinking it could be used in a similar fashion to the flash drive. For anyone who has tried this does it work?
I can test it myself but don't want to go to the bother if it's a dead end.
Thanks in advance.
PS this has the advantage of storing an .Rprofile that people whom you share the dropbox folder with can then run your R code. This is particularly nice for people unfamiliar with R.
It should just work.
R is set up in such a way that all its files are relative to a given top-level directory. Whether that is a F:\ or Z:\ drive from your flashdrive, or your Dropbox folder should not matter.
By the same token, R can run happily off a shared folder, be it via Samba, NFS or another mechanism.
It is fine if you want to share .Rprofile or .Rhistory. However, I see a problem with .Rdata, because it can be big (for example 4Gb). For me to save 100 Mb file on Dropbox takes minutes and .RData can be far bigger.
An alternative would be a remote server, where you could connect through ssh.