passing directory name as a parameter - r

My current directory is c:/users/akshay/Documents
But all my data is in the directory "specdata" whose path address is c:/users/akshay/Documents/specdata
when i type these commands separately in console it works successfully.
path <- "C:/Users/akshay/Documents"
directory <- "specdata"
setwd(paste(path, directory, sep="/", collapse=NULL))
But when i use it in function like this it wont change my working directory.
pollutantmean <- function(directory){
directory <- character(1)
path <- character(1)
path <- "C:/Users/akshay/Documents"
setwd(paste(path, directory, sep="/", collapse=NULL))
}
But when i pass
>pollutantmean("specdata")
It wont change my working directory why is it so?
what is the problem?

Maybe try returning the paste. Also, you don't need the character() functions.
pollutantmean <- function(directory){
path <- "C:/Users/akshay/Documents"
return(paste(path, directory, sep="/", collapse=NULL))
}
pollutantmean("specdata")
Output:
> pollutantmean("test")
[1] "C:/Users/akshay/Documents/test"
Change directory:
pollutantmean<-function(directory){ + path<-"C:/Users/akshay/Documents" + setwd(paste(path,directory,sep="/",collapse=NULL)) + }

Related

My R function is consuming too much memory. Can you help me optimizing it?

I'm new to R and having trouble with optimizing a function.
My function is to:
create a directory specified in the function
download the zip file from the link inside the function and extract it to the directory
move extracted files to the main directory if files are extracted under a new subfolder
delete the subfolder
It works but consumes a lot of memory and takes 30mins to do such an easy job on a 2.7MB zip file.
Thank you in advance!
create_dir <- function(directory) {
path <- file.path(getwd(), directory)
if (!file.exists(path)) {
dir.create(path)
}
link <-
"https://d396qusza40orc.cloudfront.net/rprog%2Fdata%2Fspecdata.zip"
temp <- tempfile()
download.file(link, temp, mode = "wb")
unzip(temp, exdir = path)
unlink(temp)
existing_loc <- list.files(path, recursive = TRUE)
for (loc in existing_loc) {
if (length(grep("/", loc))) {
file.copy(file.path(path, loc), path)
file.remove(file.path(path, loc))
}
}
dirs <- list.dirs(path)
rm_dirs <- dirs[dirs != path]
if (length(rm_dirs)) {
for (dir in rm_dirs) {
unlink(rm_dirs, recursive = TRUE)
}
}
}
create_dir("testDirectory")
Thanks, I found the problem. It's because of setting a working directory on OneDrive that syncs for every extraction, moving, and deletion of 332 files processed by the function. AntiVirus also run along with OneDrive and caused my PC to freeze for 30 mins by using 70% of CPU.

How to use Plumber in an API to upload multiple images to a sub directory?

I have sub directory whose location is 'data/images/' and I need my API service to upload images into that sub directory. I am using R and Plumber here. I understand the basic setup, but I can't seem to get my code to deliver my uploaded files into my directory.
This is my attempt:
library(plumber)
library(Rook)
#* Upload file
#* #param req:[file]
#* #post /uploadfile
function(req, res){
names(req)
print(names(req))
fileInfo <- list(formContents = Rook::Multipart$parse(req))
print(fileInfo)
## The file is downloaded in a temporary folder
tmpfile <- fileInfo$formContents$upload$tempfile
## Copy the file to a new folder, with its original name
fn <- file.path(paste0("data/images/",req, sepp=''))
file.copy(tmpfile, fn)
print(fn)
## Send a message with the location of the file
res$body <- paste0("Your file is now stored in ", fn, "\n")
res
}
Any help would be greatly appreciated.
So after a bunch of pulled hairs, this is essentially the code that makes allows you upload pictures to a designated sub directory. This code needs the plumber and Rook packages to work:
library(plumber)
library(Rook)
#* #param req:[file]
#* #post /upload_test27
function(req, res) {
# Required for multiple file uploads
names(req)
# Parses into a Rook multipart file type;needed for API conversions
fileInfo <- list(formContents = Rook::Multipart$parse(req))
# This is where the file name is stored
# print(fileInfo$formContents$req$filename)
file_name <- fileInfo$formContents$req$filename
# The file is downloaded in a temporary folder
tmpfile <- fileInfo$formContents$req$tempfile
# Create a file path
fn <- (paste0("data/images/",file_name, sepp=''))
#Copies the file into the designated folder
file.copy(tmpfile, fn)
res$body <- paste0("Your file is now stored in ", fn, "\n")
res
}

Download files from FTP folder using Loop

I am trying to download all the files inside FTP folder
temp <- tempfile()
destination <- "D:/test"
url <- "ftp://XX.XX.net/"
userpwd <- "USER:Password"
filenames <- getURL(url, userpwd = userpwd,ftp.use.epsv = FALSE,dirlistonly = TRUE)
filenames <- strsplit(filenames, "\r*\n")[[1]]
When I am printing "filenames" I am getting all the file names which are inside the FTP folder - correct output till here
[1] "2018-08-28-00.gz" "2018-08-28-01.gz"
[3] "2018-08-28-02.gz" "2018-08-28-03.gz"
[5] "2018-08-28-04.gz" "2018-08-28-05.gz"
[7] "2018-08-28-08.gz" "2018-08-28-09.gz"
[9] "2018-08-28-10.gz" "2018-08-28-11.gz"
[11] "2018-08-28-12.gz" "2018-08-28-13.gz"
[13] "2018-08-28-14.gz" "2018-08-28-15.gz"
[15] "2018-08-28-16.gz" "2018-08-28-17.gz"
[17] "2018-08-28-18.gz" "2018-08-28-23.gz"
for ( i in filenames ) {
download.file(paste0(url,i), paste0(destination,i), mode="w")
}
I got this error
trying URL 'ftp://XXX.net/2018-08-28-00.gz'
Error in download.file(paste0(url, i), paste0(destination, i), mode = "w") :
cannot open URL 'ftp://XXX.net/2018-08-28-00.gz'
In addition: Warning message:
In download.file(paste0(url, i), paste0(destination, i), mode = "w") :
InternetOpenUrl failed: 'The login request was denied'
I modified the code to
for ( i in filenames )
{
#download.file(paste0(url,i), paste0(destination,i), mode="w")
download.file(getURL(paste(url,filenames[i],sep=""), userpwd =
"USER:PASSWORD"), paste0(destination,i), mode="w")
}
After that, I got this error
Error in function (type, msg, asError = TRUE) : RETR response: 550
Without a minimal, complete, and verifiable example it is a challenge to directly replicate your problem. Assuming the file names don't include the URL, you'll need to combine them to access the files.
download.file() requires a file to be read, an output file, as well as additional flags regarding whether you want a binary download or not.
For example, I have data from Alberto Barradas' Pokémon Stats kaggle.com data set stored on my Github site. To download some of the files to the test subdirectory of my R Working Directory, I can use the following code:
filenames <- c("gen01.csv","gen02.csv","gen03.csv")
fileLocation <- "https://raw.githubusercontent.com/lgreski/pokemonData/master/"
# use ./ for subdirectory of current directory, end with / to work with paste0()
destination <- "./test/"
# note that these are character files, so use mode="w"
for (i in filenames){
download.file(paste0(fileLocation,i),
paste0(destination,i),
mode="w")
}
...and the output:
The paste0() function concatenates text without spaces, which allows the code to generate a fully qualified path name for the url of each source file, as well as the subdirectory where the destination file will be stored.
To illustrate what's happening with paste0() in the for() loop, we can use message() to print to the R console.
> # illustrate what paste0() does
> for (i in filenames){
+ message(paste("Source is: ",paste0(fileLocation,i)))
+ message(paste("Destination is:",paste0(destination,i)))
+ }
Source is: https://raw.githubusercontent.com/lgreski/pokemonData/master/gen01.csv
Destination is: ./test/gen01.csv
Source is: https://raw.githubusercontent.com/lgreski/pokemonData/master/gen02.csv
Destination is: ./test/gen02.csv
Source is: https://raw.githubusercontent.com/lgreski/pokemonData/master/gen03.csv
Destination is: ./test/gen03.csv
>

R Studio-0.99.451: how to unzip folder and paste files into destination folder

I am trying to extract(unzip) folder (namely "pakistan.zip" which contains 5 files Pak_admin0.shp, Pak_admin0.shx, Pak_admin0.dbf, Pak_admin0.prj, Pak_admin0.qpj) and copying the files of .shp, .shx, .dbf files from that folder to destination folder using Rstudio 0.99.451 version with the following codes:
for(j in list(".shp", ".shx", ".dbf"))
{
fname <- unzip(file=paste("pakistan", j, sep=""), zipfile= "pakistan.zip")
file.copy(fname, paste("./pakistan", j, sep="/"), overwrite=TRUE)
}
unlink("pakistan.zip")
but it gives me following error
Warning messages:
1: In unzip(file = paste("zupanije", j, sep = ""), zipfile = "pakistan.zip") : requested file not found in the zip file
2: In unzip(file = paste("zupanije", j, sep = ""), zipfile = "pakistan.zip") : requested file not found in the zip file
3: In unzip(file = paste("zupanije", j, sep = ""), zipfile = "pakistan.zip") : requested file not found in the zip file
Please provide any possible solution to deal with this error.
These are actual codes which I have found but zip.file.extract function is no longer part of R:
for(j in list(".shp", ".shx", ".dbf")){
fname <- zip.file.extract(file=paste("zupanije", j, sep=""),
zipname="zupanije.zip")
file.copy(fname, paste("./zupanije", j, sep=""), overwrite=TRUE)
}
unlink("zupanije.zip")
I want to automate the structure of downloading the shape file from website and unzip it and place into another folder then will display it using maptools library using readShapePoly() function.
Your code works for me for a zip file that contains those files. The error suggests those files are not contained in the zip file. Since you say you are trying to extract a "directory" perhaps they are in a subdirectory in the zipfile? For example, if I put the files in a "temp" directory and then create a zip file of that directory, I must add the directory to the file path, like this:
f <- "test.zip"
for(j in list(".shp", ".shx", ".dbf"))
{
# note "pakistan" directory added to path
# unzip pakistan/zupanije.shp (or .shx or .dbf) out of test.zip
fname <- unzip(file=paste("pakistan/zupanije", j, sep=""), zipfile= f)
#copy extracted file to destination directory
file.copy(fname, paste("./destination", j, sep="/"), overwrite=TRUE)
}
If you are in a Linux like environment, you could try the following command to inspect the zip file and ensure it contains what you think it contains and at the path you expect:
unzip -vl pakistan.zip
By the way, your code will output the file "./pakistan/.dbf", "./pakistan/.shx" and "./pakistan/.shp". Is that what you want? Or do you perhaps want "pakistan.shx", etc. in which case this change is needed:
-file.copy(fname, paste("./pakistan", j, sep="/"), overwrite=TRUE)
+file.copy(fname, paste("./pakistan", j, sep=""), overwrite=TRUE)

Always set working directory to Dropbox folder on any machine

I wanted to have a simple code at the beginning of my scripts to set the working directory to my Dropbox folder, regardless of which machine I run my code on:
setdir <- function(){
wandir <- paste(path.expand("~"), "/Dropbox/_R", sep = "")
curdir <- getwd()
if(curdir!=wandir){
setwd(wandir)
}
}
setdir()
The trick with the path.expand("~") works on Linux machines, but it doesn't on Windows machines, because it leads to C:/Users/username/Documents instead of C:/Users/username/. Is there a function that would work globally?
Here is a hacky workaround, which is far from a global one:
setdir <- function(){
wandir <- paste(path.expand("~"), "/Dropbox/_R", sep = "")
wandir <- sub("/Documents", "", wandir)
curdir <- getwd()
if(curdir!=wandir){
setwd(wandir)
}
}
setdir()

Resources