I've just begun to use the R-ArcGIS bridge package arcgisbinding and am running into a problem when I try to join feature class data with the dplyr package. Here is an example where I'm trying to get the ozone columns from two shapefiles into a single data frame, then export back as a shapefile.
library(dplyr)
library(arcgisbinding)
arc.check_product()
fc <- arc.open(system.file("extdata", "ca_ozone_pts.shp",
package="arcgisbinding"))
d <- arc.select(fc, fields=c('FID', 'ozone'))
p<-arc.select(fc,fields=c('FID', 'ozone'))
p$ozone<-p$ozone*2
p<-left_join(p,d,by="FID")
arc.write(tempfile("ca_new", fileext=".shp"), p)
# original dataframe has shape attributes
str(d)
# new dataframe does not
str(p)
From the arcgisbinding package, p and d above are data frame objects with shape attributes. The problem is that once I use left_join, I lose the spatial attribute data in the joined data frame. Is there a way around this?
So apparently this is a known problem (see GitHub here).
A workaround using the spdplyr package comes by way of Shaun Wallbridge on ESRI GeoNet (link to thread). Basically, convert the arc.data data frame into an sp object, perform analyses, then export as a feature class or shapefile.
library(spdplyr)
library(arcgisbinding)
arc.check_product()
fc <- arc.open(system.file("extdata", "ca_ozone_pts.shp", package="arcgisbinding"))
d <- arc.select(fc,fields=c('FID', 'ozone'))
d.sp <- arc.data2sp(d)
p <-arc.select(fc,fields=c('FID', 'ozone'))
p.sp <- arc.data2sp(p)
p.sp$ozone <- p$ozone*2
joined <- left_join(p.sp, d.sp, by="FID", copy=TRUE)
joined.df <- arc.sp2data(joined)
arc.write(tempfile("ca_ozone_pts_joined", fileext=".shp"), joined.df)
Related
I am new to working with Spatial Data in R, and I'm having trouble accessing the polygons and relating the polygons to the data. Here's the thing:
I have municipalities a SpatialPolygonsDataFrame, and I can access the data section with municipalities#data this of course is a data frame and I can do the usual manipulation with sapply and similar functions. I want, however, to create a new variable in data that depends on the polygons. So for example, I would like to do:
municipalities$pointInPolygon <- sapply(POLYGON, function(x){getPointInPolygon{x})
or something similar. Thing is, I still don't understand how the data and polygon sections of my SPDF are related and how to access the later.
Any answers,tips, or resources to understand SpatialPolygonsDataFrames better is appreciated.
PS: Here is the data I'm using and how I am reading it
municipalities <- rgdal::readOGR("México_Estados/México_Estados.shp")
Why not use sf package for simple features:
municipalities_sf <- sf::read_sf("México_Estados.shp")
class(municipalities_sf)
#
> [1] "sf" "tbl_df" "tbl" "data.frame"
#
Then you can use dplyr, purrr, etc on data frame directly:
municipalities_sf |>
dplyr::mutate(purrr::map(geometry, function(x){myFantasticFunction}))
I have a spatial dataset, polyline, that contains 115 line features and am trying to figure out if it is possible to select and save each line feature out into individual shape files using a loop or similar function?
I understand how to do this individually using subset (example below), however repeating this process 115 times seems like a waste of time and the power of R.
I am including an example of the data below:
trailname <- ("trail1", "trail2", "trail3")
trailtype <- ("mountain", "flat", "hilly")
parking <- ("no", "yes", "no")
shapelength <- ("835", "5728", "367")
trails <- data.frame(accessname, trailtype, parking, shapelength)
Here is a single subset example:
trail1 <- subset(trails, trailname == "trail1")
I would like to select each trail, and save it out as the name that appears under the "trail name" column i.e., trail1.shp
In base R, couldn't you us the assign function in a for loop to do this?
trailname <- c("trail1", "trail2", "trail3")
trailtype <- c("mountain", "flat", "hilly")
parking <- c("no", "yes", "no")
shapelength <- c("835", "5728", "367")
trails <- data.frame(trailname, trailtype, parking, shapelength)
for(i in 1:nrow(trails)){
name <- as.character(trails$trailname[[i]])
assign( name, subset(trails, trailname == trails$trailname[[i]]) )
}
EDITED TO ANSWER OP'S COMMENT
This should be do-able with a few tweaks. One item to note is that the example you provided is a data frame, while the writeOGR function takes...
SpatialPointsDataFrame, SpatialLinesDataFrame, or SpatialPolygonsDataFrame objects as defined in the sp package.
These type of objects have data frames, but also other attributes that are likely of interests. Let's assume your data is in one of these accepted types. I'll use rgdal cities data as an example. If all we care about is saving the files outside of our R session, then skip the assign function and drop the subset into the writeOGR function:
library('rgdal')
#loading in data
cities <- readOGR(system.file("vectors", package = "rgdal")[1], "cities")
#taking only first two rows for this example
shap <- cities[1:2,]
#where you want to save these files. This places it on your current working directory
location <- getwd()[[1]]
for(i in 1:nrow(shap)){
# name of file
name <- as.character(shap$NAME[[i]])
# change shap to your 'SpatialPointsDataFrame'
writeOGR(subset(shap, NAME == shap$NAME[[i]]), location, name , driver="ESRI Shapefile")
}
There is a R package called ShapePattern. Look up the function shpsplitter. Seems to do what you want. Otherwise you can do it in other GIS software, see here https://gis.stackexchange.com/questions/25709/splitting-shapefile-into-separate-files-for-each-feature-using-qgis-gdal-saga
I'm working with a feature class dataset extracted from a geodatabase, which I've filtered to my area of interest and intersected with a SpatialPointsDataFrame. In order to export it to a shapefile with WriteOGR I need to format the attribute names and I also want to only select specific columns to export in my final shapefile. I have been running into a lot of errors using standard select or base R subletting techniques. For some reason R doesn't seem to recognize the column names when I try to select. I've tried lots of different methods and can't figure out where I'm going wrong.
```bfcln%>%
+ select(STATEFP,DP2_HC03_V, DP2_HC03V.1)
Error in tolower(use) : object 'STATEFP' not found```
# create a spatial join between bf_pop and or_acs
#check CRS
```crsbf <- bf_pop#proj4string```
# change acs CRS to match bf_pop
```oracs_reprj <- spTransform(or_acs, crsbf)```
# join by spatial attributes
```bf_int <- raster::intersect(bf_pop, oracs_reprj)```
#truncate field names to 10 characters for ESRI formatting
```names(bf_int) <- strtrim(names(bf_int),10)```
#remove duplicates from attribute table
```bfcln <- bf_int[which(!duplicated(bf_int$id)), ]```
After failing with the select() method multiple times, I tried renaming columns.
# rename variables of interest
```bfcln1 <-bfcln%>%
select(DP2_HC03_V)%>%
rename(DP2_HC03_V=pcntunmar)%>%
select(DP2_HC03_V.1)%>%
rename(DP2_HC03_V.1=pcntirsh)
Error in tolower(use) : object 'DP2_HC03_V' not found```
To rename spatial files you'll need to install the package spdplyr.
Similarly to dplyr, you'd do:
df <- df %>%
rename(newName = oldName)
In R, how can I export a khrud object from function kernelUD in package adehabitat to a raster file (geoTiff)?
I tried following this thread (R: how to create raster layer from an estUDm object) using the code here:
writeRaster(raster(as(udbis1,"SpatialPixelsDataFrame")), "udbis1.tif")
where udbis1 is a khrud object, but I get "Error in as(udbis1, "SpatialPixelsDataFrame") : no method or default for coercing “khrud” to “SpatialPixelsDataFrame."
I think the issue may be that the old thread was before an update to the adehabitat package changed the data format from estUD to khrud. Maybe?
You do not provide a reproducible example. The following works for me:
library(adehabitatHR)
library(raster)
data(puechabonsp)
loc <- puechabonsp$relocs
ud <- kernelUD(loc[, 1])
r <- raster(as(ud[[1]], "SpatialPixelsDataFrame"))
writeRaster(r, filename = file.path(tempdir(), "ud1.tif"))
AdehabitatHR solutions work well for data that are in the required format or when using multiple animals. Though when wanting to create KDE with data organized differently or for only one source, it can be frustrating. For some reason, #johaness' answer doesn't work for my case so here is an alternative solution that avoids the headaches of going into adehabitatHR's innards.
library(adehabitatHR)
library(raster)
# Recreating an example for only one animal
# with a basic xy dataset like one would get from tracking
loc<-puechabonsp$relocs
loc<-as.data.frame(loc)
loc<-loc[loc$Name=="Brock",]
coordinates(loc)<-~X+Y
ud<-kernelUD(loc)
# Extract the UD values and coordinates into a data frame
udval<-data.frame("value" = ud$ud, "lon" = ud#coords[,1], "lat" = ud#coords[,2])
coordinates(udval)<-~lon+lat
# coerce to SpatialPixelsDataFrame
gridded(udval) <- TRUE
# coerce to raster
udr <- raster(udval)
plot(udr)
I want to to convert two .shp files into one database that would allow me to draw the maps together.
Also, is there a way to convert .shp files into .csv files? I want to be able to personalize and add some data which is easier for me under a .csv format. What I have in mind if to add overlay yield data and precipitation data on the maps.
Here are the shapefiles for Morocco, and Western Sahara.
Code to plot the two files:
# This is code for mapping of CGE_Morocco results
# Loading administrative coordinates for Morocco maps
library(sp)
library(maptools)
library(mapdata)
# Loading shape files
Mor <- readShapeSpatial("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/MAR_adm1.shp")
Sah <- readShapeSpatial("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/ESH_adm1.shp")
# Ploting the maps (raw)
png("Morocco.png")
Morocco <- readShapePoly("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/MAR_adm1.shp")
plot(Morocco)
dev.off()
png("WesternSahara.png")
WesternSahara <- readShapePoly("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/ESH_adm1.shp")
plot(WesternSahara)
dev.off()
After looking into suggestions from #AriBFriedman and #PaulHiemstra and subsequently figuring out how to merge .shp files, I have managed to produce the following map using the following code and data (For .shp data, cf. links above)
code:
# Merging Mor and Sah .shp files into one .shp file
MoroccoData <- rbind(Mor#data,Sah#data) # First, 'stack' the attribute list rows using rbind()
MoroccoPolys <- c(Mor#polygons,Sah#polygons) # Next, combine the two polygon lists into a single list using c()
summary(MoroccoData)
summary(MoroccoPolys)
offset <- length(MoroccoPolys) # Next, generate a new polygon ID for the new SpatialPolygonDataFrame object
browser()
for (i in 1: offset)
{
sNew = as.character(i)
MoroccoPolys[[i]]#ID = sNew
}
ID <- c(as.character(1:length(MoroccoPolys))) # Create an identical ID field and append it to the merged Data component
MoroccoDataWithID <- cbind(ID,MoroccoData)
MoroccoPolysSP <- SpatialPolygons(MoroccoPolys,proj4string=CRS(proj4string(Sah))) # Promote the merged list to a SpatialPolygons data object
Morocco <- SpatialPolygonsDataFrame(MoroccoPolysSP,data = MoroccoDataWithID,match.ID = FALSE) # Combine the merged Data and Polygon components into a new SpatialPolygonsDataFrame.
Morocco#data$id <- rownames(Morocco#data)
Morocco.fort <- fortify(Morocco, region='id')
Morocco.fort <- Morocco.fort[order(Morocco.fort$order), ]
MoroccoMap <- ggplot(data=Morocco.fort, aes(long, lat, group=group)) +
geom_polygon(colour='black',fill='white') +
theme_bw()
Results:
New Question:
1- How to eliminate the boundaries data that cuts though the map in half?
2- How to combine different regions within a .shp file?
Thanks you all.
P.S: the community in stackoverflow.com is wonderful and very helpful, and especially toward beginners like :) Just thought of emphasizing it.
Once you have loaded your shapefiles into Spatial{Lines/Polygons}DataFrames (classes from the sp-package), you can use the fortify generic function to transform them to flat data.frame format. The specific functions for the fortify generic are included in the ggplot2 package, so you'll need to load that first. A code example:
library(ggplot2)
polygon_dataframe = fortify(polygon_spdf)
where polygon_spdf is a SpatialPolygonsDataFrame. A similar approach works for SpatialLinesDataFrame's.
The difference between my solution and that of #AriBFriedman is that mine includes the x and y coordinates of the polygons/lines, in addition to the data associated to those polgons/lines. I really like visualising my spatial data with the ggplot2 package.
Once you have your data in a normal data.frame you can simply use write.csv to generate a csv file on disk.
I think you mean you want the associated data.frame from each?
If so, it can be accessed with the # slot access function. The slot is called data:
write.csv( WesternSahara#data, file="/home/wherever/myWesternSahara.csv")
Then when you read it back in with read.csv, you can try assigning:
myEdits <- read.csv("/home/wherever/myWesternSahara_modified.csv")
WesternSahara#data <- myEdits
You may need to do some massaging of row names and so forth to get it to accept the new data.frame as valid. I'd probably try to merge the existing data.frame with a csv you read in in R, rather than making edits destructively....