How to calculate area of multipart polygon in R - r

I'm having trouble calculating in R the area of an imported shapefile that has a multipart polygon (one feature containing two separate polygons). I noticed that ArcMap gave me a different value for the area of a shapefile than raster::area. To figure out which program was giving me the correct area, I broke the shapefile into single parts and recalculated the area of the two separate polygons:
library(raster)
> single_part <- shapefile("../Desktop/test/test_sp.shp")
> area(single_part)
[1] 575924.0 433409.8
> sum(area(single_part))
[1] 1009334
>
> multi_part <- shapefile("../Desktop/test/test_mp.shp")
> area(multi_part)
[1] 1018390
I realize now that I know about this problem, I should always break up polygon feature classes into single parts, but does anyone know how raster::area calculates the area of multipart polygons? I also tried using rgeos::gArea but got the same result. Is there a way to calculate the area of multipart polygons in R?
I'd love to know, because they're pretty common and I'm trying to switch from doing all my analyses in ArcMap to R.
In case it's helpful, here's an image of the shapefile:
multipart poly shapefile
EDIT ADDED 9/21/2018 -------------------------------------------------------
Here's a link to the shapefile test_mp.shp
From what I can tell, it seems like the problem stems from how R (vs. ArcMap) interprets the holes. See the difference between the ArcMap display and the R display. For some reason R is filling in those holes as part of the shapefile, which must be the reason that I'm getting different calculations for the area. Is there something wrong with the shapefile, or how I'm importing it?

Clearly your object named 'multi_part' has only one (multi?) polygon, as area returns a single value. I illustrate here how to investigate what you are after:
library(raster)
d <- getData('GADM', country='Isle of Man', level=0)
area(d)
[1] 579672897
Split into 4 polygons (islands)
dd <- disaggregate(d)
a <- area(dd)
a
[1] 19424.12 2705442.41 25629.79 576922400.90
sum(a)
[1] 579672897
The same area, and there is no reason why they would be different. Except perhaps if there is confusion with polygon holes. It is difficult to comment without your data.
You can write these objects to disk (see below) and see what ArcGIS gives you as area (but note that this example uses lon/lat coordinates, I am not sure if ArcGIS can compute areas on those).
shapefile(d, "man.shp")
Here is a case with and without a hole:
p1 <- rbind(c(-180,-20), c(-140,55), c(10, 0), c(-140,-60), c(-180,-20))
p2 <- rbind(c(-150,-20), c(-100,-10), c(-110,20), c(-150,-20))
# two (overlapping) polygons (no hole)
pol1 <- spPolygons(p1, p2, crs="+proj=utm +zone=1 +datum=WGS84")
# single polygon with hole
pol2 <- spPolygons(list(p1, p2), crs="+proj=utm +zone=1 +datum=WGS84")
a <- area(pol1) / 10e+9
b <- area(pol2) / 10e+9
a
#[1] 10925 800
sum(a)
#[1] 11725
a[1]-a[2]
#[1] 10125
b matches a[1] - a[2], as expected
b
#[1] 10125
I get exactly the same results with ArcGIS, using "calculate geometry" for a field in the attribute tables.

Related

Impossible Nas in wrld_simpl

When I create a raster of the world land based on wrld_simpl (or any other environmental layer coming from worldclim) always appear to be some "impossible" NAs on land. Why would that happen? I need a perfect mask of the world land to excerpt records that did not fall in the ocean. However, there are many records on land and still are considered NA.
My script goes like this:
require(raster)
require(maptools)
data(wrld_simpl)
x=read.csv("https://www.dropbox.com/s/ncvu64r2fxgfd4e/NAlocations.csv?dl=0")
r=raster(ncols=360,nrows=(180))
extent(r)=extent(wrld_simpl)
r=rasterize(wrld_simpl,r,wrld_simpl$AREA)
plot(r)
x=x[-which(is.na(extract(r,x$lon,x$lat))),]# This should eliminate all locations on land.
points(x$lon,x$lat, col="red", cex=.3)
How is that possible? And would it be a way to create a clean raster for the world land?
The direct read.csv from dropbox does not work for me.
If I do
z <- extract(r, x)
# NOT z <- extract(r, x[,1], x[,2]) !!!
i <- which(is.na(z))
points(x[i,])
I see a bunch of points in the water of the coast of Mozambique.

Query raster brick layer based on another raster in R

I have a NetCDF file of global oceanographic (OmegaA) data at relatively coarse spatial resolution with 33 depth levels. I also have a global bathymetry raster at much finer resolution. My goal is to use get the seabed OmegaA data from the NetCDF file, using the bathymetry data to determine the desired depth. My code so far;
library(raster)
library(rgdal)
library(ncdf4)
# Aragonite data. Defaults to CRS WGS84
ncin <- nc_open("C:/..../GLODAPv2.2016b.OmegaA.nc")
ncin.depth <- ncvar_get(ncin, "Depth")# 33 depth levels
omegaA.brk <- brick("C:/.../GLODAPv2.2016b.OmegaA.nc")
omegaA.brk <-rotate(omegaA.bkr)# because netCDF is in Lon 0-360.
# depth raster. CRS WGS84
r<-raster("C:/....GEBCO.tif")
# resample the raster brick to the resolution that matches the bathymetry raster
omegaA.brk <-resample(omegaA.brk, r, method="bilinear")
# create blank final raster
omegaA.rast <- raster(ncol = r#ncols, nrow = r#nrows)
extent(omegaA.rast) <- extent(r)
omegaA.rast[] <- NA_real_
# create vector of indices of desired depth values
depth.values<-getValues(r)
depth.values.index<-which(!is.na(depth.values))
# loop to find appropriate raster brick layer, and extract the value at the desired index, and insert into blank raster
for (p in depth.values.index) {
dep.index <-which(abs(ncin.depth+depth.values[p]) == min(abs(ncin.depth+depth.values[p]))) ## this sometimes results in multiple levels being selected
brk.level <-omegaA.brk[[dep.index]] # can be more than on level if multiple layers selected above.
omegaA.rast[p] <-omegaA.brk[[1]][p] ## here I choose the first level if multiple levels have been selected above
print(paste(p, "of", length(depth.values.index))) # counter to look at progress.
}
The problem: The result is a raster with massive gaps (NAs) in it where there should be data. The gaps often take a distinctive shape - eg, follow a contour, or along a long straight line. I've pasted a cropped example.
enter image description here
I think this could be because either 1) for some reason the 'which' statement in the loop is not finding a match or 2) a misalignment of the projections is created which I've read can happen when using 'Rotate'.
I've tried to make sure all the extents, resolutions, number of cells, and CRS's are all the same, which they seem to be.
To speed up the process I've cropped the global brick and bathy raster to my area of interest, again checking that all the spatial resolutions, etc etc match - I've not included those steps here for simplicity.
At a loss. Any help welcome!
Without a reproducible example, this kind of problems is hard to solve. I can't tell where your problem is but I'll present to you the approach I would try. Maybe it's good, maybe it's bad, I don't know but it may inspire you to find a way to go around your problem.
To my understanding, you have a brick of OmegaA (33 layers/depth) and a bathymetry raster. You want to get the OmegaA value at the bottom of the sea. Here is how I would do:
Make OmegaA raster to the same resolution and extent to the bathymetry one
Transforme the bathymetry raster into a raster brick of 33 three layers of 0-1. e.g. If the sea bottom is at 200m for one particular pixel, than this pixel on all depth layer other than 200 is 0 and 1 for the 200. To program this, I would go the long way, something like
:
r_1 <- r
values(r_1) <- values(r)==10 # where 10 is the depth (it could be a range with < or >)
r_2 <- r
values(r_2) <- values(r)==20
...
r_33 <- r
values(r_33) <- values(r)==250
r_brick <- brick(r_1, r_2, ..., r_33)
then you multiple both your raster bricks. They have the same dimension, it should be easy. The output should be a raster brick of 33 layers with 0 everywhere where it isn't the bottom of the sea and the value of OmegaA anywhere else.
Combine all the layer of the brick obtained previously into a simple raster with a sum.
This should work. If you have problem with dealing with raster brick, you could make the data into base R arrays, it could be simpler.
Good luck.

ggmap - merging two satellite ggmaps into 1 ggmap

I am trying to map data onto higher resolution Google satellite imagery. I could use a lower resolution image (e.g. zoom 13 and limit the scales - as suggested here - ggmap extended zoom or boundaries) however, the resultant image is not clear enough for my purpose. So basically I would like to be able to combine 2 14 zoom into 1 ggmap:
library(ggmap)
library(gridExtra)
g1 <- get_googlemap(center = c(-83.986927, 33.955656), maptype="satellite", zoom=14)
g2 <- get_googlemap(center = c(-83.938079, 33.955656), maptype="satellite", zoom=14)
gmap1 <- ggmap(g1)
gmap2 <- ggmap(g2)
grid.arrange(gmap1, gmap2, ncol =2)
but have 1 ggmap object that combined gmap1 and gmap2.
You can (and probably should) convert to raster objects. You should really use them independently from then on, like tiles, since their pixels don't seem to be on the same grid basis so mosaicing them might not be perfect. You can bodge this by adjusting the tolerance.
The objects from get_googlemap are matrices with colour values in hex ("#FF000" etc) and some attributes defining the extent. The following code converts that object to a three-band RGB raster, with the right extent and CRS:
library(raster)
ggmap2raster <- function(g){
rgb = col2rgb(g)
bands = apply(rgb, 1, function(band){
raster(t(matrix(band,ncol=ncol(g), nrow=nrow(g))))
})
s = stack(bands)
bb = attr(g, "bb")
extent(s) = extent(bb$ll.lon,bb$ur.lon, bb$ll.lat, bb$ur.lat)
crs(s) <- "+init=epsg:4326"
s
}
To merge a bunch of them, this code uses mosaic, but because the layers don't seem to line up quite right (possibly because the data are really in web mercator rather than WGS84) you need to up the tolerance and hope:
mergegg <- function(glist){
m = function(...){
mosaic(...,tolerance=0.5, fun=min)
}
do.call(m,
lapply(glist, function(g){
ggmap2raster(g)
})
)
}
> r = mergegg(list(g1, g2))
> plotRGB(r)
I suspect the tolerance problem may disappear if I convert the corner coords back to Web Mercator. But that's too much bother for a Friday morning. ggmap and its handling of coordinate systems is not something I want to get into right now. You could try binding the two g1 and g2 matrix objects together but you probably would have to do the reverse transform first and to be honest given the restrictions on using Google satellite images (you have read the license conditions?) I suspect its a bad thing.
To visualise raster objects, use the tmap package instead of ggmap.

Check if point is in spatial object which consists of multiple polygons/holes

I have a SpatialPolygonsDataFrame with 11589 objects of class "polygons". 10699 of those objects consists of exactly 1 polygon, however the rest of those objects consists of multiple polygons (2 to 22).
If an object of consists of multiple polygons, three scenarios are possible:
Sometimes, those additional polygons describe a "hole" in the geographic ara describe by the first polygon in the object of class "polygons".
Sometimes, those additional polygons describe additional geographic areas, i.e. the shape of the region is quite complex and described by putting together multiple parts.
Sometimes, it might be a mix of both, 1) and 2).
Stackoverflow helped me to plot such an spatial object properly (Plot spatial area defined by multiple polygons).
However, I am still not able to answer how to determine whether a point (defined by longitude/latitude) is in a polygon.
Below is my code. I tried to apply the function point.in.polygon in the sp package, but found no way how it could handle such an object which consists of multiple polygons/holes.
# Load packages
# ---------------------------------------------------------------------------
library(maptools)
library(rgdal)
library(rgeos)
library(ggplot2)
library(sp)
# Get data
# ---------------------------------------------------------------------------
# Download shape information from the internet
URL <- "http://www.geodatenzentrum.de/auftrag1/archiv/vektor/vg250_ebenen/2012/vg250_2012-01-01.utm32s.shape.ebenen.zip"
td <- tempdir()
setwd(td)
temp <- tempfile(fileext = ".zip")
download.file(URL, temp)
unzip(temp)
# Get shape file
shp <- file.path(tempdir(),"vg250_0101.utm32s.shape.ebenen/vg250_ebenen/vg250_gem.shp")
# Read in shape file
map <- readShapeSpatial(shp, proj4string = CRS("+init=epsg:25832"))
# Transform the geocoding from UTM to Longitude/Latitude
map <- spTransform(map, CRS("+proj=longlat +datum=WGS84"))
# Pick an geographic area which consists of multiple polygons
# ---------------------------------------------------------------------------
# Output a frequency table of areas with N polygons
nPolys <- sapply(map#polygons, function(x)length(x#Polygons))
# Get geographic area with the most polygons
polygon.with.max.polygons <- which(nPolys==max(nPolys))
# Get shape for the geographic area with the most polygons
Poly.coords <- map[which(nPolys==max(nPolys)),]
# Plot
# ---------------------------------------------------------------------------
# Plot region without Google maps (ggplot2)
plot(Poly.coords, col="lightgreen")
# Find if a point is in a polygon
# ---------------------------------------------------------------------------
# Define points
points_of_interest <- data.frame(long=c(10.5,10.51,10.15,10.4),
lat =c(51.85,51.72,51.81,51.7),
id =c("A","B","C","D"), stringsAsFactors=F)
# Plot points
points(points_of_interest$long, points_of_interest$lat, pch=19)
You can do this simply with gContains(...) in the rgeos package.
gContains(sp1,sp2)
returns a logical depending on whether sp2 is contained within sp1. The only nuance is that sp2 has to be a SpatialPoints object, and it has to have the same projection as sp1. To do that, you would do something like this:
point <- data.frame(lon=10.2, lat=51.7)
sp2 <- SpatialPoints(point,proj4string=CRS(proj4string(sp1)))
gContains(sp1,sp2)
Here is a working example based on the answer to your previous question.
library(rgdal) # for readOGR(...)
library(rgeos) # for gContains(...)
library(ggplot2)
setwd("< directory with all your files >")
map <- readOGR(dsn=".", layer="vg250_gem", p4s="+init=epsg:25832")
map <- spTransform(map, CRS("+proj=longlat +datum=WGS84"))
nPolys <- sapply(map#polygons, function(x)length(x#Polygons))
region <- map[which(nPolys==max(nPolys)),]
region.df <- fortify(region)
points <- data.frame(long=c(10.5,10.51,10.15,10.4),
lat =c(51.85,51.72,51.81,51.7),
id =c("A","B","C","D"), stringsAsFactors=F)
ggplot(region.df, aes(x=long,y=lat,group=group))+
geom_polygon(fill="lightgreen")+
geom_path(colour="grey50")+
geom_point(data=points,aes(x=long,y=lat,group=NULL, color=id), size=4)+
coord_fixed()
Here, point A is in the main polygon, point B is in a lake (hole), point C is on an island, and point D is completely outside the region. So this code checks all of the points using gContains(...)
sapply(1:4,function(i)
list(id=points[i,]$id,
gContains(region,SpatialPoints(points[i,1:2],proj4string=CRS(proj4string(region))))))
# [,1] [,2] [,3] [,4]
# id "A" "B" "C" "D"
# TRUE FALSE TRUE FALSE
Since you can use the "point in polygon" routine, and this apparently isn't already suitably designed to handle the multi-polygon case in R (which I find a bit odd actually), you are left with having to cycle through each of the multiple polygons. Now the trick is, if you are inside an odd number of polygons, you are inside the multi-polygon. If you are inside an even number of polygons, then you are actually outside of the shape.
Point in polygon testing that uses ray-crossings should ALREADY be able to handle this, just by making sure you pass in all the vertices to the original point.in.polygon test, but I am not sure which mechanism R is using, so I can only give you the even/odd advice above.
I also found this code, not sure if it will help:
require(sp)
require(rgdal)
require(maps)
# read in bear data, and turn it into a SpatialPointsDataFrame
bears <- read.csv("bear-sightings.csv")
coordinates(bears) <- c("longitude", "latitude")
# read in National Parks polygons
parks <- readOGR(".", "10m_us_parks_area")
# tell R that bear coordinates are in the same lat/lon reference system
# as the parks data -- BUT ONLY BECAUSE WE KNOW THIS IS THE CASE!
proj4string(bears) <- proj4string(parks)
# combine is.na() with over() to do the containment test; note that we
# need to "demote" parks to a SpatialPolygons object first
inside.park <- !is.na(over(bears, as(parks, "SpatialPolygons")))
# what fraction of sightings were inside a park?
mean(inside.park)
## [1] 0.1720648
# use 'over' again, this time with parks as a SpatialPolygonsDataFrame
# object, to determine which park (if any) contains each sighting, and
# store the park name as an attribute of the bears data
bears$park <- over(bears, parks)$Unit_Name
# draw a map big enough to encompass all points (but don't actually plot
# the points yet), then add in park boundaries superimposed upon a map
# of the United States
plot(coordinates(bears), type="n")
map("world", region="usa", add=TRUE)
plot(parks, border="green", add=TRUE)
legend("topright", cex=0.85,
c("Bear in park", "Bear not in park", "Park boundary"),
pch=c(16, 1, NA), lty=c(NA, NA, 1),
col=c("red", "grey", "green"), bty="n")
title(expression(paste(italic("Ursus arctos"),
" sightings with respect to national parks")))
# now plot bear points with separate colors inside and outside of parks
points(bears[!inside.park, ], pch=1, col="gray")
points(bears[inside.park, ], pch=16, col="red")
# write the augmented bears dataset to CSV
write.csv(bears, "bears-by-park.csv", row.names=FALSE)
# ...or create a shapefile from the points
writeOGR(bears, ".", "bears-by-park", driver="ESRI Shapefile")

Having trouble calculating Home Range area

I am having a lot of trouble in R calculating the area of home range of an animal. I thought once I produced a home range (if I've done it correctly) calculating the area would be easy, but no
I've pasted some of the code I've been trying. I wonder would anyone have any insight?
# Load package
library(adehabitat)
#Load file Frodo
dd <- read.csv(file.choose(), header = T)
# Plot the home range
xy <- dd[,c("X","Y")]
id <- dd[,"name"]
hr<- mcp(xy,id,percent=95)
plot(hr)
points(xy[xy$id=="frodo",])
#Great. Home range produced. Now calculate area
area <- mcp.area(xy, id,percent = 95),
# Result 2.287789e-09 Ha. Way to small. Maybe it doesnt like Lat / Long.
# Will try and convert coordinates into M or Km
# Load map project
library(mapproj)
x<-mapproject(t$X,t$Y,projection="mercator")
# Its converted it to something but its not M's or Km's.
# I'll try and run it anyway
xy <- x[,c("X","Y")]
# incorrect number of dimensions
# Ill try Project 4
library(proj4)
xy <- dd[,c("X","Y")]
tr <- ptransform(xy/180*pi, '+proj=latlong +ellps=sphere',
'+proj=merc +ellps=sphere')
View(tr)
# There seems to be a Z column filled with 0's.
# It that going to affect anything?
# Let's look at the data
plot(tr)
# Looks good, Lets try and create a home range
xy <- tr[,c("x","y")]
# 'incorrect number of dimensions'
No idea what the problem is. Don't know if I'm on the right track or doing something completely wrong
In order to calculate area you need your points in a projected coordinate systems (area in long/lat would just be units of degree). The type of projection you use is going to have a big effect on the resulting area. For instance the Mercator projection distorts area away from the Equator -- you might want to look into the best equal-area projection for your location. I am going to answer the programming part of your question, once you find the right projections to use you can substitute them in.
require(sp)
require(rgdal)
orig.points <- dd[,c("X","Y")]
# geographic coordinate system of your points
c1 <- CRS("+proj=latlong +ellps=sphere")
# define as SpatialPoints
p1 <- SpatialPoints(orig.points, proj4string=c1)
# define projected coordinate system of your choice, I am using the one you
# defined above, but see:
# http://www.remotesensing.org/geotiff/proj_list/mercator_1sp.html
# to make sure your definition of the mercator projection is appropriate
c2 <- CRS("+proj=merc ellps=sphere")
p2 <- spTransform(p1, c2) # project points
# convert to Polygon (this automatically computes the area as an attribute)
poly <- Polygon(p2)
poly#area #will print out the area

Resources