I want to return the dimensions of some gridded polygons. I know the overall area of the polygons, but I would like to have the height and width as well for some calculations. The polygons are sf objects and I wanted to convert them into multiline objects and then just take the length of each line. I can't figure out how to do this, but assume there is a built in function in the sf package for this.
For some sample code:
library(sf)
nc <- st_read(system.file("shape/nc.shp", package="sf"))
poly <- nc[5,] # object five chosen at random for testing
Now I just want poly to be converted to a series of lines, and then be able to take the length of those lines.
Any help is appreciated.
Related
I have a shapefile with around 400 polygons of different sizes. I have been trying to create a random point layer with a specific number of points within each of the 400 polygons.
I tried with spsample function but it generates a random layer of points that does not respect each polygon (it is a general random point layers of the total shapefile).
As an example:
Shape file
Download: biogeo.ucdavis.edu/data/diva/adm/USA_adm.zip
it is the file "USA_adm1"
This shapefile contains 52 polygons. I am looking to put a specific number of random points inside every one of the 52 polygons. For example a random distribution of 100 points inside each of the states.
I hope you can help me.
thank you
From the sf package using st_sample on an sf object:
library(sf)
library(ggplot2)
# using data included with sf package,
# it contains 100 polygons
nc <- st_read(system.file("shape/nc.shp", package="sf"))
# st_sample needs a vector telling it how many samples for each polygon
# here we're using 3 for each polygon
samples_per_polygon <- rep(3, nrow(nc))
samples <- st_sample(nc, samples_per_polygon)
ggplot() + geom_sf(data = nc) + geom_sf(data = samples)
I would like to assign small polygons nested in larger polygons the same values as those of larger polygons.
In figure 1 you can see the small polygons in raster format:
and in figure 2 in SpatialPolygons as individual polygons:
These polygons are results of sorting by k-means, generating raster, and using the rasterFromXYZ function (code below):
mydata.26.raster <- rasterFromXYZ(as.data.frame(mydata.26.coord[,c("x", "y", "cls_26.cluster")]),res=5, crs=crs)
and then rasterToPolygons function I was able to separate the polygons (code below):
zona.26.pol<- rasterToPolygons(zona.26.raster$cls_26.cluster,dissolve=TRUE)
zona.26.pol <- disaggregate(zona.26.pol)
Here's zona.26.pol if you want to help It is in .shp format.
And manually I reclassified the polygons and finally added them using the same classes.
After manually assigning the values by me, the result that I would like to achieve automatically (creating rules) is in figure 3:
Every help is welcome!
This will remove the small nested polygons based on their size alone and then remove the holes left in the larger, remaining polygons. This works for your example but may fail if you have larger nested polygons you are wanting to remove. In that case, we would have to figure out how to identify the geometries that are 'nested'
library(sf)
library(units)
library(nngeo)
min_polygon_area <- 10000 #set minimum size of a nested polygon you would like to remove
units(min_polygon_area) <- as_units('m^2') #as defined below by st_area
zona.26.pol <- st_read(file.path(workDir, 'zona.26.pol.shp'))
st_crs(zona.26.pol) <- '+proj=utm +zone=22 +south +ellps=WGS84 +datum=WGS84 +units=m +no_defs' #define crs
zona.26.pol$area <- st_area(zona.26.pol)
zona.26.pol$area #note area is in m^2
plot(zona.26.pol[,'cls_26_']) #as-is plot
zona.26.pol <- zona.26.pol[zona.26.pol$area>min_polygon_area, ]
plot(zona.26.pol[,'cls_26_']) #small polygons removed; holes remaining
zona.26.pol_no_holes <- st_remove_holes(zona.26.pol)
plot(zona.26.pol_no_holes[,'cls_26_']) #holes removed
Note that I used the sf package to read-in the shapefile, in order to utilize the st_remove_holes function from the nngeo package, but I typically use raster and sp packages.
it's my first time using the spatstat package, so I would like some advice. I am attempting to plot coordinate data into a irregular polygon area (format .shp), to calculate spatial analysis like Ripley's K. How can I add an irregular polygon area as a plot? How can I merge the .ppp data from the coordinates into the polygon area?
I have used the following codes:
Converting the coordinate data to .ppp format
library(spatstat)
library(sp)
library(maptools)
tree.simu <- read.table("simulation.txt", h=T)
tree.simu.ppp <-ppp(x=tree.simu$X,y=tree.simu$Y,window=owin(c(min(tree.simu$X),max(tree.simu$X)),c(min(tree.simu$Y),max(tree.simu$Y))))
plot(tree.simu.ppp)
With this function I am considering the plot area as the max and min valeu of the coordinates. I would like to put the polygon boundary as the plot.
Ploting the irregular polygon area
area <- readShapePoly("Area/Fragment.shp")
plot(area)
plot(tree.simu.ppp, add=T)
or
points(tree.simu.ppp)
The package accept the last function but, when I try to plot both files together, seems like that the .shp file it is fill the whole area. I can't visualize the coordinates data.
Thank you, I really appreciate your help!
ps.: If you know any material with those question, please I would be happy to take a look
This is indeed due to inconsistent bounding boxes as conjectured in the comment by #jlhoward. Your points are in [273663.9, 275091.45] x [7718635, 7719267] while the polygon is contained in [-41.17483, -41.15588] x [-20.619647, -20.610134].
Assuming the coordinates were indeed consistent with the window the correct way way of getting it into a ppp object would be:
library(spatstat)
library(sp)
library(maptools)
area <- readShapePoly("Area/Fragment.shp")
area <- as(area, "owin")
tree.simu <- read.table("simulation.txt", h=T)
tree.simu.ppp <-ppp(x=tree.simu$X,y=tree.simu$Y,window=area)
However, you will get a warning about your points being rejected since they are outside the window, and the object will contain no points.
I've been running into all sorts of issues using ArcGIS ZonalStats and thought R could be a great way. Saying that I'm fairly new to R, but got a coding background.
The situation is that I have several rasters and a polygon shape file with many features of different sizes (though all features are bigger than a raster cell and the polygon features are aligned to the raster).
I've figured out how to get the mean value for each polygon feature using the raster library with extract:
#load packages required
require(rgdal)
require(sp)
require(raster)
require(maptools)
# ---Set the working directory-------
datdir <- "/test_data/"
#Read in a ESRI grid of water depth
ras <- readGDAL("test_data/raster/pl_sm_rp1000/w001001.adf")
#convert it to a format recognizable by the raster package
ras <- raster(ras)
#read in polygon shape file
proxNA <- readShapePoly("test_data/proxy/PL_proxy_WD_NA_test")
#plot raster and shp
plot(ras)
plot(proxNA)
#calc mean depth per polygon feature
#unweighted - only assigns grid to district if centroid is in that district
proxNA#data$RP1000 <- extract(ras, proxNA, fun = mean, na.rm = TRUE, weights = FALSE)
#check results
head(proxNA)
#plot depth values
spplot(proxNA[,'RP1000'])
The issue I have is that I also need an area based ratio between the area of the polygon and all non NA cells in the same polygon. I know what the cell size of the raster is and I can get the area for each polygon, but the missing link is the count of all non-NA cells in each feature. I managed to get the cell number of all the cells in the polygon proxNA#data$Cnumb1000 <- cellFromPolygon(ras, proxNA)and I'm sure there is a way to get the actual value of the raster cell, which then requires a loop to get the number of all non-NA cells combined with a count, etc.
BUT, I'm sure there is a much better and quicker way to do that! If any of you has an idea or can point me in the right direction, I would be very grateful!
I do not have access to your files, but based on what you described, this should work:
library(raster)
mask_layer=shapefile(paste0(shapedir,"AOI.shp"))
original_raster=raster(paste0(template_raster_dir,"temp_raster_DecDeg250.tif"))
nonNA_raster=!is.na(original_raster)
masked_img=mask(nonNA_raster,mask_layer) #based on centroid location of cells
nonNA_count=cellStats(masked_img, sum)
I am working with shapefiles in R, one is point.shp the other is a polygon.shp.
Now, I would like to intersect the points with the polygon, meaning that all the values from the polygon should be attached to the table of the point.shp.
I tried overlay() and spRbind in package sp, but nothing did what I expected them to do.
Could anyone give me a hint?
With the new sf package this is now fast and easy:
library(sf)
out <- st_intersection(points, poly)
Additional options
If you do not want all fields from the polygon added to the point feature, just call dplyr::select() on the polygon feature before:
library(magrittr)
library(dplyr)
library(sf)
poly %>%
select(column-name1, column-name2, etc.) -> poly
out <- st_intersection(points, poly)
If you encounter issues, make sure that your polygon is valid:
st_is_valid(poly)
If you see some FALSE outputs here, try to make it valid:
poly <- st_make_valid(poly)
Note that these 'valid' functions depend on a sf installation compiled with liblwgeom.
If you do overlay(pts, polys) where pts is a SpatialPointsDataFrame object and polys is a SpatialPolygonsDataFrame object then you get back a vector the same length as the points giving the row of the polygons data frame. So all you then need to do to combine the polygon data onto the points data frame is:
o = overlay(pts, polys)
pts#data = cbind(pts#data, polys[o,])
HOWEVER! If any of your points fall outside all your polygons, then overlay returns an NA, which will cause polys[o,] to fail, so either make sure all your points are inside polygons or you'll have to think of another way to assign values for points outside the polygon...
You do this in one line with point.in.poly fom spatialEco package.
library(spatialEco)
new_shape <- point.in.poly(pts, polys)
from the documentation: point.in.poly "intersects point and polygon feature classes and adds polygon attributes to points".