I'm trying to summarize raster cell values in overlapping polygons in ArcMap. This can be done in Geospatial Modelling Environment (GME), an extension for ArcMap. They have a command called isectpolyrst that calculates for values in overlapping polygons. My problem is, my version of ArcGIS (10.6.2), doesn't support the use of GME, so I can't use this function. I've heard that isectpolyrst can still accomplished in R Studio using R script, but I haven't found it anywhere.
I have a number of GPS points with 10 km buffers around them (these buffers overlap a lot). I'm trying to calculate proportions of different vegetation types within these buffer zones. I'm using ArcMap 10.6.2., and zonal statistics can't calculate for overlapping polygons.
You can use raster::extract for that. As you seem entirely new to R, you will need to study it a bit first. You can start here: https://rspatial.org/
Related
I have a relatively large-ish (>4,000 records) dataset of species records that needs to be mapped in R, however the only spatial information with them are 6 figure Ordnance Survey’s National Grid (NGR) format grid references (e.g. SD311124, see also A beginners guide to finding grid references). The file format is CSV.
How can I get R to plot the points with this information?
Is there a better way to do it within R rather than bulk converting and adding lat/long coords to the spreadsheet before loading it into R?
I know how to do this in QGIS, but my supervisors would like it in R instead!
Package landscapemetrics can calculate area of each patch for a given raster file, shape of that patch and so on. I want to have not only tibble-frame with patch metrics calculated, but a new raster where each pixel within specific patch will have a value of the area of that patch, shape indicator and so on. We can do it with function spatialize_lsm() (it produces a Large list nested object with probably RasterObject objects within):
library(landscapemetrics)
plot(podlasie_ccilc) # this raster data is provided with package
podlasie.metrics.area <- spatialize_lsm(podlasie_ccilc, what = 'lsm_p_area') # creates a list
plot(podlasie.metrics.area) # produces an error...
How to get a desirable raster file with patch metrics from that list? I guess it is a question of raster package or something else, since landscapemetrics documentation tells nothing about this step.
I not that this data and new raster do not have resolution of the pixel like in meters (30, 30 for Landsat satellite image, for example). So we cannot plot the new raster produced:
podlasie.metrics.area[[1]]
plot(podlasie.metrics.area[[1]])
So I guess landscapemetrics cannot deal with such rasters, we can even use its function to check a suitability of the prior raster for patch discovering:
check_landscape(podlasie_ccilc)
Upd. I did it for the Landsat dataset with resolution 30, 30 and it produced patch area raster, but again I cannot open/show/save as raster it, because of the same error.
Package maintainer helps to solve a problem (yes, it is just related to the structure of list):
plot(podlasie.metrics.area[[1]]$lsm_p_area)
Summary:
I'm trying to calculate the area of a large number of polygons in R. I've read a few posts about how I might do this (Example #1 & Example #2) but the problem I'm having is that my shapefile is too large (1.7gb) to import. Given I can't import the file, I can't calculate the area of the polygons.
Extended Explanation:
I'm actually trying to calculate the area of properties in Victoria, Australia. The polygons represent these properties. I downloaded the simplified models 1 and 2 of VicMaps from Spatial Datamart for all of Victoria.
However, given the size of the shapefiles, I had to narrow my search to just one local government area (LGA) and calculated the polygon areas (just for testing). The shapefile was 15.5MB.
library(raster)
x <- shapefile("D:/Downloads/SDM616230/ll_gda94/shape/lga_polygon/ballarat/VMPROP/PROPERTY_PRIMARY_APPROVED.shp")
crs(x)
x$area_sqkm <- area(x) / 1000000
This worked but its not a practical solution to my problem given there's many LGAs in Victoria and I plan to eventually follow the same process for Queensland and NSW.
However, trying to load a larger shapefile doesn't work and results in the below error code "Error: memory exhausted (limit reached?)".
I've tried using readShapePoly, readogr, st_read and read_sf to get the large shapefile into R but they don't work. I think the file is just too large. I tried using a select query within read_sf in an effort to reduce the size of the file I was reading but that didn't work either. I've read online that I should seek to split the shapefile into just the data I need to reduce the size - but I have no idea how to do that.
Hope you can help.
Obviously the file is too big for a single box. I think the options then are either
1) split the files into smaller ones, process one by one. See
https://gis.stackexchange.com/questions/195508/split-a-shapefile-into-smaller-files-on-linux-command-line
2) use some dbms or data warehouse to do it, they do such batching automatically.
I'm trying to do some GIS work using R. Specifically, I have a spatialpointsdataframe (called 'points') and a spatiallinesdataframe (called 'lines). I want to know the closest line to each point. I do this:
# make a new field to hold the line ID
points#data$nearest_line <- as.character('')
# Loop through data. For each point, get ID of nearest line and store it
for (i in 1:nrow(points)){
points#data[i,"nearest_line"] <-
lines[which.min(gDistance(points[i,], lines, byid= TRUE)),]#data$line_id
}
This works fine. My issue is the size of my data. I've 4.5m points, and about 100,000 lines. It's been running for about a day so far, and has only done 200,000 of the 4.5m points (despite a fairly powerful computer).
Is there something I can do to speed this up? For example if I was doing this in PostGIS I would add a spatial index, but this doesn't seem to be an option in R.
Or maybe I'm approaching this totally wrong?
I'm trying to use a kriging function to create vertical maps of chemical parameters in an ocean transect, and I'm having a hard time getting started.
My data look like this:
horiz=rep(1:5, 5)
depth=runif(25)
value = horiz+runif(25)/5
df <- data.frame(horiz, depth, value)
The autoKrige function in the automap package looks like it should do the job for me but it takes an object of class SpatialPointsDataFrame. As far as I can tell, the function spTransform in package rgdal creates SpatialPointsDataFrame objects, but there are two problems:
OSX binaries of this aren't available from CRAN, and my copy of RStudio running on OXS 10.7 doesn't seem to be able to install it, and
This function seems to work on lat/long data and correct distance values for the curvature of the Earth. Since I'm dealing with a vertical plane (and short distances, scale of hundreds of meters) I don't want to correct my distances.
There's an excellent discussion of kriging in R here, but due to the issues listed above I don't quite understand how to apply it to my specific problem.
I want a matrix or dataframe describing a grid of points with interpolated values for my chemical parameters, which I can then plot (ideally using ggplot2). I suspect that the solution to my problem is considerably easier than I'm making it out to be.
So there a a few question you want answered:
The spTransform function does not create SPDF's, but transforms between projections. To create a SPDF you can use a simple data.frame as a start. To transform df to a SPDF:
coordinates(df) = c("horiz", "depth")
OS X binaries of rgdal can be found at http://www.kyngchaos.com. But I doubt if you need rgdal.
spTransform can operate on latlong data, but also on projected data. But I do not think you need rgdal, or spTransform, see also point 1.
After you create the SPDF using point 1, you can use the info at the post you mentioned to go on.