Using R to obtain slope raster from DEM GRID raster - r

After some extensive googling, I wasn't able to find my answer (first time I couldn't surmount the issue by looking at others questions/answers). I am new to asking questions, so forgive any missteps.
I am attempting to perform what ArcGIS or QGIS performs with the slope tool, just within R. To do so I have been importing a raster that I exported from ArcGIS in GRID format with the following characteristics:
class : RasterLayer
dimensions : 821, 581, 477001 (nrow, ncol, ncell)
resolution : 4.996121, 4.996121 (x, y)
extent : 2832147, 2835049, 14234048, 14238150 (xmin, xmax, ymin, ymax)
crs : +proj=tmerc +lat_0=34.75 +lon_0=-118.583333333333 +k=0.9999 +x_0=800000.000000001 +y_0=3999999.99999999 +datum=NAD83 +units=us-ft +no_defs
source : rr_2020_shell
names : rr_2020_shell
values : 5623.253, 6401.356 (min, max)
It is already projected in the correct coordinate system (EPSG: 3423) but when I go to find the slope using the following code:
RR_2020_Slope = terrain(RR_2020_St1_Raster,'slope', units = 'degrees', neighbors = 8, filename = 'RR_2020_Slope.grd', overwrite = T)
The result is a slope raster that ranges from 0 to 1.28°, which is very different from what I have calculated in ArcGIS using the slope tool. Using the same DEM raster in the same projection in ArcGIS I used the slope tool with an input of 'Degree' for the output measurement, 'Planar' for the method, and 1 for Z factor and my resulting slope raster ranges from 0.001 to 73.396°.
Overall I am wondering where my mistake in R originates from, is it an elevation resolution problem? Are there issues with my projection? Forgive me, I can't necessarily include the data as they are sensitive materials but perhaps there is a clear and obvious mistake in my approach or assumptions about the functions I have used?

The only red flag I see is that you say "it is already projected in the correct coordinate system". Projecting raster data degrades the quality. As cell values get smoothed, the slopes will get smaller. This may be particularly pronounced if the relief is at the scale of the cell size (e.g. sand dunes vs mountain chains). Have you compared with what you get with the original data?
Another source of error could be that the units of the values are different from the units of the coordinate reference system. But it would appear that in your case both are in feet.
Can you also try this with terra::terrain()?

Related

Create polygon from large raster in R

I have a large raster file (5GB) containing only 1's and NA's. I would like to convert this into a multipolygon of the areas with 1's, with adjacent cells dissolved into one polygon.
I have imported the file to R using
r = raster::raster(my_filename)
r
class : RasterLayer
dimensions : 17452, 45000, 785340000 (nrow, ncol, ncell)
resolution : 0.008, 0.008 (x, y)
extent : -180, 180, -55.9875, 83.6285 (xmin, xmax, ymin, ymax)
crs : NA
source : C://...binary_X01_januarysnow.asc
names : binary_X01_januarysnow
and
I have tried several methods to create the polygon:
rasterToPolygons from raster with dissolve==TRUE option (R crashes)
isoband from the isoband package (R crashes),
Both of the approaches have worked as expected when I've tried them on a subset of my raster covering appr. the area of Spain, so I assume the problem is only with the size of the data and not my code.
Then I have tried to read my raster with read_stars, and use stars::st_as_sf(st, as_points = FALSE, merge = TRUE, connect8 = TRUE). This returned an empty polygon, possibly because the file was read as a stars proxy object, but I'm not sure, I couldn't find any information about that online.
Then I have force-read the raster as stars and not as stars proxy by using read_stars(my_filename, proxy=FALSE) and have tried to use the st_as_sf command as above but got the message "Error: cannot allocate vector of size 2.9 Gb"
I know that in the worst case I can probably just decrease the raster resolution and therefore size and will be able to create the polygons I want (but with less precise resolution), but I was wondering if anyone has another suggestion I could try? Both the 1's and NA's are located in large continous areas, so it would be enough to have high resolution on the edges, if that helps.
PS This is my very first question on StackOverflow so I apologize if my problem is not clearly described. I don't know how to provide a reproducible example of a large dataset.
What you are looking for is as.polygons() from the terra package, the raster package's successor. terra handles large data sets better than raster does.

random sampling from a large raster using clusterR

I need to take a 5% random sample from a very large raster and return a new raster. I am trying to use sampleRandom from the raster package, but the process is very slow (I only have 8GB RAM on my machine, running 64-bit R). The raster has been cropped/masked to match an irregularly shaped study area boundary, as well - so has NA values in the rectangular extent around the polygon boundary and some internal NA values - I'm trying to sample only from the non-NA values. I've tried both sampling 5% and reversing that to sampling 95% - both ran for >2 hours without producing a result, at which point I terminated the process.
I am trying to speed it up by running it in parallel using the clusterR command, but I'm new to both the sampleRandom command and to using clusterR. My code runs, but I get all of the non-NA pixels returned, so the sample doesn't seem to working. Is this a problem with my code or is it that sampleRandom can't run with clusterR?
Here is a description of my raster layer:
conv.mod
class : RasterLayer
dimensions : 23828, 19095, 454995660 (nrow, ncol, ncell)
resolution : 56, 56 (x, y)
extent : -1220192, -150872, 87580, 1421948 (xmin, xmax, ymin, ymax)
coord. ref. : +proj=aea +lat_1=44.75 +lat_2=55.75 +lat_0=40 +lon_0=-96 +x_0=0 +y_0=0 +datum=WGS84 +units=m +no_defs +ellps=WGS84 +towgs84=0,0,0
data source : C:\GIS\carbon_cows\Intact\conv_mod.tif
names : conv_mod
values : 1, 1 (min, max)
And here is the code I have tried:
library(raster)
library(parallel)
tot<-cellStats(conv.mod,'sum', na.rm=TRUE) #get the total pixels in conv.mod
sampsize<-tot * 0.05 #calculate how many pixels would represent 5%
removeTmpFiles() #clear some memory
numcores<-detectCores() -1
start<-Sys.time()
beginCluster(numcores)
cl<-getCluster()
clusterExport(cl,"sampsize", envir = .GlobalEnv)
conv.perc <- clusterR(conv.mod,sampleRandom,args=list(size=sampsize,na.rm=TRUE,asRaster=TRUE))
endCluster()
end<-Sys.time()
difftime(end,start)
Here are the total non-NA cells in the original raster layer:
tot<-cellStats(conv.mod,'sum', na.rm=TRUE)
tot
105193858
and the number that should be a 5% sample:
sampsize<-tot * 0.05
sampsize
5259693
But, the resulting raster has the same number of non-NA pixels as the original raster:
tot_convperc<-cellStats(conv.perc,'sum',na.rm=T)
tot_convperc
105193858
I've also tried reversing the sample size calculation and running sampleRandom, so that I'm requesting a 95% sample. But, I get the same result.
I'd appreciate any help in understanding why this code is not running as expected. Thanks!
Never mind. I was able to take advantage of this post: https://gis.stackexchange.com/questions/17255/random-sampling-of-raster-using-r and the reply by whuber.
The following code solved my problem, without the use of a cluster:
col.conv <- ncol(conv.mod)
row.conv<-nrow(conv.mod)
r<-conv.mod
start<-Sys.time()
r[runif(col.conv*row.conv) >= 0.95] <- NA # Randomly *unselect* 5% of the data
end<-Sys.time()
difftime(end,start)
That code ran in ~3 minutes, as opposed to over an hour for putting the simpleRandom code in the clusterR command. I still wonder why simpleRandom was not actually taking a sample and also why this new code is so much more efficient, but happy to have the problem solved.

Raster stack incorrecting plotting latitude and longitude coordinates

I've downloaded precipitation data from the TRMM (rainfall across the tropics) satellite as a netCDF file and have been trying to plot the data in R as a rasterstack. However, R insists on plotting the latitude and longitude axes incorrectly, such that longitude is plotted on the x-axis (as it should be) but uses the latitude coordinates, while latitude is on the y-axis, but uses the longitude coordinates. I've tried using both the plot() and levelplot() functions but neither seems to work. Can anyone help me correct this?
These are the characteristic of the stack:
class : RasterStack
dimensions : 1440, 186, 267840, 12 (nrow, ncol, ncell, nlayers)
resolution : 0.25, 0.25 (x, y)
extent : -23.25, 23.25, -180, 180 (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0
names : X2016.01.16, X2016.02.15, X2016.03.16, X2016.04.15, X2016.05.16, X2016.06.15, X2016.07.16, X2016.08.16, X2016.09.15, X2016.10.16, X2016.11.15, X2016.12.16
Date : 2016-01-16, 2016-02-15, 2016-03-16, 2016-04-15, 2016-05-16, 2016-06-15, 2016-07-16, 2016-08-16, 2016-09-15, 2016-10-16, 2016-11-15, 2016-12-16
In the following image you can see the current output. It should show rainfall over the tropics from -23 to 23 degrees latitude, and -180 to 180 degrees longitude.
It's odd if the coordinates are switched prior to any processing. Maybe you want to asses the source you downloaded the data from and if there's maybe a better one.
Anyways, (in the meantime) the raster package can be of help for you .. specifically the transpose t() function. Here's an example:
# data before transpose
x <- getData('worldclim',var='tmean',res=10)
plot(x)
# data after transpose
y <- t(x)
plot(y)
There are also a couple of other functions in raster which could be of interest for you: flip and rotate
HTH
thanks for your response. It does seem strange that the coordinates are messed up right out of the box, and I did try downloading a fresh set of data and the same problem occurred. However, thanks to your input, I was able to rectify the problem through using the transpose() and flip() functions. I had to transpose the data, then flip it along both the x and y dimensions as the image was 'mirrored'. Here's the code I used in case anyone else encounters this problem with the TRMM data sets:
a.t = t(test.rasterstack)
a.flipy = flip(a.t, direction = 2)
a.t.flipxy = flip(a.t.flipy, direction = 1)
levelplot(a.t.flipxy)

Mapping temperature data from an .nc file in R

I downloaded temperature data from [NARR] (https://www.esrl.noaa.gov/psd/data/gridded/data.narr.monolevel.html) specifically "Air temperature at 2m" -monthly mean
I opened the file using the package "ncdf4". The data has 4 dimensions- time, x, y, nbnds. y corresponds to lat and x corresponds to lon. There is a variable (not dimension) called air which I do not know how to use, although this is the temperature information.
My end goal is to map the temperature data on a map of North America, using averaged temperature data for each month for each year (12 maps, one for each month).
I am having trouble identifying how to use the data as all of the dimensions are just really long lists of numbers that don't seem to have meaning (eg. the x coordinates look like this: 6232896 6265359 6297822 6330285 6362748 6395211 6427674 6460137 6492600 6525063 6557526 6589989, and so do the y values and time).
Here is the code I am using to view the dimensions:
temp2m <- nc_open("air.2m.mon.mean.nc")
time <- temp2m$dim$time$vals
lat <- temp2m$dim$x$vals
lon <- temp2m$dim$y$vals
nbnds <- temp2m$dim$nbnds$vals
If someone could help me view the data as well as map temperature data onto North America that would be great.
Thank you!
You can use the raster package to read these into a stack:
> library(raster)
> air = stack("./air.2m.mon.mean.nc")
(Note, you may need a raster package compiled with netcdf drivers...)
You can then plot them by slice or by time-name:
> plot(air[[23]])
> plot(air[["X1979.10.01.01.01.15"]])
The stack prints like this:
> air
class : RasterStack
dimensions : 277, 349, 96673, 450 (nrow, ncol, ncell, nlayers)
resolution : 32462.99, 32463 (x, y)
extent : -16231.49, 11313351, -16231.5, 8976020 (xmin, xmax, ymin, ymax)
coord. ref. : +proj=lcc +x_0=5632642.22547 +y_0=4612545.65137 +lat_0=50 +lon_0=-107 +lat_1=50 +lat_2=50 +ellps=WGS84
names : X1979.01.01.00.01.15, X1979.02.01.00.01.15, X1979.03.01.00.01.15, X1979.04.01.01.01.15, X1979.05.01.01.01.15, X1979.06.01.01.01.15, X1979.07.01.01.01.15, X1979.08.01.01.01.15, X1979.09.01.01.01.15, X1979.10.01.01.01.15, X1979.11.01.00.01.15, X1979.12.01.00.01.15, X1980.01.01.00.01.15, X1980.02.01.00.01.15, X1980.03.01.00.01.15, ...
and those coordinates are not really lat-long, but are in a transformed coordinate system described by that "coord. ref." string. If you want to put it on a lat-long map you need to warp it:
> air_ll = projectRaster(air[[1]],crs="+init=epsg:4326")
> plot(air_ll)
It might be better for you to transform any other data to this system, and keep the grid unprojected. Just look up how to deal with spatial data in R for more info on projections and transformations.

Resolution values for Rasters in R

I was just wondering if anyone has a conversion table for the resolution of rasters in R. I would like to know what numbers like these refer to in meters:
resolution : 0.08333333, 0.08333333
resolution : 0.009398496, 0.009398496
resolution : 0.002349, 0.002349 = 250m (I think)
I would really like to know what resolution to set a raster object to make cell sizes 1km^2. I am using rasters that span the country of Australia.
Thanks in advance everyone.
Cheers,
Adam
It all depends on the units of your raster, and that depends on the projection. Rasters might not even be square grids in metres - they might be square in degrees which aren't square in metres!
1 degree longitude at the equator is 1/360 of the earth's circumference. Near the north pole 1 degree is a much smaller distance, and at the pole its pretty much zero. Degrees of latitude however are constant.
You could take the corner points of your raster, convert them to lat-long coordinates if not already, and then work out the great-circle distance between them (there's an rdist function somewhere that does this I recall). However this won't work if your raster spans the whole globe, since then your NW corner and your NE corner are at the same point... Ummm. Anyway, the answer is... 42.
If you want to make 1km rasters of Australia then.... you need a coordinate system of Australia in kilometres. In the UK we have a system called the OSGB National Grid, which is close enough to a metric grid. Australia might be trickier because it is slightly bigger than the UK... So Australia seems to have a few grid systems. See here:
http://www.spatialreference.org/ref/?search=AGD84
So you might want to use the system that is in the middle of the country to avoid the worst distortions, then work out the bounds of Australia in lat-long, convert to epsg:20353 and create a raster based on that:
In lat-long I reckon Australia is roughly:
> xtll
[,1] [,2]
[1,] 112.5162 -43.906900
[2,] 155.8425 -7.585619
make this into a SpatialPoints object:
> xtll=SpatialPoints(xtll,CRS("+init=epsg:4326"))
convert to that AGD84 in the middle of the country:
> spTransform(xtll,CRS("+init=epsg:20353"))
SpatialPoints:
coords.x1 coords.x2
[1,] -1306200 4886041
[2,] 2849956 9103124
Make a raster extent object rounded to km:
> ext = extent(-1306000,2850000,4886000,9103000)
How many rows and columns do we need?
> length(-1306:2850)
[1] 4157
> length(4886:9103)
[1] 4218
Create a raster:
> r = raster(ext,ncol=4156,nrow=4217,crs="+init=epsg:20353")
> r
class : RasterLayer
dimensions : 4217, 4156, 17525852 (nrow, ncol, ncell)
resolution : 1000, 1000 (x, y)
extent : -1306000, 2850000, 4886000, 9103000 (xmin, xmax, ymin, ymax)
coord. ref. : +init=epsg:20353 +proj=utm +zone=53 +south +ellps=aust_SA +units=m +no_defs
values : none
Note that the ncol and nrow values are one less than the values from the bounds - this would be a fencepost error to put those values in.
See how my resolution is 1000? This is a 1km grid. The problem is that this is possibly going to be a bit distorted on the coasts. You could work out how distorted by converting to lat-long (epsg:4326), then to the proper AGD zone for points on the coast, and seeing how different they are. They might be very close, except for an offset.
Anyway, nuff said.
Finding the meta-data that gives meaning to your raster can be a bit of a challenge. I have spent lots of time hunting for this. If the raster was published by a government agency, then I would hope that this information is posted somewhere prominently.
The good news is that once you know the projection used on the various rasters, you can convert them to a common projection using projectRaster() in the raster package. You need to find the proj.4 string describing the original and the desired projections in each case. You can get this from: http://www.spatialreference.org.
When you know your projection, then the resolution information you seek will have meaning.

Resources