Related
I have a dataframe with 2 pairs of UTM (32N) coordinates and I need to compute the differences in km between each of them, from origin to destination.
I'm trying with sf library, using "by_element" but I obtain an error message "Error in st_distance(data, by_element = TRUE) : !missing_y non รจ TRUE".
What's wrong?
If I use it without the "by_element" option, it works and the distance matrix between all coordinates is created, but this is not what I need.
library(sf)
df <- data.frame(id = c(1,2,3), x_origin = c(642683.2, 373775,383881 ), y_origin = c(5082920, 4997274,4994504), x_dest =c(642683.3, 1126050,942763.9 ), y_dest=c(5082920, 4374481,4534235 ))
data <- st_as_sf(df, coords = c("x_origin", "y_origin"), crs="4326" )
distances <- st_distance(data, by_element = TRUE )
You have provided x (origin) to st_distance()but no y (destination).
And that CRS can't be right, sf doesn't recognise EPSG code if it's a string and 4326 would suggest coordinates are in WGS84 lat/long. Assuming 32632, WGS 84 / UTM zone 32N.
library(sf)
df <- data.frame(id = c(1,2,3), x_origin = c(642683.2, 373775,383881 ), y_origin = c(5082920, 4997274,4994504), x_dest =c(642683.3, 1126050,942763.9 ), y_dest=c(5082920, 4374481,4534235 ))
origin <- st_as_sf(df, coords = c("x_origin", "y_origin"), crs=32632 )
dest <- st_as_sf(df, coords = c("x_dest", "y_dest"), crs=32632 )
(distances <- st_distance(origin, dest, by_element = TRUE ))
#> Units: [m]
#> 1 2 3
#> 0.1 976621.1 724015.0
Created on 2023-01-25 with reprex v2.0.2
First, sf can't know that you are calculating distances between origin and destination by the way you have input the data.
Second, the EPSG code for UTM Zone 32N is not 4326.
Third, you should have used crs = st_crs(4326) instead of crs = "4326".
Use the following piece of code to create the objects needed to calculate the distances you are interested in
library(sf)
df <- data.frame(id = c(1, 2, 3),
x_origin = c(642683.2, 373775, 383881),
y_origin = c(5082920, 4997274, 4994504),
x_dest = c(642683.3, 1126050, 942763.9),
y_dest = c(5082920, 4374481, 4534235))
origin <- st_as_sf(df, coords = c("x_origin", "y_origin"),
crs = st_crs(32632))
dest <- st_as_sf(df, coords = c("x_dest", "y_dest"),
crs = st_crs(32632))
Note the different EPSG code for the CRS.
Next, we calculate the distances (the default of this projection is in meters)
distances <- st_distance(origin, dest, by_element = TRUE)
If you use by_element = FALSE, you get all the pairwise distances.
Lastly, we can use the package units to convert the distances to km (or we can simply divide them by 1000).
units::set_units(distances, "km")
> Units: [km]
> [1] 0.0001 976.6211 724.0150
I want to assess if the observations in my data are spatially randomly distributed over the sampling area (Sweden). I wanted to reproduce the example given in this answer: Spatial Autocorrelation Analysis (Global Moran's I) in R
Here is a small subset of my data, and the spatial polygon I used. Note that the coordinates are in SWEREF99 (ESPG: 3006)
## spatial polygon of Sweden
library(rworldmap)
library(sp)
worldmap <- getMap(resolution = "high")
sweden <- worldmap[which(worldmap$SOVEREIGNT == "Sweden"),]
plot(sweden)
sweden
## conversion to EPSG: 3006 (SWEREF99 TM) (https://spatialreference.org/ref/epsg/3006/)
crs.laea <- CRS("+proj=utm +zone=33 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs")
sweden_proj <- spTransform(sweden, crs.laea)
## Data subset
x <- c(669894, 669894, 669894, 671088, 671117, 671117, 671117, 670513, 670513, 670513, 669921, 669310, 669310, 669310, 669303, 629720, 630318, 630925, 630925, 630925)
y <- c(7116684, 7116684, 7116684, 7116706, 7114900, 7114900, 7114900, 7114896, 7114896, 7114896, 7114888, 7115473, 7115473, 7115473, 7116075, 7131172, 7131180, 7131190, 7131190, 7131190)
library(spatstat)
coords.ppp_1 <- ppp( x , y , xrange = c(280227, 911417) , yrange = c(6142436, 7605020) )
coords.ppp <- unique(coords.ppp_1)
### plot data and Sweden map for check
plot(coords.ppp_1)
plot(sweden_proj, add=T)
So far it seems ok. Then I convert the spatial polygon to an owin object, simulate random data for comparison, and do the analysis.
library(maptools)
sw <- as.owin.SpatialPolygons(sweden_proj)
# Generate completely spatially random point patterns to compare against the observed
n <- coords.ppp_1$n
ex <- expression(runifpoint( n , sw))
# Compute a simulation envelope using Gest, which estimates the nearest neighbour distance distribution function G(r)
set.seed(1)
res <- envelope( coords.ppp , Gest , nsim = 99, simulate = ex ,verbose = FALSE, savefuns = TRUE )
plot(res)
With the envelope() I get the following error message:
"In envelopeEngine(X = X, fun = fun, simul = simrecipe, nsim = nsim, :
Window containing simulated patterns is not a subset of data window"
I suspect that there is a problem with the conversion between sp and owin, but I couldn't figure out what the issue really is.
Any advice?
I am trying to interpolate a irregular raster grid to a regular grid using akima library in R. However, after I define the regular grid and interpolate the values to the new regular grid, I end up in a strange raster position. I'm doing something wrong but I don't see where. If anyone has a solution (or know a different approach), please let me know. Thank you very much.
library(raster)
library(akima)
library(rgdal)
library(sp)
# download the file
url <- 'https://downloads.psl.noaa.gov/Datasets/NARR/Derived/monolevel/air.2m.mon.ltm.nc'
file <- paste0(getwd(), "/airtemp.nc")
download.file(url, file, quiet = TRUE, mode = "wb") # less than 4 mb
# define the grid edges according to https://psl.noaa.gov/data/gridded/data.narr.monolevel.html
y <- c(12.2, 14.3, 57.3, 54.5)
x <- c(-133.5, -65.1, -152.9, -49.4)
xym <- cbind(x, y)
p = Polygon(xym)
ps = Polygons(list(p),1)
sps = SpatialPolygons(list(ps))
# create a spatial grid to 0.3 cell size
xy <- makegrid(sps, cellsize = 0.3)
xy$first <-1
names(xy) <- c('x','y',"first")
coordinates(xy)<-~x+y
gridded(xy)<-T
# read the netcdf file and extract the values
cape <- brick(file)[[1]] #get the first layer only
rp <- rasterToPoints(cape)
rp <- na.exclude(rp)
# interpolate to the crs for Northern America Conformal Conic
r2 <- project(rp[,1:2], paste('+proj=lcc +lat_1=20 +lat_2=60 +lat_0=40 +lon_0=-96 +x_0=0 +y_0=0 +datum=NAD83 +units=m +no_defs'), inv=TRUE, use_ob_tran=TRUE)
# add the transformed coordinates
rp[,1:2] <-r2
rp <- as.data.frame(rp)
# create a spatial points object and plot it
coordinates(rp)<-~x+y
spplot(rp, scales=list(draw = T))
# interpolate the points to the coordinates (takes a while)
akima.sp <- interpp(x = coordinates(rp)[,1], y = coordinates(rp)[,2],
z = rp#data[,names(rp)[1]],
xo = coordinates(xy)[,1],
yo = coordinates(xy)[,2],
linear = F, extrap = F)
# create a raster file
r.a <- rasterFromXYZ(as.matrix(data.frame(akima.sp)))
plot(r.a)
I have a Digital Elevation Model for the Alps (containing x,y,z). I would like to extract polygons representing two elevation levels (<=1000, >1000). The polygons need to be applied to a more elaborate registration of the Alps with different x, y points (x,y,snowdepth). The second registration must be filtered for all points above 1000m. So. it would be nice if I could store the polygon and use it to different spatial data frames. My question is: how can I capture the polygon representing the elevation level above 1000m and filter the spatial data frame in R.
#packages
packages <- c("RCurl", "RColorBrewer","ggmap","rgeos","fields","dismo","rgdal", "deldir", "dplyr","tidyr","ggplot2","contoureR",
"maptools","raster","gstat", "magick","lubridate", "ggplot2","viridis","scales","inlmisc")
has_available <- packages %in% rownames(installed.packages())
if(any(!has_available)) install.packages(packages[!has_available])
lapply(packages,library,character.only = TRUE)
for(pkg in packages) {
library(pkg, character.only = TRUE)
}
geo_proj = "+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0"
alps <- shapefile("Alpine_Convention_Perimeter_2018.shp")
proj4string(alps)
bbalps <- bbox(alps)
alps <- spTransform(alps,geo_proj)
r <- raster(alps,ncols=990,nrows=990)
srtm <- getData('SRTM', lon=15, lat=47)
srtm2 <- getData('SRTM', lon=12, lat=47)
srtm3 <- getData('SRTM', lon=9, lat=47)
srtm4 <- getData('SRTM', lon=6, lat=45)
#Mosaic/merge srtm tiles
srtmmosaic <- mosaic(srtm, srtm2, srtm3, srtm4, fun=mean)
srtmmosaic <- spTransform(srtmmosaic,geo_proj)
rm(srtm, srtm2, srtm3, srtm4)
rst0 <- projectRaster(srtmmosaic, r, crs=geo_proj)
#set the z-factor, which increases contrast
mosaic <- rst0 * 10
#then create the hillshade
slope <- terrain(mosaic, opt="slope", unit='radians')
aspect <- terrain(mosaic, opt="aspect", unit='radians')
hillshade <- hillShade(slope, aspect, angle=45, direction=315)
hillshade <- crop(hillshade, extent(alps))
hillshade <- mask(hillshade, alps)
slope <- crop(slope, extent(alps))
slope <- mask(slope, alps)
rst0 <- crop(rst0, extent(alps))
rst0 <- mask(rst0,alps)
#elevation
elev.df <- rasterToPoints(rst0)
elev.df <- data.frame(elev.df)
colnames(elev.df) <- c("lon", "lat", "elevation")
#ggplot makes me turn the raster into points
hills.df <- rasterToPoints(hillshade)
hills.df <- data.frame(hills.df)
colnames(hills.df) <- c("lon", "lat", "hills")
#merging the slope shade with the hillshade
slope.df <- rasterToPoints(slope)
slope.df <- data.frame(slope.df)
colnames(slope.df) <- c("lon", "lat", "slope")
slope.df$slope <- 1- slope.df$slope #invert the scale so that more slope is darker
#elevation normalised
elev.df$val_norm <- (elev.df[,3]-min(elev.df[,3]))/diff(range(elev.df[,3]))
mnt.df<-hills.df %>%
left_join(slope.df) %>%
full_join(elev.df)
df = getContourLines(mnt.df, binwidth=1000)
ggplot(df,aes(x,y,group=Group,colour=z)) + geom_path()
I have WRF output netCDF files with 149974991 dimensions produced with "Mercator" projection over the Horn Of Africa. I would like to convert netCDF files into raster stack to undertake further analysis. I have been trying different options but it didn't work for me. I am getting values on wrong locations. I require help in this regards and any help is much appreciated.
Here is the code :
ro_rast <- nc_open("wrf_CAM0_daily_pre.nc")
pre <- ncvar_get(ro_rast, "pre") ro_rast$dim$lon$vals -> lon ro_rast$dim$lat$vals -> lat ro_rast$dim$ncl2$vals -> time rm(ro_rast)
r1_brick <- brick(pre, xmn=min(lat), xmx=max(lat), ymn=min(lon), ymx=max(lon), crs=CRS("+proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs+ towgs84=0,0,0"))
names(r1_brick)<- seq(as.Date('2018-06-01'), as.Date('2018-08-31'), 'days')
# convert names of layer into date par(mar = c(2, 2, 2, 2))
cam1_mean <- t(calc(r1_brick, sum))
# seasonal sum precipitation
cam1 <- flip(cam1_mean, direction = 2)
library(akima)# intepolation
lonlat_reg <- expand.grid(lon = seq(min(lon), max(lon), length.out = 1499),
lat = seq(min(lat), max(lat), length.out = 749))
test <- interp(x = as.vector(lon), y = as.vector(lat), z = as.vector(pre),
xo = unique(lonlat_reg[,"lon"]), yo = unique(lonlat_reg[,"lat"]),
duplicate = "error", linear = FALSE, extrap = FALSE)
test <- interp(x = as.vector(lon), y = as.vector(lat), z = as.vector(pre),
nx = 1499, ny = 749, linear = FALSE, extrap = FALSE)
# turn into a raster
test_ras <- raster(test)
The standard approach would be
library(raster)
b <- brick("wrf_CAM0_daily_pre.nc")
It that does not work, can you point us to the file you are using?
I get this error message (you should have added that to your question).
Error in .rasterObjectFromCDF(x, type = objecttype, band = band, ...) :
cells are not equally spaced; you should extract values as points
I checked the file, and in this case, the raster is not a regular grid. The size of the cells changes with latitude. The file does not provide the x and y values of the coordinate reference system used. So the best you can do is extract these values as points, as you were doing, using the interface of the ncdf4 or another package. You can then not directly make a RasterBrick. But you do so using rasterize or interpolate.