Set maximum length for generating Delaunay graph in R - r

I was trying to generate Delaunay triangulation in R using the spatstat function 'delaunay'. However, I checked the documentation and seems there is no argument to set the maximum length.
I noticed this post:
How to set maximum length of triangle side in Delaunay triangulation in R?
This seems done the same thing as I want, but as my point pattern is large so that I would prefer a simple and quick solution. Thank you!
Here is my code:
pts <- data.frame(readMat(paste('./TMA - Coordinates/HE_Rescaled_Coords/', core,
'/HE.mat', sep = '')))
colnames(pts) <- c('x', 'y')
pts_ppp <- ppp(pts$x, pts$y, owin(poly = Region_HE))
delaunay_ppp <- delaunay(pts_ppp)
I am also open to solutions using other functions from other packages. As long as it's fast.
Here is the region data: https://livejohnshopkins-my.sharepoint.com/:u:/g/personal/hmi1_jh_edu/EU5YeWiKzXlIohj7WIbfE_kB52Nbh2soXSNdHwQVukYnLA?e=t9tCf9
Here is the points data: https://livejohnshopkins-my.sharepoint.com/:u:/g/personal/hmi1_jh_edu/EaIuRF913rtBpg3VHvlp6TkB1FUomrgUc3eeUeHbVPJ50g?e=cwg1os

The Delaunay triangulation is a mathematically defined triangulation that does not involve the concept of a maximum segment length. If you want to constrain the maximum length of the segments in the triangulation, then it's not a Delaunay triangulation any more, and the algorithm for computing the Delaunay triangulation is not applicable.
You will have to specify what you want to happen when you impose a limit on the segment length. Should the algorithm just delete the edges which are too long? Delete the triangles that have an edge which is too long? If you delete stuff then the result is no longer a triangulation of the original points. Do you want to produce a different triangulation?
If X is your point pattern, then
Dtess <- delaunay(X)
Dnet <- delaunayNetwork(X)
give the Delaunay triangulation as a tessellation Dtess and as a network Dnet.
To remove edges from Dnet that are longer than lmax:
len <- lengths_psp(as.psp(Dnet))
Net <- thinNetwork(Dnet, retainedges = (len <= lmax))
To remove triangles from Dtess that have at least one edge longer than lmax:
hypotenuse <- function(p) { max(lengths_psp(edges(p))) }
h <- sapply(tiles(Dtess), hypotenuse)
Tess <- Dtess[h <= lmax]

Related

Fastest cartesian distance (R) from each point in SpatialPointsDataFrame to closest points/lines in 2nd shapefile

I want to know the fastest algorithms for obtaining the cartesian distances between each point in a SpatialPointsDataFrame (X) and either (a) the closest point in a second SpatialPointsDataFrame (Y), or (b) the closest line segment in a SpatialLinesDataFrame (Y). So this is basically 2 questions, with perhaps the same answer.
For the lines, I know I can use dist2Line(X,Y, distfun=distGeo) but this is insanely slow. I also tried using nncross, after converting both X and Y to ppp objects, as below. This is did NOT work; heat mapping the new distance measure showed that it does not radiate from Y.
X_ppp <- as(X, "ppp")
Y_psp <- as(Y, "psp")
distR <- nncross(X_ppp,Y_ppp,what="dist",k=1)
X$dist2road <- distR
For lines, I also tried using gDistance(X,Y) but was met with the error, for i=1,2: Spatial object i is not projected; GEOS expects planar coordinates. I think this is because I'm using lat-lon, and it needs a true projection. But all the files i'm working with are lat-lon, and I'm not sure how to choose and specify a projection (for tanzania) w/out coping it from another file.
For points, again using the nncross approach resulted in definitely wrong distances. (In each the point and line case, is this because the output vector is not ordered in the same way that the points within X are? If so, I see now way of outputting an ID for the point within X.)
Also for points, this knn code below did work. But it's clearly not in cartesian distance, and so I'd like to convert it or find some other algorithm that provides cartesian distance.
knn.results = knn(data=coordinates(market.shp),
query=coordinates(tzprice.shp), k=1)
knn.results <- data.frame(knn.results)
tzprice.shp$dist2market <- knn.results[,2]
Basically, my hope is to find the fastest algorithm for each purpose (distance to nearest point, distance to nearest line), with output either in cartesian distance or convertible to cartesian distance. Thanks!
Somebody pointed me towards one possible answer for finding the cartesian distance between each point in a SpatialPointsDataFrame (X) and the closest point in a second SpatialPointsDataFrame (let's call it Y). So that's the first half of my question... perhaps there's a faster method out there, but this way is quite fast, and it DOES return answers in Km, at least if proj=longlat.
tree <- createTree(coordinates(Y))
inds <- knnLookup(tree, newdat=coordinates(X), k=1)
distkm <- sapply(seq_len(nrow(inds)), function(i) spDists(X[i, ], Y[inds[i, ],]))
Still looking for an algorithm that (quickly) finds meters/km from each point in X to the nearest line in a SpatialLinesDataFrame.

r - Calculate shortest distance between 2 points in a delaunay triangulation

Currently I'm working with spatial data and applied a Delaunay triangulation on my data points. I additionally calculated the geodesic distances on the WGS84 ellipsoid for every edge (connection between 2 points) in the triangulation. Now I'm going to search the shortest path between every 2 points in the generated graph and calculate the path distance. The shortest distance should thus be calculated as the sum over all edge distances.
Below is a minimal working example:
library(deldir)
set.seed(31)
x <- runif(100)
y <- runif(100)
d <- deldir(x, y) #preforms tesselation & Delaunay triangulation
#Calculate edge distances (for reasons of simplicity I calculate here Euclidean distances)
geodists <- NULL
for (i in 1:nrow(d$delsgs)) {
geodists[i] <- sqrt((x[d$delsgs[i,5]] - x[d$delsgs[i,6]])^2 + (y[d$delsgs[i,5]] - y[d$delsgs[i,6]])^2)
}
#Plot data
plot(d, wlines="triang")
However, I have no idea how I can perform the shortest path search on the deldir object I created. Thus, I'd be very happy if you could provide some solutions for my problem:
How can I identify which edges are involved in the shortest path between point A and B?
How can I then efficiently calculate the path distance matrix?
Thanks a lot in advance for your help!
There are some path finding algorithms. One of them is A* (Wikipedia Link)
Maybe this helps you.
You can replace the regularly ordered points in an Euclidean Metric by the delaunay points of your collection of points.
Then always go to the next neighbor, which is closest to the finish point.

spatial filtering by proximity in R

I have occurrence points for a species, and I'd like to remove potential sampling bias (where some regions might have much greater density of points than others). One way to do this would be to maximize a subset of points that are no less than a certain distance X of each other. Essentially, I would prevent points from being too close to each other.
Are there any existing R functions to do this? I've searched through various spatial packages, but haven't found anything, and can't figure out exactly how to implement this myself.
An example occurrence point dataset can be downloaded here.
Thanks!
I've written a new version of this function that no longer really follows rMaternII.
The input can either be a SpatialPoints, SpatialPointsDataFrame or matrix object.
Seems to work well, but suggestions welcome!
filterByProximity <- function(xy, dist, mapUnits = F) {
#xy can be either a SpatialPoints or SPDF object, or a matrix
#dist is in km if mapUnits=F, in mapUnits otherwise
if (!mapUnits) {
d <- spDists(xy,longlat=T)
}
if (mapUnits) {
d <- spDists(xy,longlat=F)
}
diag(d) <- NA
close <- (d <= dist)
diag(close) <- NA
closePts <- which(close,arr.ind=T)
discard <- matrix(nrow=2,ncol=2)
if (nrow(closePts) > 0) {
while (nrow(closePts) > 0) {
if ((!paste(closePts[1,1],closePts[1,2],sep='_') %in% paste(discard[,1],discard[,2],sep='_')) & (!paste(closePts[1,2],closePts[1,1],sep='_') %in% paste(discard[,1],discard[,2],sep='_'))) {
discard <- rbind(discard, closePts[1,])
closePts <- closePts[-union(which(closePts[,1] == closePts[1,1]), which(closePts[,2] == closePts[1,1])),]
}
}
discard <- discard[complete.cases(discard),]
return(xy[-discard[,1],])
}
if (nrow(closePts) == 0) {
return(xy)
}
}
Let's test it:
require(rgeos)
require(sp)
pts <- readWKT("MULTIPOINT ((3.5 2), (1 1), (2 2), (4.5 3), (4.5 4.5), (5 5), (1 5))")
pts2 <- filterByProximity(pts,dist=2, mapUnits=T)
plot(pts)
axis(1)
axis(2)
apply(as.data.frame(pts),1,function(x) plot(gBuffer(SpatialPoints(coords=matrix(c(x[1],x[2]),nrow=1)),width=2),add=T))
plot(pts2,add=T,col='blue',pch=20,cex=2)
There is also an R package called spThin that performs spatial thinning on point data. It was developed for reducing the effects of sampling bias for species distribution models, and does multiple iterations for optimization. The function is quite easy to implement---the vignette can be found here. There is also a paper in Ecography with details about the technique.
Following Josh O'Brien's advice, I looked at spatstat's rMaternI function, and came up with the following. It seems to work pretty well.
The distance is in map units. It would be nice to incorporate one of R's distance functions that always returns distances in meters, rather than input units, but I couldn't figure that out...
require(spatstat)
require(maptools)
occ <- readShapeSpatial('occurrence_example.shp')
filterByProximity <- function(occ, dist) {
pts <- as.ppp.SpatialPoints(occ)
d <- nndist(pts)
z <- which(d > dist)
return(occ[z,])
}
occ2 <- filterByProximity(occ,dist=0.2)
plot(occ)
plot(occ2,add=T,col='blue',pch=20)
Rather than removing data points, you might consider spatial declustering. This involves giving points in clusters a lower weight than outlying points. The two simplest ways to do this involve a polygonal segmentation, like a Voronoi diagram, or some arbitrary grid. Both methods will weight points in each region according to the area of the region.
For example, if we take the points in your test (1,1),(2,2),(4.5,4.5),(5,5),(1,5) and apply a regular 2-by-2 mesh, where each cell is three units on a side, then the five points fall into three cells. The points ((1,1),(2,2)) falling into the cell [0,3]X[0,3] would each have weights 1/( no. of points in current cell TIMES tot. no. of occupied cells ) = 1 / ( 2 * 3 ). The same thing goes for the points ((4.5,4.5),(5,5)) in the cell (3,6]X(3,6]. The "outlier", (1,5) would have a weight 1 / ( 1 * 3 ). The nice thing about this technique is that it is a quick way to generate a density based weighting scheme.
A polygonal segmentation involves drawing a polygon around each point and using the area of that polygon to calculate the weight. Generally, the polygons completely cover the entire region, and the weights are calculated as the inverse of the area of each polygon. A Voronoi diagram is usually used for this, but polygonal segmentations may be calculated using other techniques, or may be specified by hand.

Finding the coordinates of points from distance matrix

I have a set of points (with unknow coordinates) and the distance matrix. I need to find the coordinates of these points in order to plot them and show the solution of my algorithm.
I can set one of these points in the coordinate (0,0) to simpify, and find the others. Can anyone tell me if it's possible to find the coordinates of the other points, and if yes, how?
Thanks in advance!
EDIT
Forgot to say that I need the coordinates on x-y only
The answers based on angles are cumbersome to implement and can't be easily generalized to data in higher dimensions. A better approach is that mentioned in my and WimC's answers here: given the distance matrix D(i, j), define
M(i, j) = 0.5*(D(1, j)^2 + D(i, 1)^2 - D(i, j)^2)
which should be a positive semi-definite matrix with rank equal to the minimal Euclidean dimension k in which the points can be embedded. The coordinates of the points can then be obtained from the k eigenvectors v(i) of M corresponding to non-zero eigenvalues q(i): place the vectors sqrt(q(i))*v(i) as columns in an n x k matrix X; then each row of X is a point. In other words, sqrt(q(i))*v(i) gives the ith component of all of the points.
The eigenvalues and eigenvectors of a matrix can be obtained easily in most programming languages (e.g., using GSL in C/C++, using the built-in function eig in Matlab, using Numpy in Python, etc.)
Note that this particular method always places the first point at the origin, but any rotation, reflection, or translation of the points will also satisfy the original distance matrix.
Step 1, arbitrarily assign one point P1 as (0,0).
Step 2, arbitrarily assign one point P2 along the positive x axis. (0, Dp1p2)
Step 3, find a point P3 such that
Dp1p2 ~= Dp1p3+Dp2p3
Dp1p3 ~= Dp1p2+Dp2p3
Dp2p3 ~= Dp1p3+Dp1p2
and set that point in the "positive" y domain (if it meets any of these criteria, the point should be placed on the P1P2 axis).
Use the cosine law to determine the distance:
cos (A) = (Dp1p2^2 + Dp1p3^2 - Dp2p3^2)/(2*Dp1p2* Dp1p3)
P3 = (Dp1p3 * cos (A), Dp1p3 * sin(A))
You have now successfully built an orthonormal space and placed three points in that space.
Step 4: To determine all the other points, repeat step 3, to give you a tentative y coordinate.
(Xn, Yn).
Compare the distance {(Xn, Yn), (X3, Y3)} to Dp3pn in your matrix. If it is identical, you have successfully identified the coordinate for point n. Otherwise, the point n is at (Xn, -Yn).
Note there is an alternative to step 4, but it is too much math for a Saturday afternoon
If for points p, q, and r you have pq, qr, and rp in your matrix, you have a triangle.
Wherever you have a triangle in your matrix you can compute one of two solutions for that triangle (independent of a euclidean transform of the triangle on the plane). That is, for each triangle you compute, it's mirror image is also a triangle that satisfies the distance constraints on p, q, and r. The fact that there are two solutions even for a triangle leads to the chirality problem: You have to choose the chirality (orientation) of each triangle, and not all choices may lead to a feasible solution to the problem.
Nevertheless, I have some suggestions. If the number entries is small, consider using simulated annealing. You could incorporate chirality into the annealing step. This will be slow for large systems, and it may not converge to a perfect solution, but for some problems it's the best you and do.
The second suggestion will not give you a perfect solution, but it will distribute the error: the method of least squares. In your case the objective function will be the error between the distances in your matrix, and actual distances between your points.
This is a math problem. To derive coordinate matrix X only given by its distance matrix.
However there is an efficient solution to this -- Multidimensional Scaling, that do some linear algebra. Simply put, it requires a pairwise Euclidean distance matrix D, and the output is the estimated coordinate Y (perhaps rotated), which is a proximation to X. For programming reason, just use SciKit.manifold.MDS in Python.
The "eigenvector" method given by the favourite replies above is very general and automatically outputs a set of coordinates as the OP requested, however I noticed that that algorithm does not even ask for a desired orientation (rotation angle) for the frame of the output points, the algorithm chooses that orientation all by itself!
People who use it might want to know at what angle the frame will be tipped before hand so I found an equation which gives the answer for the case of up to three input points, however I have not had time to generalize it to n-points and hope someone will do that and add it to this discussion. Here are the three angles the output sides will form with the x-axis as a function of the input side lengths:
angle side a = arcsin(sqrt(((c+b+a)*(c+b-a)*(c-b+a)*(-c+b+a)*(c^2-b^2)^2)/(a^4*((c^2+b^2-a^2)^2+(c^2-b^2)^2))))*180/Pi/2
angle side b = arcsin(sqrt(((c+b+a)*(c+b-a)*(c-b+a)*(-c+b+a)*(c^2+b^2-a^2)^2)/(4*b^4*((c^2+b^2-a^2)^2+(c^2-b^2)^2))))*180/Pi/2
angle side c = arcsin(sqrt(((c+b+a)*(c+b-a)*(c-b+a)*(-c+b+a)*(c^2+b^2-a^2)^2)/(4*c^4*((c^2+b^2-a^2)^2+(c^2-b^2)^2))))*180/Pi/2
Those equations also lead directly to a solution to the OP's problem of finding the coordinates for each point because: the side lengths are already given from the OP as the input, and my equations give the slope of each side versus the x-axis of the solution, thus revealing the vector for each side of the polygon answer, and summing those sides through vector addition up to a desired vertex will produce the coordinate of that vertex. So if anyone can extend my angle equations to handling beyond three input lengths (but I note: that might be impossible?), it might be a very fast way to the general solution of the OP's question, since slow parts of the algorithms that people gave above like "least square fitting" or "matrix equation solving" might be avoidable.

Identify a linear feature on a raster map and return a linear shape object using R

I would like to identify linear features, such as roads and rivers, on raster maps and convert them to a linear spatial object (SpatialLines class) using R.
The raster and sp packages can be used to convert features from rasters to polygon vector objects (SpatialPolygons class). rasterToPolygons() will extract cells of a certain value from a raster and return a polygon object. The product can be simplified using the dissolve=TRUE option, which calls routines in the rgeos package to do this.
This all works just fine, but I would prefer it to be a SpatialLines object. How can I do this?
Consider this example:
## Produce a sinuous linear feature on a raster as an example
library(raster)
r <- raster(nrow=400, ncol=400, xmn=0, ymn=0, xmx=400, ymx=400)
r[] <- NA
x <-seq(1, 100, by=0.01)
r[cellFromRowCol(r, round((sin(0.2*x) + cos(0.06*x)+2)*100), round(x*4))] <- 1
## Quick trick to make it three cells wide
r[edge(r, type="outer")] <- 1
## Plot
plot(r, legend=FALSE, axes=FALSE)
## Convert linear feature to a SpatialPolygons object
library(rgeos)
rPoly <- rasterToPolygons(r, fun=function(x) x==1, dissolve=TRUE)
plot(rPoly)
Would the best approach be to find a centre line through the polygon?
Or is there existing code available to do this?
EDIT: Thanks to #mdsumner for pointing out that this is called skeletonization.
Here's my effort. The plan is:
densify the lines
compute a delaunay triangulation
take the midpoints, and take those points that are in the polygon
build a distance-weighted minimum spanning tree
find its graph diameter path
The densifying code for starters:
densify <- function(xy,n=5){
## densify a 2-col matrix
cbind(dens(xy[,1],n=n),dens(xy[,2],n=n))
}
dens <- function(x,n=5){
## densify a vector
out = rep(NA,1+(length(x)-1)*(n+1))
ss = seq(1,length(out),by=(n+1))
out[ss]=x
for(s in 1:(length(x)-1)){
out[(1+ss[s]):(ss[s+1]-1)]=seq(x[s],x[s+1],len=(n+2))[-c(1,n+2)]
}
out
}
And now the main course:
simplecentre <- function(xyP,dense){
require(deldir)
require(splancs)
require(igraph)
require(rgeos)
### optionally add extra points
if(!missing(dense)){
xy = densify(xyP,dense)
} else {
xy = xyP
}
### compute triangulation
d=deldir(xy[,1],xy[,2])
### find midpoints of triangle sides
mids=cbind((d$delsgs[,'x1']+d$delsgs[,'x2'])/2,
(d$delsgs[,'y1']+d$delsgs[,'y2'])/2)
### get points that are inside the polygon
sr = SpatialPolygons(list(Polygons(list(Polygon(xyP)),ID=1)))
ins = over(SpatialPoints(mids),sr)
### select the points
pts = mids[!is.na(ins),]
dPoly = gDistance(as(sr,"SpatialLines"),SpatialPoints(pts),byid=TRUE)
pts = pts[dPoly > max(dPoly/1.5),]
### now build a minimum spanning tree weighted on the distance
G = graph.adjacency(as.matrix(dist(pts)),weighted=TRUE,mode="upper")
T = minimum.spanning.tree(G,weighted=TRUE)
### get a diameter
path = get.diameter(T)
if(length(path)!=vcount(T)){
stop("Path not linear - try increasing dens parameter")
}
### path should be the sequence of points in order
list(pts=pts[path+1,],tree=T)
}
Instead of the buffering of the earlier version I compute the distance from each midpoint to the line of the polygon, and only take points that are a) inside, and b) further from the edge than 1.5 of the distance of the inside point that is furthest from the edge.
Problems can arise if the polygon kinks back on itself, with long segments, and no densification. In this case the graph is a tree and the code reports it.
As a test, I digitized a line (s, SpatialLines object), buffered it (p), then computed the centreline and superimposed them:
s = capture()
p = gBuffer(s,width=0.2)
plot(p,col="#cdeaff")
plot(s,add=TRUE,lwd=3,col="red")
scp = simplecentre(onering(p))
lines(scp$pts,col="white")
The 'onering' function just gets the coordinates of one ring from a SpatialPolygons thing that should only be one ring:
onering=function(p){p#polygons[[1]]#Polygons[[1]]#coords}
Capture spatial lines features with the 'capture' function:
capture = function(){p=locator(type="l")
SpatialLines(list(Lines(list(Line(cbind(p$x,p$y))),ID=1)))}
Thanks to #klewis at gis.stackexchange.com for linking to this elegant algorithm for finding the centre line (in response to a related question I asked there).
The process requires finding the coordinates on the edge of a polygon describing the linear feature and performing a Voronoi tessellation of those points. The coordinates of the Voronoi tiles that fall within the polygon of the linear feature fall on the centre line. Turn these points into a line.
Voronoi tessellation is done really efficiently in R using the deldir package, and intersections of polygons and points with the rgeos package.
## Find points on boundary of rPoly (see question)
rPolyPts <- coordinates(as(as(rPoly, "SpatialLinesDataFrame"),
"SpatialPointsDataFrame"))
## Perform Voronoi tessellation of those points and extract coordinates of tiles
library(deldir)
rVoronoi <- tile.list(deldir(rPolyPts[, 1], rPolyPts[,2]))
rVoronoiPts <- SpatialPoints(do.call(rbind,
lapply(rVoronoi, function(x) cbind(x$x, x$y))))
## Find the points on the Voronoi tiles that fall inside
## the linear feature polygon
## N.B. That the width parameter may need to be adjusted if coordinate
## system is fractional (i.e. if longlat), but must be negative, and less
## than the dimension of a cell on the original raster.
library(rgeos)
rLinePts <- gIntersection(gBuffer(rPoly, width=-1), rVoronoiPts)
## Create SpatialLines object
rLine <- SpatialLines(list(Lines(Line(rLinePts), ID="1")))
The resulting SpatialLines object:
You can get the boundary of that polygon as SpatialLines by direct coercion:
rLines <- as(rPoly, "SpatialLinesDataFrame")
Summarizing the coordinates down to a single "centre line" would be possible, but nothing immediate that I know of. I think that process is generally called "skeletonization":
http://en.wikipedia.org/wiki/Topological_skeleton
I think ideal solution would be to build such negative buffer which dynamically reach the minimum width and doesn't break when value is too large; keeps continued object and eventually, draws a line if the value is reached. But unfortunately, this may be very compute demanding because this would be done probably in steps and checks if the value for particular point is enough to have a point (of our middle line). Possible it's ne need to have infinitive number of steps, or at least, some parametrized value.
I don't know how to implement this for now.

Resources