A simple algorithm for polygon intersection - math

I'm looking for a very simple algorithm for computing the polygon intersection/clipping.
That is, given polygons P, Q, I wish to find polygon T which is contained in P and in Q, and I wish T to be maximal among all possible polygons.
I don't mind the run time (I have a few very small polygons), I can also afford getting an approximation of the polygons' intersection (that is, a polygon with less points, but which is still contained in the polygons' intersection).
But it is really important for me that the algorithm will be simple (cheaper testing) and preferably short (less code).
edit: please note, I wish to obtain a polygon which represent the intersection. I don't need only a boolean answer to the question of whether the two polygons intersect.

I understand the original poster was looking for a simple solution, but unfortunately there really is no simple solution.
Nevertheless, I've recently created an open-source freeware clipping library (written in Delphi, C++ and C#) which clips all kinds of polygons (including self-intersecting ones). This library is pretty simple to use: https://github.com/AngusJohnson/Clipper2

You could use a Polygon Clipping algorithm to find the intersection between two polygons. However these tend to be complicated algorithms when all of the edge cases are taken into account.
One implementation of polygon clipping that you can use your favorite search engine to look for is Weiler-Atherton. wikipedia article on Weiler-Atherton
Alan Murta has a complete implementation of a polygon clipper GPC.
Edit:
Another approach is to first divide each polygon into a set of triangles, which are easier to deal with. The Two-Ears Theorem by Gary H. Meisters does the trick. This page at McGill does a good job of explaining triangle subdivision.

If you use C++, and don't want to create the algorithm yourself, you can use Boost.Geometry. It uses an adapted version of the Weiler-Atherton algorithm mentioned above.

You have not given us your representation of a polygon. So I am choosing (more like suggesting) one for you :)
Represent each polygon as one big convex polygon, and a list of smaller convex polygons which need to be 'subtracted' from that big convex polygon.
Now given two polygons in that representation, you can compute the intersection as:
Compute intersection of the big convex polygons to form the big polygon of the intersection. Then 'subtract' the intersections of all the smaller ones of both to get a list of subracted polygons.
You get a new polygon following the same representation.
Since convex polygon intersection is easy, this intersection finding should be easy too.
This seems like it should work, but I haven't given it more deeper thought as regards to correctness/time/space complexity.

Here's a simple-and-stupid approach: on input, discretize your polygons into a bitmap. To intersect, AND the bitmaps together. To produce output polygons, trace out the jaggy borders of the bitmap and smooth the jaggies using a polygon-approximation algorithm. (I don't remember if that link gives the most suitable algorithms, it's just the first Google hit. You might check out one of the tools out there to convert bitmap images to vector representations. Maybe you could call on them without reimplementing the algorithm?)
The most complex part would be tracing out the borders, I think.
Back in the early 90s I faced something like this problem at work, by the way. I muffed it: I came up with a (completely different) algorithm that would work on real-number coordinates, but seemed to run into a completely unfixable plethora of degenerate cases in the face of the realities of floating-point (and noisy input). Perhaps with the help of the internet I'd have done better!

I have no very simple solution, but here are the main steps for the real algorithm:
Do a custom double linked list for the polygon vertices and
edges. Using std::list won't do because you must swap next and
previous pointers/offsets yourself for a special operation on the
nodes. This is the only way to have simple code, and this will give
good performance.
Find the intersection points by comparing each pair of edges. Note
that comparing each pair of edge will give O(N²) time, but improving
the algorithm to O(N·logN) will be easy afterwards. For some pair of
edges (say a→b and c→d), the intersection point is found by using
the parameter (from 0 to 1) on edge a→b, which is given by
tₐ=d₀/(d₀-d₁), where d₀ is (c-a)×(b-a) and d₁ is (d-a)×(b-a). × is
the 2D cross product such as p×q=pₓ·qᵧ-pᵧ·qₓ. After having found tₐ,
finding the intersection point is using it as a linear interpolation
parameter on segment a→b: P=a+tₐ(b-a)
Split each edge adding vertices (and nodes in your linked list)
where the segments intersect.
Then you must cross the nodes at the intersection points. This is
the operation for which you needed to do a custom double linked
list. You must swap some pair of next pointers (and update the
previous pointers accordingly).
Then you have the raw result of the polygon intersection resolving algorithm. Normally, you will want to select some region according to the winding number of each region. Search for polygon winding number for an explanation on this.
If you want to make a O(N·logN) algorithm out of this O(N²) one, you must do exactly the same thing except that you do it inside of a line sweep algorithm. Look for Bentley Ottman algorithm. The inner algorithm will be the same, with the only difference that you will have a reduced number of edges to compare, inside of the loop.

The way I worked about the same problem
breaking the polygon into line segments
find intersecting line using IntervalTrees or LineSweepAlgo
finding a closed path using GrahamScanAlgo to find a closed path with adjacent vertices
Cross Reference 3. with DinicAlgo to Dissolve them
note: my scenario was different given the polygons had a common vertice. But Hope this can help

If you do not care about predictable run time you could try by first splitting your polygons into unions of convex polygons and then pairwise computing the intersection between the sub-polygons.
This would give you a collection of convex polygons such that their union is exactly the intersection of your starting polygons.

If the polygons are not aligned then they have to be aligned. I would do this by finding the centre of the polygon (average in X, average in Y) then incrementally rotating the polygon by matrix transformation, project the points to one of the axes and use the angle of minimum stdev to align the shapes (you could also use principal components). For finding the intersection, a simple algorithm would be define a grid of points. For each point maintain a count of points inside one polygon, or the other polygon or both (union) (there are simple & fast algorithms for this eg. http://wiki.unity3d.com/index.php?title=PolyContainsPoint). Count the points polygon1 & polygon2, divide by the amount of points in polygon1 or Polygon2 and you have a rough (depending on the grid sampling) estimate of proportion of polygons overlap. The intersection area would be given by the points corresponding to an AND operation.
eg.
function get_polygon_intersection($arr, $user_array)
{
$maxx = -999; // choose sensible limits for your application
$maxy = -999;
$minx = 999;
$miny = 999;
$intersection_count = 0;
$not_intersected = 0;
$sampling = 20;
// find min, max values of polygon (min/max variables passed as reference)
get_array_extent($arr, $maxx, $maxy, $minx, $miny);
get_array_extent($user_array, $maxx, $maxy, $minx, $miny);
$inc_x = $maxx-$minx/$sampling;
$inc_y = $maxy-$miny/$sampling;
// see if x,y is within poly1 and poly2 and count
for($i=$minx; $i<=$maxx; $i+= $inc_x)
{
for($j=$miny; $j<=$maxy; $j+= $inc_y)
{
$in_arr = pt_in_poly_array($arr, $i, $j);
$in_user_arr = pt_in_poly_array($user_array, $i, $j);
if($in_arr && $in_user_arr)
{
$intersection_count++;
}
else
{
$not_intersected++;
}
}
}
// return score as percentage intersection
return 100.0 * $intersection_count/($not_intersected+$intersection_count);
}

This can be a huge approximation depending on your polygons, but here's one :
Compute the center of mass for each
polygon.
Compute the min or max or average
distance from each point of the
polygon to the center of mass.
If C1C2 (where C1/2 is the center of the first/second polygon) >= D1 + D2 (where D1/2 is the distance you computed for first/second polygon) then the two polygons "intersect".
Though, this should be very efficient as any transformation to the polygon applies in the very same way to the center of mass and the center-node distances can be computed only once.

Related

How to compute Voronoi tesselation based on manhattan distance in R

I am trying to compute a Voronoi tesselation in 2D with the Manhattan distance in R.
Ideally this would be a function that takes a set of two-dimensional points and outputs a list of polygons that partition the space. I am not certain what representations of Voronoi tesselations are standard.
There are of course many ways to do this with the Euclidean metric (packages like deldir and qhull make this very easy), but I haven't found a way to do this for the manhattan distance. A search using sos's findFn('voronoi') also yielded no results.
Info: taxicabgeometry.net
Interactive: Manhattan-metric Voronoi diagram(Click version)
I've been rolling my own in python, and can sum up the basics here:
Between neighboring centroids is a perpendicular line, in manhattan metric - two rays and a 45 degree diagonal most likely, if the centroids are randomly generated, but a straight horizontal, vertical, or 45 degree diagonal line may also occur. Given a set of such lines for every centroid pair, the edges separating the regions are among them. Collect intersect points of each pair of lines which are equal-distant (within an epsilon), in manhattan metric, to it's 3 nearest centroids. Also collect the two mid points of the 45 degree diagonal which are similarly equal-distant to their nearest two centroids. The outer polies won't be closed. How to deal with them depends on what you need. The poly borders and border verts will need sorting, so your polies aren't a zigzagged mess. Winding order can be determined if they should be clockwise or other. More can be done, just depends on what you need.
Unfortunately, this gets exponentially slower the more points are involved. The intersecting of every bisector to every other bisector is the bottleneck. I've been attempting an insertion method, with some success, but . Now I'm thinking to try first creating a nearest-neighbor linkage between the centroids. If the neighbors are known, the bisectors to intersect will be minimal, and many centroids can be computed quickly.
Anyway, the brute-force approach does work:
The point near the cursor is actually 2 points of a tiny diagonal. It's a precise method, but more complicated than it first seems. The java code from the interactive link above may be faster, but was difficult to get solid and precise geometry from.
Sorry, I dont know R.
Maybe the question is about finding the maximum area of a square that match inside a circumcircle (of a triangle). The equation for such a square abs(x)+abs(y)=r (www.mathematische-basteleien.de/taxicabgeometry.htm). When you have a mesh of triangles the voronoi diagram is the dual.

2d integration over non-uniform grid

I'm writing a data analysis program and part of it requires finding the volume of a shape. The shape information comes in the form of a lost of points, giving the radius and the angular coordinates of the point.
If the data points were uniformly distributed in coordinate space I would be able to perform the integral, but unfortunately the data points are basically randomly distributed.
My inefficient approach would be to find the nearest neighbours to each point and stitch the shape together like that, finding the volume of the stitched together parts.
Does anyone have a better approach to take?
Thanks.
IF those are surface points, one good way to do it would be to discretize the surface as triangles and convert the volume integral to a surface integral using Green's Theorem. Then you can use simple Gauss quadrature over the triangles.
Ok, here it is, along duffymo's lines I think.
First, triangulate the surface, and make sure you have consistent orientation of the triangles. Meaning that orientation of neighbouring triangle is such that the common edge is traversed in opposite directions.
Second, for each triangle ABC compute this expression: H*cross2D(B-A,C-A), where cross2D computes cross product using coordinates X and Y only, ignoring the Z coordinates, and H is the Z-coordinate of any convenient point in the triangle (although the barycentre would improve precision).
Third, sum up all the above expressions. The result would be the signed volume inside the surface (plus or minus depending on the choice of orientation).
Sounds like you want the convex hull of a point cloud. Fortunately, there are efficient ways of getting you there. Check out scipy.spatial.ConvexHull.

Fast algorithm to uncross any crossing edges in a set of polygons

I have a number of polygons each represented as a list of points. I'm looking for a fast algorithm to go through the list of polygons and uncross all of the crossed edges until no crossed edges remain.
Psudocode for current version:
While True:
For each pair of polygons:
for edge1 in first_polygon:
for edge2 in second_polygon:
if edges_cross(edge1,edge2): # Uses a line segment intersection test
uncross_edges(first_polygon,second_polygon,edge1,edge2)
If no edges have been uncrossed:
break
This can be improved a fair bit by replacing the while loop with recursion. However, it's still rather poor in terms of performance.
Below is a simple example of the untangling*. The there'll actually be a large number of polygons and a fair number of points per polygon (around 10-500). The red line shows which two edges are being uncrossed. The result should always be a series of planar graphs, although not sure if there are multiple valid outcomes or just one.
Edit: This time I added the lines first then added the points, and used a bit more complex of a shape. Pretend the points are fixed.
First, let us illustrate what you want (if I got it right). Suppose you have two polygons, one of them has an edge (a, b) which intersects with an edge (s, r) of the other one. These polygons also have a clock-wise orientation, so you know the next vertex after b, and the next vertex after r. Since the edges crosses, you remove them both, and add four new ones. The new ones you add are: (a, r), (r, next(b)); (s, b), (b, next(r)). So you again have two polygons. This is illustrated in the following figure. Note that by initially removing only two edges (one from each polygon), all the crossing were resolved.
Speeding the trivial implementation of O(n^2) per iteration is not entirely easy, and 500 points per polygon is a very small amount to be worried about. If you decide that you need to improve this time, my initial suggestion would be to use the Bentley-Otmann algorithm in some smart way. The smart way involves running the algorithm, then when you find an intersection, you do the procedure above to eliminate the intersection, and then you update the events that guide the algorithm. Hopefully, the events to be handled can be updated without rendering the algorithm useless for this situation, but I don't have a proof for that.
It seems that you want to end up with an embedded planar polygon whose vertices are exactly a given collection of points. The desired "order" on the points is what you get by going around the boundary of the polygon and enumerating the vertices in the order they appear.
For a given collection of points in general there will be more than one embedded polygon with this property; for an example, consider the following list of points:
(-1,-1), (0,0), (1,0), (1,1), (0,1)
This list defines a polygon meeting your criteria (if I understand it correctly). But so does the following ordering of this list:
(-1,-1), (1,0), (0,0), (1,1), (0,1)
Here is one algorithm that will work (I don't know about fast).
First, sort your points by x-coordinate (eg with quicksort) in increasing order (call this list L).
Second, find the convex hull (eg with quickhull); the boundary of the convex hull will contain the leftmost and rightmost points in the sorted list L (call these L[1] and L[n]); let S be the subset of points on the boundary between L[1] and L[n].
The list you want is S in the order it appears in L (which will also be the order it appears in the boundary of the convex hull) followed by the other elements L-S in the reverse of the order they appear in L.
The first two operations should usually take time O(n log n) (worst case O(n^2)); the last will take time O(n). The polygon you get will be the lower boundary of the convex hull (from left to right, say), together with the rest of the points in a "zigzag" above them going from right to left.

Sphere that surely encompass given list of points [points are with x, y and z co-ordinate]

I am trying to find sphere that surly encompasses given list of points.
Points will have x, y and z co-ordinate[Points are in 3D].
Actually I am trying to find new three points based on given list of points by some calculations like find MinX,MaxX ,MinY,MaxY,and MinZ and MaxZ and do some operation and find new three points
And I will draw sphere from these three points.
And I will also taking all these three points on the diameter of sphere so I have a unique sphere.
Is there any standard way for finding encompassing sphere of given list of points?
Yes, the standard algorithm is Welzl's algorithm (assuming you want the minimal sphere around your points). Particularly the improved version of Gaertner is very useful, robust and numerically stable! It handles all the degenerate cases well too.
At its core, the algorithm permutes the points (randomly) to find the 1-4 points that lie on the boundary of the sphere. It's basically a clever trial-and-error algorithm. From these points, you can find the center by finding a point that has the same distance to all those points. Gärtner's version uses an improved numerical device to find the center. Also, it employs an extra pivoting step that presumably makes the algorithm work better for a large number of input points.
If all you want is a sphere around three points, I suggest you still use Gärtners "device" to compute the circumsphere of the triangle. Otherwise, the method will probably degenerate easily (i.e. when the triangle is very flat).
Do you need 3 points, or any number of points?
If you only need the answer for 3 points, each pair of points defines a line segment. Take the longest line segment. Take a sphere centered at the middle of that line segment, whose radius is half the length of the line segment. There are two cases.
The third point is inside of that initial sphere. If so, then you have the smallest sphere.
The third point is outside of that initial sphere. Then the solution at Find Circum Center of Three point of Triangle [Not using Compass] will give you the center of the smallest sphere containing those 3 points.
If you need an arbitrary number of points, I'd do some sort of iterative approximation algorithm. Since you don't seem like you need that, I won't work out the details.

checking intersection of many (small) polygons against one (large) polygon by filling with rectangles?

I have a 2D computational geometry / GIS problem that I think should be common and I'm hoping to find some existing code/library to use.
The problem is to check which subset of a big set (thousands) of small polygons intersect with a single large polygon. (By "small" and "large" I'm referring to the amount of space the polygons cover, not the number of points that define them, although in general suppose that the number of points defining a polygon is roughly proportional to its geometric size. And to give a sense of proportion, think of "large" as the polygon for a state in the United States, and "small" as the polygon for a town.)
Suppose the naive solution using a standard CheckIfPolygonsIntersect( P, p ) function, called for each small polygon p against the one large polygon P, is too slow. It seems that there are ways to pre-process the large polygon to make the intersection checks for the majority of the small polygons trivial. In particular, it seems like you could create a small set of rectangles that partially/almost fill the large polygon. And similarly you could create a small set of rectangles that partially/almost fill the area of the bounding box of the large polygon that is not actually within the large polygon. Then the vast majority of your small polygons could be trivially included or excluded: if they are fully outside the bounding rect of the large polygon, they are excluded. If they are fully inside the boundary of one of the inside-bounding-rect-but-outside-polygon rects, they are excluded. If any of their points are within any of the internal rects, they are included. And only if none of the above apply do you have to call the CheckIfPolygonsIntersect( P, p ) function.
Is that a well-known algorithm? Do you know of existing code to compute a reasonable set of interior/exterior rectangles for arbitrary (convex or concave) polygons? The rectangles don't have to be perfect in all cases; they just have to fill much of the polygon, and much of the inside-bounding-rect-but-outside-polygon area.
Here's a simple plan for how I might compute these rectangles:
take the bounding box of the large polygon and construct a, say, 10x10 grid of points over it
for each point, determine if it's inside or outside the polygon
"grow" each point into a rectangle by iteratively expanding it in each of the four directions until one of the rect edges crosses one of the polygon edges, in which case you've gone too far (this would actually be done in a "binary search" kind of iteration so with just a few iterations you could find the correct amount to expand in each direction; and of course there is some question of whether to maximize the edges one at a time or in concert with one another)
any not-yet-expanded grid point that get covered by another point's expansion just disappears
when all points have been expanded (or have disappeared), you have your set of interior and exterior rectangles
Of course, certain crazy concave shapes for the large polygon could lead to some poor/small rectangles. But assuming the polygons are mostly reasonable (e.g., say they were the shapes of the states of the United States), it seems like you'd get a good set of rectangles and could greatly optimize those thousands of intersection checks you'd subsequently do.
Is there a name (and code) for that algorithm?
Edit: I am already using a quad-tree to determine the small polygons that are likely to intersect with the bounding rect of the large polygon. So the problem is about checking which of those polygons actually do intersect with the large polygon.
Thanks for any help.
In your plan you described something very similar to the signed distance map method. Google 'distance map algorithm' for details. I hope it will be what you're looking for.

Resources