segmenting lat/long data graph into lines/vectors - r

I have lat/lng data of multirotor UAV flights. There are alot of datapoints (~13k per flight) and I wish to find line segments from the data. They give me flight speed and direction. I know that most of the flights are guided missons meaning a point is given to fly to. However the exact points are unknown to me.
Here is a graph of a single flight lat/lng shifted to near (0,0) so they are visible on the same time-series graph.
I attempted to generate similar data, but there are several constraints and it may take more time to solve than working on the segmenting.
The graphs start and end nearly always at the same point.
Horisontal lines mean the UAV is stationary. These segments are expected.
Beginning and and end are always stationary for takeoff and landing.
There is some level of noise in the lines for the gps accuracy tho seemingly not that much.
Alot of data points.
The number of segments is unknown.
The noise I could calculate given the segments and least squares method to the line. Currently I'm thinking of sampling the data (to decimate it a little) and constructing lines. Merging the lines with smaller angle than x (dependant on the noise) and finding the intersection points of the lines left.
Another thought is to try and look at this problem in the frequency domain. The corners should be quite high frequency. Maybe I could make a custom filter kernel that would enable me to use a window function and win in efficency.
EDIT: Rewrote the question for more clarity and less rambling.

Related

Registration of slightly overlapping point cloud

Problem image
I am trying to align/register (>4) 2-D point cloud segments from several laser scanners with high accuracy, producing an perimeter contour of the scanned product. The segments between lasers may look like that above. The issue is that the calibration process may be both incorrect, and slightly not accurate enough thus I am where I am (and possibly containing individual elevation tilt errors so the segments are not shape-wise similar--close but not exact) and trying to make the best of the situation.
Visually, the segments have a slight bias in both directions as well as a rotational error compared to each other.
The difficulty is that the segments only partially overlap, contain a low but noticeable noise which maybe coherent, and the sampled point distribution is both low and uneven in the overlapping region, since the camera placement are apart (approximately 90 degrees).
My solution so far is to ignore the rotational bias, estimate the mean bias of selected corresponce points within point cloud segments in the overlapping region, and translate each segment by that estimate until I get to the opposite corner. It works somewhat OK, but it is a problem in the last set of sensors since all the errors appear to add up there. Additionally, it fails when there is little or no overlapping region.
I am not a specialist, so complicated solutions maybe useful for others. A relatively robust, iterative approach that can be simply coded is the best! I am thankfully grateful for any advice to solve this simple but quite challenging problem.

A large amount of points to create separate polygons (ArcGIS/QGIS)

Visual example of the data
I used a drone to create a DOF of a small area. During the flight, it takes a photo every 20sh seconds (40sh meters of a flight). I have created a CSV file, which I transferred to a point shapefile. In total, I made with drone 10 so-called "missions", each with 100-200 points which are "shaped" as squares on the map. What I want now is to create a polygon shapefile from the point shapefile.
Because those points sometimes overlap, I cannot use the "Aggregate Points" task, as it's only distance-based. I want to make polygons automatically, using some kind of script. What could help is the fact that a maximum time between two points (AKA photos taken) is 10-20 seconds, so if the time distance is over 3 minutes, it's another "mission". Can you help with such a script, that would quickly and automatically create as many polygons as there are missions?
Okay, I think I understand what you are trying to accomplish. Since no one replied I am going to give it a quick shot, so you have something to try.
I think the best strategy would be to:
Clustering algorithm: Try running a Clustering algorithm such as DBSCAN around the timestamp dimension to classify them based on time groups, instead of the distance (since, as you said, distance based separation is not enough to properly identify and separate the points). After which, you should have all the points classified between different groups with a column group id. Maximum distance parameter in the algorithm should be around 20 seconds steps, or even a minute (since you said each mission was separated at least about 3 minutes apart).
Feature based Polygon to point: At that point, then you run your generic Polygon_from_points(...) function that transforms these clustered points to polygons shapes based on a specific discriminant feature (which in your case is going to be each group id).
How does this work?: This would properly separate the groups first (time-based) and then you should be able to find a generic point to polygon based on a feature (Arcgis should have some).
I dont have an example dataset, nor any code written, but based on what you described I think it would work, hope it helps.

Finding a quantity of anything between two points in space

I'm currently working towards a 3D model of this, but I thought I would start with 2D. Basically, I have a grid of longitude and latitude with NO2 concentrations across it. What I want to produce, at least for now, is a total amount of Nitrogen Dioxide between two points. Like so:
2DGrid
Basically, These two points are at different lats and lons and as I stated I want to find the amount of something between them. The tricky thing to me is that the model data I'm working with is gridded so I need to be able to account for the amount of something along a line at the lat and lons at which that line cuts through said grid.
Another approach, and maybe a better one for my purposes, could be visualized like this:3DGrid
Ultimately, I'd like to be able to create a program (within any language honestly) that could find the amount of "something" between two points in a 3D grid. If you would like specfics, the bottom altitude is the surface, the top grid is the top of the atmosphere. The bottom point is a measurement device looking at the sun during a certain time of day (and therefore having a certain zenith and azimuth angle). I want to find the NO2 between that measurement device and the "top of the atmosphere" which in my grid is just the top altitude level (of which there are 25).
I'm rather new to coding, stack exchange, and even the subject matter I'm working with so the sparse code I've made might end up creating more clutter than purely asking the question and seeing what methods/code you might suggest?
Hopefully my question is beneficial!
Best,
Taylor
To traverse all touched cells, you can use Amanatides-Woo algorithm. It is suitable both for 2D and for 3D case.
Implementation clues
To account for quantity used from every cell, you can apply some model. For example, calculate path length inside cell (as difference of enter and exit coordinates) and divide by normalizing factor to get cell weight (for example, byCellSize*Sqrt(3) for 3D case as diagonal length).

Maximum number of points that lie on the same straight straight line in a 2D plane

I am trying to solve a programming interview question that requires one to find the maximum number of points that lie on the same straight straight line in a 2D plane. I have looked up the solutions on the web. All of them discuss a O(N^2) solution using hashing such as the one at this link: Here
I understand the part where common gradient is used to check for co-linear points since that is a common mathematical technique. However, the solution points out that one must beware of vertical lines and overlapping points. I am not sure how these points can cause problems? Can't I just store the gradient of vertical lines as infinity (a large number)?
Hint:
Three distinct points are collinear if
x_1*(y_2-y_3)+x_2*(y_3-y_1)+x_3*(y_1-y_2) = 0
No need to check for slopes or anything else. You need need to eliminate duplicate points from the set before the search begins though.
So pick a pair of points, and find all other points that are collinear and store them in a list of lines. For the remainder points do the same and then compare which lines have the most points.
The first time around you have n-2 tests. The second time around you have n-4 tests because there is no point on revisiting the first two points. Next time n-6 etc, for a total of n/2 tests. At worst case this results in (n/2)*(n/2-1) operations which is O(n^2) complexity.
PS. Who ever decided the canonical answer is using slopes knows very little about planar geometry. People invented homogeneous coordinates for points and lines in a plane for the exact reason of having to represent vertical lines and other degenerate situations.

How to smooth hand drawn lines?

So I am using Kinect with Unity.
With the Kinect, we detect a hand gesture and when it is active we draw a line on the screen that follows where ever the hand is going. Every update the location is stored as the newest (and last) point in a line. However the lines can often look very choppy.
Here is a general picture that shows what I want to achieve:
With the red being the original line, and the purple being the new smoothed line. If the user suddenly stops and turns direction, we think we want it to not exactly do that but instead have a rapid turn or a loop.
My current solution is using Cubic Bezier, and only using points that are X distance away from each other (with Y points being placed between the two points using Cubic Bezier). However there are two problems with this, amongst others:
1) It often doesn't preserve the curves to the distance outwards the user drew them, for example if the user suddenly stop a line and reverse the direction there is a pretty good chance the line won't extend to point where the user reversed the direction.
2) There is also a chance that the selected "good" point is actually a "bad" random jump point.
So I've thought about other solutions. One including limiting the max angle between points (with 0 degrees being a straight line). However if the point has an angle beyond the limit the math behind lowering the angle while still following the drawn line as best possible seems complicated. But maybe it's not. Either way I'm not sure what to do and looking for help.
Keep in mind this needs to be done in real time as the user is drawing the line.
You can try the Ramer-Douglas-Peucker algorithm to simplify your curve:
https://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm
It's a simple algorithm, and parameterization is reasonably intuitive. You may use it as a preprocessing step or maybe after one or more other algorithms. In any case it's a good algorithm to have in your toolbox.
Using angles to reject "jump" points may be tricky, as you've seen. One option is to compare the total length of N line segments to the straight-line distance between the extreme end points of that chain of N line segments. You can threshold the ratio of (totalLength/straightLineLength) to identify line segments to be rejected. This would be a quick calculation, and it's easy to understand.
If you want to take line segment lengths and segment-to-segment angles into consideration, you could treat the line segments as vectors and compute the cross product. If you imagine the two vectors as defining a parallelogram, and if knowing the area of the parallegram would be a method to accept/reject a point, then the cross product is another simple and quick calculation.
https://www.math.ucdavis.edu/~daddel/linear_algebra_appl/Applications/Determinant/Determinant/node4.html
If you only have a few dozen points, you could randomly eliminate one point at a time, generate your spline fits, and then calculate the point-to-spline distances for all the original points. Given all those point-to-spline distances you can generate a metric (e.g. mean distance) that you'd like to minimize: the best fit would result from eliminating points (Pn, Pn+k, ...) resulting in a spline fit quality S. This technique wouldn't scale well with more points, but it might be worth a try if you break each chain of line segments into groups of maybe half a dozen segments each.
Although it's overkill for this problem, I'll mention that Euler curves can be good fits to "natural" curves. What's nice about Euler curves is that you can generate an Euler curve fit by two points in space and the tangents at those two points in space. The code gets hairy, but Euler curves (a.k.a. aesthetic curves, if I remember correctly) can generate better and/or more useful fits to natural curves than Bezier nth degree splines.
https://en.wikipedia.org/wiki/Euler_spiral

Resources