I find the centroid over the point clouds by using PCL, I want to know how to visualize these centroid on the point clouds.
Related
I am trying to make a 3D scanner based on turntable. This 3D scanner should create a point cloud model to meause of object length, hight and depth. I am working with realsense d435 RGBD camera. While turntable is rotating, I get the point clouds of object that wants to scan. My question is that how can I join these point clouds that has different rotation angle.
My steps are:
Get the point cloud of object. ( I already got it)
Transform the axes of point cloud to world coordinate system. I mean when I import it to meshlab. It should be on xy axis. I made it to)
Use rotation matrix to rotate the point cloud data for real position. I mean, I get the point cloud data that has 90 degree angle from turntable. When I try the rotate it about z axis, it rotates. But the point clouds has 0 degree angle and 90 degree angle doesnt match.
How can I get over from this problem?
i'm working on a project where i have a cloud of points in space as input data, my goal is to create a surface.
I started by computing a regression plan for the cloud, then i projected my points on the plane using dot products :
My plane is represented by a point and a normal , i construct the axis of the plane's space using cross products then project each point on these axis.
then i triangulate in 2D (that's the point of the whole operation).
My problem is that my points now are in the plane space and i want to get them back to their inital position (inverse the transformation) to have my surface ON my points.
thank you :)
the best way is to keep the original positions and make the triangulation give you the indices rather than the positions , i hope it will help !
I have a PCL Point Cloud. Basically, I need to write some code that does the following:
Example
Basically, I need to build a graph/edge map of the point cloud. Where each node represents a point, and those points have pointers/edges to neighbouring points. And preferably, it cannot form a corner edge as seen in the picture. (This could be enforced by saying a point cannot have a large change in l1 norm too (taxicab distance. add all axis), not just l2 norm).
I need to do this because, it's useful for all my other algorithms. Normal computation etc.
I'm currently at a loss of how to implement this. My point cloud is unorganized. I could sort it into a KD Tree but I'm not sure if that is related to this or how I might use this.
The graph/edge map is the same as a triangulation between the vertices.
In your case, as you only want to connect vertices which are close together, Delaunay Triangulation will work.
The edges are the connections between vertices in your graph.
PCL has ConcaveHull, which will triangulate the surface of your vertices, given an alpha value. This alpha value is the maximum radius for each triangle, in your case, half the known distance between diagonal vertices.
I have a csv file with longitude and latitude coordinates at various times and then another column that assigns a value from 1-10 for each of the location points. I want to create a contour map of each location and its value as a visualization. I was thinking of using mathematica but the very little programming experience I have is with python only.
You can try my implementation for geographic maps in PHP at https://contourplot.codeplex.com. It uses a delaunay triangulation and a linear transformation along the edges of the triangles. It also uses 2 colors to show the difference between local and statewide z-values of any triangles and also the triangles and isolines maps nicely with the border (from the shapefile). Some shapes with extreme concavities, holes and islands can be a problem. Another algorithm is conrec from Paul Bourke. There is also the algorithm from indiemaps blogs but only for openlayers.
I want to create a Voronoi diagram on several pairs of
latitudes/longitudes, but want to use the great circle distance
between them, not the (inaccurate) Pythagorean distance.
Can I make qhull/qvoronoi or some other Linux program do this?
I considered mapping the points to 3D, having qvoronoi create a 3D
Voronoi diagram[1], and intersecting the result with the unit sphere, but
I'm not sure that's easy.
[1] I realize the 3D distance between two latitudes/longitudes (the
"through the Earth" path) isn't the same as the great circle distance,
but it's easy to prove that this transformation preserves relative
distances, which is all that matters for a Voronoi diagram.
I assume you've found this article. From that, it seems like you have the right idea by using a 3D embedding. Your question is then how to intersect the result with the sphere.
First of all you need to consider how you're going to represent the voronoi diagram. If you want to work in lat/long coordinates in a 2D plane, then your voronoi diagram will contain curved edges, so maybe it is best to just use a 3D representation.
If you use a program like qvoronoi, you should in theory only need the inifinite hyperplane data (generated by Fo). This gives you the equation of the plane and the two points it corresponds to. Usually you only need to use the voronoi diagram to test for inclusion within regions, and the hyperplanes should be enough for that.
See also this question: Algorithm to compute a Voronoi diagram on a sphere?