Determining closest neighbor in 3 dimensional space [closed] - math

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I'm trying to find out the closest neighbor of a point A in a 3 dimensional space.
There are several points present in point A's neighborhood.
I would now like to know which point is closest to point A.
I can calculate the distance of X, Y and Z for point A to each other point, but I don't know how to put this into a formula.
Let's say point A's X coordinate is 0 units different to point B's X coordinate, and point A's Y coordinate is 1 unit different to point B's Y coordinate, and point A's Z coordinate is 1 unit different to point B's Z coordinate.
At first I thought I could simply add these 3 difference to sum a single distance variable, but in the case of point C's X difference being 0, C's Y difference being 0 and C's Z difference being 2, I don't see if point B or point C would be closer to point A.
Could anybody share his ideas about this problem?
Thank you.

The distance from point A to B in 3 dimensional space is calculated as follows:
distance = sqrt((b.x-A.x)^2+(B.y-A.y)^2+(B.z-A.z)^2)
To find the minimum you have to iterate your points. Lets say candidates is the set of Points you want to find the closest neighbor. And the Point neighbor will be the closest point to four point a.
Point a = new Point(0,0,0);
Point neighbor = null;
int min = INTEGER.MAX_VALUE;
for(Point p : candidates){
distance = sqrt((b.x-A.x)^2+(B.y-A.y)^2+(B.z-A.z)^2)
if( distance < min){
distance : min
neighbor = p
}
}
If you don't want to return the distance-value you can do it without the sqrt and save this expensive operation.

This is answered in math.stackexchange already:
https://math.stackexchange.com/questions/42640/calculate-distance-in-3d-space
Summary: The distance formula is Math.sqrt(dx*dx + dy*dy + dz*dz) where dx, dy, dz are the component coordinate differences.
To find the minimum, you don't need to calculate the root, but you still need to square the components of the sum (because for a,b > 0: sqrt(a) > sqrt(b) if a > b)

Related

Circle curve angle [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
So I'm trying to make an animation of 2 objects orbiting around one object that we will call the sun. The distance from object 1 to the sun is 2 units and that has a constant angle at which it turns and moves on.
I assume that the farther away from the sun it is the smaller the angle so the bigger the circle, but how would you calculate this angle depending on the distance? Here is a picture:
So let's just talk about how to calculate the X, Y coordinates of an object moving around the origin at a constant distance D and with angular velocity W (angular velocity is the number of degrees per second).
The angle Q that our object will make with the ray beginning at the origin and pointed at the positive X-axis is given by Q(t) = Q0 + Wt, where Q0 is the angle the object makes at time t = 0 (the initial condition). If we assume that the object begins immediately to the right of the origin, Q0 = 0, for instance.
The X, Y coordinates of the object at time t can be found using trigonometry on Q(t):
X(t) = D * cos(Q(t)) = D * cos(Q0 + Wt)
Y(t) = D * sin(Q(t)) = D * sin(Q0 + Wt)
If you have two objects at different distances from the origin/sun, then for the same angular velocity W, the object closer to the origin/sun will move with a slower speed than the one farther away. This is because to move the same number of degrees around a larger circle, the object farther away has longer actual distance to go in the same time. Assuming that the angular velocity is being measured in degrees per second, the object's speed in D's distance units per second can be found as follows:
V = (angular velocity / 360 degrees) * (circumference of circle)
= (W/360) * (2*PI*D)
= 2*PI*D*W/180
So, if you wanted V to be constant rather than W, you could solve this for W in terms of your desired V and D.

How to calculate the signed angle between 2 vectors with a given axis normal in 3D? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
Suppose if I have two vectors, A and B, and an axis (normalized vector), how do I find the angle between A and B such that the angle difference between A after rotation(axis, angle) and B wrt to the given axis is 0. A doesnt have to be equal to B after the rotation. Basically I want to find the angle difference between A and B in a specified plane.
Note: this is different than finding the shortest angle between 2 vectors since the axis is not the cross product between A and B. Thus, technique here (and many SO answers) does not apply. This needs to work in 3D.
I don't think the problem has a solution unless both A and B are the same length and A and B both make the same angle (in the usual sense of shortest angle between vectors) with the axis. I will assume that these are given.
In that case, one solution would be to compute the orthogonal projection of both A and B into a plane that is orthogonal to the axis. This could be done by subtracting the component that is in the direction of the axis. So if I have a unit vector in the direction of the axis and call it X, the computation would be something like
Aproj = A - dot(A, X)X
Bproj = B - dot(B, X)X
Then the angle between Aproj and Bproj (in the usual sense of shortest angle) is the angle of rotation around the axis that you are asking about.
I'm not sure if this is the simplest way to compute it, but it should work pretty generally.
The dot product give the angle between A & B.
In Fortran something like: dotAB = DASIN(DOT(A/|A|, B/|B|)).
A cross product gives a vector orthongal to A and B.
The projection of the Xproduct-vector towards the planes (or axis) should get you there when multiplied by the angle DotAB. You'll probably be a sine or cosine in there.

Circumcenter coordinates for a isosceles triangle [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I need to calculate circumcenter coordinates (or at least I hope they're called that) at point C for an isosceles triangle (the circle must be such, that created triangle is). I know the point O (origin), two vectors p and q (length may differ) originating in that point (leading to points P and Q). I also know the radius r of this to be circumscribed circle. When the circle's center is known it should create said green highlighted isosceles triangle. Here is drawing for better understanding:
Update (solution):
Calculates the length of p and q vectors
Normalize them both, and add them together
Normalize this to be OC vector again
Finally extend OC vector from point of origin O to length equivalent to radius r
Thinking geometrically:
normalise vectors p and q, i.e. p = p / |p|, q = q / |q|
add them together
normalise the result
multiply that by r - this is the vector OC
add to O
Steps 1 - 3 simply produce the bisection of the vectors p and q
EDIT this is simplified somewhat compared to my original answer.
The first equation of your system is:
(x_c-x_o)^2 + (y_c-y_o)^2 = r^2
The second one is more convoluted. You must intersect the circumference
(x-x_c)^2+(y-y_c)^2 = r^2
with your two vectors, that have equation rispectively
y = (Q_y/Q_x)*x and y = (P_y/P_x)*x
this gives you the two points of intersection p and q in function of x_c and y_c. Now force hte distance OP and OQ to be equal (you want an isoscele triangle), and you have your second equation.
Solve hte two equation system and you have the formula for x_c and y_c.
Assuming i did the math right, the solution is:
x_c = ((a+b)^2 * r^2) / ((a+b)^2+4)
y_c = (-2*(a+b) * r^2) / ((a+b)^2+4)
where
a = p_y / p_x
b = q_y / q_x

How to find the 3rd coordinate of a triangle [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
This is not homework. We are trying to build double connection lines between circles for a project.
Given a triangle of any type (because it will be rotated)
AB is known
AC is known
BC is known Where
AB is equal to BC (they are both the radius of the circle)
Point A is (x1,y1) and is known. It is the center point of the circle.
Point B is (x2,y2) and is known. It is the point on the edge of the circle that connects to the center of a remote circle.
Point C is unknown (x3,y3) and is what we are trying to figure out. I THINK we need to use the law of cosines, but it's not working out so far.
Thanks to anyone who can help!
You have much more info than you need to get the answer and it has nothing to do with law of cosine
Basically you only need A, B, AC, and BC
You draw a circle with A as the center and AC as the edge
You draw another circle with B as the center and BC as the edge
These two circles will have two intersecting points, and they are the two possible location of C
put it in math:
you have two Binary quadratic equations:
(x-x1)^2 + (y-y1)^2 = AC^2
(x-x2)^2 + (y-y2)^2 = BC^2
and you need to get (x, y) from these two equations
You can use the law of cosines, since you know the lengths of the three sides of the triangle (AB), (BC) and (AC). The law of cosines states that
(BC)^2 = (AC)^2 + (AB)^2 - 2 (AC)(AB) cos theta
where theta is the internal angle of the triangle at vertex A. Rearranging this gives
theta = acos(((BC)^2 - (AC)^2 - (AB)^2)/(-2 (AC)(AB)))
then your answer is (in vector notation):
(x,y) = (x1,y1) + (AC)*(v1,v2)
where (v1,v2) is the unit vector in the direction from A to C. (i.e., in scalar notation, x=x1+(AC)*v1 and y=y1+(AC)*v2). We can obtain v1 and v2 by rotating the unit vector from A to B by the angle theta:
v1 = (cos(theta)*(x2-x1) + sin(theta)*(y2-y1))/(AB)
v2 = (cos(theta)*(y2-y1) - sin(theta)*(x2-x1))/(AB)
Flip the sign of theta to get the other of the two solutions.
Note that one can avoid ever calculating theta by observing that:
cos(theta) = ((BC)^2 - (AC)^2 - (AB)^2)/(-2 (AC)(AB))
sin(theta) = sqrt(1-((BC)^2 - (AC)^2 - (AB)^2)/(-2 (AC)(AB))^2)
which may be faster to evaluate than the trigonometric functions.

Calculating distance between two points using pythagorean theorem [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I'd like to create a function that calculates the distance between two pairs of lat/longs using the pythag theorem instead of the haversine great-circle formula. Since this will be over relative short distances (3km), I think this version that assumes a flat earth should be OK. How can I do this? I asked the internet and didn't come up with anything useful. :)
Thanks.
EDIT:
Here's what I came up with (seems to be working):
def get_dist(lat0, lng0, lat1, lng1)
begin
d_ew = (lng1.to_f - lng0.to_f) * Math.cos(lat0.to_f)
d_ns = (lat1.to_f - lat0.to_f)
d_lu = Math.sqrt(d_ew.to_f * d_ew.to_f + d_ns.to_f * d_ns.to_f)
d_mi = ((2*Math::PI*3961.3)/360)*d_lu
return d_mi
rescue Exception => ex
logger.debug "[get_dist] An exception occurred: #{ex.message}"
return -1
end
end
You can use a simple pythagoras triangle if you expect the distances involved to be small compared with the size of the Earth.
Suppose you are at (lat0, long0) and you want to know the distance to a point (lat1, long1) in "latitude units".
Horizontal (EW) distance is roughly
d_ew = (long1 - long0) * cos(lat0)
This is multiplied by cos(lat0) to account for longitude lines getting closer together at high latitude.
Vertical (NS) distance is easier
d_ns = (lat1 - lat0)
So the distance between the two points is
d = sqrt(d_ew * d_ew + d_ns * d_ns)
You can refine this method for more exacting tasks, but this should be good enough for comparing distances.
In fact, for comparing distances, it will be fine to compare d squared, which means you can omit the sqrt operation.
Well, since your points are near each other, the surface of the sphere is almost flat, so just find the coordinates of the points in 3D space, so find (x,y,z) for each of the points, where
x = r*sin(lat)*cos(long)
y = r*sin(lat)*sin(long)
z = r*cos(lat)
where r is the radius of the sphere.
or something like that depending on how you define lat/long. Once you have the two xyz coords, just use sqrt((x1-x2)^2+(y1-y2)^2+(z1-z2)^2). You really can't just use a 2D Pythagorean theoreom since you would need to get reasonable 2D coordinates, which is hard.
You will commonly see this notation 'dy, dx' which stands for difference y and difference x. You simply work out the differences on both axises, the get the square root of both differences squared as per the theorum.(the sum of the hype is equal to the square of the other two sides).
var dx:Number = x1-x2;
var dy:Number = y1-y2;
var distance:Number = Math.sqrt(dx*dx + dy*dy);
Hope this is clear enough

Resources