Calculate bearing from lateral and longitudinal speeds - math

How can I calculate the bearing from an relative "origin" by lateral and longitudinal speeds?
For example if the lateral speed was 0 meters a second and the longitudinal speed is positive, that would mean the bearing would be 0 degrees of "origin" but if the longitudinal speed was negative that would indicate the bearing is 180 degrees of "origin". This scenario is simple. (I think, laughs at self).
Now lets make things interesting. The longitudinal speed is still positive, say 30.0 meters a second and my lateral speed is -0.05 meters a second. That would indicate my bearing would be angled ever so slightly "left of origin". But specifically what degree?
Is there a formula to calculate the bearing from two speeds?
Thanks!

After digging into the trigonometry trenches. I found a solution.
Given a lon/lat speeds create a 90 degree triangle. In this scenario the hypotenuse doesn't matter.
It boils down to (in python for folks)...
fraction = a / b # sides of the projection that form the 90 degree angle
if b < 0:
fraction = b / a
bearing = 360 - (90 + math.atan(fraction))
Using that bearing. If you have a distance you can project a point.

Related

Getting limits on latitude and longitude

I have a service that looks for nearby locations(300m) from a user specified point.
I'm using the haversine formula to check if a location is near the point
https://en.wikipedia.org/wiki/Haversine_formula
My problem is that it's slow since it's checking against all of the points in my DB.
What I want to do is limit the initial query and apply the haversine formula to a list of points in a smaller bounded area
e.g.
results = (SELECT * FROM location WHERE location.latitude BETWEEN 14.223 AND 14.5 )
AND location.longitude BETWEEN 121.5 AND 122
haversine(results, user_point)
Is there a loose way of getting the bounds from a given point?
Or basically a dirty conversion of lat/long to meters?
If you can modify your database structure, there's one super-easy way to do it: instead of (or in addition to) storing latitude and longitude, convert your location coordinates into 3D space, with columns for x, y, and z in meters. Then you can just do
SELECT * FROM location
WHERE location.x BETWEEN center.x - 300 AND center.x + 300
AND location.y BETWEEN center.y - 300 AND center.y + 300
AND location.z BETWEEN center.z - 300 AND center.z + 300
That will trim down your list pretty well, and you can do the haversine calculation on the resulting set.
If you're stuck with using a database that has only longitude and latitude in it, you can still narrow down the search. It's easy for latitude: one degree of latitude due north or south always corresponds to 111 km of distance, as long as you ignore the complications that arise when you get close to the poles. That means a distance of 300 m is 0.0027... degrees of latitude, although you might as well be a bit conservative and use 0.003 or 0.004.
Longitude is a bit trickier because the conversion factor changes depending on how far north or south you are, but it's still not too complicated: you just multiply by the cosine of the latitude.
distance = cos(latitude) * 111.19... km/degree * delta_angle
At the equator, it's the same as with latitude: one degree change in longitude at the equator is 111 km. At 80 degrees north (or south), you multiply by a factor of cos(80 degrees) = 0.17..., with the result that 1 degree change in longitude is only 19.3 km. For your purposes, you could invert this and find the range of longitudes to select as 300 m / cos(latitude) / (111.19... km/degree) = (0.0027... degrees) / cos(latitude). That coefficient is the same quantity from the first paragraph; it's not a coincidence.
The tricky problems come up near the discontinuities of the coordinate system, for example when you get near the poles. You can see why when you start plugging in latitudes like 89.9996 degrees:
0.0027... degrees / cos(89.9996 degrees) = 386... degrees
Well, how can that be when there are only 360 degrees in a whole circle? This is an indicator that you've gotten to the point where your 300 m radius extends all the way around the pole and comes back to include your starting location, in a manner of speaking. At that point, you might as well just search all points in your database close enough to the pole. Of course you should really start doing this at 89.999 degrees or so, because that's where the 600 m diameter of the region you're searching just about encircles the pole completely.
There's another issue at (well, near) the International Date Line, or more precisely the "antimeridian", having to do with the jump from -180 to +180 degrees of longitude. A point at +179.9999 degrees and one at -179.9999 degrees, both on the equator, will have very different coordinates even though they are geographically just a few meters apart. Since you're just doing this as a preliminary filter for a more detailed search, it's probably easiest to just pass through every point within 0.006 degrees (that's roughly the diameter of a 300 m-radius circle) of the antimeridian, and then the haversine calculation will determine whether the points are actually close.
To sum up, you can use the bounds on latitude and longitude I mentioned above and just add special cases for the poles and the antimeridian. In some kind of pseudo-SQL/code hybrid:
IF abs(center.latitude) > 89.999
SELECT * FROM location WHERE abs(location.latitude - center.latitude) < 0.003
ELSE
IF abs(center.longitude) > 179.997
SELECT * FROM location
WHERE abs(location.latitude - center.latitude) < 0.003
AND 180 - abs(location.longitude) < (0.006 / cos(center.latitude))
ELSE
SELECT * FROM location
WHERE abs(location.latitude - center.latitude) < 0.003
AND abs(location.longitude - center.longitude) < (0.003 / cos(center.latitude))
ENDIF
ENDIF
If you want a pithy statement at the expense of having to test potentially twice as many points, you can only compare the absolute values of longitude:
SELECT * FROM location
WHERE abs(location.latitude - center.latitude) < 0.003
AND abs(abs(location.longitude) - abs(center.longitude)) <= min(0.003 / cos(center.latitude), 180)
Approximating the earth with a sphere, the distance between two consecutive latitudes can be calculated by
dPerLat = pi * r / 180°,
where r is the radius of the earth. This will be about 111 km.
So, if your reference point is (lat, long) and your search radius is d then you want to search for latitudes in the range
lat* \in [lat - d / dPerLat, lat + d / dPerLat]
Then, for a given latitude, the distance of consecutive longitudes is:
dPerLong = pi * r * cos(lat) / 180°
Again, the range of longitudes to search is +- d / dPerLong. You should use the lat value that gives you a conservative (maximal) range, i.e. the lat value with the highest absolute value.
Be careful at the poles.

Trying to find lat lon given original lat lon + 3 meters

I have this problem I have to solve.
I am given a coordinate lat/lon, and I need to find a random point within 3 meters of this original point. Approximations are good, but all I could find was this https://gis.stackexchange.com/questions/2951/algorithm-for-offsetting-a-latitude-longitude-by-some-amount-of-meters that has a 10 meter error. Thank you.
Not sure what "find" and "random" mean in this question.
The earth is about 10 million meters from equator to either pole (that's actually how they defined the size of the meter, at first; it's been modified slightly since). The width of latitude lines doesn't vary, so one meter north or south is always is one ten-millionth of 90 degrees, or 9e-6 degrees, so just multiply that by the north-south displacement in meters of your desired point from the initial point and you'll get the number to add to the initial point in degrees: delta_lat = y_meters * 9e-6.
The width of longitude lines does vary, but it works out as simply east-west displacement in meters * 9e-6 = delta_lon * cos(lat), which means you can use the distance from your initial point to figure the east-west difference in degrees: delta_lon = x_meters * 9e-6/cos(lat).
You'll have to be careful with that last part around the poles, because cos(lat) will approach zero. Navigational systems use quaternions to do these things because they don't have singularities in spherical coordinates.

Finding if an angle lies between two points

This is basically just a math question.
Heres what I am having troubles with... I am having a difficult time coming up with how to phrase the question, so bear with me. Basically I think I need to use some advanced math to accomplish this, but I do not know what I need.
I will use some illustrations to make this clear. Spam prevention doesn't let me post pictures... Here's a simple concept image though: http://radleygh.com/images/gimp-2_2011-057-00-57-26-40.bmp
Objective: Determine if several objects lie within a cone on a 2D plane
Cone Properties:
Position (x, y)
Angle (0-359)
Spread (0-359, aka Width)
Distance (0++)
I can decide the brownish lines using a simple bit of math:
Angle_A = Angle + (Spread / 2)
Angle_B = Angle - (Spread / 2)
Angle_Target = Point_Direction(origin, object_position)
Now I thought of comparing these with the position of each object with a simple if/then statement:
If (Angle_A > Angle_Target) && (Angle_B < Angle_Target) Then Angle_Target is between A and B
This works... untill Angle_A or Angle_B pass the 0-360 threshold. 0* is between 45* and 315*... but the above if statement wouldn't work. We can then determine which direction to check based on the size of the cone...
And what if the cone effect is larger than a 180* cone?
I'm not sure of the answer. I'm pretty sure I should be using Radians... But I do not understand the concept of Radians. if someone can point me in the right direction, perhaps show me an example somewhere, that would be wonderful!
I will continue to do my own research in the mean time.
You may consider a simple transformation which sets your coordinate system such that Angle_B is zero. In other words, instead of testing
Angle_B < Angle_Target < Angle_A
you may also use
0 < Angle_Target - Angle_B < Angle_A - Angle_B
If you apply a modulo 360° to all terms you're logic should work:
0 < (Angle_Target - Angle_B) % 360 < (Angle_A - Angle_B) % 360
One radian is the angle made by tracing a circle's circumference by a length equal to that circle's radius. Hence there are exactly 2*PI radians in a circle.
So 2*PI radians = 360 degrees
So to convert degrees to radians, multiply by 2 * PI, then divide by 360. (Or of course, multiply by PI, divide by 180).
However, whether you work in radians or degrees should only be dictated by the library you are using. Even then, you could write wrappers which do the above calculations.
But to the main part of your question. Consider that:
sin (theta) = sin (360 + theta).
cos (theta) = cos (360 + theta).
etc.
So if you come across your cone that goes through 0 degrees, simply add 360 to both angles of the cone.
e.g. if your cone goes from -10 to +20, simply use 350 to 380 instead.
And of course, when you test an angle, make sure you also add 360 to that and test both the original and added angles.
e.g. testing +5 (which is in your cone), you would test 5 (which fails) then 365 (which passes).
Good luck!

Translating radians to degrees

I noticed that translating radians to degrees and vice versa is like translating a percentage to a whole number and vice versa. For example, to get 60 percent of 345 you do the following
60 * 345/100
to convert 60 degrees to radians you do
60 * 3.14/180
There is a pattern there BUT... we use 100 to compare percentages to a number. So, why do we use 180 degrees instead of 360 degrees to compare degrees to radians?
%100 percent = a whole number
360 degrees represents a whole circle
using 180 degrees is like using 50% instead of 100%
I hope I am making some sense. Can anyone answer? Thanks
The reason you use 180 degrees instead of 360 is that there are 2*pi radians in a circle, not pi. Thus you divide both 360 and 2*pi by 2 and get pi and 180.
In Mathematica, I use the handy predefined Degree constant for conversions, which is defined as Pi/180 or 2 * Pi/360.
The reason there are 2 * Pi radians in a circle is that the size of an angle in radians is the length of the arc of a circle with radius 1 that subtends it. The circumference of a circle with radius 1 is 2 * Pi. In addition to providing a clear geometrical interpretation, using radians also makes a number of other relations much more convenient; cosine is the derivative of sine, and as a result the Maclaurin series for sines and cosines are much simpler than they would be for angles expressed in degrees.
360 degrees = 2 * Pi radians
1 degree = Pi / 180 radians
I guess your question is, why there 360 degrees in a circle (or 180 in a semicircle), and why not some other more tenable number like 100.
The answer to that is the origin of degree. If you'd like to use a round figure, check out the gradian unit of angles.
PS: SO is for programming questions only. This is not programming related.
I ask this question because my lack of paying attention in school. Programming actually is the reason I ask this question because it is now that I am actually paying attention. Every programming formula uses 180 and PI to translate back and forth instead of 360. Since I haven't came across any examples, I assumed that there was only one way. Of course if I was reading a regular math book, I would of known differently.
But I understand now. Actionscript uses 180 degrees for clock wise rotation. once 180 is reached, it uses -180 back down to 0 for a full rotation. Which makes alot more sense if you want your answer to fall in the 180 degree range. and depending on if its negative or positive determines whether or not it is traveling up on the x axis or down and y axis as well. As much as I appreciate the responses, I believe this is absolutely a suitable programming question. For programmers calculating in degrees is different from your average surveyor.
Given a real life scenario, measuring a distance is always considered a absolute value, where programming this is false. which also rationalizes why we use -180 degrees.

Maximum length of a decimal latitude/longitude Degree?

What is the maximum length (in kilometers or miles - but please specify) that one degree of latitude and longitude can have in the Earth surface?
I'm not sure if I'm being clear enough, let me rephrase that. The Earth is not a perfect circle, as we all know, and a change of 1.0 in the latitude / longitude on the equator (or in Ecuador) can mean one distance while the same change at the poles can mean another completely different distance.
I'm trying to shrink down the number of results returned by the database (in this case MySQL) so that I can calculate the distances between several points using the Great Circle formula. Instead of selecting all points and then calculating them individually I wish to select coordinates that are inside the latitude / longitude boundaries, e.g.:
SELECT * FROM poi
WHERE latitude >= 75 AND latitude <= 80
AND longitude >= 75 AND longitude <= 80;
PS: It's late and I feel that my English didn't came up as I expected, if there is something that you are unable to understand please say so and I'll fix/improve it as needed, thanks.
The length of a degree of latitude varies little, between about 110.6 km at the equator to about 111.7 near the poles. If the earth were a perfect sphere, it would be constant. For purposes like getting a list of points within say 10 km of a known (lat, lon), assuming a constant 111 km should be OK.
However it's quite a different story with longitude. It ranges from about 111.3 km at the equator, 55.8 km at 60 degrees latitude, 1.9 km at 89 degrees latitude to zero at the pole.
You asked the wrong question; you need to know the MINIMUM length to ensure that your query doesn't reject valid candidates, and unfortunately the minimum length for longitude is ZERO!
Let's say you take other folk's advice to use a constant of about 111 km for both latitude and longitude. For a 10 km query, you would use a margin of 10 / 111 = 0.09009 degrees of latitude or longitude. This is OK at the equator. However at latitude 60 (about where Stockholm is, for example) travelling east by 0.09 degrees of longitude gets you only about 5 km. In this case you are incorrectly rejecting about half of the valid answers!
Fortunately the calculations to get a better longitude bound (one that depends on the latitude of the known point) is very simple -- see this SO answer, and the article by Jan Matuschek that it references.
Originally, the definition of a nautical mile was the length of one minute of longitude on the equator. So, there were 360 * 60 = 21,600 nautical miles around the equator. Similarly, the original definition of a kilometer was that 10,000 km = length from pole to equator. Consequently, assuming a spherical earth, there would be:
40,000 ÷ 21,600 = 1.852 km per minute
1.852 × 60 = 111.11 km per degree
Allowing for a spheroidal earth instead of a spherical one will slightly adjust the factor, but not by all that much. You could be pretty confident the factor is less than 1.9 km per minute or 114 km per degree.
If you can use MySQL spatial extensions: http://dev.mysql.com/doc/refman/5.0/en/spatial-extensions.html, you can use its operators and functions both to filter the points and calculate distances. See http://dev.mysql.com/doc/refman/5.0/en/functions-that-test-spatial-relationships-between-geometries.html, specifically the functions contains() and distance().
The reference ellipsoid used for the earth is the WGS84 system, meaning
that the earth radius for the equator has the length of
6 378 137 m or 3963.19 miles
The maximum length of longitude is reached at the equator and is approximately (upper bound)
111.3195 km or
69.1708 miles
The maximum length of one degree latitude is reached between the equator and 1°. It is almost exactly equal to the maximum length of longitude; a first approximation shows that the difference is less than 4.2 meters yielding
111.3153 km or
69.1682 miles

Resources