I'm tring to calculate a point's longitude from the other point's longitude. They have the same latitude and the distance between them is known. I try to use the sperical law of cosines formula.
# 'lat' short for 'latitude', 'lng' short for 'longitude'.
# EARTH_RADIUS = 6371000.0, unit is meter.
#
distance = Math.acos( Math.sin(lat1)*Math.sin(lat2) +
Math.cos(lat1)*Math.cos(lat2) *
Math.cos(lng2-lng1)) * EARTH_RADIUS
If the two point's latitude are equal(lat1 == lat2), i can calculate lng2 from lng1 with distance. So i reason the formula from sperical law of cosines formula
# lat1 == lat2 == lat
# 'distance' and 'lng' are known
lng2 = Math.acos((Math.cos(distance/EARTH_RADIUS) - Math.sin(lat)*Math.sin(lat))/(Math.cos(lat)*Math.cos(lat))) + lng
This formula works very well, except some situations.
Like
lat_degrees = -89.8345981836319
lng_degrees = 96.42309331893921
lat = lat1 = lat2 = (lat_degrees * Math::PI)/180 # -1.567909520510494
lng = (lng_degrees * Math::PI)/180 # 1.682900453373236
distance = 67544.06725769254
This will apper the error
Math::DomainError: Numerical argument is out of domain - "acos"
Because the value in Math.acos(value) equal to -2.5100189069914602, which smaller than -1. I have no idea about that. Is the derived formual wrong?
There is nothing wrong with your formula. I didn't do the calculations but I suppose the point is that you are (very!!) near to the south pole and there basically are no two points which lie apart such large distance.
just add a compare before it:
if lat1 == lat2 and lng1 == lng2
return 0
end
Related
I'm attempting to implement position determination function of an aircraft to get "latitude/longitude azimuth"
I attached 3 images for summerized formula as you may see this is 5-step trigonometric equation (Steps 0/4) which is my aim to program. image1; image2 image3
To find aircraft coordinates there are defined 9 input parameters (image1): Station U and S latitude,longitude,altitude; Aircraft altitude and 2 slant ranges.
At the end of problem (image3) we will find 3 outputs: Aircraft latitude/longitude azimuth.
This code implements the solution explained for: How can I triangulate a position using two DMEs? on aviation.se. The code is in Python, which I happen to use instead of C, it's easy to convert into another language as required. I've broken down the calculation in smaller units to make code more legible and to ease your understanding.
The problem involves 3 points in 3D space: U and S are the DMEs, A is the aircraft.
As we just need the coordinates of U and S, to determinate A coordinates, I'm using coordinates of 3 well known DME stations. This will allow to check whether the result is correct. View based on the Low Altitude Enroute Chart:
When the program is run, the output is:
CAN: lat 49.17319, lon -0.4552778, alt 82
EVX: lat 49.03169, lon 1.220861, alt 152
P north: lat 49.386910325692874, lon 0.646650777948733, alt 296
P south: lat 48.78949175956114, lon 0.5265322105880027, alt 296
First are the coordinates of points U (CAN DME) and S (EVX DME) we entered, and
then two lines for the two possible location of the aircraft.
I made another test with DME at longer distance (1241 km for ARE and 557.1 km for GLA) which worked pretty good too:
ARE: lat 48.33264, lon -3.602472, alt 50
GLA: lat 46.40861, lon 6.244222, alt 1000
P north: lat 48.082101174246304, lon 13.210754399535269, alt 10
P south: lat 41.958725412109445, lon 9.470999690780628, alt 10
The actual location of the aircraft is supposed to be SZA navaid, in south of France: Lat 41.937°, lon 9.399°.
from math import asin, sqrt, cos, sin, atan2, acos, pi, radians, degrees
# Earth radius in meters (https://rechneronline.de/earth-radius/)
E_RADIUS = 6367 * 1000 # at 45°N - Adjust as required
def step_0(r_e, h_u, h_s, h_a, d_ua, d_sa):
# Return angular distance between each station U/S and aircraft
# Triangle UCA and SCA: The three sides are known,
a = (d_ua - h_a + h_u) * (d_ua + h_a - h_u)
b = (r_e + h_u) * (r_e + h_a)
theta_ua = 2 * asin(.5 * sqrt(a / b))
a = (d_sa - h_a + h_s) * (d_sa + h_a - h_s)
b = (r_e + h_s) * (r_e + h_a)
theta_sa = 2 * asin(.5 * sqrt(a / b))
# Return angular distances between stations and aircraft
return theta_ua, theta_sa
def step_1(lat_u, lon_u, lat_s, lon_s):
# Determine angular distance between the two stations
# and the relative azimuth of one to the other.
a = sin(.5 * (lat_s - lat_u))
b = sin(.5 * (lon_s - lon_u))
c = sqrt(a * a + cos(lat_s) * cos(lat_u) * b * b)
theta_us = 2 * asin(c)
a = lon_s - lon_u
b = cos(lat_s) * sin(a)
c = sin(lat_s) * cos(lat_u)
d = cos(lat_s) * sin(lat_u) * cos(a)
psi_su = atan2(b, c - d)
return theta_us, psi_su
def step_2(theta_us, theta_ua, theta_sa):
# Determine whether DME spheres intersect
if (theta_ua + theta_sa) < theta_us:
# Spheres are too remote to intersect
return False
if abs(theta_ua - theta_sa) > theta_us:
# Spheres are concentric and don't intersect
return False
# Spheres intersect
return True
def step_3(theta_us, theta_ua, theta_sa):
# Determine one angle of the USA triangle
a = cos(theta_sa) - cos(theta_us) * cos(theta_ua)
b = sin(theta_us) * sin(theta_ua)
beta_u = acos(a / b)
return beta_u
def step_4(ac_south, lat_u, lon_u, beta_u, psi_su, theta_ua):
# Determine aircraft coordinates
psi_au = psi_su
if ac_south:
psi_au += beta_u
else:
psi_au -= beta_u
# Determine aircraft latitude
a = sin(lat_u) * cos(theta_ua)
b = cos(lat_u) * sin(theta_ua) * cos(psi_au)
lat_a = asin(a + b)
# Determine aircraft longitude
a = sin(psi_au) * sin(theta_ua)
b = cos(lat_u) * cos(theta_ua)
c = sin(lat_u) * sin(theta_ua) * cos(psi_au)
lon_a = atan2(a, b - c) + lon_u
return lat_a, lon_a
def main():
# Program entry point
# -------------------
# For this test, I'm using three locations in France:
# VOR Caen, VOR Evreux and VOR L'Aigle.
# The angles and horizontal distances between them are known
# by looking at the low-altitude enroute chart which I've posted
# here: https://i.stack.imgur.com/m8Wmw.png
# We know there coordinates and altitude by looking at the AIP France too.
# For DMS <--> Decimal degrees, this tool is handy:
# https://www.rapidtables.com/convert/number/degrees-minutes-seconds-to-degrees.html
# Let's pretend the aircraft is at LGL
# lat = 48.79061, lon = 0.5302778
# Stations U and S are:
u = {'label': 'CAN', 'lat': 49.17319, 'lon': -0.4552778, 'alt': 82}
s = {'label': 'EVX', 'lat': 49.03169, 'lon': 1.220861, 'alt': 152}
# We know the aircraft altitude
a_alt = 296 # meters
# We know the approximate slant ranges to stations U and S
au_range = 45 * 1852 # 1 NM = 1,852 m
as_range = 31 * 1852
# Compute angles station - earth center - aircraft for U and S
# Expected values UA: 0.0130890288 rad
# SA: 0.0090168045 rad
theta_ua, theta_sa = step_0(
r_e=E_RADIUS, # Earth
h_u=u['alt'], # Station U altitude
h_s=s['alt'], # Station S altitude
h_a=a_alt, d_ua=au_range, d_sa=as_range # aircraft data
)
# Compute angle between station, and their relative azimuth
# We need to convert angles into radians
theta_us, psi_su = step_1(
lat_u=radians(u['lat']), lon_u=radians(u['lon']), # Station U coordinates
lat_s=radians(s['lat']), lon_s=radians(s['lon'])) # Station S coordinates
# Check validity of ranges
if not step_2(
theta_us=theta_us,
theta_ua=theta_ua,
theta_sa=theta_sa):
# Cannot compute, spheres don't intersect
print('Cannot compute, ranges are not consistant')
return 1
# Solve one angle of the USA triangle
beta_u = step_3(
theta_us=theta_us,
theta_ua=theta_ua,
theta_sa=theta_sa)
# Compute aircraft coordinates.
# The first parameter is whether the aircraft is south of the line
# between U and S. If you don't know, then you need to compute
# both, once with ac_south = False, once with ac_south = True.
# You will get the two possible positions, one must be eliminated.
# North position
lat_n, lon_n = step_4(
ac_south=False, # See comment above
lat_u=radians(u['lat']), lon_u=radians(u['lon']), # Station U
beta_u=beta_u, psi_su=psi_su, theta_ua=theta_ua # previously computed
)
pn = {'label': 'P north', 'lat': degrees(lat_n), 'lon': degrees(lon_n), 'alt': a_alt}
# South position
lat_s, lon_s = step_4(
ac_south=True,
lat_u=radians(u['lat']), lon_u=radians(u['lon']),
beta_u=beta_u, psi_su=psi_su, theta_ua=theta_ua)
ps = {'label': 'P south', 'lat': degrees(lat_s), 'lon': degrees(lon_s), 'alt': a_alt}
# Print results
fmt = '{}: lat {}, lon {}, alt {}'
for p in u, s, pn, ps:
print(fmt.format(p['label'], p['lat'], p['lon'], p['alt']))
# The expected result is about:
# CAN: lat 49.17319, lon -0.4552778, alt 82
# EVX: lat 49.03169, lon 1.220861, alt 152
# P north: lat ??????, lon ??????, alt 296
# P south: lat 48.79061, lon 0.5302778, alt 296
if __name__ == '__main__':
main()
I'm trying to find the angle to north(bearing/azimuth) & distance between 2 gps coordinates. But obviously I have a mistake somewhere - it gives me wrong bearing&distance values. Please correct me where I'm wrong. Trying it in Unity 5 (c#).
Here is the code:
public float pointX;
public float pointY;
public float lat1=55.500817f;
public float lat2=55.380680f;
public float lon1=37.568342f;
public float lon2=37.822586f;
public float azimuth;
void Update () {
float dlon = lon2 - lon1;
float dlat = lat2 - lat1;
pointX = Mathf.Sin(dlon* 0.01745329f)*Mathf.Cos(lat2* 0.01745329f);
pointY = Mathf.Cos (lat1* 0.01745329f) * Mathf.Sin (lat2* 0.01745329f) - Mathf.Sin (lat1* 0.01745329f) * Mathf.Cos (lat2* 0.01745329f) * Mathf.Cos (dlon*0.01745329f);
azimuth=Mathf.Atan2(pointX, pointY)*57.29578f;
double distance = Math.Pow(Math.Sin(dlat/2*0.01745329),2.0)+(Math.Cos(lat1* 0.01745329)*Math.Cos(lat2* 0.01745329)* Math.Pow(Math.Sin(dlon/2* 0.01745329),2.0));
distance = 2.0*6376500.0*Math.Atan2(Math.Sqrt(distance),Math.Sqrt(1.0-distance));
where * 0.01745329f is the conversion from degrees to radians and *57.29578f is the conversion from radians to degrees
Let's assume all the angles are already converted to radians, and use Re as the Earth's mean radius, and we'll assume a spherical Earth model. There are corrections for the ellipsoidal shape of the Earth but this will get you close. I'll use python-style coding since I know nothing about C#.
#
# North Distance of point 2 from point 1
#
dN = Re * dlat
#
# East Distance of point 2 from point 1
#
dE = Re * dlon * cos(0.5 * (lat1 + lat2))
#
# Distance between points
#
distance=math.sqrt(dN**2 + dE**2)
#
# Azimuth to point 2 from point 1 in radians
#
azimuth=math.atan2(dE,dN)
I copied your code (to java), 1:1
There is no bug in your azimuth code.
Either the cause is by usage of Mathf (float) instead of the double variant
or you just look at the wrong data or wrong output.
As intermediate values I get:
pointx= 0.0025209875920285405,
pointy = -0.0020921620920549278,
azimuth = 129.68
Given a location(lat,lng), I want to get its coordinate in an Azimuthal Equidistant Projection. The formulas are explained here.
Below is the screenshot of the web page.
at the end of that page, it states
It looks like given any location (lat [-Pi/2, +Pi/2], lng [0, +2Pi)), and the center of the projection (latCenter, lngCenter), I can calculate its coordinate (x, y) in the map, and since no map Radius is provided, the value of x and y will fall in the range of [-1, +1] or [-Pi, +Pi].
My question is, what is the c in the formulas? If it is a value calculated from (x, y), how can it be used to calculate (x, y)?
Can somebody help me understand these formulas?
Use equation 4 to compute c when projecting from lat/long to x,y. Equation 7 is for computing the inverse, ie going from x,y to lat/long. For your purposes, making a map, ignore equation 7.
c is the angle subtended at the centre of the Earth by the arc of the great circle from the centre of the projection (phi0, lambda0) to another point (phi, lambda).
As you don't state the programming language you're working with here is an implementation in F# from a recent blogpost.
open System
module AzimuthalEquidistantProjection =
let inline degToRad d = 0.0174532925199433 * d; // (1.0/180.0 * Math.PI) * d
let project centerlon centerlat lon lat =
// http://mathworld.wolfram.com/AzimuthalEquidistantProjection.html
// http://www.radicalcartography.net/?projectionref
let t:float = degToRad lat
let l:float = degToRad lon
let t1 = degToRad centerlat // latitude center of projection
let l0 = degToRad centerlon // longitude center of projection
let c = Math.Acos ((sin t1) * (sin t) + (cos t1) * (cos t) * (cos (l-l0)))
let k = c / (sin c)
let x = k * (cos t) * (sin (l-l0))
let y = k * (cos t1) * (sin t) - (sin t1) * (cos t) * (cos (l-l0))
(x, y)
Other versions (F# with units of measure, Python and Julia)
What is the formula for calculating the distance between 2 geocodes? I have seen some of the answers on this site but they basically say to rely on SQL Server 08 functions, I'm not on 08 yet. Any help would be appreciated.
Use an approximation of the earth and the Haversine formula. You can get a javascript version on the following url, which you can translate to your language of choice:
http://www.movable-type.co.uk/scripts/latlong.html
Here is another way: http://escience.anu.edu.au/project/04S2/SE/3DVOT/3DVOT/pHatTrack_Application/Source_code/pHatTrack/Converter.java
Take a look here for a SQL server 2000 version SQL Server Zipcode Latitude/Longitude proximity distance search
This will do it for you in c#.
Within the namespace put these:
public enum DistanceType { Miles, Kilometers };
public struct Position
{
public double Latitude;
public double Longitude;
}
class Haversine
{
public double Distance(Position pos1, Position pos2, DistanceType type)
{
double preDlat = pos2.Latitude - pos1.Latitude;
double preDlon = pos2.Longitude - pos1.Longitude;
double R = (type == DistanceType.Miles) ? 3960 : 6371;
double dLat = this.toRadian(preDlat);
double dLon = this.toRadian(preDlon);
double a = Math.Sin(dLat / 2) * Math.Sin(dLat / 2) +
Math.Cos(this.toRadian(pos1.Latitude)) * Math.Cos(this.toRadian(pos2.Latitude)) *
Math.Sin(dLon / 2) * Math.Sin(dLon / 2);
double c = 2 * Math.Asin(Math.Min(1, Math.Sqrt(a)));
double d = R * c;
return d;
}
private double toRadian(double val)
{
return (Math.PI / 180) * val;
}
Then to utilise these in the main code:
Position pos1 = new Position();
pos1.Latitude = Convert.ToDouble(hotelx.latitude);
pos1.Longitude = Convert.ToDouble(hotelx.longitude);
Position pos2 = new Position();
pos2.Latitude = Convert.ToDouble(lat);
pos2.Longitude = Convert.ToDouble(lng);
Haversine calc = new Haversine();
double result = calc.Distance(pos1, pos2, DistanceType.Miles);
If
you know that the 2 points are "not too far from each other"
and you tolerate a "reasonably small" error.
Then, consider that the earth is flat between the 2 points :
The distance difference in the latitude direction is EarthRadius * latitude difference
The distance difference in the longitude direction is EarthRadius * longitude difference * cos(latitude).
We multiply by cos(lat) because the longitude degrees don't make the same km distance if the latitude changes. As P1 and P2 are close, cos(latP1) is close from cos(latP2)
Then Pythagore
In JavaScript :
function ClosePointsDistance(latP1, lngP1, latP2, lngP2) {
var d2r = Math.PI / 180,
R=6371; // Earth Radius in km
latP1 *= d2r; lngP1 *= d2r; latP2 *= d2r; lngP2 *= d2r; // convert to radians
dlat = latP2 - latP1,
dlng = (lngP2 - lngP1) * Math.cos(latP1);
return R * Math.sqrt( dlat*dlat + dlng*dlng );
}
I tested it between Paris and Orleans (France) : the formula finds 110.9 km whereas the (exact) Haversine formula finds 111.0 km.
!!! Beware of situations around the meridian 0 (you may shift it) : if P1 is at Lng 359 and P2 is at Lng 0, the function will consider them abnormally far !!!
The pythagorean theorem as offered up by others here doesn't work so well.
The best, simple answer is to approximate the earth as a sphere (its actually a slightly flattened sphere, but this is very close). In Haskell, for instance you might use the following, but the math can be transcribed into pretty much anything:
distRadians (lat1,lon1) (lat2,lon2) =
radius_of_earth *
acos (cos lat1 * cos lon1 * cos lat2 * cos lon2 +
cos lat1 * sin lon1 * cos lat2 * sin lon2 +
sin lat1 * sin lat2) where
radius_of_earth = 6378 -- kilometers
distDegrees a b = distRadians (coord2rad a) (coord2rad b) where
deg2rad d = d * pi / 180
coord2rad (lat,lon) = (deg2rad lat, deg2rad lon)
distRadians requires your angles to be given in radians.
distDegrees is a helper function that can take lattitudes and longitudes in degrees.
See this series of posts for more information on the derivation of this formula.
If you really need the extra precision granted by assuming the earth is ellipsoidal, see this FAQ: http://www.movable-type.co.uk/scripts/gis-faq-5.1.html
Here is a way to do it if you are using sql server.
CREATE function dist (#Lat1 varchar(50), #Lng1 varchar(50),#Lat2 varchar(50), #Lng2 varchar(50))
returns float
as
begin
declare #p1 geography
declare #p2 geography
set #p1 = geography::STGeomFromText('POINT('+ #Lng1+' '+ #Lat1 +')', 4326)
set #p2 = geography::STGeomFromText('POINT('+ #Lng2+' '+ #Lat2 +')', 4326)
return #p1.STDistance(#p2)
end
You're looking for the length of the Great Circle Path between two points on a sphere. Try looking up "Great Circle Path" or "Great Circle Distance" on Google.
Sorry, I don't know what country you are in even. Are we talking about Easting and Northings (UK, Ordance Survey system) or Lat/Long or some other system?
If we are talking Easting and Northing then you can use
sqr((x1-x2)^2 + (y1-y2)^2)
This does not allow for the fact that the earth is a sphere, but for short distances you won't notice. We use it at work for distances between points within the county.
Be carful about how longer grid reference you use. I think an 8 figure reference will give you a distance in metres. I'll be able to get a definate answer at work next week if no one else has supplied it.
the pythagorean theorem?
using a Latitude and Longitude value (Point A), I am trying to calculate another Point B, X meters away bearing 0 radians from point A. Then display the point B Latitude and Longitude values.
Example (Pseudo code):
PointA_Lat = x.xxxx;
PointA_Lng = x.xxxx;
Distance = 3; //Meters
bearing = 0; //radians
new_PointB = PointA-Distance;
I was able to calculate the distance between two Points but what I want to find is the second point knowing the distance and bearing.
Preferably in PHP or Javascript.
Thank you
It seems you are measuring distance (R) in meters, and bearing (theta) counterclockwise from due east. And for your purposes (hundereds of meters), plane geometry should be accurate enough. In that case,
dx = R*cos(theta) ; theta measured counterclockwise from due east
dy = R*sin(theta) ; dx, dy same units as R
If theta is measured clockwise from due north (for example, compass bearings),
the calculation for dx and dy is slightly different:
dx = R*sin(theta) ; theta measured clockwise from due north
dy = R*cos(theta) ; dx, dy same units as R
In either case, the change in degrees longitude and latitude is:
delta_longitude = dx/(111320*cos(latitude)) ; dx, dy in meters
delta_latitude = dy/110540 ; result in degrees long/lat
The difference between the constants 110540 and 111320 is due to the earth's oblateness
(polar and equatorial circumferences are different).
Here's a worked example, using the parameters from a later question of yours:
Given a start location at longitude -87.62788 degrees, latitude 41.88592 degrees,
find the coordinates of the point 500 meters northwest from the start location.
If we're measuring angles counterclockwise from due east, "northwest" corresponds
to theta=135 degrees. R is 500 meters.
dx = R*cos(theta)
= 500 * cos(135 deg)
= -353.55 meters
dy = R*sin(theta)
= 500 * sin(135 deg)
= +353.55 meters
delta_longitude = dx/(111320*cos(latitude))
= -353.55/(111320*cos(41.88592 deg))
= -.004266 deg (approx -15.36 arcsec)
delta_latitude = dy/110540
= 353.55/110540
= .003198 deg (approx 11.51 arcsec)
Final longitude = start_longitude + delta_longitude
= -87.62788 - .004266
= -87.632146
Final latitude = start_latitude + delta_latitude
= 41.88592 + .003198
= 41.889118
It might help if you knew that 3600 seconds of arc is 1 degree (lat. or long.), that there are 1852 meters in a nautical mile, and a nautical mile is 1 second of arc. Of course you're depending on the distances being relatively short, otherwise you'd have to use spherical trigonometry.
Here is an updated version using Swift:
let location = CLLocation(latitude: 41.88592 as CLLocationDegrees, longitude: -87.62788 as CLLocationDegrees)
let distanceInMeter : Int = 500
let directionInDegrees : Int = 135
let lat = location.coordinate.latitude
let long = location.coordinate.longitude
let radDirection : CGFloat = Double(directionInDegrees).degreesToRadians
let dx = Double(distanceInMeter) * cos(Double(radDirection))
let dy = Double(distanceInMeter) * sin(Double(radDirection))
let radLat : CGFloat = Double(lat).degreesToRadians
let deltaLongitude = dx/(111320 * Double(cos(radLat)))
let deltaLatitude = dy/110540
let endLat = lat + deltaLatitude
let endLong = long + deltaLongitude
Using this extension:
extension Double {
var degreesToRadians : CGFloat {
return CGFloat(self) * CGFloat(M_PI) / 180.0
}
}
dx = sin(bearing)
dy = cos(bearing)
x = center.x + distdx;
y = center.y + distdy;