I have a set of coordinates that I want to turn into an angle and do some Anova analysis. I used maptools trackAzimuth function to do it, but I keep losing one or two points. I assume it calculates angles between two coordinates. But I want 1 point (angle) on 1 latitude and longitude, for example at [-80.222, 30.555 to 45]. Since my longitude is in North America, it is in a negative value, do we have to convert into positive before converting it into angle? Please enlighten me on this. Following is my data;
Longitude Latitude
-104.952 39.71478
-104.952 39.7149
-104.54 39.7148
-104.955 39.70441
-104.966 39.7175
My codes:
setwd("C:/Users/data")
install.packages("maptools")
library(maptools)
data <- read.table("data.csv", header = T, sep = ",")
dfr<-data.frame(data[3:4])
angle<-data.frame(trackAzimuth(as.matrix(dfr)))
My results is:
Angle
-81.001
-95.57075
-175.254
-32.628
Here, I lost 1 latitude and longitude. And my angles are negative, I want positive angles on this. How do I do it? Please help.
Thanks
Related
I have lat and lon coordinates. Because I needed to rotate them I transformed the WGS84 lat/lon coordinates into distances from a given point and rotated around that point using a rotation matrix. But for plotting reasons I now need to transform the newly rotated distance values (x and y) back into WGS84 lat/lon coordinates. But I can't find a way to do it.
I transformed the initial lat/lon values to distances from a chosen point like this:
g_mat_x <- cbind(lon, rep(sp_lat,length(lon)))
dist_x <- distGeo(c(sp_lon,sp_lat),g_mat_x)
g_mat_y <- cbind(rep(sp_lon,length(lat)),lat)
dist_y <- distGeo(c(sp_lon,sp_lat),g_mat_y)*(-1)
Where sp_lon and sp_lat are the coordinates of the freely chosen point. lon and lat are vectors from the measurement with the cordinates I needed the distances to.
This works great, but I can't get my head around on how to transform the distances back into the corresponding lat/lon values using the same point (sp_lon/sp_lat).
Are there functions around capable of doing that?
Sample data:
sp_lon <- 6.5
sp_lat <- 54.1
The lat and lon data can be found here:
https://pastebin.com/8ZMGkG2P for lat values
lon <- c(7.03922544225856, 7.03921652830416, 7.03920761347249, 7.03919870033677,
7.03918980775434, 7.039180893677, 7.03917198029713, 7.03916306606371,
7.03915415091448, 7.03914523638381, 7.03913632276797)
https://pastebin.com/bz4zRDb4 for lon values
lat <- c(53.8599307418054, 53.8599299782294, 53.8599292147252, 53.8599284513955,
53.8599276909077, 53.8599269276675, 53.8599261646716, 53.8599254016427,
53.8599246387294, 53.8599238760691, 53.8599231135803)
results for distances from distGeo:
https://pastebin.com/b1rK2i0H for dist_x
dist_x <- c(35275.2396149456, 35274.6564815661, 35274.0732907975, 35273.4902109736,
35272.9084757064, 35272.3253342833, 35271.7422384877, 35271.1590868543,
35270.5758753102, 35269.9927042311, 35269.4095929985)
dist_x_rot <- c(27157.4079196703, 27156.82265961, 27156.2373461817, 27155.6521449452,
27155.0683243199, 27154.4830661607, 27153.897859114, 27153.3125971812,
27152.7272807104, 27152.1420106127, 27151.556803262)
https://pastebin.com/tL7qhXwk for dist_y
dist_y <- c(-26720.819753436, -26720.9047412656, -26720.9897211144, -26721.0746815378,
-26721.1593256438, -26721.2442761088, -26721.3291993745, -26721.4141263143,
-26721.4990403831, -26721.5839262974, -26721.6687931261)
dist_y_rot <- c(-34940.2337323618, -34940.1648982768, -34940.0960416297, -34940.0271949336,
-34939.9583906952, -34939.8895184371, -34939.8206317157, -34939.7517340913,
-34939.6828085284, -34939.6138662435, -34939.5449210127)
I hope it is okay this way. I'd rather give you a small part of the real data instead of making up data.
EDIT: Okay, I got it using destPoint and simple trigonometry to get the dist vector and the angle.
I have a list of latitudes and longitudes which represent the mean center of accidents and hospital locations calculated using the gravity method, i.e for various accidents I have 1 mean latitude and longitude for year and 1 mean latitude and longitude for hospital. Now I want to create a rose diagram for both accident and hospital, calculate their correlation, significance, and direction of movement by year and mean direction as well, and other summary statistics using circular statistics. How do i proceed? Some ideas in R and R code would be helpful. I went through circular package, but I did not get how to convert the x, y coordinates into a single column to a create rose diagram and other analysis. Any help would be much appreciated.
My data is as follows: LA is longitude of Accidents, LTA is latitude of Accidents, LH is longitude of Hospitals, and LTH is longitude of hospitals.
df <- read.table(text='Year LA LTA LH LTH
Year1 -84.3213 33.8488 -84.3281 33.8779
Year2 -84.322 33.8470 -84.3284 33.8782
year3 -84.323 33.84461 -84.3293 33.8791
year4 -84.3165 33.8359 -84.3452 33.8404
year5 -84.3257 33.8330 -84.3413 33.8340', header=TRUE)
These are the only data i have, so far, i converted that latitude and longitude into an Azimuth, which gave me an angle from origin i believe. Then i used that angle to draw rose diagram. I am not sure if i have to convert those negative coordinates to positive angle. Any inputs from statistics person would be appreciated. Since i have two angles from accidents and hospitals, i believe i can correlate and do watson test. following is my code so far and still working on it;The rose diagram is only for accidents not hospitals, is it good to draw both together? what other things can be done need ,ore suggestions please
library(geosphere)
library(circular)
library(CircStats)
data <- read.table("trial_circ.csv", header = T, sep = ",")
dfr<-data.frame(data[3:4])
dist<- trackAzimuth(as.matrix(dfr))
d<-rose.diag(dist)
Thanks in Advance
Very simple situation : a polygon define a geographical area and I want to know whether a point, given by it gps coordinates, lies within that polygon.
I went through many SO questions and have tried various functions and packages like sp, but cannot make out why it fails.
I tried with this very simple function:
https://www.rdocumentation.org/packages/SDMTools/versions/1.1-221/topics/pnt.in.poly
install.packages("SDMTools v1.1-221")
library(SDMTools v1.1-221)
## Coordinates of the polygon corners
lat <- c(48.43119, 48.43119, 48.42647, 48.400031, 48.39775, 48.40624, 48.42060, 48.42544, 48.42943 )
lon <- c(-71.06970, -71.04180, -71.03889, -71.04944, -71.05991, -71.06764, -71.06223, -71.06987, -71.07004)
pol = cbind(lat=lat,lng=lon)
## Point to be tested
x <- data.frame(lng=-71.05609, lat=48.40909)
## Visualization, this point clearly stands in the middle of the polygon
plot(rbind(pol, x))
polygon(pol,col='#99999990')
## Is that point in the polygon?
out = pnt.in.poly(x,poly)
## Well, no (pip=0)
print(out)
The example given for this function works with me, but this simple case no... why is that?
I have not used the method that you are using, but I have one from within sp which works flawlessly on your point and polygon.
I cherry picked your code and left the lat and lon as vectors and the point coordinates as values to suit the functions requirements.
But you could just has easily have made a data frame and used the columns explicitly as lat/lon values.
Here is the gist of it:
require(sp)
## Your polygon
lat <- c(48.43119, 48.43119, 48.42647, 48.400031, 48.39775, 48.40624, 48.42060, 48.42544, 48.42943 )
lon <- c(-71.06970, -71.04180, -71.03889, -71.04944, -71.05991, -71.06764, -71.06223, -71.06987, -71.07004)
## Your Point
lng=-71.05609
lt=48.40909
# sp function which tests for points in polygons
point.in.polygon(lt, lng, lat, lon, mode.checked=FALSE)
And here is the output:
[1] 1
The interpretation of this from the documentation:
integer array values are:
0 point is strictly exterior to polygon
1 point is strictly interior to polygon
2 point lies on the relative interior of an edge of polygon
3 point is a vertex of polygon
As your point is a 1 based on this, it should be wholly within the polygon as your map shows! The key to getting good output with these types of data is serving the variables in the right formats.
you could just as easily had a data frame df with df$lat and df$lon as the two polygon variables as well as a test frame test with test$lat and test$lon as a series of points. You would just substitute each of those in the equation as such:
point.in.polygon(df$lat, df$lon, test$lat, test$lon, mode.checked=FALSE)
And it would return a vector of 0's, 1's 2's and 3's
Just be sure you get it in the right format first!
Here is a link to the function page:
I can't see it explicitly stated in the documentation for ?pnt.in.poly, but it appears the ordering of the lng and lat columns matter. You need to swap the column ordering in your pol and it works.
pol = cbind(lat=lat, lng=lon)
pnt.in.poly(x, pol)
# lng lat pip
# 1 -71.05609 48.40909 0
pol = cbind(lng=lon, lat=lat)
pnt.in.poly(x, pol)
# lng lat pip
# 1 -71.05609 48.40909 1
In spatial goemetry, lng is often thought of as the x-axis, and lat the y-axis, which you'll see is reversed in your plot()
Lets assume we have a point (described by latitude and longitude) (WGS84) and we form a SpatialPointDataFrame (gData.init). I would like to change the projection (transform) and then use the planar coordinates to estimate distances and intersection points using simple line-point methods. I use the following code to perform the transformation.
library(rgeos)
library(sp)
longitude = 22.954638
latitude = 40.617048
gData.init = data.frame(longitude,latitude)
gData.init$id <- as.numeric(rownames(gData.init))
coordinates(gData.init) <- gData.init[c("longitude", "latitude")]
proj4string(gData.init) <- "+proj=longlat +datum=WGS84"
gDataIn2100 <- spTransform( gData.init, CRS("+init=epsg:2100") )
Now I want to save the coordinates in any data type object; when I do this using the following code
gDataIn2100#coords
I get maximum one decimal:
longitude latitude
[1,] 411425.8 4496486
However when I print coordinates (I like lets say my coordinates to be more precise)
print(coordinates(gDataIn2100), digits = 12)
Then the resulting coordinates are somewhat different:
longitude latitude
[1,] 411425.810118 4496486.37561
This I think causes different estimation of minimum distances between a line and my point in case of using gDistance and by estimating the distance using LinkPointMinDistance
What do I do wrong?
DataIn2100#coords is equivalent to print(DataIn2100#coords, digits = getOption("digits"))
The decimals are only dropped when rendered to the screen. They are stored as numeric and have the precision of a floating point.
Note that coordinates(DataIn2100) is the recommended way to get the coordinates.
I have what may be a very simplistic question on the KEST function in Spatstat.KEST graph output I'm using the KEST function in Spatstat to assess spatial randomness in a dataset. I have uploaded lat and long values spread over London and converted them to a PPP object, using the ripras function to specify the spatial domain. When I run my KEST analysis on my ppp, and plot the graph, I end up with an r value on the x, but although I know this is a distance measurement, I don't know what units it's using. I get this summary output:
Planar point pattern: 113 points
Average intensity 407.9378 points per square unit
Coordinates are given to 9 decimal places
Window: polygonal boundary
single connected closed polygon with 14 vertices
enclosing rectangle: [-0.5532963, 0.3519148] x [51.2901, 51.7022] units
Window area = 0.277003 square units
with the max r on the x axis being 0.1 units, and the K(r) on the y axis being 0.04. How do I figure out what unit of distance these equate to?
Your lat,lon coordinates correspond to points on a sphere (or ellipsoid or whatever) used as a model for planet Earth. Essentially, spatstat assumes you are using coordinates projected on a flat map. This conversion could be done with e.g. the sp package (using Buckingham Palace as an example):
library(sp)
lat = c(51.501476)
lon = c(-0.140634)
xy = data.frame(lon, lat)
coordinates(xy) <- c("lon", "lat")
proj4string(xy) <- CRS("+proj=longlat +datum=WGS84")
NE <- spTransform(xy, CRS("+proj=utm +zone=30 ellps=WGS84"))
NE <- as.data.frame(NE)
The result is a data.frame with projected coordinates in Easting, Northing in metres. Then you can continue your analysis from there. To assign a unit label like "m" for prettier labels in figures use the function unitname on your ppp object (assuming the object is called X): unitname(X) <- "m"
If the function is able to accept geographic coordinates, then it is using a great circle equation to calculate distance. This normally results in units that are in Kilometers.
It is not very good practice to perform PPA on non-projected data. If possible, you should project your data into a coordinate system that is in distance units. I believe that most of the functions in spatstat use Euclidean distance, which is quite inappropriate for projection units in decimal degrees. Since there is not a latlong argument in the Kest function, I do not believe that your results are valid.
The K function itself (i.e. the theoretical K-function, not just the computer code) assumes that the space is flat rather than curved.
This would probably be a reasonable approximation in your case (points scattered over a few dozen kilometres) but not for a point pattern scattered over a continent. That is, in general the planar K-function should not be used for point patterns on a sphere.
The other posts are correct. The Kest function expects the coordinates to be given in an isometric coordinate system. You just need to express the spatial locations in a coordinate system in which the x and y coordinates are measured in the same distance units. Longitude and latitude are not measured in the same distance units because one degree (say) of longitude does not represent the same distance as one degree of latitude. Ege Rubak's example using spTransform is probably the best way to go.