Does this point stand within a polygon? - r

Very simple situation : a polygon define a geographical area and I want to know whether a point, given by it gps coordinates, lies within that polygon.
I went through many SO questions and have tried various functions and packages like sp, but cannot make out why it fails.
I tried with this very simple function:
https://www.rdocumentation.org/packages/SDMTools/versions/1.1-221/topics/pnt.in.poly
install.packages("SDMTools v1.1-221")
library(SDMTools v1.1-221)
## Coordinates of the polygon corners
lat <- c(48.43119, 48.43119, 48.42647, 48.400031, 48.39775, 48.40624, 48.42060, 48.42544, 48.42943 )
lon <- c(-71.06970, -71.04180, -71.03889, -71.04944, -71.05991, -71.06764, -71.06223, -71.06987, -71.07004)
pol = cbind(lat=lat,lng=lon)
## Point to be tested
x <- data.frame(lng=-71.05609, lat=48.40909)
## Visualization, this point clearly stands in the middle of the polygon
plot(rbind(pol, x))
polygon(pol,col='#99999990')
## Is that point in the polygon?
out = pnt.in.poly(x,poly)
## Well, no (pip=0)
print(out)
The example given for this function works with me, but this simple case no... why is that?

I have not used the method that you are using, but I have one from within sp which works flawlessly on your point and polygon.
I cherry picked your code and left the lat and lon as vectors and the point coordinates as values to suit the functions requirements.
But you could just has easily have made a data frame and used the columns explicitly as lat/lon values.
Here is the gist of it:
require(sp)
## Your polygon
lat <- c(48.43119, 48.43119, 48.42647, 48.400031, 48.39775, 48.40624, 48.42060, 48.42544, 48.42943 )
lon <- c(-71.06970, -71.04180, -71.03889, -71.04944, -71.05991, -71.06764, -71.06223, -71.06987, -71.07004)
## Your Point
lng=-71.05609
lt=48.40909
# sp function which tests for points in polygons
point.in.polygon(lt, lng, lat, lon, mode.checked=FALSE)
And here is the output:
[1] 1
The interpretation of this from the documentation:
integer array values are:
0 point is strictly exterior to polygon
1 point is strictly interior to polygon
2 point lies on the relative interior of an edge of polygon
3 point is a vertex of polygon
As your point is a 1 based on this, it should be wholly within the polygon as your map shows! The key to getting good output with these types of data is serving the variables in the right formats.
you could just as easily had a data frame df with df$lat and df$lon as the two polygon variables as well as a test frame test with test$lat and test$lon as a series of points. You would just substitute each of those in the equation as such:
point.in.polygon(df$lat, df$lon, test$lat, test$lon, mode.checked=FALSE)
And it would return a vector of 0's, 1's 2's and 3's
Just be sure you get it in the right format first!
Here is a link to the function page:

I can't see it explicitly stated in the documentation for ?pnt.in.poly, but it appears the ordering of the lng and lat columns matter. You need to swap the column ordering in your pol and it works.
pol = cbind(lat=lat, lng=lon)
pnt.in.poly(x, pol)
# lng lat pip
# 1 -71.05609 48.40909 0
pol = cbind(lng=lon, lat=lat)
pnt.in.poly(x, pol)
# lng lat pip
# 1 -71.05609 48.40909 1
In spatial goemetry, lng is often thought of as the x-axis, and lat the y-axis, which you'll see is reversed in your plot()

Related

R find distance between line and point at fixed bearing angles [duplicate]

This question already has answers here:
Find nearest distance from spatial point with direction specified
(3 answers)
Closed 2 years ago.
this is my reproducible example
########################################
library(sf)
# matrix of lon lat for the definition of the linestring
m<-rbind(
c(12.09136, 45.86471),
c(12.09120, 45.86495),
c(12.09136, 45.86531),
c(12.09137, 45.86540),
c(12.09188, 45.86585),
c(12.09200, 45.86592),
c(12.09264, 45.86622),
c(12.09329, 45.86624),
c(12.09393, 45.86597),
c(12.09410, 45.86585),
c(12.09423, 45.86540),
c(12.09411, 45.86495),
c(12.09393, 45.86471),
c(12.09383, 45.86451),
c(12.09329, 45.86414),
c(12.09264, 45.86413),
c(12.09200, 45.86425),
c(12.09151, 45.86451),
c(12.09136, 45.86471)
)
# define a linestring
ls<-st_linestring(m)
# create a simple feature with appropriate crs
ls<-st_sfc(ls, crs=4326)
# and now again going through the very same
# definition process for a point
# define a point
pt <- st_point(c(12.09286,45.86557))
# crate simple feature with appropriate crs
pt<-st_sfc(pt, crs = 4326)
plot(ls)
plot(pt, add=TRUE)
# this is computing the minimum distance from the point to the line
st_distance(ls, pt)
###############
given the above mentioned toy dataset, I need to find a proper method to calculate:
1 - the distance from each vertex of the line to the given point: and this is probably easily accomplished by calculating the distance between each couple of points (line vetex vs. point) through the simple application of the pythagorean theorem even if I'm quite dubious of that because of the crs in use (i.e. epsg 4326, in degree unit), so that I probably need first to convert the whole dataset to another reference system (with metric unit)...
2 - the distance between the point and the line at fixed bearing angles (10°, 20°, 30°,....,360° from the North): and this is where I'm really lost....
please give me some help in order to properly proceed with the calculation, possibly by using the 'sf' standard that I'm trying now to familiarize with
thanks
thank you for pointing me in the right direction
I worked out my final solution that I'm posting here for the sake of completeness
# my reproducible example
library(sf)
# matrix of lon lat for the definition of the linestring
m<-rbind(
c(12.09136, 45.86471),
c(12.09120, 45.86495),
c(12.09136, 45.86531),
c(12.09137, 45.86540),
c(12.09188, 45.86585),
c(12.09200, 45.86592),
c(12.09264, 45.86622),
c(12.09329, 45.86624),
c(12.09393, 45.86597),
c(12.09410, 45.86585),
c(12.09423, 45.86540),
c(12.09411, 45.86495),
c(12.09393, 45.86471),
c(12.09383, 45.86451),
c(12.09329, 45.86414),
c(12.09264, 45.86413),
c(12.09200, 45.86425),
c(12.09151, 45.86451),
c(12.09136, 45.86471)
)
# define the linestring
ls<-st_linestring(m)
# create a simple feature linestring with appropriate crs
ls<-st_sfc(ls, crs=4326)
# and now again going through the very same
# definition process for a point
# define the origin point
pt <- st_point(c(12.09286,45.86557))
# create simple feature point with appropriate crs
pt<-st_sfc(pt, crs = 4326)
plot(ls)
plot(pt, add=TRUE)
# get minimum distance from the origin point to the line
dist_min<-st_distance(ls, pt)
# get cordinates of the origin point
pt_orig<-st_coordinates(pt)
# load library for later use of the function destPoint()
library(geosphere)
# create vector of bearing angles of 10 degress amplitude
b_angles<-seq(0, 350, 10)
# create empty container for final result as data frame
result<-data.frame(bearing=NULL, distance=NULL)
for(i in 1:length(b_angles)){
result[i,"bearing"]<-b_angles[i]
# calculate destination point coordinates with bearing angle i
# at fixed safe distance (i.e. 100 times the minimum distance)
# so that to avoid null intersection in next step calculation
pt_dest<-destPoint(p=pt_orig, b=b_angles[i],d=dist_min*100)
# define linestring from origin to destination
b_ls<-st_sfc(st_linestring(rbind(pt_orig, pt_dest)), crs=4326)
# get the intersection point between two features
pt_int<-st_intersection(ls, b_ls)
# get the distance
d<-st_distance(pt, pt_int)
result[i,"distance"]<-d
}
I stick as much as possible with the "sf" approach which is giving the following warning inside the for loop in correspondace with the execution of st_intersection(): "although coordinates are longitude/latitude, st_intersection assumes that they are planar"
but considering the short distance I'm working with it seems to me an acceptable approximation
by the way, as far as I understand, it does not exists a corresponding function to geosphere::destPoint within the package "sf"
thanks

I would like to work out the distance of data points (lat/long) from the edges of a shape file in R and then apply a criterion to the data points?

I have data points of a species observed using camera traps and would like to measure the distance of each camera trap site (CameraStation) to the edge of a national park using R. I have a shapefile of the park (shp) and want to apply a criterion to CameraStation(s) which are <5km from the edge. My data frame (df) consists of multiple events/observations (EventID) per CameraStation. The aim is to analyse when events near the park edge are most frequent given other environmental factors such as Season, Moon Phase and DayNight (also columns in DF).
I found a package called distance in R but this is for distance sampling and not what I want to do. Which package is relevant in this situation?
I expect the following outcome:
EventID CameraStation Distance(km) Within 5km
0001 Station 1 4.3 Yes
0002 Station 1 4.3 Yes
0003 Station 2 16.2 No
0004 Station 3 0.5 Yes
...
Here's a general solution, adapted from Spacedmans answer to this question at gis.stackexchange. Note: This solution requires working in a projected coordinate system. You can transform to a projected CRS if needed using spTransform.
The gDistance function of the rgeos package calculates the distance between geometries, but for the case of points inside a polygon the distance is zero. The trick is to create a new "mask" polygon where the original polygon is a hole cut out from the mask. Then we can measure the distance between points in the hole and the mask, which is the distance to the edge of the original polygon that we really care about.
We'll use the shape file of the Yellowstone National Park Boundary found on this page.
library(sp) # for SpatialPoints and proj4string
library(rgdal) # to read shapefile with readOGR
library(rgeos) # for gDistance, gDifference, and gBuffer
# ab67 was the name of the shape file I downloaded.
yellowstone.shp <- readOGR("ab67")
# gBuffer enlarges the boundary of the polygon by the amount specified by `width`.
# The units of `width` (meters in this case) can be found in the proj4string
# for the polygon.
yellowstone_buffer <- gBuffer(yellowstone.shp, width = 5000)
# gDifference calculates the difference between the polygons, i.e. what's
# in one and not in the other. That's our mask.
mask <- gDifference(yellowstone_buffer, yellowstone.shp)
# Some points inside the park
pts <- list(x = c(536587.281264245, 507432.037861251, 542517.161278414,
477782.637790409, 517315.171218198),
y = c(85158.0056377799, 77251.498952222, 15976.0721391485,
40683.9055315169, -3790.19457474617))
# Sanity checking the mask and our points.
plot(mask)
points(pts)
# Put the points in a SpatialPointsDataFrame with camera id in a data field.
spts.df <- SpatialPointsDataFrame(pts, data = data.frame(Camera = ordered(1:length(pts$x))))
# Give our SpatialPointsDataFrame the same spatial reference as the polygon.
proj4string(spts.df) <- proj4string(yellowstone.shp)
# Calculate distances (km) from points to edge and put in a new column.
spts.df$km_to_edge <- apply(gDistance(spts.df, difference, byid=TRUE),2,min)/1000
# Determine which records are within 5 km of an edge and note in new column.
spts.df$edge <- ifelse(spts.df$km_to_edge < 5, TRUE, FALSE)
# Results
spts.df
# coordinates Camera km_to_edge edge
# 1 (536587.3, 85158.01) 1 1.855010 TRUE
# 2 (507432, 77251.5) 2 9.762755 FALSE
# 3 (542517.2, 15976.07) 3 11.668700 FALSE
# 4 (477782.6, 40683.91) 4 4.579638 TRUE
# 5 (517315.2, -3790.195) 5 8.211961 FALSE
Here's a quick solution.
Simplify the outline of your shapefile into N points. Then calculate the minimum distance for each camera trap to every point in the outline of the national park.
library(geosphrere)
n <- 500 ##The number of points summarizing the shapefile
NPs <- ##Your shapefile goes here
NP.pts <- spsample(NPs, n = n, type = "regular")
CP.pts <- ## Coordinates for a single trap
distances<-distm(coordinates(CP.pts), coordinates(NP.pts), fun = distHaversine)/1000
##Distance in Km between the trap to each point in the perimeter of the shapefile:
distances
Use distances to find the minimum distance between the shapefile and that given trap. This approach can easily be generalizable using for loops or apply functions.
I had a problem with the points data frame and shape file being projected so instead I used the example in this link to answer my question
https://gis.stackexchange.com/questions/225102/calculate-distance-between-points-and-nearest-polygon-in-r
Basically, I used this code;
df # my data frame with points
shp # my shapefile (non-projected)
dist.mat <- geosphere::dist2Line(p = df2, line = shp)
coordinates(df2)<-~Longitude+Latitude # Longitude and Latitude are columns in my df
dmat<-data.frame(dist.mat) # turned it into a data frame
dmat$km5 <- ifelse(dmat$distance < 5000, TRUE, FALSE) # in meters (5000)
coordinates(dmat)<-~lon+lat
df2$distance <- dmat$distance # added new Distance column to my df

How to calculate road network distances between a reference line(or point) and a dataframe of long/lat points with R?

I want to calculate road network distances between a reference line (or a reference point if a single point facilitates the possible solution) and a dataframe of long/lat points. I have the following data frame:
Latitude Longitude
1 40.66858 22.88713
2 40.66858 22.88713
3 40.66858 22.88713
4 40.66858 22.88713
5 40.66858 22.88714
6 40.66857 22.88715
7 40.66858 22.88716
8 40.66858 22.88717
9 40.66859 22.88718
10 40.66861 22.88719
and the following reference line with start/end coordinates:
22.88600 40.66885
22.88609 40.66880
(If we want a single reference point in the middle of the line (instead of the whole line) its coordinates are: 22.88602844465866,40.66883357487465)
Here is a screenshot from google earth after plotting the points and the line:
I have tried to compute the distances of each point with the reference line with the following way:
dist2Line(points, line, distfun=distHaversine) #from geosphere package
The distance which is computed (e.g. for the first point) is the one with the yellow line in the following screenshot. The desired one is the one with the red
line (road network distance). How can I solve this? I want to compute the road network distances for all points!
Thank you in advance!
library(sp)
library(rgeos)
library(geosphere)
Let's join the midpoint of your line to the other line:
pt1 <- matrix(c(22.88600, 40.66885), ncol=2)
pt2 <- matrix(c(22.88609, 40.66880), ncol=2)
midpt <- as.data.frame(midPoint(pt1, pt2))
NOTE: The first 4 line points are the same in your supplied data
read.csv(text="lat,lon
40.66858,22.88713
40.66858,22.88713
40.66858,22.88713
40.66858,22.88713
40.66858,22.88714
40.66857,22.88715
40.66858,22.88716
40.66858,22.88717
40.66859,22.88718
40.66861,22.88719", stringsAsFactors = FALSE) -> l
l <- rbind.data.frame(midpt, l)
Using the midpoint on the line isn't perfect so you could use the spatial intersection operations as well to find the correct intersecting point.
Now, make it a spatial object and give it the boring longlat "projection".
l <- SpatialLines(list(Lines(Line(l[,2:1]), "1")), proj4string = CRS("+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0"))
Convert said "projection" to something meaningful (I picked EPSG:3265, but choose whatever you want so you can get real distance):
l <- spTransform(l, CRS("+init=epsg:3265"))
Get the points from the line:
pts <- as(l, "SpatialPoints")
Follow How to calculate geographic distance between two points along a line in R? to get the distance between points which you can do the rest from there:
diff(sort(gProject(l, pts, normalized = FALSE)))
## [1] 372.553928 0.000000 0.000000 0.000000 3.360954 4.581859
## [7] 4.581860 3.360956 4.581862 7.077129
It'd be 👍🏼 if someone who knows how to do this with sf could do that as well since I couldn't find a gProject equivalent.

Create Grid in R for kriging in gstat

lat long
7.16 124.21
8.6 123.35
8.43 124.28
8.15 125.08
Consider these coordinates, these coordinates correspond to weather stations that measure rainfall data.
The intro to the gstat package in R uses the meuse dataset. At some point in this tutorial: https://rpubs.com/nabilabd/118172, the guys makes use of a "meuse.grid" in this line of code:
data("meuse.grid")
I do not have such a file and I do not know how to create it, can I create one using these coordinates? Or at least point me to material that discusses how to create a custom grid for a custom area (i.e not using administrative boundaries from GADM).
Probably wording this wrong, don't even know if this question makes sense to R savvy people. Still, would love to hear some direction, or at least tips. Thanks a lot!
Total noob at R and statistics.
EDIT: See the sample grid that the tutorial I posted looks like, that's the thing I want to make.
EDIT 2: Would this method be viable? https://rstudio-pubs-static.s3.amazonaws.com/46259_d328295794034414944deea60552a942.html
I am going to share my approach to create a grid for kriging. There are probably more efficient or elegant ways to achieve the same task, but I hope this will be a start to facilitate some discussions.
The original poster was thinking about 1 km for every 10 pixels, but that is probably too much. I am going to create a grid with cell size equals to 1 km * 1 km. In addition, the original poster did not specify an origin of the grid, so I will spend some time determining a good starting point. I also assume that the Spherical Mercator projection coordinate system is the appropriate choice for the projection. This is a common projection for Google Map or Open Street Maps.
1. Load Packages
I am going to use the following packages. sp, rgdal, and raster are packages provide many useful functions for spatial analysis. leaflet and mapview are packages for quick exploratory visualization of spatial data.
# Load packages
library(sp)
library(rgdal)
library(raster)
library(leaflet)
library(mapview)
2. Exploratory Visualization of the station locations
I created an interactive map to inspect the location of the four stations. Because the original poster provided the latitude and longitude of these four stations, I can create a SpatialPointsDataFrame with Latitude/Longitude projection. Notice the EPSG code for Latitude/Longitude projection is 4326. To learn more about EPSG code, please see this tutorial (https://www.nceas.ucsb.edu/~frazier/RSpatialGuides/OverviewCoordinateReferenceSystems.pdf).
# Create a data frame showing the **Latitude/Longitude**
station <- data.frame(lat = c(7.16, 8.6, 8.43, 8.15),
long = c(124.21, 123.35, 124.28, 125.08),
station = 1:4)
# Convert to SpatialPointsDataFrame
coordinates(station) <- ~long + lat
# Set the projection. They were latitude and longitude, so use WGS84 long-lat projection
proj4string(station) <- CRS("+init=epsg:4326")
# View the station location using the mapview function
mapview(station)
The mapview function will create an interactive map. We can use this map to determine what could be a suitable for the origin of the grid.
3. Determine the origin
After inspecting the map, I decided that the origin could be around longitude 123 and latitude 7. This origin will be on the lower left of the grid. Now I need to find the coordinate representing the same point under Spherical Mercator projection.
# Set the origin
ori <- SpatialPoints(cbind(123, 7), proj4string = CRS("+init=epsg:4326"))
# Convert the projection of ori
# Use EPSG: 3857 (Spherical Mercator)
ori_t <- spTransform(ori, CRSobj = CRS("+init=epsg:3857"))
I first created a SpatialPoints object based on the latitude and longitude of the origin. After that I used the spTransform to perform project transformation. The object ori_t now is the origin with Spherical Mercator projection. Notice that the EPSG code for Spherical Mercator is 3857.
To see the value of coordinates, we can use the coordinates function as follows.
coordinates(ori_t)
coords.x1 coords.x2
[1,] 13692297 781182.2
4. Determine the extent of the grid
Now I need to decide the extent of the grid that can cover all the four points and the desired area for kriging, which depends on the cell size and the number of cells. The following code sets up the extent based on the information. I have decided that the cell size is 1 km * 1 km, but I need to experiment on what would be a good cell number for both x- and y-direction.
# The origin has been rounded to the nearest 100
x_ori <- round(coordinates(ori_t)[1, 1]/100) * 100
y_ori <- round(coordinates(ori_t)[1, 2]/100) * 100
# Define how many cells for x and y axis
x_cell <- 250
y_cell <- 200
# Define the resolution to be 1000 meters
cell_size <- 1000
# Create the extent
ext <- extent(x_ori, x_ori + (x_cell * cell_size), y_ori, y_ori + (y_cell * cell_size))
Based on the extent I created, I can create a raster layer with number all equal to 0. Then I can use the mapview function again to see if the raster and the four stations matches well.
# Initialize a raster layer
ras <- raster(ext)
# Set the resolution to be
res(ras) <- c(cell_size, cell_size)
ras[] <- 0
# Project the raster
projection(ras) <- CRS("+init=epsg:3857")
# Create interactive map
mapview(station) + mapview(ras)
I repeated this process several times. Finally I decided that the number of cells is 250 and 200 for x- and y-direction, respectively.
5. Create spatial grid
Now I have created a raster layer with proper extent. I can first save this raster as a GeoTiff for future use.
# Save the raster layer
writeRaster(ras, filename = "ras.tif", format="GTiff")
Finally, to use the kriging functions from the package gstat, I need to convert the raster to SpatialPixels.
# Convert to spatial pixel
st_grid <- rasterToPoints(ras, spatial = TRUE)
gridded(st_grid) <- TRUE
st_grid <- as(st_grid, "SpatialPixels")
The st_grid is a SpatialPixels that can be used in kriging.
This is an iterative process to determine a suitable grid. Throughout the process, users can change the projection, origin, cell size, or cell number depends on the needs of their analysis.
#yzw and #Edzer bring up good points for creating a regular rectangular grid, but sometimes, there is the need to create an irregular grid over a defined polygon, usually for kriging.
This is a sparsely documented topic. One good answer can be found here. I expand on it with code below:
Consider the the built in meuse dataset. meuse.grid is an irregularly shaped grid. How do we make an grid like meuse.grid for our unique study area?
library(sp)
data(meuse.grid)
ggplot(data = meuse.grid) + geom_point(aes(x, y))
Imagine an irregularly shaped SpatialPolygon or SpatialPolygonsDataFrame, called spdf. You first build a regular rectangular grid over it, then subset the points in that regular grid by the irregularly-shaped polygon.
# First, make a rectangular grid over your `SpatialPolygonsDataFrame`
grd <- makegrid(spdf, n = 100)
colnames(grd) <- c("x", "y")
# Next, convert the grid to `SpatialPoints` and subset these points by the polygon.
grd_pts <- SpatialPoints(
coords = grd,
proj4string = CRS(proj4string(spdf))
)
# subset all points in `grd_pts` that fall within `spdf`
grd_pts_in <- grd_pts[spdf, ]
# Then, visualize your clipped grid which can be used for kriging
ggplot(as.data.frame(coordinates(grd_pts_in))) +
geom_point(aes(x, y))
If you have your study area as a polygon, imported as a SpatialPolygons, you could either use package raster to rasterize it, or use sp::spsample to sample it using sampling type regular.
If you don't have such a polygon, you can create points regularly spread over a rectangular long/lat area using expand.grid, using seq to generate a sequence of long and lat values.

location data format for adehabitat package

I have a file in this format:
ASCII format
The first rows look like this:
ncols 1440
nrows 720
xllcorner -180.0
yllcorner -90
cellsize 0.25
NODATA_value -9999
Basically I have the world with 1440 'tiles' in x direction (longitude) and 720 'tiles' in y direction (latitude). Each 'tile' is a square with a length of 0.25 degrees. I think I have xllcorner and yllcorner correct. I can draw this map like this in R:
library("adehabitat")
bio1 <- import.asc("D:/ENFA/data.asc")
maps <- as.kasc(list(data = bio1))
image(maps, col = cm.colors(256), clfac = list(Aspect = cl))
The map looks fine.
I would like to perform some ecological niche factor analysis (ENFA) using the adehabitat package and am not too sure about the location data. Basically I have them as longitudes and latitudes at the moment but I could also generate then as 'tile index' (e.g. lower left corner has the latitude -90 and longitude -180 so the 'tile index' would be 0, 0 - right?). Which is the correct location data format? I would use ENFA code like this:
locs <- read.table("D:/ENFA/Locs.txt", header = TRUE, sep="\t")
dataenfa1 <- data2enfa(maps, locs)
pc <- dudi.pca(dataenfa1$tab, scannf = FALSE)
enfa1 <- enfa(pc, dataenfa1$pr,scannf = FALSE)
hist(enfa1)
I would appreciate any comments please. Thanks in advance.
The problem with leaving your coordinates in lat-long form is that, at most places on earth, a degree of longitude has a different length than a degree of latitude. This might distort your ENFA by exaggerating distances in some directions relative to those in others.
Especially if your data are from a relatively small area, I'd suggest re-expressing the coordinates in meters along an W/E x-axis and S/N y-axis. If all of your points fall inside a single UTM zone, then you could do the conversion within R, using project() in the rgdal package:
Here's one example, taken from here:
library(rgdal)
# Make a two-column matrix, col1 = long, col2 = lat
xy <- cbind(c(118, 119), c(10, 50))
# Convert it to UTM coordinates (in units of meters)
project(xy, "+proj=utm +zone=51 ellps=WGS84")
[,1] [,2]
[1,] -48636.65 1109577
[2,] 213372.05 5546301
Much more info about how to manipulate spatial data is available in the "Applied Spatial Data Analysis with R" by Bivand, Pebesma, and Gomez-Rubio. If you need more specific assistance, try the R-sig-Geo mailing list.
Hope this helps.
Maybe you want to convert the coordinates into
GHAM (Global, Hierarchical, Alphanumeric, and Morton-encoded)
which represents the globe by cells of arbitrary precision (as fine or coarse as you wish), so any lat/lon has a single alpha-numeric address that remains sortable.
Here's the abstract from GHAM: A compact global geocode suitable for sorting, by Duncan Agnew:
The GHAM code is a technique for labeling geographic locations based
on their positions. It defines addresses for equal-area cells bounded
by constant latitude and longitude, with arbitrarily fine precision.
The cell codes are defined by applying Morton ordering to a recursive
division into a 16 by 16 grid, with the resulting numbers encoded into
letter–number pairs. A lexical sort of lists of points so labeled will
bring near neighbors (usually) close together; tests on a variety of
global datasets show that in most cases the actual closest point is
adjacent in the list 50% of the time, and within 5 entries 80% of the
time.
Source code is the IAMG repository, but if you can't access it I'm sure he would provide it.

Resources