I am working with a dataset that features chemical analyses from different locations within a cave, with each analysis ordered by a site number and that sites latitude and longitude. This first image is what I had done originally simply using ggplot.
Map of site data, colored by N concentration
But what I want to do is use the shapefile of the cave system from which the data is sourced from and do something similar by plotting the points over the system and then coloring them by concentration. This below is the shapefile that I uploaded Cave system shapefile
Cave system shapefile
So basically I want to be able to map the chemical data from my dataset used to map the first figure, but on the map of the shapefile. Initially it kept on saying that it could not plot on top of it. So I figured I had to convert the latitude and longitude into spatial coordinates that could then be mapped on the shapefile.
Master_Cave_data <- Master_cave_data %>%
st_as_sf(MastMaster_cave_data, agr = "identity", coord = Lat_DD)
This was what I had thought to use in order to convert the numerical Latitude cooridnates into spatial data.
I assume your coordinates are in WSG84 projection system (crs code 4326). You can create your sf object the following way:
Master_Cave_data <- st_as_sf(MastMaster_cave_data, coords = c('lon', 'lat'), crs = 4326)
Change lon and lat columns to relevent names. To plot your points with your shapefile, you need them both in the same projection system so reproject if needed:
Master_Cave_data <- Master_cave_data %>% st_transform(st_crs(shapefile))
Example
Borrowed from there
df <- data.frame(place = "London",
lat = 51.5074, lon = 0.1278,
population = 8500000) # just to add some value that is plotable
crs <- 4326
df <- st_as_sf(x = df,
coords = c("lon", "lat"),
crs = crs)
And you can have a look at the map:
library(tmap)
data("World")
tm_shape(World[World$iso_a3 == "GBR", ]) + tm_polygons("pop_est") +
tm_shape(df) + tm_bubbles("population")
Related
I have the following polygon, defined using degrees latitude/longitude:
## Define latitude/longitude
lats <- c(64.25086, 64.24937, 63.24105, 63.22868)
lons <- c(-140.9985, -136.9171, -137.0050, -141.0260)
df <- data.frame(lon = lons, lat = lats)
polygon <- df %>%
## EPSG 3578; Yukon Albers projection
st_as_sf(coords = c('lon', 'lat'), crs = 3578) %>%
summarise(geometry = st_combine(geometry)) %>%
st_cast('POLYGON')
When I plot it on a map using Tmap, it appears in the Pacific Ocean off the coast of British Columbia, rather than in the middle of the Yukon:
library(sf)
library(sp)
library(tmap)
library(dplyr)
library(magrittr)
library(leaflet)
m <- tm_shape(data$study_boundary) + tm_borders(col = 'black',
lwd = 5,
zindex = 1000)
m
I am guessing that the problem is in using lat/long rather than UTMs because I have other polygons defined using UTMs that do appear where they (and the polygon defined above) are supposed to be. I found several other posts going the other way (UTM to lat/long) using spTransform, but I haven't been able to go lat/long to UTM with spTransform. I tried the code below:
poly_utm <- st_transform(polygon, crs = "+proj=utm+7")
But that didn't work either.
Thanks!
This (which I've improved by removing the pipe):
st_as_sf(df, coords = c('lon', 'lat'), crs = 3578)
creates a spatial points data frame using the numbers in the data frame for the coordinates, and the crs code of 3578 as the label for what those numbers represent. It does not change the numbers.
It looks like those numbers are actually lat-long coordinates, which means they are probable crs code 4326, the lat-long system used for GPS, also known as WGS 84. But it might not be. But probably is. Do check. So anyway, you should do:
df_unprojected = st_as_sf(df, coords = c('lon', 'lat'), crs = 4326)
df_projected = st_transform(df_unprojected, 3578)
The st_transform function does the actual change of the coordinate numbers and assigns the new CRS code to the spatial data metadata. That should give you a set of points you can then plot and check they are in the right place before you throw it into summarise and st_cast.
I'm working with a dataframe containing longitude and latitude for each point. I have a shapefile containing mutually exclusive polygons. I would like to find the index of the polygon it where each point is contained. Is there a specific function that helps me achieve this? I've been trying with the sf package, but I'm open to doing it with another one. Any help is greatly appreciated.
I believe you may be looking for function sf::st_intersects() - in combination with sparse = TRUE setting it returns a list, which can be in this use case (points & a set of non-overlapping polygons) converted to a vector easily.
Consider this example, built on the North Carolina shapefile shipped with {sf}
library(sf)
# as shapefile included with sf package
shape <- st_read(system.file("shape/nc.shp", package="sf")) %>%
st_transform(4326) # WGS84 is a good default
# three semi random cities
cities <- data.frame(name = c("Raleigh", "Greensboro", "Wilmington"),
x = c(-78.633333, -79.819444, -77.912222),
y = c(35.766667, 36.08, 34.223333)) %>%
st_as_sf(coords = c("x", "y"), crs = 4326) # again, WGS84
# plot cities on full map
plot(st_geometry(shape))
plot(cities, add = T, pch = 4)
# this is your index
index_of_intersection <- st_intersects(cities, shape, sparse = T) %>%
as.numeric()
# plot on subsetted map to doublecheck
plot(st_geometry(shape[index_of_intersection, ]))
plot(cities, add = T, pch = 4)
I've read through many forums regarding this topic but I can't seem to adapt anything I've read to my particular question. Basically, I have a data frame of lat/lon values and all I want to do is test whether these coordinates exist within California.
Here is some example data:
library(tidyverse)
library(sf)
coords <- tribble(
~city, ~lon, ~lat,
LA, -118.2437, 34.0522,
SF, -122.4194, 37.7749,
SAC, -121.4944, 38.5816,
CHI, -87.6298, 41.8781,
NY, -74.0060, 40.7128
)
And here is a link to the shape files from the state website: CA Shape Files.
I think I'm close...
# read in shape data
cali <- read_sf("CA_State_TIGER2016.shp")
# convert coordinates to spatial point compatible data
coords_sf <- st_as_sf(coords, coords = c("lon", "lat"), crs = st_crs(cali))
From there, I assume I use st_contains to test whether my cali object contains the coordinates found in coords_sf but I can't get it to work.
Any advice?
Thanks for your help!
In your code, there is a confusion between the original coordinate reference system of your point dataset coords and the crs you want to apply to it.
Note that your dataset named coords is not a spatial dataset. You need to make it a spatial dataset with st_as_sf(). The crs of the coordinates you entered in this dataframe is "geographical coordinates".
Once this is a dataset, you can then transform it to the target crs.
In your code, you tried to do both at the same time.
Hence the answer you are looking for is:
library(tidyverse)
library(sf)
coords <- tribble(
~city, ~lon, ~lat,
"LA", -118.2437, 34.0522,
"SF", -122.4194, 37.7749,
"SAC", -121.4944, 38.5816,
"CHI", -87.6298, 41.8781,
"NY", -74.0060, 40.7128
)
file <- tempfile(fileext = ".zip")
download.file("https://data.ca.gov/dataset/e212e397-1277-4df3-8c22-40721b095f33/resource/3db1e426-fb51-44f5-82d5-a54d7c6e188b/download/ca-state-boundary.zip", destfile = file)
unzip(zipfile = file)
# read in shape data
cali <- read_sf("CA_State_TIGER2016.shp")
# Your data are originally geographical coordinates which has EPSG=4326
coords_sf <- st_as_sf(coords, coords = c("lon", "lat"), crs = 4326)
# Then you can transform them to the system you want
coords_cali <- coords_sf %>% st_transform(crs = st_crs(cali))
cali %>% st_contains(coords_cali)
If you want to add the information of the cali shapefile in your point dataset you can:
Keep entire point dataset and put NA
coords_cali %>%
st_join(cali)
Keep only points that are inside the cali polygon
coords_cali %>%
st_intersection(cali)
The sf package provides a great approach to working with geographic features, but I can't figure out a simple equivalent to the poly.counts function from GISTools package which desires sp objects.
poly.counts computes the number of points from a SpatialPointsDataFrame fall within the polygons of a SpatialPolygonsDataFrame and can be used as follows:
Data
## Libraries
library("GISTools")
library("tidyverse")
library("sf")
library("sp")
library("rgdal")
## Obtain shapefiles
download.file(url = "https://www2.census.gov/geo/tiger/TIGER2016/STATE/tl_2016_us_state.zip", destfile = "data-raw/states.zip")
unzip(zipfile = "data-raw/states.zip", exdir = "data-raw/states")
sf_us_states <- read_sf("data-raw/states")
## Our observations:
observations_tibble <- tribble(
~lat, ~long,
31.968599, -99.901813,
35.263266, -80.854385,
35.149534, -90.04898,
41.897547, -84.037166,
34.596759, -86.965563,
42.652579, -73.756232,
43.670406, -93.575858
)
Calculate points per polygon
I generate both my sp objects:
sp_us_states <- as(sf_us_states, "Spatial")
observations_spdf <- observations_tibble %>%
select(long, lat) %>% # SPDF want long, lat pairs
SpatialPointsDataFrame(coords = .,
data = .,
proj4string = sp_us_states#proj4string)
Now I can use poly.counts
points_in_states <-
poly.counts(pts = observations_spdf, polys = sp_us_states)
Add this into the sp object:
sp_us_states$points.in.state <- points_in_states
Now I've finished I'd convert back to sf objects and could visualise as follows:
library("leaflet")
updated_sf <- st_as_sf(sp_us_states)
updated_sf %>%
filter(points.in.state > 0) %>%
leaflet() %>%
addPolygons() %>%
addCircleMarkers(
data = observations_tibble
)
Question
Can I perform this operation without tedious conversion between sf and sp objects?
Try the following:
sf_obs = st_as_sf(observations_tibble, coords = c("long", "lat"),
crs = st_crs(sf_us_states))
lengths(st_covers(sf_us_states, sf_obs))
# check:
summary(points_in_states - lengths(st_covers(sf_us_states, sf_obs)))
st_covers returns a list with the indexes of points covered by each state; lengths returns the vector of the lenghts of these vectors, or the point count. The warnings you'll see indicate that although you have geographic coordinates, the underlying software assumes they are cartesian (which, for this case, will be most likely not problematic; move to projected coordinates if you want to get rid of it the proper way)
I'm a New R user so not quite comfortable with the language.
Attempting to plot the locations of bird records on a map of Manchester, England.
have managed to create a map with following code
mymap<-get_map(c(lon=53.46388,lat=-2.294037),zoom=3,col="bw")
Have read spreadsheet as an xlsx file from excel via gdata, columns containing both long and lat assigned to Lon & Lat.
Seem to be able to qplot lon&lat but not as a layer on the map, when I attempt this I get the following error
Error: ggplot2 doesn't know how to deal with data of class list
I've now tried so many combinations of code it would be impossible for me to offer a demonstrative line on how I'm attempting to attach the data onto my map, have followed tutorials online to no avail - is it a problem in my xlsx file?
Edited: Sample code :
#Here is what Jamie Dunning tried:
require(ggmap)
origin<-c("Worsley,Salford","Elton reservoir","Etherow country park","Blackleach country park","Low Hall,LNR, Wigan","Cheadle royal","Rhodes lodges,Middleton","Persons flash,Wigan","Sale water park","Plattfields","Higher Boarshaw","Clifton country park","Horrocks flash")
ringing.origins<-geocode(origin)
map<-c(get_map("Greater Manchester")
swans.coor<-cbind(ringing.origins$lon,ringing.origins$lat)
I'm yet to have an example where they are plotted successfully.
Another Alternative using plotGoogleMaps
1- Get coordinates
require(ggmap)
#List places to find GPS coordinates for:
origin<-c("Worsley,Salford","Elton reservoir","Etherow country park","Elton reservoir","Blackleach country park","Low Hall,LNR, Wigan","Cheadle royal","Rhodes lodges,Middleton","Persons flash,Wigan","Sale water park","Plattfields","Higher Boarshaw","Clifton country park","Horrocks flash")
#Get coordinates via geocode function
ringing.origins<-geocode(origin)
#Put these coordinates in a data frame for creating an SP object later
df <- as.data.frame(origin)
row.names(df) <- 1:nrow(df)
2- Create Spatial Object
require(sp)
#Coordinates must be numeric and binded together as one element and rows numbered:
ringing.origins$lon <- as.numeric(ringing.origins$lon)
ringing.origins$lat <- as.numeric(ringing.origins$lat)
coord <- cbind(ringing.origins$lon,ringing.origins$lat)
row.names(coord) <- 1:nrow(coord)
#Define a mapping projection
AACRS <- CRS("+proj=longlat +ellps=WGS84")
#Creating a spatial object of "df" using the binded coordinates "coord":
Map2 <- SpatialPointsDataFrame(coord, df, proj4string = AACRS, match.ID = TRUE)
3-Create an interactive html googlemap:
require(plotGoogleMaps)
#Simple Map
plotGoogleMaps(Map2)
#Map with some options, filename creates a file in the working directory.
plotGoogleMaps(Map2, mapTypeId="ROADMAP", colPalette="red", legend=FALSE, filename="Swan_Map.htm")
Plotting using ggmap
require(ggmap)
#Get your coordinates
origin<-c("Worsley,Salford","Elton reservoir","Etherow country park","Elton reservoir","Blackleach country park","Low Hall,LNR, Wigan","Cheadle royal","Rhodes lodges,Middleton","Persons flash,Wigan","Sale water park","Plattfields","Higher Boarshaw","Clifton country park","Horrocks flash")
ringing.origins<-geocode(origin)
#Map of Greater Manchester
map<-get_map("Greater Manchester")
ggmap(map, extent = 'normal') +
geom_point(aes(x = lon, y = lat), data = ringing.origins)
#Box is too small...
#Bounding box with All points
mymap<-get_map(c(lon=-2.294037,lat=53.46388),zoom=10)
ggmap(mymap, extent = 'device') +
geom_point(aes(x = lon, y = lat), data = ringing.origins, alpha = 1)