rayshader: Plot coordinates of interest using ggplot2 doesn't work - r

I like to plot a 3D plot using rayshader package and put some coordinates of interest but it doesn't work.
If I make the dem elevation in rayshader example, it's OK:
library(rayshader)
library(ggplot2)
library(raster)
#Here, I load a map with the raster package.
loadzip = tempfile()
download.file("https://tylermw.com/data/dem_01.tif.zip", loadzip)
localtif = raster::raster(unzip(loadzip, "dem_01.tif"))
unlink(loadzip)
#And convert it to a matrix:
elmat = raster_to_matrix(localtif)
# Plot elevation
ggelmat = elmat %>%
reshape2::melt() %>%
ggplot() +
geom_tile(aes(x=Var1,y=Var2,fill=value)) +
scale_x_continuous("X",expand = c(0,0)) +
scale_y_continuous("Y",expand = c(0,0)) +
coord_fixed()
ggelmat
# \donttest{
plot_gg(ggelmat, multicore = TRUE, raytrace = TRUE, width = 7, height = 4,
scale = 300, windowsize = c(1400, 866), zoom = 0.6, phi = 30, theta = 30, show.legend = FALSE)
render_snapshot(clear = TRUE)
But if I create some random point and try to put inside the plot:
#Create some 100 random coordinates
# grab 100 cell index numbers at random
samp <- sample(localtif, 100, replace = FALSE)
# and their location coordinates
samplocs <- xyFromCell(localtif, samp)
samplocs.df<-as.data.frame(samplocs)
pts<-spsample(localtif, 100, type = 'random')
pts.df<-as.data.frame(pts)
# Plot elevation with some coordinates
ggelmat2 = elmat %>%
reshape2::melt() %>%
ggplot() +
geom_point(data = samplocs.df, mapping = aes(x = x, y = y)) +
geom_tile(aes(x=Var1,y=Var2,fill=value)) +
scale_x_continuous("X",expand = c(0,0)) +
scale_y_continuous("Y",expand = c(0,0)) +
coord_fixed()
ggelmat2
# \donttest{
plot_gg(ggelmat2, multicore = TRUE, raytrace = TRUE, width = 7, height = 4,
scale = 300, windowsize = c(1400, 866), zoom = 0.6, phi = 30, theta = 30, show.legend = FALSE)
render_snapshot(clear = TRUE)
#
It doesn't work! Another question, and I believe that's part of the solution, is it possible to show the original image system coordinates in the X and Y axis?

Your elmat %>% reshape2::melt() %>% ggplot() pattern causes the Var1 and Var2 columns of the plot input to be row and column numbers instead of coordinates. Also, your samplelocs were sampling values instead of cells from the localtif objects it seems.
I adressed these two points in the code below:
library(rayshader)
#> Warning: package 'rayshader' was built under R version 4.0.2
library(ggplot2)
#> Warning: package 'ggplot2' was built under R version 4.0.2
library(raster)
#> Loading required package: sp
#Here, I load a map with the raster package.
loadzip = tempfile()
download.file("https://tylermw.com/data/dem_01.tif.zip", loadzip)
localtif = raster::raster(unzip(loadzip, "dem_01.tif"))
#> Warning in showSRID(uprojargs, format = "PROJ", multiline = "NO"): Discarded
#> datum Unknown based on GRS80 ellipsoid in CRS definition
unlink(loadzip)
#And convert it to a matrix:
elmat = raster_to_matrix(localtif)
samp <- sample(seq_along(localtif), 100, replace = FALSE)
# and their location coordinates
samplocs <- xyFromCell(localtif, samp)
samplocs.df<-as.data.frame(samplocs)
# Melt like this
df <- xyFromCell(localtif, seq_along(localtif))
df <- as.data.frame(df)
df$value <- as.vector(elmat)
# Plot elevation with some coordinates
ggelmat2 = df %>%
ggplot(aes(x = x, y =y)) +
geom_tile(aes(fill=value)) +
geom_point(data = samplocs.df) +
scale_x_continuous("X",expand = c(0,0)) +
scale_y_continuous("Y",expand = c(0,0)) +
coord_fixed()
ggelmat2

Related

Dealing with raster data in ggplot

I have a vector spatial data for the boundary of a county and topographical data for the same county in raster format.
I need to create a map with ggplot so that it only shows those data in raster format that are within the county boundaries (which in turn are in a vector spatial file).
In other words, I need to remove raster data that is outside the county outline. Is it possible to do this with ggplot?
Reproducible example:
# load packages
library(elevatr)
library(terra)
library(geobr)
# get the municipality shapefile (vectorized spatial data)
municipality_shape <- read_municipality(code_muni = 3305802)
plot(municipality_shape$geom)
# get the raster topographical data
prj_dd <- "EPSG:4674"
t <- elevatr::get_elev_raster(locations = municipality_shape,
z = 10,
prj = prj_dd)
obj_raster <- rast(t)
plot(obj_raster)
# create the ggplot map
df_tere_topo <- obj_raster %>%
as.data.frame(xy = TRUE) %>%
rename(altitude = file40ac737835de)
ggplot()+
geom_raster(data = df_tere_topo, aes(x = x, y = y, fill = `altitude`))+
geom_sf(municipality_shape, mapping = aes(), color = 'red', fill = NA)
Edited
See the comments, use terra::crop() and terra::mask() instead:
# load packages
library(elevatr)
library(terra)
library(geobr)
library(dplyr)
library(ggplot2)
# Use tidyterra
library(tidyterra)
# get the municipality shapefile (vectorized spatial data)
municipality_shape <- read_municipality(code_muni = 3305802)
# get the raster topographical data
prj_dd <- "EPSG:4674"
t <- elevatr::get_elev_raster(
locations = municipality_shape,
z = 10,
prj = prj_dd
)
obj_raster <- rast(t)
# Crop + Mask
obj_raster_mask <- crop(obj_raster, vect(municipality_shape)) %>%
mask(vect(municipality_shape))
# create the ggplot map
# using tidyterra
ggplot() +
geom_spatraster(data = obj_raster_mask) +
geom_sf(municipality_shape, mapping = aes(), color = "white", fill = NA) +
# Not plotting NAs of the raster
scale_fill_continuous(na.value = NA) +
labs(fill="altitude")
Original answer
I think the most efficient way is to crop your SpatRaster to the extent of your vector data. With this approach the plotting is more efficient since you are not using data that you don't wnat to plot.
Another option is to set limits in coord_sf().
On this reprex I am using the package tidyterra as well, that has some functions to work with ggplot2 + terra (I am the developer of tidyterra):
# load packages
library(elevatr)
library(terra)
library(geobr)
library(dplyr)
library(ggplot2)
# Use tidyterra
library(tidyterra)
# get the municipality shapefile (vectorized spatial data)
municipality_shape <- read_municipality(code_muni = 3305802)
# get the raster topographical data
prj_dd <- "EPSG:4674"
t <- elevatr::get_elev_raster(
locations = municipality_shape,
z = 10,
prj = prj_dd
)
obj_raster <- rast(t)
# Option1: Crop first
obj_raster_crop <- crop(obj_raster, vect(municipality_shape))
# create the ggplot map
# using tidyterra
ggplot() +
geom_spatraster(data = obj_raster_crop) +
geom_sf(municipality_shape, mapping = aes(), color = "white", fill = NA) +
coord_sf(expand = FALSE) +
labs(fill="altitude")
# Option 2: Use limits and no crop
lims <- sf::st_bbox(municipality_shape)
ggplot() +
geom_spatraster(
data = obj_raster,
# Avoid resampling
maxcell = ncell(obj_raster)
) +
geom_sf(municipality_shape, mapping = aes(), color = "white", fill = NA) +
coord_sf(
expand = FALSE,
xlim = lims[c(1, 3)],
ylim = lims[c(2, 4)]
) +
labs(fill="altitude")

Make grid map from spatial data

I have spatial coordinates in a data frame where each row (Longitude, Latitude) corresponds to the occurrence of an event I am following. I tried to map these data but instead of using points, I want to create a grid with cells of a resolution of 5 nautical miles (~ 0.083333) and count the number of occurrences of the event is each cell and plot it.
This is the code I came to write with the help of some resources. But it doesn't look the way I expected it to be. Can you figure out what's I'm doing wrong? I attached the raw positions and the resulting map I get.
Here is the link to the data.
re_pi = read.csv(file = "~/Desktop/Events.csv")
gridx <- seq(from=-19,to=-10,by=0.083333)
gridy <- seq(from=20,to=29,by=0.083333)
xcell <- unlist(lapply(re_pi$LON,function(x) min(which(gridx>x))))
ycell <- unlist(lapply(re_pi$LAT,function(y) min(which(gridy>y))))
re_pi$cell <- (length(gridx) - 1) * ycell + xcell
rr = re_pi %>%
group_by(cell)%>%
summarise(Lat = mean(LAT),Lon = mean(LON),Freq = length(cell))
my_theme <- theme_bw() + theme(panel.ontop=TRUE, panel.background=element_blank())
my_cols <- scale_color_distiller(palette='Spectral')
my_fill <- scale_fill_distiller(palette='Spectral')
ggplot(rr, aes(y=Lat, x=Lon, fill=Effort)) + geom_tile(width=1.2, height=1.2) +
borders('world', xlim=range(rr$Lon), ylim=range(rr$Lat), colour='black') + my_theme + my_fill +
coord_quickmap(xlim=range(rr$Lon), ylim=range(rr$Lat))
Nice dataset, assume these are fishing vessel VMS data. Here may be one way to achieve your objective, heavily reliant on the tidyverse and by-passing raster and shapes.
library(tidyverse)
library(mapdata) # higher resolution maps
# poor man's gridding function
grade <- function (x, dx) {
if (dx > 1)
warning("Not tested for grids larger than one")
brks <- seq(floor(min(x)), ceiling(max(x)), dx)
ints <- findInterval(x, brks, all.inside = TRUE)
x <- (brks[ints] + brks[ints + 1])/2
return(x)
}
d <-
read_csv("https://raw.githubusercontent.com/abenmhamed/data/main/Events.csv") %>%
janitor::clean_names() %>%
# make a grid 0.01 x 0.01 longitude / latitude
mutate(lon = grade(lon, 0.01),
lat = grade(lat, 0.01)) %>%
group_by(lon, lat) %>%
count() %>%
# not much happening south of 21 and north of 26
filter(between(lat, 21, 26.25))
d %>%
ggplot() +
theme_bw() +
geom_tile(aes(lon, lat, fill = n)) +
scale_fill_viridis_c(option = "B", direction = -1) +
# only data within the data-bounds
borders(database = "worldHires",
xlim = range(d$lon), ylim = range(d$lat),
fill = "grey") +
labs(x = NULL, y = NULL, fill = "Effort") +
# limit plot
coord_quickmap(xlim = range(d$lon), ylim = range(d$lat)) +
# legends within plot
theme(legend.position = c(0.77, 0.26))
Here is my attempt using the sf package. First I imported your data and converted it to an sf object. Then, I created another sf object which includes the grids. I used the raster package and the sf package in order to create the grids. Once I had the two sf object, I counted how many data points exist in each grid and added the results as a new column in foo. Finally, I drew a graphic.
library(tidyverse)
library(sf)
library(raster)
library(viridis)
# Import the data and convert it to an sf object
mydata <- read_csv("https://raw.githubusercontent.com/abenmhamed/data/main/Events.csv") %>%
st_as_sf(coords = c("LON", "LAT"),
crs = 4326, agr = "constant")
# Create an sf object for the grid
gridx <- seq(from = -19,to = -10, by = 0.083333)
gridy <- seq(from = 20,to = 29, by = 0.083333)
foo <- raster(xmn = -19, xmx = -10,
ymn = 20, ymx = 29,
nrows = length(gridx),
ncols = length(gridy)) %>%
rasterToPolygons() %>%
st_as_sf(crs = 4326) %>%
mutate(group = 1:(length(gridx)*length(gridy))) %>%
st_cast("MULTIPOLYGON")
# Now count how many data points exist in each grid
mutate(foo,
count = lengths(st_intersects(x = foo, y = mydata))) -> foo
# Draw a graphic
ggplot() +
geom_sf(data = foo, aes(fill = count)) +
scale_fill_viridis(option = "D") -> g

Create shaded polygons around points with ggplot2

I saw yesterday this beautiful map of McDonalds restaurants in USA. I wanted to replicate it for France (I found some data that can be downloaded here).
I have no problem plotting the dots:
library(readxl)
library(ggplot2)
library(raster)
#open data
mac_do_FR <- read_excel("./mcdo_france.xlsx")
mac_do_FR_df <- as.data.frame(mac_do_FR)
#get a map of France
mapaFR <- getData("GADM", country="France", level=0)
#plot dots on the map
ggplot() +
geom_polygon(data = mapaFR, aes(x = long, y = lat, group = group),
fill = "transparent", size = 0.1, color="black") +
geom_point(data = mac_do_FR_df, aes(x = lon, y = lat),
colour = "orange", size = 1)
I tried several methods (Thiessen polygons, heat maps, buffers), but the results I get are very poor. I can't figure out how the shaded polygons were plotted on the American map. Any pointers?
Here's my result, but it did take some manual data wrangling.
Step 1: Get geospatial data.
library(sp)
# generate a map of France, along with a fortified dataframe version for ease of
# referencing lat / long ranges
mapaFR <- raster::getData("GADM", country="France", level=0)
map.FR <- fortify(mapaFR)
# generate a spatial point version of the same map, defining your own grid size
# (a smaller size yields a higher resolution heatmap in the final product, but will
# take longer to calculate)
grid.size = 0.01
points.FR <- expand.grid(
x = seq(min(map.FR$long), max(map.FR$long), by = grid.size),
y = seq(min(map.FR$lat), max(map.FR$lat), by = grid.size)
)
points.FR <- SpatialPoints(coords = points.FR, proj4string = mapaFR#proj4string)
Step 2: Generate a voronoi diagram based on store locations, & obtain the corresponding polygons as a SpatialPolygonsDataFrame object.
library(deldir)
library(dplyr)
voronoi.tiles <- deldir(mac_do_FR_df$lon, mac_do_FR_df$lat,
rw = c(min(map.FR$long), max(map.FR$long),
min(map.FR$lat), max(map.FR$lat)))
voronoi.tiles <- tile.list(voronoi.tiles)
voronoi.center <- lapply(voronoi.tiles,
function(l) data.frame(x.center = l$pt[1],
y.center = l$pt[2],
ptNum = l$ptNum)) %>%
data.table::rbindlist()
voronoi.polygons <- lapply(voronoi.tiles,
function(l) Polygon(coords = matrix(c(l$x, l$y),
ncol = 2),
hole = FALSE) %>%
list() %>%
Polygons(ID = l$ptNum)) %>%
SpatialPolygons(proj4string = mapaFR#proj4string) %>%
SpatialPolygonsDataFrame(data = voronoi.center,
match.ID = "ptNum")
rm(voronoi.tiles, voronoi.center)
Step 3. Check which voronoi polygon each point on the map overlaps with, & calculate its distance to the corresponding nearest store.
which.voronoi <- over(points.FR, voronoi.polygons)
points.FR <- cbind(as.data.frame(points.FR), which.voronoi)
rm(which.voronoi)
points.FR <- points.FR %>%
rowwise() %>%
mutate(dist = geosphere::distm(x = c(x, y), y = c(x.center, y.center))) %>%
ungroup() %>%
mutate(dist = ifelse(is.na(dist), max(dist, na.rm = TRUE), dist)) %>%
mutate(dist = dist / 1000) # convert from m to km for easier reading
Step 4. Plot, adjusting the fill gradient parameters as needed. I felt the result of a square root transformation looks quite good for emphasizing distances close to a store, while a log transformation is rather too exaggerated, but your mileage may vary.
ggplot() +
geom_raster(data = points.FR %>%
mutate(dist = pmin(dist, 100)),
aes(x = x, y = y, fill = dist)) +
# optional. shows outline of France for reference
geom_polygon(data = map.FR,
aes(x = long, y = lat, group = group),
fill = NA, colour = "white") +
# define colour range, mid point, & transformation (if desired) for fill
scale_fill_gradient2(low = "yellow", mid = "red", high = "black",
midpoint = 4, trans = "sqrt") +
labs(x = "longitude",
y = "latitude",
fill = "Distance in km") +
coord_quickmap()

How can I add annotation in ggplotly animation?

I am creating animated plotly graph for my assignment in r, where I am comparing several models with various number of observations. I would like to add annotation showing what is the RMSE of the current model - this means I would like to have text that changes together with slider. Is there any easy way how to do that?
Here is my dataset stored on GitHub. There already is created variable with RMSE: data
The base ggplot graphic is as follows:
library(tidyverse)
library(plotly)
p <- ggplot(values_predictions, aes(x = x)) +
geom_line(aes(y = preds_BLR, frame = n, colour = "BLR")) +
geom_line(aes(y = preds_RLS, frame = n, colour = "RLS")) +
geom_point(aes(x = x, y = target, frame = n, colour = "target"), alpha = 0.3) +
geom_line(aes(x = x, y = sin(2 * pi * x), colour = "sin(2*pi*x)"), alpha = 0.3) +
ggtitle("Comparison of performance) +
labs(y = "predictions and targets", colour = "colours")
This is converted to plotly, and I have added an animation to the Plotly graph:
plot <- ggplotly(p) %>%
animation_opts(easing = "linear",redraw = FALSE)
plot
Thanks!
You can add annotations to a ggplot graph using the annotate function: http://ggplot2.tidyverse.org/reference/annotate.html
df <- data.frame(x = rnorm(100, mean = 10), y = rnorm(100, mean = 10))
# Build model
fit <- lm(x ~ y, data = df)
# function finds RMSE
RMSE <- function(error) { sqrt(mean(error^2)) }
library(ggplot2)
ggplot(df, aes(x, y)) +
geom_point() +
annotate("text", x = Inf, y = Inf, hjust = 1.1, vjust = 2,
label = paste("RMSE", RMSE(fit$residuals)) )
There seems to be a bit of a problem converting between ggplot and plotly. However this workaround here shows a workaround which can be used:
ggplotly(plot) %>%
layout(annotations = list(x = 12, y = 13, text = paste("RMSE",
RMSE(fit$residuals)), showarrow = F))
Here's an example of adding data dependent text using the built in iris dataset with correlation as text to ggplotly.
library(plotly)
library(ggplot2)
library(dplyr)
mydata = iris %>% rename(variable1=Sepal.Length, variable2= Sepal.Width)
shift_right = 0.1 # number from 0-1 where higher = more right
shift_down = 0.02 # number from 0-1 where higher = more down
p = ggplot(mydata, aes(variable1,variable2))+
annotate(geom = "text",
label = paste0("Cor = ",as.character(round(cor.test(mydata$variable1,mydata$variable2)$estimate,2))),
x = min(mydata$variable1)+abs(shift_right*(min(mydata$variable1)-max(mydata$variable1))),
y = max(mydata$variable2)-abs(shift_down*(min(mydata$variable2)-max(mydata$variable2))), size=4)+
geom_point()
ggplotly(p) %>% style(hoverinfo = "none", traces = 1) # remove hover on text

Generating spatial heat map via ggmap in R based on a value

I'd like to generate a choropleth map using the following data points:
Longitude
Latitude
Price
Here is the dataset - https://www.dropbox.com/s/0s05cl34bko7ggm/sample_data.csv?dl=0.
I would like the map to show the areas where the price is higher and the where price is lower. It should most probably look like this (sample image):
Here is my code:
library(ggmap)
map <- get_map(location = "austin", zoom = 9)
data <- read.csv(file.choose(), stringsAsFactors = FALSE)
data$average_rate_per_night <- as.numeric(gsub("[\\$,]", "",
data$average_rate_per_night))
ggmap(map, extent = "device") +
stat_contour( data = data, geom="polygon",
aes( x = longitude, y = latitude, z = average_rate_per_night,
fill = ..level.. ) ) +
scale_fill_continuous( name = "Price", low = "yellow", high = "red" )
I'm getting the following error message:
2: Computation failed in `stat_contour()`:
Contour requires single `z` at each combination of `x` and `y`.
I'd really appreciate any help on how this can be fixed or any other method to generate this type of heatmap. Please note that I'm interested in the weight of the price, not density of the records.
If you insist on using the contour approach then you need to provide a value for every possible x,y coordinate combination you have in your data. To achieve this I would highly recommend to grid the space and generate some summary statistics per bin.
I attach a working example below based on the data you provided:
library(ggmap)
library(data.table)
map <- get_map(location = "austin", zoom = 12)
data <- setDT(read.csv(file.choose(), stringsAsFactors = FALSE))
# convert the rate from string into numbers
data[, average_rate_per_night := as.numeric(gsub(",", "",
substr(average_rate_per_night, 2, nchar(average_rate_per_night))))]
# generate bins for the x, y coordinates
xbreaks <- seq(floor(min(data$latitude)), ceiling(max(data$latitude)), by = 0.01)
ybreaks <- seq(floor(min(data$longitude)), ceiling(max(data$longitude)), by = 0.01)
# allocate the data points into the bins
data$latbin <- xbreaks[cut(data$latitude, breaks = xbreaks, labels=F)]
data$longbin <- ybreaks[cut(data$longitude, breaks = ybreaks, labels=F)]
# Summarise the data for each bin
datamat <- data[, list(average_rate_per_night = mean(average_rate_per_night)),
by = c("latbin", "longbin")]
# Merge the summarised data with all possible x, y coordinate combinations to get
# a value for every bin
datamat <- merge(setDT(expand.grid(latbin = xbreaks, longbin = ybreaks)), datamat,
by = c("latbin", "longbin"), all.x = TRUE, all.y = FALSE)
# Fill up the empty bins 0 to smooth the contour plot
datamat[is.na(average_rate_per_night), ]$average_rate_per_night <- 0
# Plot the contours
ggmap(map, extent = "device") +
stat_contour(data = datamat, aes(x = longbin, y = latbin, z = average_rate_per_night,
fill = ..level.., alpha = ..level..), geom = 'polygon', binwidth = 100) +
scale_fill_gradient(name = "Price", low = "green", high = "red") +
guides(alpha = FALSE)
You can then play around with the bin size and the contour binwidth to get the desired result but you could additionally apply a smoothing function on the grid to get an even smoother contour plot.
You could use the stat_summary_2d() or stat_summary_hex() function to achieve a similar result. These functions divide the data into bins (defined by x and y), and then the z values for each bin are summarised based on a given function. In the example below I have selected mean as an aggregation function and the map basically shows the average price in each bin.
Note: I needed to treat your average_rate_per_night variable appropriately in order to convert it into numbers (removed the $ sign and the comma).
library(ggmap)
library(data.table)
map <- get_map(location = "austin", zoom = 12)
data <- setDT(read.csv(file.choose(), stringsAsFactors = FALSE))
data[, average_rate_per_night := as.numeric(gsub(",", "",
substr(average_rate_per_night, 2, nchar(average_rate_per_night))))]
ggmap(map, extent = "device") +
stat_summary_2d(data = data, aes(x = longitude, y = latitude,
z = average_rate_per_night), fun = mean, alpha = 0.6, bins = 30) +
scale_fill_gradient(name = "Price", low = "green", high = "red")

Resources