I am having issues when plotting SST data after rasterizing the original data file. I eventually want to extract SST from a series of points for each day over a period of time, and so I noticed a shift in the raster relative to coastline after I recieved a bunch of NAs after extraction. I suspect I messed up my projection/transformation, but everything seems to work.
Query to get 1 month of SST data as a tibble. From heatwaveR documentation.
CRS for this data is EPSG:4326
according to ERDDAP. Resolution needed for later also from this step.
library(rerddap)
library(ggplot2)
library(dplyr)
library(raster)
library(rasterVis)
library(viridis)
library(purrr)
library(sf)
SST_sub_dl <- function(time_df){
SST_data_tas <- griddap(x = "NOAA_DHW_Lon0360",
url = "https://coastwatch.pfeg.noaa.gov/erddap/",
time = c(time_df$start, time_df$end),
latitude = c(-44.5, -38.5),
longitude = c(142, 150),
store=disk(),
fields = "CRW_SST")$data%>%
mutate(time = as.Date(stringr::str_remove(time, "T00:00:00Z"))) %>%
dplyr::rename(t = time, temp = CRW_SST) %>%
select(lon, lat, t, temp) %>%
na.omit()
}
#Assign years desired
tas_time <- data.frame(date_index = 1:2,
start = as.Date(c("2011-01-01")),
end = as.Date(c("2011-01-31")))
# The time this takes will vary greatly based on connection speed
#takes me 26 seconds
system.time(
SST_data_tas <- tas_time %>%
group_by(date_index) %>%
group_modify(~SST_sub_dl(.x)) %>%
ungroup() %>%
select(lon, lat, t, temp))
Now get Australia shapefile
aus.gadm<-getData("GADM",country="AUS",level=1, path=tempdir())
tas.gadm<-filter(aus.gadm,NAME_1=="Tasmania")
#clip function
gClip <- function(shp, bb){
if(class(bb) == "matrix") b_poly <- as(extent(as.vector(t(bb))), "SpatialPolygons")
else b_poly <- as(extent(bb), "SpatialPolygons")
gIntersection(shp, b_poly, byid = T)
}
#clip to remove outer islands
b<-st_bbox(c(xmin=142,xmax=150,ymin=-44.5,ymax=-38.5),crs=4326)
tasss.gadm<-gClip(tas.gadm,b)
tasss.gadm.fort<-sf::st_as_sf(tasss.gadm,"+proj=longlat +datum=WGS84 +nodefs")
Here, I plot the SST data from the dataframe and get SST data and Tasmania shapefile lined up well.
figure1<-SST_data_tas %>%
filter(t == "2011-01-01") %>%
ggplot(aes(x = lon, y = lat)) +
geom_tile(aes(fill = temp)) +
scale_fill_viridis_c(na.value="transparent") +
coord_quickmap(expand = F) +
labs(x = NULL, y = NULL, fill = "SST (°C)") +
theme(legend.position = "bottom")+
geom_sf(data=tasss.gadm.fort,inherit.aes=F,fill=NA)
Figure 1 output
Of course, since I eventually want to extract SST values underneath points, I need to convert this to a raster stack, one layer for each day.
#create blank raster
r_tas_obj<- raster(xmn=min(SST_data_tas$lon), xmx=max(SST_data_tas$lon), ymn=min(SST_data_tas$lat),
ymx=max(SST_data_tas$lat),res=c(0.04999,0.05),crs=sf::st_crs(4326)[[2]])
#rasterize SST data into a rasterstack
system.time(SST_data_tas_stack <- SST_data_tas %>%
group_split(t) %>%
purrr::map(~rasterize(x=.x[,c("lon", "lat")],y=r_tas_obj,field=.x[,4],fun=mean))%>%
stack())
#plot
figure2<-SST_data_tas_stack$layer.1%>%
gplot()+
geom_raster(aes(x = x, y = y,fill=value),stat='identity') +
scale_fill_viridis_c(na.value="transparent") +
coord_quickmap(expand = F) +
labs(x = NULL, y = NULL, fill = "SST (°C)") +
theme(legend.position = "bottom")+
geom_sf(data=tasss.gadm.fort,inherit.aes=F,fill=NA)
#CRS match...
compareCRS(SST_data_tas_stack,tasss.gadm.fort)
Figure2 output
What you will notice is that my raster result is slightly shifted compared to the original plotting of the data using geom_tile. I suspect this is due to an error at the rasterization process, but the CRS between my tasmania shapefile and SST raster match up.
Related
What I have:
points in the arctic and antarctic
raster data from various geophysical entities in arctic and antarctic
What I want:
A map in stereographic or any other polar projection with background map or coastlines, cropped to the extent of the points. In other words: A map like above with base map of my own choice.
What I did so far:
I loaded all the data (including land surface data from naturalearthdata; see MWE), projected them into stereographic and plotted that. The result including the polygon data looks then like this:
My MWE:
library(raster)
library(sf)
library(ggplot2)
library(rgdal)
# file load ---------------------------------------------------------------
# sea ice raster data
if (!file.exists("seaiceraster.tif")) {
url = "https://seaice.uni-bremen.de/data/smos/tif/20100514_hvnorth_rfi_l1c.tif"
download.file(url, destfile = 'seaiceraster.tif')
}
si.raster = raster::raster('seaiceraster.tif')
# land surface shapefile
if (!file.exists("110m-admin-0-countries")) {
url_land = "https://www.naturalearthdata.com/http//www.naturalearthdata.com/download/10m/physical/ne_10m_land.zip"
download.file(url_land, destfile = "110m-admin-0-countries")
unzip("110m-admin-0-countries")
}
world_shp = rgdal::readOGR("ne_10m_land.shp")
# points
p.data = structure(
list(
Lat = c(
73.0114126168676,70.325555278764,77.467797903163,
58.6423827457304,66.3616310851294,59.2097857474643,
75.3135274436283,60.1983078512275,72.6614399747201,
61.1566678672946,73.0822309615673,55.7759666826898,
75.1651656433833,69.0130753414173,62.3288262448589
),
Lon = c(
-59.9175490701543,-80.1900239630732,-40.4609968914928,
-61.0914448815381,-60.0703668488408,-21.027205418284,
-100.200463810276,-74.861777073788,-55.1093773178206,
-29.4108649230234,-64.5878251008461,-36.5343322019187,
-31.647365623387,-67.466355105829,-64.1162329769077
)
),
row.names = c(
1911L, 592L,2110L,3552L,3426L,1524L,635L,4668L,
3945L,2848L,3609L,36L,4262L,3967L,2725L
),
class = "data.frame"
)
p = sf::st_as_sf(p.data, coords = c("Lon", "Lat"),
crs = "+init=epsg:4326")
# project -----------------------------------------------------------------
polar.crs = CRS("+init=epsg:3995")
si.raster.proj = projectRaster(si.raster, crs = polar.crs)
world_shp.proj = sp::spTransform(world_shp, polar.crs)
p.proj = sf::st_transform(p, polar.crs)
# preparation -------------------------------------------------------------
AG = ggplot2::fortify(world_shp.proj)
# make raster to data.frame
si.raster.df = si.raster.proj %>%
raster::crop(., p.proj) %>%
raster::rasterToPoints(., spatial = TRUE) %>%
as.data.frame(.)
colnames(si.raster.df) = c("val", "x", "y")
# plot --------------------------------------------------------------------
ggplot() +
# geom_polygon(data = AG, aes(long, lat, group = group)) + # un-comment to see
geom_raster(data = si.raster.df, aes(x = x, y = y, fill = val)) +
geom_sf(data = p.proj, color = "green", size = 3)
I've changed the workflow in your example a bit to add the stars package for the sea ice data, but I think it should get you what you're looking for. You'll need to adjust the crop size to expand it a little, as the points p are right on the edge of the plotted area. st_buffer might help with that.
I used the crs from the seaicebuffer.tif file for all of the objects.
The .tif file has a crs that I'm not able to easily transform on my computer. It seems to be able to use meters as a lengthunit and might be a polar stereographic (variant B) projection. The points & world data don't seem to have a problem transforming to it though, which is why I've used it throughout.
library(raster)
library(sf)
library(ggplot2)
library(rgdal)
library(stars)
si <- stars::read_stars('seaiceraster.tif')
world_sf = rgdal::readOGR("ne_10m_land.shp") %>%
st_as_sf() %>%
st_transform(st_crs(si))
# p <- ... same as example and then:
p <- st_transform(p, st_crs(si))
# get a bounding box for the points to crop si & world.
p_bbox <- st_bbox(p) %>%
st_as_sfc() %>%
st_as_sf() %>%
st_buffer(100000)
# crop si & world_sf to an area around the points (p)
world_cropped <- st_crop(world_sf, p_bbox)
si_cropped <- st_crop(si, p_bbox)
#Plot
ggplot() +
geom_sf(data = world_cropped,
color = 'black',
fill = 'NA',
size = .2) +
geom_stars(data = si_cropped) +
geom_sf(data = p, color = 'red') +
scale_fill_continuous(na.value = 0)
Ugly hack for the southern .tif that stars reads as factors:
si <- stars::read_stars('20150324_hvsouth_rfi_l1c.tif', NA_value = 0 )
si$"20150324_hvsouth_rfi_l1c.tif" <- as.numeric(si$"20150324_hvsouth_rfi_l1c.tif")
ggplot() + geom_stars(data = si)
I have spatial coordinates in a data frame where each row (Longitude, Latitude) corresponds to the occurrence of an event I am following. I tried to map these data but instead of using points, I want to create a grid with cells of a resolution of 5 nautical miles (~ 0.083333) and count the number of occurrences of the event is each cell and plot it.
This is the code I came to write with the help of some resources. But it doesn't look the way I expected it to be. Can you figure out what's I'm doing wrong? I attached the raw positions and the resulting map I get.
Here is the link to the data.
re_pi = read.csv(file = "~/Desktop/Events.csv")
gridx <- seq(from=-19,to=-10,by=0.083333)
gridy <- seq(from=20,to=29,by=0.083333)
xcell <- unlist(lapply(re_pi$LON,function(x) min(which(gridx>x))))
ycell <- unlist(lapply(re_pi$LAT,function(y) min(which(gridy>y))))
re_pi$cell <- (length(gridx) - 1) * ycell + xcell
rr = re_pi %>%
group_by(cell)%>%
summarise(Lat = mean(LAT),Lon = mean(LON),Freq = length(cell))
my_theme <- theme_bw() + theme(panel.ontop=TRUE, panel.background=element_blank())
my_cols <- scale_color_distiller(palette='Spectral')
my_fill <- scale_fill_distiller(palette='Spectral')
ggplot(rr, aes(y=Lat, x=Lon, fill=Effort)) + geom_tile(width=1.2, height=1.2) +
borders('world', xlim=range(rr$Lon), ylim=range(rr$Lat), colour='black') + my_theme + my_fill +
coord_quickmap(xlim=range(rr$Lon), ylim=range(rr$Lat))
Nice dataset, assume these are fishing vessel VMS data. Here may be one way to achieve your objective, heavily reliant on the tidyverse and by-passing raster and shapes.
library(tidyverse)
library(mapdata) # higher resolution maps
# poor man's gridding function
grade <- function (x, dx) {
if (dx > 1)
warning("Not tested for grids larger than one")
brks <- seq(floor(min(x)), ceiling(max(x)), dx)
ints <- findInterval(x, brks, all.inside = TRUE)
x <- (brks[ints] + brks[ints + 1])/2
return(x)
}
d <-
read_csv("https://raw.githubusercontent.com/abenmhamed/data/main/Events.csv") %>%
janitor::clean_names() %>%
# make a grid 0.01 x 0.01 longitude / latitude
mutate(lon = grade(lon, 0.01),
lat = grade(lat, 0.01)) %>%
group_by(lon, lat) %>%
count() %>%
# not much happening south of 21 and north of 26
filter(between(lat, 21, 26.25))
d %>%
ggplot() +
theme_bw() +
geom_tile(aes(lon, lat, fill = n)) +
scale_fill_viridis_c(option = "B", direction = -1) +
# only data within the data-bounds
borders(database = "worldHires",
xlim = range(d$lon), ylim = range(d$lat),
fill = "grey") +
labs(x = NULL, y = NULL, fill = "Effort") +
# limit plot
coord_quickmap(xlim = range(d$lon), ylim = range(d$lat)) +
# legends within plot
theme(legend.position = c(0.77, 0.26))
Here is my attempt using the sf package. First I imported your data and converted it to an sf object. Then, I created another sf object which includes the grids. I used the raster package and the sf package in order to create the grids. Once I had the two sf object, I counted how many data points exist in each grid and added the results as a new column in foo. Finally, I drew a graphic.
library(tidyverse)
library(sf)
library(raster)
library(viridis)
# Import the data and convert it to an sf object
mydata <- read_csv("https://raw.githubusercontent.com/abenmhamed/data/main/Events.csv") %>%
st_as_sf(coords = c("LON", "LAT"),
crs = 4326, agr = "constant")
# Create an sf object for the grid
gridx <- seq(from = -19,to = -10, by = 0.083333)
gridy <- seq(from = 20,to = 29, by = 0.083333)
foo <- raster(xmn = -19, xmx = -10,
ymn = 20, ymx = 29,
nrows = length(gridx),
ncols = length(gridy)) %>%
rasterToPolygons() %>%
st_as_sf(crs = 4326) %>%
mutate(group = 1:(length(gridx)*length(gridy))) %>%
st_cast("MULTIPOLYGON")
# Now count how many data points exist in each grid
mutate(foo,
count = lengths(st_intersects(x = foo, y = mydata))) -> foo
# Draw a graphic
ggplot() +
geom_sf(data = foo, aes(fill = count)) +
scale_fill_viridis(option = "D") -> g
I am able to produce a cartogram using cartogram::cartogram_cont() But then have difficulty in customising the styling.
I have used broom::tidy() and dplyr::left_join() to fortify the cartogram, but I think perhaps the tidy stage has interfered with the plotOrder. If possible, I will include the output cartograms.
I'm attempting to replicate this type of output, but within my locality. Plesae note that the dataset used for the weighting in cartogram_cont() is not particularly significant, just a proof of concept at this stage:
[R Graph Gallery][1]
[1]: https://www.r-graph-gallery.com/331-basic-cartogram/
Shapefile from: [Lle Shapefile Location][2]
[2]: http://lle.gov.wales/catalogue/item/LocalAuthorities/?lang=en
library(dplyr)
library(leaflet)
library(maptools)
library(cartogram)
library(devtools)
install_github("HanOostdijk/odataR" , build_vignettes = T)
library(odataR)
library(tidyr)
library(rgdal)
library(htmltools)
#Read in shapefile and transform shape
#dsn = folder name, layer = filename but drop the .shp
shapefile <- readOGR(dsn = "Wales Shapefile",
layer = "localauthoritiesPolygon") %>%
#Transform coordinate referencing system
spTransform(CRS("+init=epsg:4326"))
#Next step is to join an interesting dataset to the shapefile using dplyr, then pass this to the cartoram package to render.
#Gone for the teacher sickness dataset from Stats Wales. Noticed it's only up to 2017, wonder if they've stopped collecting.
teacher_sickness_data <- odataR_query('http://open.statswales.gov.wales/dataset/schw0001')
#Check values for join.
categories <- unique(teacher_sickness_data$Area_ItemName_ENG)
categories_shp <- shapefile#data$name_en
categories
categories_shp
#Teacher data has "All Welsh local authorities". Not contained in shapefile so remove.
UA_sickness_data <- teacher_sickness_data[-c(2, 4:6, 8, 9, 11:13, 15:17)] %>%
filter(Area_ItemName_ENG != "All Welsh local authorities")
#Perform join to shapefile
shapefile_1 <- shapefile %>%
merge(UA_sickness_data, by.x = "name_en", by.y = "Area_ItemName_ENG",
duplicateGeoms = TRUE)
#Shiny App will allow choice of inputs to achieve one row per polygon. However, for testing
#functionality with cartograph functions, perform test filtering.
data_filtered <- UA_sickness_data %>%
filter(Year_ItemName_ENG == 2017) %>%
filter(Type_ItemName_ENG == "Full-time") %>%
filter(Variable_ItemName_ENG == "Total days of sick leave")
test_merge <- shapefile %>%
merge(data_filtered, by.x = "name_en", by.y = "Area_ItemName_ENG")
nc_pal <- colorNumeric(palette = "Reds",
domain = log(test_merge#data$Data))
m <-test_merge %>%
leaflet() %>%
addProviderTiles(providers$CartoDB.Positron) %>%
addPolygons(weight = 1,
color = ~nc_pal(log(Data)),
label = ~name_en,
highlight = highlightOptions(weight = 3,
color = "crimson",
bringToFront = TRUE),
popup = ~ paste0(Variable_ItemName_ENG, "<br/>",
"<b/>",
Data))
m
wales_cart <- cartogram_cont(test_merge, "Data", itermax=5)
plot(wales_cart)
[![Wales_Cartogram][3]][3]
[3]: https://i.stack.imgur.com/2tsMC.png
library(tidyverse)
library(ggmap)
library(broom)
library(rgeos) #used for gBuffer
#Buffer allows to tidy cartogram based on factor of choice.
wales_cart_buffered <- gBuffer(wales_cart, byid=TRUE, width=0)
#tidy cartogram in order to pass to ggplot
spdf_fortified_wales <- tidy(wales_cart_buffered, region = "name_en")
#Now perform a join based on english UA names
spdf_fortified_wales_joined <- spdf_fortified_wales %>%
left_join(. , wales_cart#data, by=c("id"="name_en"))
ggplot() +
geom_polygon(data = spdf_fortified_wales_joined, aes(fill = Data, x = long, y = lat, group = "name_en") , size=0, alpha=0.9) +
coord_map() +
theme_void()
[![incorrect_ggplot][4]][4]
[4]: https://i.stack.imgur.com/as0Z4.png
ggplot() +
geom_polygon(data = spdf_fortified_wales_joined, aes(fill = Data, x = long, y = lat, group = "name_en") , size=0, alpha=0.9) +
coord_map() +
theme_void()
Success Criteria: Polygons are rendered correctly distorted and colour scale reflects weighting variable.
I saw yesterday this beautiful map of McDonalds restaurants in USA. I wanted to replicate it for France (I found some data that can be downloaded here).
I have no problem plotting the dots:
library(readxl)
library(ggplot2)
library(raster)
#open data
mac_do_FR <- read_excel("./mcdo_france.xlsx")
mac_do_FR_df <- as.data.frame(mac_do_FR)
#get a map of France
mapaFR <- getData("GADM", country="France", level=0)
#plot dots on the map
ggplot() +
geom_polygon(data = mapaFR, aes(x = long, y = lat, group = group),
fill = "transparent", size = 0.1, color="black") +
geom_point(data = mac_do_FR_df, aes(x = lon, y = lat),
colour = "orange", size = 1)
I tried several methods (Thiessen polygons, heat maps, buffers), but the results I get are very poor. I can't figure out how the shaded polygons were plotted on the American map. Any pointers?
Here's my result, but it did take some manual data wrangling.
Step 1: Get geospatial data.
library(sp)
# generate a map of France, along with a fortified dataframe version for ease of
# referencing lat / long ranges
mapaFR <- raster::getData("GADM", country="France", level=0)
map.FR <- fortify(mapaFR)
# generate a spatial point version of the same map, defining your own grid size
# (a smaller size yields a higher resolution heatmap in the final product, but will
# take longer to calculate)
grid.size = 0.01
points.FR <- expand.grid(
x = seq(min(map.FR$long), max(map.FR$long), by = grid.size),
y = seq(min(map.FR$lat), max(map.FR$lat), by = grid.size)
)
points.FR <- SpatialPoints(coords = points.FR, proj4string = mapaFR#proj4string)
Step 2: Generate a voronoi diagram based on store locations, & obtain the corresponding polygons as a SpatialPolygonsDataFrame object.
library(deldir)
library(dplyr)
voronoi.tiles <- deldir(mac_do_FR_df$lon, mac_do_FR_df$lat,
rw = c(min(map.FR$long), max(map.FR$long),
min(map.FR$lat), max(map.FR$lat)))
voronoi.tiles <- tile.list(voronoi.tiles)
voronoi.center <- lapply(voronoi.tiles,
function(l) data.frame(x.center = l$pt[1],
y.center = l$pt[2],
ptNum = l$ptNum)) %>%
data.table::rbindlist()
voronoi.polygons <- lapply(voronoi.tiles,
function(l) Polygon(coords = matrix(c(l$x, l$y),
ncol = 2),
hole = FALSE) %>%
list() %>%
Polygons(ID = l$ptNum)) %>%
SpatialPolygons(proj4string = mapaFR#proj4string) %>%
SpatialPolygonsDataFrame(data = voronoi.center,
match.ID = "ptNum")
rm(voronoi.tiles, voronoi.center)
Step 3. Check which voronoi polygon each point on the map overlaps with, & calculate its distance to the corresponding nearest store.
which.voronoi <- over(points.FR, voronoi.polygons)
points.FR <- cbind(as.data.frame(points.FR), which.voronoi)
rm(which.voronoi)
points.FR <- points.FR %>%
rowwise() %>%
mutate(dist = geosphere::distm(x = c(x, y), y = c(x.center, y.center))) %>%
ungroup() %>%
mutate(dist = ifelse(is.na(dist), max(dist, na.rm = TRUE), dist)) %>%
mutate(dist = dist / 1000) # convert from m to km for easier reading
Step 4. Plot, adjusting the fill gradient parameters as needed. I felt the result of a square root transformation looks quite good for emphasizing distances close to a store, while a log transformation is rather too exaggerated, but your mileage may vary.
ggplot() +
geom_raster(data = points.FR %>%
mutate(dist = pmin(dist, 100)),
aes(x = x, y = y, fill = dist)) +
# optional. shows outline of France for reference
geom_polygon(data = map.FR,
aes(x = long, y = lat, group = group),
fill = NA, colour = "white") +
# define colour range, mid point, & transformation (if desired) for fill
scale_fill_gradient2(low = "yellow", mid = "red", high = "black",
midpoint = 4, trans = "sqrt") +
labs(x = "longitude",
y = "latitude",
fill = "Distance in km") +
coord_quickmap()
I've managed to produce a map, however I need to add one label for a state (level 2) that includes subdivisons (level 3), instead of labeling each subdivision (for only this state). In data "newpak" rows 641-664 correspond to this state, is there any way to place only one name above this state.
library(dplyr)
library(raster)
library(sf)
library(tidyverse)
library(ggrepel)
devtools::install_github("tidyverse/ggplot2", force = TRUE)
library(ggplot2)
pak <- getData("GADM",country="PAK",level=3)
pak <- st_as_sf(pak) %>%
mutate(
lon = map_dbl(geometry, ~st_centroid(.x)[[1]]),
lat = map_dbl(geometry, ~st_centroid(.x)[[2]]))
ggplot(pak) + geom_sf() + geom_text(aes(label = NAME_3, x = lon, y = lat), size = 2)
ind <- getData("GADM",country="IND",level=3)
ind <- st_as_sf(ind) %>%
mutate(
lon = map_dbl(geometry, ~st_centroid(.x)[[1]]),
lat = map_dbl(geometry, ~st_centroid(.x)[[2]]))
jnk <- subset(ind, OBJECTID >= 641 & OBJECTID <= 664 )
newpak <- rbind(pak, jnk)
regionalValues <- runif(165) # Simulate a value for each region between 0 and 1
ggplot(newpak) + geom_sf(aes(fill = regionalValues)) + geom_text(aes(label = NAME_3, x = lon, y = lat), size = 2)
Here's a complete solution using the sf package.
library(raster)
library(sf)
library(tidyverse)
# downlaod PAK data and convert to sf
pak <- getData("GADM",country="PAK",level=3) %>%
st_as_sf()
# download IND data, convert to sf, filter out
# desired area, and add NAME_3 label
jnk <- getData("GADM",country="IND",level=3) %>%
st_as_sf() %>%
filter(OBJECTID %>% between(641, 664)) %>%
group_by(NAME_0) %>%
summarize() %>%
mutate(NAME_3 = "Put desired region name here")
regionalValues <- runif(142) # Simulate a value for each region between 0 and 1
# combine the two dataframes, find the center for each
# region, and the plot with ggplot
pak %>%
select(NAME_0, NAME_3, geometry) %>%
rbind(jnk) %>%
mutate(
lon = map_dbl(geometry, ~st_centroid(.x)[[1]]),
lat = map_dbl(geometry, ~st_centroid(.x)[[2]])
) %>%
ggplot() +
geom_sf(aes(fill = regionalValues)) +
geom_text(aes(label = NAME_3, x = lon, y = lat), size = 2) +
scale_fill_distiller(palette = "Spectral")
Some notes:
I used sf::filter instead of raster::subset to get the desired subset of the IND data, because I feel it's more idiomatic tidyverse code.
To combine areas with sf you can group the different regions by a common group with group_by and then simply call summarize. This is the method I used in my solution above. There are other functions in the sf package that accomplish similar results worth looking at. They are st_combine and st_union.
Using st_centroid for the purpose of plotting the region labels is not necessarily the best method for finding a good location for region labels. I used it because it's the most convenient. You might try other methods, including manual placement of labels.
I changed the fill palette to a diverging color palette because I think it more clearly shows the difference between one region and the next. You can see some of the color palettes available with RColorBrewer::display.brewer.all()