Creating spatialpolygons dataframe from list of polygons - r

I am currently trying to create a polygon shapefile from a list of polygons (study areas for biodiversity research).
Currently these polygons are stored in a list in this format:
$SEW22
[,1] [,2]
[1,] 427260.4 5879458
[2,] 427161.4 5879472
[3,] 427175.0 5879571
[4,] 427273.9 5879557
[5,] 427260.4 5879458
$SEW23
[,1] [,2]
[1,] 418011.0 5867216
[2,] 417912.0 5867230
[3,] 417925.5 5867329
[4,] 418024.5 5867315
[5,] 418011.0 5867216
I tried to simply write them as shpfile with writeOGR but the following error occurs:
> #write polygons to shp
> filenameshp <- paste('Forestplots')
> layername <- paste('Forestplots')
> writeOGR(obj=forest, dsn = filenameshp,
+ layer=layername, driver="ESRI Shapefile", overwrite_layer = TRUE)
Error in writeOGR(obj = forest, dsn = filenameshp, layer = layername, :
inherits(obj, "Spatial") is not TRUE
I read this tutorial by Barry Rowlingson to create spatialpolygons and thought I should probably first create a dataframe and did this:
forestm<-do.call(rbind,forest)
but this returned nothing useful as you can imagine, plus it lost the names of the plots.
As I am still new to R I also tried lots of different other approaches which sensefulness I could not fully judge but none returned what I hoped for and so I spare you with these random approaches.....
I am looking forward to your propositions.
Many thanks
P.S. I also tried the following as described in the spatialpolygons{sp} package:
> Polygons(forest, ID)
Error in Polygons(forest, ID) : srl not a list of Polygon objects

You can follow the approach described in this answer: https://gis.stackexchange.com/questions/18311/instantiating-spatial-polygon-without-using-a-shapefile-in-r.
Here's how to apply the approach to your case. First, I create a list of matrices as in your sample data:
forest <- list(
"SEW22" = matrix(c(427260.4, 5879458, 427161.4, 5879472, 427175.0, 5879571, 427273.9, 5879557, 427260.4, 5879458),
nc = 2, byrow = TRUE),
"SEW23" = matrix(c(418011.0, 5867216, 417912.0, 5867230, 417925.5, 5867329, 418024.5, 5867315, 418011.0, 5867216),
nc = 2, byrow = TRUE)
)
Now
library(sp)
p <- lapply(forest, Polygon)
ps <- lapply(seq_along(p), function(i) Polygons(list(p[[i]]), ID = names(p)[i]))
sps <- SpatialPolygons(ps)
sps_df <- SpatialPolygonsDataFrame(sps, data.frame(x = rep(NA, length(p)), row.names = names(p)))
In the first step, we iterate through the list of matrices and apply the Polygon function to each matrix to create a list of Polygon objects. In the second step, we iterate through this list to create a Polygons object, setting the ID of each element in this object to the corresponding name in the original list (e.g. "SEW22", "SEW23"). The third step creates a SpatialPolygons object. Finally, we create a SpatialPolygonsDataFrame object. Here I have a dummy dataframe populated with NAs (note that the row names must correspond to the polygon IDs).
Finally, write the data
rgdal::writeOGR(obj = sps_df,
dsn = "Forestplots",
layer = "Forestplots",
driver = "ESRI Shapefile",
overwrite_layer = TRUE)
This creates a new folder in your working directory:
list.files()
# [1] "Forestplots"
list.files("Forestplots")
# [1] "Forestplots.dbf" "Forestplots.shp" "Forestplots.shx"
Consult the linked answer for more details.

Related

Error in do.ply(i) : task 1 failed - "could not find function "%>%"" in R parallel programming

Every time I run the script it always gives me an error: Error in { : task 1 failed - "could not find function "%>%""
I already check every post on this forum and tried to apply it but no one works.
Please advise any solution.
Please note: I have only 2 cores on my PC.
My code is as follows:
library(dplyr) # For basic data manipulation
library(ncdf4) # For creating NetCDF files
library(tidync) # For easily dealing with NetCDF data
library(ggplot2) # For visualising data
library(doParallel) # For parallel processing
MHW_res_grid <- readRDS("C:/Users/SUDHANSHU KUMAR/Desktop/MTech Project/R/MHW_result.Rds")
# Function for creating arrays from data.frames
df_acast <- function(df, lon_lat){
# Force grid
res <- df %>%
right_join(lon_lat, by = c("lon", "lat")) %>%
arrange(lon, lat)
# Convert date values to integers if they are present
if(lubridate::is.Date(res[1,4])) res[,4] <- as.integer(res[,4])
# Create array
res_array <- base::array(res[,4], dim = c(length(unique(lon_lat$lon)), length(unique(lon_lat$lat))))
dimnames(res_array) <- list(lon = unique(lon_lat$lon),
lat = unique(lon_lat$lat))
return(res_array)
}
# Wrapper function for last step before data are entered into NetCDF files
df_proc <- function(df, col_choice){
# Determine the correct array dimensions
lon_step <- mean(diff(sort(unique(df$lon))))
lat_step <- mean(diff(sort(unique(df$lat))))
lon <- seq(min(df$lon), max(df$lon), by = lon_step)
lat <- seq(min(df$lat), max(df$lat), by = lat_step)
# Create full lon/lat grid
lon_lat <- expand.grid(lon = lon, lat = lat) %>%
data.frame()
# Acast only the desired column
dfa <- plyr::daply(df[c("lon", "lat", "event_no", col_choice)],
c("event_no"), df_acast, .parallel = T, lon_lat = lon_lat)
return(dfa)
}
# We must now run this function on each column of data we want to add to the NetCDF file
doParallel::registerDoParallel(cores = 2)
prep_dur <- df_proc(MHW_res_grid, "duration")
prep_max_int <- df_proc(MHW_res_grid, "intensity_max")
prep_cum_int <- df_proc(MHW_res_grid, "intensity_cumulative")
prep_peak <- df_proc(MHW_res_grid, "date_peak")

Extraction of data from HDF at different pressure level

I am trying to extract a variable named Data Fields/OzoneTropColumn at a point location (lon=40, lat=34) at different pressure level (825.40198, 681.29102, 464.16000, 316.22699 hPa) from multiple hdf files
library(raster)
library(ncdf4)
library(RNetCDF)
# read file
nc <- nc_open("E:/Ozone/test1.nc")
list_col1 <- as.list(list.files("E:/Ozone/", pattern = "*.hdf",
full.names = TRUE))
> attributes(nc$var) #using a single hdf file to check its variables
$names
[1] "Data Fields/Latitude" "Data Fields/Longitude"
[3] "Data Fields/O3" "Data Fields/O3DataCount"
[5] "Data Fields/O3Maximum" "Data Fields/O3Minimum"
[7] "Data Fields/O3StdDeviation" "Data Fields/OzoneTropColumn"
[9] "Data Fields/Pressure" "Data Fields/TotColDensDataCount"
[11] "Data Fields/TotColDensMaximum" "Data Fields/TotColDensMinimum"
[13] "Data Fields/TotColDensStdDeviation" "Data Fields/TotalColumnDensity"
[15] "HDFEOS INFORMATION/StructMetadata.0" "HDFEOS INFORMATION/coremetadata"
> pres <- ncvar_get(nc, "Data Fields/Pressure") #looking at pressure level from single file of hdf
> pres
[1] 825.40198 681.29102 464.16000 316.22699 215.44400 146.77901 100.00000 68.12950 46.41580 31.62290
[11] 21.54430 14.67800 10.00000 6.81291 4.64160
ncin <- raster::stack(list_col1,
varname = "Data Fields/OzoneTropColumn",
ncdf=TRUE)
#cannot extract using the following code
o3 <- ncvar_get(list_col1,attributes(list_col1$var)$names[9])
"Error in ncvar_get(list_col1, attributes(list_col1$var)$names[9]) :
first argument (nc) is not of class ncdf4!"
#tried to extract pressure levels
> prsr <- raster::stack(list_col1,varname = "Data Fields/Pressure",ncdf=TRUE)
"Error in h(simpleError(msg, call)) :
error in evaluating the argument 'x' in selecting a method for function 'stack': varname: Data Fields/Pressure does not exist in the file. Select one from:
Data Fields/O3, Data Fields/O3DataCount, Data Fields/O3Maximum, Data Fields/O3Minimum, Data Fields/O3StdDeviation, Data Fields/OzoneTropColumn, Data Fields/TotColDensDataCount, Data Fields/TotColDensMaximum, Data Fields/TotColDensMinimum, Data Fields/TotColDensStdDeviation, Data Fields/TotalColumnDensity"
#tried using index
#Point location can also be written as below 1 deg by 1 deg resolution
lonIdx <- which(lon >32 & lon <36)
latIdx <- which(lat >38 & lat <42)
presIdx <- which(pres >= 400 & pres <= 900)
#also tried
# Option 2 -- subset using array indexing
o3 <- ncvar_get(list_col1,'Data Fields/OzoneTropColumn')
"Error in ncvar_get(list_col1, "Data Fields/OzoneTropColumn") :
first argument (nc) is not of class ncdf4!"
extract2 <- o3[lonIdx, latIdx, presIdx, ]
How to I extract these values vertically at each pressure level ? (SM=Some value)
I would like the output in following way at location (lon=40, lat=34):
Pressure 1 2 3 4 5 .... 10
825.40198 SM1 SM2 SM3 SM4 SM5... SM10
681.29102 SM11 SM12
464.16000
316.22699 SM.. SM.. SM.. SM.. SM.. SM..
Appreciate any help.
Thank you
This might be an issue with how netcdf4 and raster name each of the layers in the file. And perhaps some confusion with trying to create a multilayer object from multiple ncdf at once.
I would do the following, using only raster: Load a single netCDF, using stack() or brick(). This will load the file as a multilayer object in R. Use names() to identify what is the name of the Ozone layer according to the raster package.
firstraster <- stack("E:/Ozone/test1.nc")
names(firstraster)
Once you find out the name, you can just execute a reading of all objects as stack(), extract the information on points of interest, without even assembling all layers in a single stack.
Ozonelayername <- "put name here"
files <- list.files("E:/Ozone/", pattern = "*.hdf", full.names = TRUE)
stacklist <- lapply(files, stack)
Ozonelayerlist <- lapply(stacklist, "[[", "Ozonelayername")
The line above will output a list of rasters objects (not stacks or bricks, just plain rasters), with only the layer you want.
Now we just need to execute an extract on each of these layers. sapply() will format that neatly in a matrix for us.
pointsofinterest <- expand.grid(32:36,38:42)
values <- sapply(Ozonelayerlist, extract, pointsofinterest)
I can test it, since I do not have the data, but I assume this would work.

How to convert Sentinel-3 .nc-file into .tiff-file?

regarding the conversion of .nc-files into .tiff-files i encounter the problem of loosing geoinformation of my pixels. I know that other users experienced the same problem and tried to solve it via kotlin but failed. i would prefer a solution using R. see here for kotlin approach URL:https://gis.stackexchange.com/questions/259700/converting-sentinel-3-data-netcdf-to-geotiff
I downloaded freely available Sentinel-3 data of the ESA (URL:https://scihub.copernicus.eu/dhus/#/home). This data comes unfortunately in the .nc-format, so I want to convert it into the .tiff-format. I have already tried various approaches, but failed. What I have tried so far:
data_source <- 'D:/user_1/01_test_data/S3A_SL_1_RBT____20180708T093240_20180708T093540_20180709T141944_0179_033_150_2880_LN2_O_NT_003.SEN3/F1_BT_in.nc'
# define path to .nc-file
data_output <- 'D:/user_1/01_test_data/S3A_SL_1_RBT____20180708T093240_20180708T093540_20180709T141944_0179_033_150_2880_LN2_O_NT_003.SEN3/test.tif'
# define path of output .tiff-file
###################################################
# 1.) use gdal_translate via Windows cmd-line in R
# see here URL:https://stackoverflow.com/questions/52046282/convert-netcdf-nc-to-geotiff
system(command = paste('gdal_translate -of GTiff -sds -a_srs epsg:4326', data_source, data_output))
# hand over character string to Windows cmd-line to use gdal_translate
###################################################
# 2.) use the raster-package
# see here URL:https://www.researchgate.net/post/How_to_convert_a_NetCDF4_file_to_GeoTIFF_using_R2
epsg4326 <- "+proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs"
# proj4-code
# URL:https://spatialreference.org/ref/epsg/wgs-84/proj4/
specific_band <- raster(data_source)
crs(specific_band) <- epsg4326
writeRaster(specific_band, filename = data_output)
# both approaches work, i can convert the files from .nc-format into .tiff-format, but i **loose the geoinformation for the pixels** and just get pixel coordinates instead of longlat-values.
I really appreciate any solutions that keep the geoinformation for the pixels!
Thanks a lot in advance, ExploreR
As #j08lue points out,
The product format for Sentinel 3 products is horrible. Yes, the data
values are stored in netCDF, but the coordinate axes are in separate
files and it is all just a bunch of files and metadata.
I did not find any documentation (I assume it must exist), but it seems you can get the data like this:
library(ncdf4)
# coordinates
nc <- nc_open("geodetic_in.nc")
lon <- ncvar_get(nc, "longitude_in")
lat <- ncvar_get(nc, "latitude_in")
# including elevation for sanity check only
elv <- ncvar_get(nc, "elevation_in")
nc_close(nc)
# the values of interest
nc <- nc_open("F1_BT_in.nc")
F1_BT <- ncvar_get(nc, "F1_BT_in")
nc_close(nc)
# combine
d <- cbind(as.vector(lon), as.vector(lat), as.vector(elv), as.vector(F1_BT_in))
Plot a sample of the locations. Note that the raster is rotated
plot(d[sample(nrow(d), 25000),1:2], cex=.1)
I would need to investigate a bit more to see how to write a rotated raster.
For now, a not recommended shortcut could be to rasterize to a non-rotated raster
e <- extent(as.vector(apply(d[,1:2],2, range))) + 1/120
r <- raster(ext=e, res=1/30)
#elev <- rasterize(d[,1:2], r, d[,3], mean)
F1_BT <- rasterize(d[,1:2], r, d[,4], mean, filename="")
plot(F1_BT)
so that´s what i have done so far - unfortunately the raster is not somehow rotated by 180degree, but somehow distorted in another way...
# (1.) first part of the code adapted to Robert Hijmans approach (see code of answer provided above)
nc_geodetic <- nc_open(paste0(wd, "/01_test_data/sentinel_3/geodetic_in.nc"))
nc_geodetic_lon <- ncvar_get(nc_geodetic, "longitude_in")
nc_geodetic_lat <- ncvar_get(nc_geodetic, "latitude_in")
nc_geodetic_elv <- ncvar_get(nc_geodetic, "elevation_in")
nc_close(nc_geodetic)
# to get the longitude, latitude and elevation information
F1_BT_in_vars <- nc_open(paste0(wd, "/01_test_data/sentinel_3/F1_BT_in.nc"))
F1_BT_in <- ncvar_get(F1_BT_in_vars, "F1_BT_in")
nc_close(F1_BT_in_vars)
# extract the band information
###############################################################################
# (2.) following part of the code is adapted to #Matthew Lundberg rotation-code see URL:https://stackoverflow.com/questions/16496210/rotate-a-matrix-in-r
rotate_fkt <- function(x) t(apply(x, 2, rev))
# create rotation-function
F1_BT_in_rot180 <- rotate_fkt(rotate_fkt(F1_BT_in))
# rotate raster by 180degree
test_F1_BT_in <- raster(F1_BT_in_rot180)
# convert matrix to raster
###############################################################################
# (3.) extract corner coordinates and transform with gdal
writeRaster(test_F1_BT_in, filename = paste0(wd, "/01_test_data/sentinel_3/test_flip.tif"), overwrite = TRUE)
# write the raster layer
data_source_flip <- '"D:/unknown_user/X_processing/01_test_data/sentinel_3/test_flip.tif"'
data_tmp_flip <- '"D:/unknown_user/X_processing/01_test_data/temp/test_flip.tif"'
data_out_flip <- '"D:/unknown_user/X_processing/01_test_data/sentinel_3/test_flip_ref.tif"'
# define input, temporary output and output for gdal-transformation
nrow_nc_mtx <- nrow(nc_geodetic_lon)
ncol_nc_mtx <- ncol(nc_geodetic_lon)
# investigate on matrix size of the image
xy_coord_char1 <- as.character(paste("1", "1", nc_geodetic_lon[1, 1], nc_geodetic_lat[1, 1]))
xy_coord_char2 <- as.character(paste(nrow_nc_mtx, "1", nc_geodetic_lon[nrow_nc_mtx, 1], nc_geodetic_lat[nrow_nc_mtx, 1]))
xy_coord_char3 <- as.character(paste(nrow_nc_mtx, ncol_nc_mtx, nc_geodetic_lon[nrow_nc_mtx, ncol_nc_mtx], nc_geodetic_lat[nrow_nc_mtx, ncol_nc_mtx]))
xy_coord_char4 <- as.character(paste("1", ncol_nc_mtx, nc_geodetic_lon[1, ncol_nc_mtx], nc_geodetic_lat[1, ncol_nc_mtx]))
# extract the corner coordinates from the image
system(command = paste('gdal_translate -of GTiff -gcp ', xy_coord_char1, ' -gcp ', xy_coord_char2, ' -gcp ', xy_coord_char3, ' -gcp ', xy_coord_char4, data_source_flip, data_tmp_flip))
system(command = paste('gdalwarp -r near -order 1 -co COMPRESS=NONE ', data_tmp_flip, data_out_flip))
# run gdal-transformation

Merging Polygons in Shape Files with Common Tag IDs: unionSpatialPolygons

I am trying to read from a shape file and merge the polygons with a common tag ID.
library(rgdal)
library(maptools)
if (!require(gpclib)) install.packages("gpclib", type="source")
gpclibPermit()
usa <- readOGR(dsn = "./path_to_data/", layer="the_name_of_shape_file")
usaIDs <- usa$segment_ID
isTRUE(gpclibPermitStatus())
usaUnion <- unionSpatialPolygons(usa, usaIDs)
When I try to plot the merged polygons:
for(i in c(1:length(names(usaUnion)))){
print(i)
myPol <- usaUnion#polygons[[i]]#Polygons[[1]]#coords
polygon(myPol, pch = 2, cex = 0.3, col = i)
}
all the merged segments looks fine except those in around Michigan for which the merger happens in a very weird way such that the resulted area for this particular segment, gives only a small polygon as below.
i = 10
usaUnion#polygons[[i]]#Polygons[[1]]#coords
output:
[,1] [,2]
[1,] -88.62533 48.03317
[2,] -88.90155 47.96025
[3,] -89.02862 47.85066
[4,] -89.13988 47.82408
[5,] -89.19292 47.84461
[6,] -89.20179 47.88386
[7,] -89.15610 47.93923
[8,] -88.49753 48.17380
[9,] -88.62533 48.03317
which turned out to be a small northern island:
I suspect the problem is that for some reason the unionSpatialPolygons function does not like geographically separated polygons [left and right side of Michigan], but I could not find a solution to it yet.
Here is the link to input data as you can reproduce.
I think the problem is not with unionSpatialPolygons but with your plot. Specifically, you are plotting only the first 'sub-polygon' for each ID. Run the following to verify what went wrong:
for(i in 1:length(names(usaUnion))){
print(length(usaUnion#polygons[[i]]#Polygons))
}
For each of these, you took only the first one.
I got a correct polygon join/plot with the following code:
library(rgdal)
library(maptools)
library(plyr)
usa <- readOGR(dsn = "INSERT_YOUR_PATH", layer="light_shape")
# remove NAs
usa <- usa[!is.na(usa$segment_ID), ]
usaIDs <- usa$segment_ID
#get unique colors
set.seed(666)
unique_colors <- sample(grDevices::colors()[grep('gr(a|e)y|white', grDevices::colors(), invert = T)], 15)
colors <- plyr::mapvalues(
usaIDs,
from = as.numeric(sort(as.character(unique(usaIDs)))), #workaround to get correct color order
to = unique_colors
)
plot(usa, col = colors, main = "Original Map")
usaUnion <- unionSpatialPolygons(usa, usaIDs)
plot(usaUnion, col = unique_colors, main = "Joined Polygons")
Here is an example using sf to do this plot which highlights how the package's ability to work with dplyr and summarise in particular can make this operation extremely expressive and succinct. I filter out the missing IDs, group_by the ID, summarise (which does union by default), and easily plot with geom_sf.
library(tidyverse)
library(sf)
# Substitute wherever you are reading the file from
light_shape <- read_sf(here::here("data", "light_shape.shp"))
light_shape %>%
filter(!is.na(segment_ID)) %>%
group_by(segment_ID) %>%
summarise() %>%
ggplot() +
geom_sf(aes(fill = factor(segment_ID)))

How to create an interactive plot of GTFS data in R using Leaflet?

I would like to create an interactive map showing the public transport lines of a city. I am trying to do this using Leaflet in R (but I'm open to alternatives, suggestions?)
Data: The data of the transport system is in GTFS format, organized in text files (.txt), which I read into R as a data frame.*
The Problem: I cannot find how to indicate the id of each Poly line (variable shape_id) so the plot would actually follow the route of each transit line. Instead, it is connecting the dots in a random sequence.
Here is what I've tried, so far without success:
# Download GTFS data of the Victoria Regional Transit System
tf <- tempfile()
td <- tempdir()
ftp.path <- "http://www.gtfs-data-exchange.com/agency/bc-transit-victoria-regional-transit-system/latest.zip"
download.file(ftp.path, tf)
# Read text file to a data frame
zipfile <- unzip( tf , exdir = td )
shape <- read.csv(zipfile[9])
# Create base map
basemap <- leaflet() %>% addTiles()
# Add transit layer
basemap %>% addPolylines(lng=shape$shape_pt_lon, lat=shape$shape_pt_lat,
fill = FALSE,
layerId =shape$shape_id)
I would be glad to have your comments on this.
*I know it is possible to import this data into a GIS software (e.g. QGIS) to create a shapefile and then read the shapefile into R with readOGR. Robin Lovelace has shown how to do this. BUT, I am looking for a pure R solution. ;)
ps. Kyle Walker has written a great intro to interactive maps in R using Leaflet. Unfortunately, he doesn't cover poly lines in his tutorial.
Your problem is not one of method but of data: note that you download 8 MB and that the line file you try to load into Leaflet via shiny is 5 MB. As a general principle, you should always try new methods with tiny datasets first, before scaling them up. This is what I do below to diagnose the problem and solve it.
Stage 1: Explore and subset the data
pkgs <- c("leaflet", "shiny" # packages we'll use
, "maps" # to test antiquated 'maps' data type
, "maptools" # to convert 'maps' data type to Spatial* data
)
lapply(pkgs, "library", character.only = TRUE)
class(shape)
## [1] "data.frame"
head(shape)
## shape_id shape_pt_lon shape_pt_lat shape_pt_sequence
## 1 1-39-220 -123.4194 48.49065 0
## 2 1-39-220 -123.4195 48.49083 1
## 3 1-39-220 -123.4195 48.49088 2
## 4 1-39-220 -123.4196 48.49123 3
## 5 1-39-220 -123.4197 48.49160 4
## 6 1-39-220 -123.4196 48.49209 5
object.size(shape) / 1000000 # 5 MB!!!
## 5.538232 bytes
summary(shape$shape_id)
shape$shape_id <- as.character(shape$shape_id)
ids <- unique(shape$shape_id)
shape_orig <- shape
shape <- shape[shape$shape_id == ids[1],] # subset the data
Stage 2: Convert to a Spatial* object
Is this like the data.frame objects from maps?
state.map <- map("state", plot = FALSE, fill = TRUE)
str(state.map)
## List of 4
## $ x : num [1:15599] -87.5 -87.5 -87.5 -87.5 -87.6 ...
## $ y : num [1:15599] 30.4 30.4 30.4 30.3 30.3 ...
## $ range: num [1:4] -124.7 -67 25.1 49.4
## $ names: chr [1:63] "alabama" "arizona" "arkansas" "california" ...
## - attr(*, "class")= chr "map"
Yes, it's similar, so we can use map2Spatial* to convert it:
shape_map <- list(x = shape$shape_pt_lon, y = shape$shape_pt_lat)
shape_lines <- map2SpatialLines(shape_map, IDs = ids[1])
plot(shape_lines) # success - this plots a single line!
Stage 3: Join all the lines together
A for loop will do this nicely. Note we only use the first 10 lines. Use 2:length(ids) for all lines:
for(i in 2:10){
shape <- shape_orig[shape_orig$shape_id == ids[i],]
shape_map <- list(x = shape$shape_pt_lon, y = shape$shape_pt_lat)
shape_temp <- map2SpatialLines(shape_map, IDs = ids[i])
shape_lines <- spRbind(shape_lines, shape_temp)
}
Stage 4: Plot
Using the SpatialLines object makes the code a little shorter - this will plot the first 10 lines in this case:
leaflet() %>%
addTiles() %>%
addPolylines(data = shape_lines)
Conclusion
You needed to play around with the data and manipulate it before converting it into a Spatial* data type for plotting, with the correct IDs. maptools::map2Spatial*, unique() and a clever for loop can solve the problem.

Resources