I'm pretty new to using rgdal so I'm hoping this is something simple that I'm missing, but I've been googling around about it for a few hours and I can't figure out the issue.
Basically I'm trying to make a leaflet map in a shiny app, but I'm getting snarled right at the beginning, trying to load country data like so:
library(rgdal)
countries <- readOGR("https://raw.githubusercontent.com/datasets/geo boundaries-world-110m/master/countries.geojson", "OGRGeoJSON")
but every time I get the following error:
Error in ogrInfo(dsn = dsn, layer = layer, encoding = encoding, use_iconv = use_iconv, :
Cannot open file
I've gone to the address and I see the raw geojson file there, so it's not a missing file. I've also downloaded the file manually into a data folder and then tried to access it with
countries <- readOGR("data/countries.geojson", "OGRGeoJSON")
and I get the same error. Any ideas would be much appreciated.
I'm running R on Windows 7.
Related
I downloaded Chelsa bio files on my local directory.
And when I tried to create raster for my R project, I keep getting error with following message.
Error in .rasterObjectFromFile(x, band = band, objecttype = "RasterLayer", :
Cannot create a RasterLayer object from this file.
My code was simple and I was able to create worldclim climate raster file with the same code.
and, cannot figure out the difference.
download.file(url = "https://os.zhdk.cloud.switch.ch/envicloud/chelsa/chelsa_V2/GLOBAL/climatologies/1981-2010/bio/CHELSA_bio1_1981-2010_V.2.1.tif",
destfile = "Chelsa/bio1.tif")
bio1 <- raster("Chelsa/bio1.tif")
Could anyone advice on this?
I also tried
bio1 <- stack("Chelsa/bio1.tif")
But, similar error message popped out.
I also changed my directory - instead of putting it under subdirectory (named Chelsa), I directly put the file into my home directory. But, all didn't work.
I've been using mapshot a lot to send interactive maps of data but recently, although I can make the maps I want with mapview, I can't save them.
Example:
map<- mapview(mapdata, zcol = "columnofinterest", burst = TRUE)
mapshot(map, url = paste0(getwd(), "/whatIwanttocallmymap.html"))
File whatIwanttocallmymap_files/PopupTable-0.0.1/popup.css not found in resource path
Error: pandoc document conversion failed with error 99
I'm afraid I've messed something up in how I get packages. folders with package names are turning up in the area I've set as my wd instead of in my library for R
Thank you for any help/suggestions you might have
Hi I am trying to read and plot on a custom shape file in R which is not a map.
This is the code I use and the error I get in return:
library(rgdal)
mySHP<- choose.files()
myFile<- readOGR(mySHP)
Error in ogrListLayers(dsn = dsn) : Cannot open data source
If your file is a shapefile, you need to specify the dsn which is the directory where is saved the shapefile and layer which is the name of the shapefile without the extension. You cannot really do it with choose.files. At least not that simply.
myFile <- readOGR(dsn='path.to.folder', layer='nameOfShapefile')
I have a Shiny app that accesses data from a dropbox account. I used the instructions at https://github.com/karthik/rdrop2/blob/master/README.md to be been able to read in csv data with no problem, i.e. using the drop_read_csv command from the rdrop2 package after doing the authentication step.
e.g.
my_data<-drop_read_csv("ProjectFolder/DataSI.csv")
My next problem however is that there are going to be a lot of gpx track files uploaded to the dropbox that I want the app to be able to read in. I have tried using:
gpx.files<-drop_search('gpx', path="ProjectFolder/gpx_files")
trk.tmp<-vector("list",dim(gpx.files)[1])
for(i in 1: dim(gpx.files)[1]){
trk.tmp[[i]]<-readOGR(gpx.files$path[i], layer="tracks")
}
But no luck. At the readOGR step, I get:
Error in ogrInfo(dsn = dsn, layer = layer, encoding = encoding, use_iconv = use_iconv, :
Cannot open data source
Hopefully someone can help.
My problem was I hadn't specified the dropbox path properly. I have used the drop_read_csv code and made a drop_readOGR version:
drop_readOGR<-function(my.file, dest=tempdir()){
localfile = paste0(dest, "/", basename(my.file))
drop_get(my.file, local_file = localfile, overwrite = TRUE)
readOGR(localfile, layer="tracks")
}
So now I can just use what I was doing before except I have changed the line in the loop to call the new function.
gpx.files<-drop_search('gpx', path="ProjectFolder/gpx_files")
trk.tmp<-vector("list",dim(gpx.files)[1])
for(i in 1: dim(gpx.files)[1]){
trk.tmp[[i]]<-drop_readOGR(gpx.files$path[i])
}
wmap <- readOGR(dsn="~/R/funwithR/data/ne_110m_land", layer="ne_110m_land")
This code is not loading the shape file and error is generated as
Error in ogrInfo(dsn = dsn, layer = layer, encoding = encoding, use_iconv = use_iconv, :
Cannot open file
I am sure that the directory is correct one. At the end / is also not there and layer name is also correct.
Inside the ne_110m_land directory files I have are:
ne_110m_land.dbf
ne_110m_land.prj
ne_110m_land.shp
ne_110m_land.shx
ne_110m_land.VERSION.txt
ne_110m_land.README.html
You could have shown that you have the right path with:
list.files('~/R/funwithR/data/ne_110m_land', pattern='\\.shp$')
file.exists('~/R/funwithR/data/ne_110m_land/ne_110m_land.shp')
perhaps try:
readOGR(dsn=path.expand("~/R/funwithR/data/ne_110m_land"), layer="ne_110m_land")
or a simpler alternative that is wrapped around that:
library(raster)
s <- shapefile("~/R/funwithR/data/ne_110m_land/ne_110m_land.shp")
Update:
rgdal has changed a bit and you do not need to separate the path and layer anymore (at least for some formats). So you can do
x <- readOGR("~/R/funwithR/data/ne_110m_land/ne_110m_land.shp")
(perhaps still using path.expand)
Also, if you are still using readOGR you are a bit behind the times. It is better to use terra::vect or sf::st_read.
I had the same error. To read in a shapefile, you need to have three files in your folder: the .shp, .dbf and .shx files.
For me, the command returned the Cannot open layer error when I included the dsn and layer tags.
So when I included it all just as
readOGR('~/R/funwithR/data/ne_110m_land/ne_110m_land.shp')
it worked.
Note that my file was a gjson, so I've only seen this with
readOGR('~/R/funwithR/data/ne_110m_land/ne_110m_land.gjson')
the Mandatory files should be all in the same directory
.shp — shape format
.shx — shape index format;
.dbf — attribute format;
then we can just give the path as a parameter to the function it will work.
global_24h =readOGR( '/Users/m-store/Desktop/R_Programing/global_24h.shp')
Here's what worked for me (with a real example)
require(rgdal)
shape <- readOGR(dsn = "1259030001_ste11aaust_shape/STE11aAust.shp", layer = "STE11aAust")
The exact data is available here (download the .zip file called 'State and Territory ASGC Ed 2011 Digital Boundaries in MapInfo Interchange Format')
the syntax: library(raster)
s <- shapefile("~/R/funwithR/data/ne_110m_land/ne_110m_land.shp") worked perfectly! todah rabah!
As I commented in other post (Error when opening shapefile), using file.choose() and selecting manually will help in the case one file selection is needed. Apparently is related with NaturalEarth shapefiles
it seems to me that this is the solution, at least before uploading it to the cloud
######################################
# Server
######################################
#I tell R where to extract the data from
#Le digo al R donde debe jalar la data
dirmapas <- "E:/Tu-carpeta/Tu-sub-carpeta/ESRI" #Depende donde tengas tú tus
#archivos de cartografía
setwd(dirmapas)
#The raw map
# El mapa de polígonos en blanco y negro
departamentos<-readOGR(dsn="BAS_LIM_DEPARTAMENTO.shp", layer="BAS_LIM_DEPARTAMENTO")