Is there a way to read a shape file with a specific character encoding? I'm trying to read in a Canadian shapefile that has special (French) characters in some of the names. I can convert them manually, but I'd prefer not to do this if there's a setting somewhere that I'm so far blind to.
# manual conversion works
library(maptools)
shp <- file.path("path/to/file.shp")
map <- readShapePoly(shp, proj4string = CRS("+init=epsg:25832"))
map$ERNAME <- iconv(map$ERNAME, "Windows-1252", "UTF-8")
Instead of using, maptools and the readShapePoly arguments, using the rgdal library with readOGR function allows for more options. For example, the syntax with rgdal could be:
pasl=readOGR(".","filename", use_iconv=TRUE, encoding="UTF-8")
Be aware that this is not a universal solution but depends on the encoding of the file, which in the OP case was UTF-8. Common encoding also includes latin1. In some shapefiles, the encoding is named in the .cpg file (open with a text editor software) along with the .shp file. QGIS automatically generates a .cpg file when a new shapefile is created but many other GIS software do not.
Related
I'm using Leaflet in R to make an interactive map.
Data I'm using was created using QGIS with Windows-1250 encoding (it's a non-English language). The layers were then exported as geoJSON which only allows UTF-8 encoding and when displaying the features on the map some characters are, of course, displayed wrong.
Is there a way of displaying the encoding properly? Trying to manually change the data results in:
Error: unexpected symbol in x' (x is the wrongly encoded data point I tried to rewrite).
You're not specifying the vector format of the data before being exported - this is more important than the tool used to create that data.
Assuming that the data is in shapefiles, you can use the approach from the question at https://gis.stackexchange.com/q/44057 , and rely on ogr2ogr to both convert to GeoJSON and handle character encoding.
I have just downloaded some climate data in grib format. I want to use "R" to convert it to NetCDF format.
Furthermore, as the file consists of different variables, I would like to extract one variable at a time into individual files.
It's hard to answer this without your specific file. You should look into producing reproducible examples, especially if you're posting to the R board.
For R, check out library(raster) and library(ncdf4). I just grabbed the first grib1 file I saw, and put together a quick example.
library(raster)
library(ncdf4)
download.file(url = 'ftp://ftp.hpc.ncep.noaa.gov/grib/20130815/p06m_2013081500f030.grb', destfile = 'test.grb')
(r <- raster('test.grb'))
n <- writeRaster(r, filename = 'netcdf_in_youR_comp.nc', overwrite = TRUE)
1. RNOMADS
The package Rnomads has a function readgrib providing wrappers to external libraries allowing one to read grib files
2. converting to netcdf
If the GRIB data is on a regular lat-lon grid, then probably an easier way is to convert to netcdf as the support for reading that is more developed (and you are probably already used to using it)
You can convert grib in several ways, two of the easiest are
CDO:
cdo -f nc copy test.grb test.nc
Use "-f nc4" if you want netcdf4 conventions.
ECCODES (on a mac install with brew install eccodes)
grib_to_netcdf -o test.nc test.grb
you can use ncl installed on you computer
library(ncdf)
system(ncl_convert2nc xxxx.grb, internal = TRUE)
my.nc <- open.ncdf("result.nc")
print(my.nc)
I would like to create a RasterLayer from GeoTIFF data provided by a server. I'll query the server for this data using a httr::GET call (the data is provided on-demand, so in the application there won't be a url ending in .tif but a query url).
After writing the result of this call to disk as a GeoTIFF file it's easy enough to create the RasterLayer from the resulting GeoTIFF file on disk:
library(httr)
library(raster)
url <- 'http://download.osgeo.org/geotiff/samples/gdal_eg/cea.tif'
geotiff_file <- tempfile(fileext='.tif')
httr::GET(url,httr::write_disk(path=geotiff_file))
my_raster <- raster(geotiff_file)
my_raster
However, I would like to skip the write to disk part and create the raster straight from the in-memory server response.
response <- httr::GET(url,httr::write_memory())
response
The content of the response is a raw string which I would need to interpret as geoTIFF data.
str(httr::content(response))
However, I can only find raster or rgdal functions to read from a file. Any suggestions on translating this raw string to a raster?
Thanks!
GDAL has some cool virtual file system driver, one of which is /vsicurl that
allows on-the-fly random reading of files available through HTTP/FTP web protocols, without prior download of the entire file. It requires GDAL to be built against libcurl.
Since the raster package builds on rgdal you can simply do this:
library(raster)
r <- raster('/vsicurl/http://download.osgeo.org/geotiff/samples/gdal_eg/cea.tif')
plot(r)
I am trying to use multiple sources of German/Swiss data with umlauts in it. When trying to merge, I realized that the umlauts do not display correctly in R and the same names were rendered differently in different files.
map <-readOGR("/path/to/data.gdb", layer = "layer")
map#data$name
# [1] L\303\266rrach
# [2] Karlsruhe
# [3] ...
Along with several other posts, I read Encoding of German umlauts when using readOGR because one of my data sources is a shp file I read in with readOGR.
Appending use_iconv = TRUE, encoding = "UTF-8") to the end of readOGR did not help. And the problem exists outside of the use of redOGR. I saw that using Sys.setlocale() and a locale which supports UTF-8 worked for that poster, but I don't know what that means after looking at the ?Sys.setlocale information.
How do I correctly read in German data in R on a Mac using English? Sys.getlocale reports C.
Could you somehow include an exemplary .gdb-file?
What happens, if you try encoding="latin1"?
Maybe the gdb-data was saved in a wrong encoding? Are you creating it yourself, or you downloaded it from somewhere?
You could also check the information of the gdb-file with this command:
ogrinfo -al "/path/to/data.gdb"
I have just downloaded some climate data in grib format. I want to use "R" to convert it to NetCDF format.
Furthermore, as the file consists of different variables, I would like to extract one variable at a time into individual files.
It's hard to answer this without your specific file. You should look into producing reproducible examples, especially if you're posting to the R board.
For R, check out library(raster) and library(ncdf4). I just grabbed the first grib1 file I saw, and put together a quick example.
library(raster)
library(ncdf4)
download.file(url = 'ftp://ftp.hpc.ncep.noaa.gov/grib/20130815/p06m_2013081500f030.grb', destfile = 'test.grb')
(r <- raster('test.grb'))
n <- writeRaster(r, filename = 'netcdf_in_youR_comp.nc', overwrite = TRUE)
1. RNOMADS
The package Rnomads has a function readgrib providing wrappers to external libraries allowing one to read grib files
2. converting to netcdf
If the GRIB data is on a regular lat-lon grid, then probably an easier way is to convert to netcdf as the support for reading that is more developed (and you are probably already used to using it)
You can convert grib in several ways, two of the easiest are
CDO:
cdo -f nc copy test.grb test.nc
Use "-f nc4" if you want netcdf4 conventions.
ECCODES (on a mac install with brew install eccodes)
grib_to_netcdf -o test.nc test.grb
you can use ncl installed on you computer
library(ncdf)
system(ncl_convert2nc xxxx.grb, internal = TRUE)
my.nc <- open.ncdf("result.nc")
print(my.nc)