plot VIIRS raster data (.h5) in R - r

Im looking into how to handle .h5 (hdf) VIIRS DNB raster data in R. I would like to plot it or export it as a geotiff.
So far I am able to read in the file with
hdf<- h5file("mypath/GDNBO-SVDNB_npp_d20171101_t0110370_e0116174_b31151_c20180208224859630066_nobc_ops.h5", mode = "r")
But I cannot figure out how to access the "radiance" band, let alone plot it. I have read the "h5" package documentation without any progress..
As another option I have tried to export the h5 file to geotiff with
sds <- get_subdatasets("subset_1_d20171101_t0110370.h5")
gdal_translate(sds[17], dst_dataset = "radiance.tif", overwrite=T) #subset 17 is the radiance band i need
however the file has lost its geolocation during the transformation, when the geotiff is opened in e.g. QGIS, it doesnt have any location or projection.
Anyone knows how to deal with these type of files in R?

Related

Map xml data from arc GIS

Is it possible to plot/map xml data from arc GIS in R? I have an XML file with a basemap I would like to map in R.
I dont know how to add my XML file to my question but if someone could show me how to plot using this type of file that would be awesome. I usually make my maps in r using ggplot and geom_sf.
#How I loaded my XML file into r
# Load the package required to read XML files.
library("XML")
# Also load the other required package.
library("methods")
# Give the input file name to the function.
result <- xmlParse(file = "Grey background.xml")
# Print the result.
print(result)

.dat to raster conversion

I have downloaded a dataset that is supposed to be a global map of freezing and thawing indices. Unfortunately I cannot uploaded or view it as a raster in R. Does anyone know how to convert this type of file to a .bil file?
I can load the file using: dat<- read.delim('freez.dat', header = F)
But I cannot get this to plot as a traditional raster.
Any input is appreciated.

How to load a geospatial pdf in R?

I am new handling spatial data, and newer doing it in R.
My last attemp was trying to read data in a geographical pdf format. It is information about mexican political bourdaries, so polygons, the file.
I tryied to use the rgdal package to read the data. After typing ogrDrivers()[40:45,], which show the drivers available, I got.
name write
40 PCIDSK TRUE
41 PDF TRUE
42 PDS FALSE
43 PGDump TRUE
44 PGeo FALSE
45 PostgreSQL TRUE
The result show that there is a driver for PDF's, but trying the usual way to read files readOGR(dsn = "data source name", layer = "LAYER") produces:
Error in ogrInfo(dsn = dsn, layer = layer, encoding = encoding, use_iconv = use_iconv, :
Cannot open file
The help of the function does not say the values expected neither for dsn nor for layer when the file is in a geospatial pdf format.
Does anybody knows a way to import data from a pdf? this is from geospatial format; I would appreciate any answer.
By the way, I have Ubuntu 14.04.3 with Qgis installed, and the latest versions of R and rgeos.
The dsn is the file path, and the layer name is internal to the PDF. You can get a list of layers with ogrListLayers on the file name:
> ogrListLayers("foo.pdf")
[1] "polys"
attr(,"driver")
[1] "PDF"
attr(,"nlayers")
[1] 1
Ugly output, but that's one layer called polys. So I can read it like this:
> polys = readOGR("./foo.pdf","polys")
OGR data source with driver: PDF
Source: "./foo.pdf", layer: "polys"
with 9 features
It has 1 fields
Note that this only applies to a special class of PDF files with the map data stored in a particular way. Just because your PDF has a map in it, doesn't make it a Geospatial PDF. Here's the command line test on my Geospatial PDF:
$ ogrinfo Monaco/foo.pdf
Had to open data source read-only.
INFO: Open of `Monaco/foo.pdf'
using driver `PDF' successful.
1: polys (Polygon)
and here's the test on yours:
$ ogrinfo CED06_CARTA_110614.pdf
FAILURE:
Unable to open datasource `CED06_CARTA_110614.pdf' with the following drivers.
-> ESRI Shapefile
-> MapInfo File
[etc etc]
-> PDF
[etc etc]
So you dont have a Geospatial PDF.
Your options are, in possible order of simplicity, something like:
Get the boundary data in a GIS data format (eg shapefile, GeoPDF)
Save as an image, load into a GIS, georeference and trace it (QGIS can do this)
Get the raw PDF vectors out of the PDF, assuming they are vectors (first glance shows me the map isn't an image), then find the right transformation to whatever coordinate system, then possibly rebuild the topology if all you have is line segments...
I've had a little bit of success using pstoedit to convert the PDF to a DXF file which can be loaded into QGIS, but then you have to clean it up and reconstruct the polygons, and then its still not in the right geographical location. It would be much simpler if you can get a shapefile of the regions you are interested in.
If what you want is a raster version of the PDF, then you can either use raster::stack("file.pdf") or readGDAL("file.pdf"). But you'll get an image with no georeferencing (it'll just have a bounding box of the number of pixels) since there's no coordinate system with the PDF.

Using shapefile as an input to the user defined function using R

I have a script to create a randomly distributed square polygons in KML format which takes in shapefile with a single polygon as an input which works absolutely well. The problem arise when I tried to create the user defined function out of it. I used readShapePoly() function to read the shapefile and it works well when used out of the function. But when the function is created in which shapefile should be given as an input, it simply wont take. It shows this error message
Error in getinfo.shape(filen) : Error opening SHP file
I avoid writing extensions and I do have all the extension files to create the shapefile.
Part of the script to read the shapefile using it as the input file:
library(maptools)
library(sp)
library(ggplot2)
Polytokml <- function(shapefile_name){
###Input Files
file1 <- "shapefile_name"
Shape_file <- readShapePoly(file1) #requires maptools
return(Shape_file)
}
The function is created but it doesn't work if the function is called.
>Polytokml(HKH.shp)
Error in getinfo.shape(filen) : Error opening SHP file
This works well out of the function.
###Input Files
file1 <- "shapefile.shp"
Shape_file <- readShapePoly(file1) #requires maptools
This is just an example out of the whole script in which different arguments are taken as an input. So just to make things simple I have added script to read the shapefile which has been a problem now. Do let me know if it is not clear.
Thank you so much in advance :)

Creating Shapefiles in R

I'm trying to create a shapefile in R that I will later import to either Fusion Table or some other GIS application.
To start,I imported a blank shapefile containing all the census tracts in Canada. I have attached other data (in tabular format) to the shapefile based on the unique ID of the CTs, and I have mapped my results. At the moment, I only need the ones in Vancouver and I would like to export a shapefile that contains only the Vancouver CTs as well as my newly attached attribute data.
Here is my code (some parts omitted due to privacy reasons):
shape <- readShapePoly('C:/TEST/blank_ct.shp') #Load blank shapefile
shape#data = data.frame(shape#data, data2[match(shape#data$CTUID, data2$CTUID),]) #data2 is my created attributes that I'm attaching to blank file
shape1 <-shape[shape$CMAUID == 933,] #selecting the Vancouver CTs
I've seen other examples using this: writePolyShape to create the shapefile. I tried it, and it worked to an extent. It created the .shp, .dbf, and .shx files. I'm missing the .prj file and I'm not sure how to go about creating it. Are there better methods out there for creating shapefiles?
Any help on this matter would be greatly appreciated.
Use rgdal and writeOGR. rgdal will preserve the projection information
something like
library(rdgal)
shape <- readOGR(dsn = 'C:/TEST', layer = 'blank_ct')
# do your processing
shape#data = data.frame(shape#data, data2[match(shape#data$CTUID, data2$CTUID),]) #data2 is my created attributes that I'm attaching to blank file
shape1 <-shape[shape$CMAUID == 933,]
writeOGR(shape1, dsn = 'C:/TEST', layer ='newstuff', driver = 'ESRI Shapefile')
Note that the dsn is the folder containing the .shp file, and the layer is the name of the shapefile without the .shp extension. It will read (readOGR) and write (writeOGR) all the component files (.dbf, .shp, .prj etc)
Problem solved! Thank you again for those who help!
Here is what I ended up doing:
As Mnel wrote, this line will create the shapefile.
writeOGR(shape1, dsn = 'C:/TEST', layer ='newstuff', driver = 'ESRI Shapefile')
However, when I ran this line, it came back with this error:
Can't convert columns of class: AsIs; column names: ct2,mprop,mlot,mliv
This is because my attribute data was not numeric, but were characters. Luckily, my attribute data is all numbers so I ran transform() to fix this problem.
shape2 <-shape1
shape2#data <- transform(shape1#data, ct2 = as.numeric(ct2),
mprop = as.numeric(mprop),
mlot = as.numeric(mlot),
mliv = as.numeric(mliv))
I tried the writeOGR() command again, but I still didn't get the .prj file that I was looking for. The problem was I didn't specified the coordinate systems for the shapefile when I was importing the file. Since I already know what the coordinate system is, all I had to do was define it when importing.
readShapePoly('C:/TEST/blank_ct.shp',proj4string=CRS("+proj=longlat +datum=WGS84")
After that, I re-ran all the things I wanted to do with the shapefile, and the writeOGR line for exporting. And that's it!

Resources