R: Raster mosaic from list of rasters? - r

I am working from the post here: How can I create raster mosaic using list of rasters? to create a raster mosaic using a list of rasters. The example in the answer given by fmark works perfectly but I get an error when I follow the steps using my own data. Not sure where I am going wrong, any help would be very much appreciated!
R version 2.15.3 (2013-03-01)
Platform: x86_64-unknown-linux-gnu (64-bit)
locale:
[1] C
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] raster_2.2-12 rgdal_0.8-10 sp_1.0-14
loaded via a namespace (and not attached):
[1] grid_2.15.3 lattice_0.20-15 tools_2.15.3
I used the function from How to iterate over a list preserving the format of the results? to generate my raster list.
ListRasters <- function(list_names) {
raster_list <- list() # initialise the list of rasters
for (i in 1:(length(list_names))){
grd_name <- list_names[i] # list_names contains all the names of the images in .grd format
raster_file <- raster(grd_name)
}
raster_list <- append(raster_list, raster_file) # update raster_list at each iteration
}
Then I generate my list names and create my raster list from them.
wgs84.tif.list <- list.files(path=mod.dir, pattern=glob2rx("*_wgs84.tif"), full.names=T,recursive=F)
list_names <- NULL
for (i in 1:length(wgs84.tif.list)) {
list_names <- c(list_names, wgs84.tif.list[i])
}
raster.list <-sapply(list_names, FUN = ListRasters)
raster.list$fun <- mean
mos <- do.call(mosaic, raster.list)
This is the error I get:
Error in function (classes, fdef, mtable) : unable to find an
inherited method for function 'mosaic' for signature '"missing",
"missing"'
My raster.list starts off like this (it contains 11 rasters):
$`/import/c/w/kbennett/MODSCAG/snow-dav.jpl.nasa.gov/modscag-historic/2002/091/MOD09GA.A2002091.h08v03.005.2007124035032snow_fraction_wgs84.tif`
class : RasterLayer
dimensions : 2400, 2400, 5760000 (nrow, ncol, ncell)
resolution : 463.3127, 463.3127 (x, y)
extent : -11119737, -10007786, 5559984, 6671935 (xmin, xmax, ymin, ymax)
coord. ref. : +proj=sinu +lon_0=0 +x_0=0 +y_0=0 +datum=WGS84 +units=m +no_defs +ellps=WGS84 +towgs84=0,0,0
data source : /import/c/w/kbennett/MODSCAG/snow-dav.jpl.nasa.gov/modscag-historic/2002/091/MOD09GA.A2002091.h08v03.005.2007124035032snow_fraction_wgs84.tif
names : MOD09GA.A2002091.h08v03.005.2007124035032snow_fraction_wgs84
values : 0, 255 (min, max)

My rasters were not named correctly. To rectify this ran, before calling fun on it:
names(rasters.list) <- NULL
Then:
raster.list$fun <- mean
mos <- do.call(mosaic, raster.list)

To expand a bit on foo's answer. You can use sapply to create a list of RasterLayer objects.
rlist <- sapply(list_names)
Then add the names of the other arguments. The first ones are 'x' and 'y' (see ?mosaic). However it will also work if they are NULL (as their position will be used).
names(rlist)[1:2] <- c('x', 'y')
rlist$fun <- mean
rlist$na.rm <- TRUE
And now call do.call
x <- do.call(mosaic, rlist)

how about that? Im noob in R.
lista = list of rasters
mosaicar = function(lista){
raster = lista[[1]]
for (i in 2:length(lista)){
raster1 = mosaic(raster, lista[[i]], fun = max)
raster = raster1
}
return(raster)
}

As mentioned by #Bappa Das above, the provided solution does not work on terra. #moho wu did not mention the na.rm issue. It remains unclear how to pass the na.rm to terra::mosaic. If anywone has a working answer...

Related

How can I create an R loop with the code provided below?

Please, I need help in creating a loop that would do the computation shown in the codes below on a hdflist containing 483 files in R. I have added a link that contains two .hdf files and the shapefiles for trial. The code seems to work just fine for a single .hdf file but I'm still struggling with looping. Thank you
download files from here
https://beardatashare.bham.ac.uk/getlink/fi2gNzWbuv5H8Gp7Qg2aemdM/
# import .hdf file into R using get_subdatasets to access the subsets in the file`
sub <- get_subdatasets("MOD13Q1.A2020353.h18v08.006.2021003223721.hdf")
# convert red and NIR subsets and save them as raster`
gdalwarp(sub[4], 'red_c.tif')
gdalwarp(sub[5], 'NIR_c.tif')
# import red and NIR raster back into R`
# scale the rater while at it`
r_r=raster('red_c.tif') * 0.0001
r_N=raster('NIR_c.tif') * 0.0001
# calculate sigma using (0.5*(NIR+red))`
sigma <- (0.5*(r_N+r_r))
# calculate knr using exp((-(NIR-red)^2)/(2*sigma^2))`
knr <- exp((-(r_N-r_r)^2)/(2*sigma^2))
# calculate kndvi using (1 - knr) / (1 + knr)`
kndvi <- (1 - knr) / (1 + knr)
# import shapefile into R`
shp=readOGR(".", "National_Parks")
options(stringsAsFactors = FALSE)
#change crs of shapefile to crs of one of the rasters`
shp2 <- spTransform(shp, crs(kndvi))
# use extent to crop/clip raster`
## set extent`
e <- extent(910000,980000, 530000, 650000)
## clip using crop function`
crop_kndvi <- crop(kndvi, e)
# mask raster using the shapefile`
kndvi_mask <- mask(crop_kndvi, shp2)
And then save kndvi_mask as raster for 483 files
Here is how you can do that with terra. terra is the replacement for raster; it is much faster and more versatile. For example, with terra you can skip the gdalwarp step.
You can write one big for-loop, but I prefer to use functions and then call these in a loop or lapply.
Also, instead of your raster-algebra approach, it could be more efficient to wrap the kndvi computation into its own function and use it with lapp. I think that is a better approach as the code is clearer and it allows you to re-use the kndvi function.
library(terra)
parks <- vect("National_Parks.shp")
parks <- project(parks, "+proj=sinu +lon_0=0 +x_0=0 +y_0=0 +R=6371007.181 +units=m")
e <- ext(910000,980000, 530000, 650000)
kndvi function to be used by lapp
kndvi <- function(red, NIR) {
red <- red * 0.0001
NIR <- NIR * 0.0001
sigma <- (0.5 * (NIR + red))
knr <- exp((-(NIR-red)^2)/(2*sigma^2))
(1 - knr) / (1 + knr)
}
Main function. Note that I use crop before the other functions; that saves a lot of unnecessary processing.
fun <- function(f) {
outf <- gsub(".hdf$", "_processed.tif", f)
# if file.exists(outf) return(rast(outf))
r <- rast(f)[[4:5]]
# or r <- sds(f)[4:5]
r <- crop(r, e)
kn <- lapp(r, kndvi)
name <- substr(basename(f), 9, 16)
mask(kn, parks, filename=outf, overwrite=TRUE, names=name)
}
Get the filenames and use the function with a loop or with lapply as shown by Elia.
ff <- list.files(pattern="hdf$", full=TRUE)
x <- list()
for (i in 1:length(ff)) {
print(ff[i]); flush.console()
x[[i]] <- fun(ff[i])
}
z <- rast(x)
z
#class : SpatRaster
#dimensions : 518, 302, 2 (nrow, ncol, nlyr)
#resolution : 231.6564, 231.6564 (x, y)
#extent : 909946.2, 979906.4, 530029.7, 650027.7 (xmin, xmax, ymin, ymax)
#coord. ref. : +proj=sinu +lon_0=0 +x_0=0 +y_0=0 +R=6371007.181 +units=m +no_defs
#sources : MOD13Q1.A2020337.h18v08.006.2020358165204_processed.tif
# MOD13Q1.A2020353.h18v08.006.2021003223721_processed.tif
#names : A2020337, A2020353
#min values : 0.0007564131, 0.0028829363
#max values : 0.7608207, 0.7303495
This takes about 1 second per file on my computer.
Or as a for-loop that you asked for:
ff <- list.files(pattern="hdf$", full=TRUE)
for (f in ff) {
print(f); flush.console()
outf <- gsub(".hdf$", "_processed.tif", f)
r <- rast(f)[[4:5]]
r <- crop(r, e)
kn <- lapp(r, kndvi)
name <- substr(basename(f), 9, 16)
mask(kn, parks, filename=outf, overwrite=TRUE, names=name)
}
outf <- list.files(pattern="_processed.tif$", full=TRUE)
x <- rast(outf)
You can wrap your code in a function and then lapply over the hdf path. In this way if your loop is too slow it will be easy to parallelize it.
You could try this:
library(gdalUtils)
library(raster)
library(rgdal)
#set the directory where you have .hdf files. In my case I downloaded your data in "D:/download"
setwd("D:/download")
#function to save the masked index in your current working directory
#the final files name will depend on the name of the input hdf files
myfun <- function(path){
name <- basename(tools::file_path_sans_ext(path))
sub <- get_subdatasets(path)
gdalwarp(sub[4], paste0(name,'_red_c.tif'))
gdalwarp(sub[5], paste0(name,'NIR_c.tif'))
r_r=raster(paste0(name,'_red_c.tif')) * 0.0001
r_N=raster(paste0(name,'NIR_c.tif')) * 0.0001
sigma <- (0.5*(r_N+r_r))
knr <- exp((-(r_N-r_r)^2)/(2*sigma^2))
kndvi <- (1 - knr) / (1 + knr)
crop_kndvi <- crop(kndvi, e)
kndvi_mask <- mask(crop_kndvi,
shp2,filename=paste0(name,"_kndvi_mask.tif"))
}
#list the hdf file in your current working directory. Thanks to setwd("D:/download") there is no need to specify the path argument of list.files().
b#however for the for peace of mind:
hdf <- list.files(path=getwd(),pattern = "hdf",full.names = T)
#since your shop is always the same you could keep this part out of the function
shp=readOGR(".", "National_Parks")
options(stringsAsFactors = FALSE)
shp2 <- spTransform(shp, "+proj=sinu +lon_0=0 +x_0=0 +y_0=0 +a=6371007.181 +b=6371007.181 +units=m
+no_defs ")
e <- extent(910000,980000, 530000, 650000)
#now run your function across the hdf files path
lapply(hdf, myfun)
in your working directory now you find all the saved if
list.files(pattern = "tif")
[1] "MOD13Q1.A2020337.h18v08.006.2020358165204_kndvi_mask.tif"
[2] "MOD13Q1.A2020337.h18v08.006.2020358165204_red_c.tif"
[3] "MOD13Q1.A2020337.h18v08.006.2020358165204NIR_c.tif"
[4] "MOD13Q1.A2020353.h18v08.006.2021003223721_kndvi_mask.tif"
[5] "MOD13Q1.A2020353.h18v08.006.2021003223721_red_c.tif"
[6] "MOD13Q1.A2020353.h18v08.006.2021003223721NIR_c.tif"
With lapply on my PC the function run in 45 seconds.
You could easily parallelize lapply by replacing it with sfLapply from the snowfall package, for example. For just 2 files it's not worth it, but if you have hundreds of files you can speed up the process a lot:
library(snowfall)
#open cluster with as many node as hdf file
sfInit(parallel=TRUE, cpus=length(hdf))
# Load the required packages inside the cluster
sfLibrary(raster)
sfLibrary(rgdal)
sfLibrary(gdalUtils)
sfExportAll()
system.time(sfLapply(hdf, myfun))
sfStop()
with sfLapply this function took 20 secs to run. It is a good improvement
hdf_files <- list.files("foldername", pattern = ".hdf")
for(f in files) { ... }
For the save code, you can use the string "f" to create a name for the file, so they don't save on top of each other.

Prefixing the c function from terra

In R, I can prefix functions with the name of the package they belong to (e.g., dplyr::select). Nevertheless, I am having problems when doing this with c from the terra package. I can do it fine with base::c (should I want to):
base::c(1, 2, 3)
# [1] 1 2 3
However, I run into problems when running similar code for terra:
# Dummy SpatRaster
foo <- terra::rast(matrix(1:9, ncol=3))
# Works fine
c(foo, foo)
# Not so much
terra::c(foo, foo)
# Error: 'c' is not an exported object from 'namespace:terra'
I am confused how c is not an exported function of terra and yet I can access and use it just fine... so long as I don't use a prefix.
Q: Can someone explain why this is the case and how I can explicitly refer to c from terra?
PS ?terra::c gives a help page explaining how c combines SpatRasterobjects into a new SpatRaster object, which suggests to me that this function must've been implemented in the terra package.
This is because c is a "primitive" function --- these follow their own rules. A package cannot / need not import and export them
c
#function (...) .Primitive("c")
This is not the case, for, for example, nrow
nrow
#function (x)
#dim(x)[1L]
#<bytecode: 0x000000001662d228>
#<environment: namespace:base>
And terra creates a generic function from it, and exports it so that you can do terra::nrow().
Your question triggered me to look a bit more at this, and I noted that in the current version of terra many primitive functions do not work if you do not load the package with library(terra). For example, you get
foo <- terra::rast(matrix(1:9, ncol=3))
x <- c(foo, foo)
max(x)
#Error in x#ptr$summary(fun, na.rm, .terra_environment$options#ptr) :
# trying to get slot "ptr" from an object of a basic class ("NULL") with no slots
I just fixed that in the development version; and with that version the above returns
#class : SpatRaster
#dimensions : 3, 3, 1 (nrow, ncol, nlyr)
#resolution : 1, 1 (x, y)
#extent : 0, 3, 0, 3 (xmin, xmax, ymin, ymax)
#coord. ref. : +proj=longlat +datum=WGS84 +no_defs
#source : memory
#names : max
#min values : 1
#max values : 9

R - gdalUtils - gdal_grid example data giving zero values...?

I have been trying to use the gdal_grid in R and while running the example data set I recieve a raster which has only zero values. I have tried experimenting with this with my own data, and have searched the forums with no luck. Can others get the example to work?
I have tried to explicitly call my path to my GDAL library, and have updated the version of GDAL. I am running R studio with version 3.3.1.
library(raster)
library(rgeos)
library(gdalUtils)
# We'll pre-check to make sure there is a valid GDAL install
# and that raster and rgdal are also installed.
# Note this isn't strictly neccessary, as executing the function will
# force a search for a valid GDAL install.
gdal_setInstallation()
valid_install <- !is.null(getOption("gdalUtils_gdalPath"))
if(require(raster) && valid_install)
{
# Create a properly formatted CSV:
temporary_dir <- tempdir()
tempfname_base <- file.path(temporary_dir,"dem")
tempfname_csv <- paste(tempfname_base,".csv",sep="")
pts <- data.frame(
Easting=c(86943.4,87124.3,86962.4,87077.6),
Northing=c(891957,892075,892321,891995),
Elevation=c(139.13,135.01,182.04,135.01)
)
write.csv(pts,file=tempfname_csv,row.names=FALSE)
# Now make a matching VRT file
tempfname_vrt <- paste(tempfname_base,".vrt",sep="")
vrt_header <- c(
'<OGRVRTDataSource>',
'\t<OGRVRTLayer name="dem">',
'\t<SrcDataSource>dem.csv</SrcDataSource>',
'\t<GeometryType>wkbPoint</GeometryType>',
'\t<GeometryField encoding="PointFromColumns" x="Easting" y="Northing" z="Elevation"/>',
'\t</OGRVRTLayer>',
'\t</OGRVRTDataSource>'
)
vrt_filecon <- file(tempfname_vrt,"w")
writeLines(vrt_header,con=vrt_filecon)
close(vrt_filecon)
tempfname_tif <- paste(tempfname_base,".tiff",sep="")
# Now run gdal_grid:
setMinMax(gdal_grid(src_datasource=tempfname_vrt,
dst_filename=tempfname_tif,a="invdist:power=2.0:smoothing=1.0",
txe=c(85000,89000),tye=c(894000,890000),outsize=c(400,400),
of="GTiff",ot="Float64",l="dem",output_Raster=TRUE))
}
r<-raster(tempfname_tif)
r
#class : RasterLayer
#dimensions : 400, 400, 160000 (nrow, ncol, ncell)
#resolution : 10, 10 (x, y)
#extent : 85000, 89000, 890000, 894000 (xmin, xmax, ymin, ymax)
#coord. ref. : NA
#data source : C:\Users\m.modeler\AppData\Local\Temp\RtmpW6HvOc\dem.tiff
#names : dem
#min values : 0
#max values : 0
plot(r)
Raster results plot with zero values:
Thanks much,
I have got the code to run by changing the path from the temp directory to a folder on my hard drive. Example below.
# change to a path on your computer
setwd("C:\\Users\\m.modeler\\Documents\\R\\gdal_Examples")
#######################################################
#create XYZ csv
pts <- data.frame(
Easting=c(86943.4,87124.3,86962.4,87077.6),
Northing=c(891957,892075,892321,891995),
Elevation=c(139.13,135.01,182.04,135.01))
write.csv(pts,file="dem.csv",row.names=FALSE)
#######################################################
#create VRT
fn_vrt<-"dem.vrt"
# Now make a matching VRT file
vrt_header <- c(
'<OGRVRTDataSource>',
'\t<OGRVRTLayer name="dem">',
'\t<SrcDataSource>dem.csv</SrcDataSource>',
'\t<GeometryType>wkbPoint</GeometryType>',
'\t<GeometryField encoding="PointFromColumns" x="Easting" y="Northing" z="Elevation"/>',
'\t</OGRVRTLayer>',
'\t</OGRVRTDataSource>')
vrt_filecon <- file(fn_vrt,"w")
writeLines(vrt_header,con=vrt_filecon)
close(vrt_filecon)
#######################################################
#create interpolated DEM
fn_tif <- "dem.tif"
# Now run gdal_grid:
r.dem <- setMinMax(gdal_grid(src_datasource=fn_vrt,
dst_filename=fn_tif,a="invdist:power=2.0:smoothing=1.0",
txe=c(85000,89000),tye=c(894000,890000),outsize=c(400,400),
of="GTiff",ot="Float64",l="dem",output_Raster=TRUE,verbose=TRUE))
plot(r.dem)

Very likely bug in the R raster package intersect function when intersecting 1 polygon with 1 point

Through attempting to get an intersect result from a single point and a single polygon I have found what I believe can only be a bug in the R raster package intersect function.
I have 1 polygon and 1 point, and use intersect as follows:
intersect(a_point, a_polygon)
Where a_point contains an id attribute. This fails with the error:
Error in j[, 2] : incorrect number of dimensions
However, if I reverse the arguments and do:
intersect(a_polygon, a_point)
It works fine, but doesn't return the id from the point shape file as part of the result which I require. This is expected behaviour, so fine but I need it to work the other way around.
To rule out there being some peculiarity with my polygon or point data, I created a single polygon and single point spatial object and tested the same hypothesis, and the same result occurred as above with these 'raw' objects.
The following is the code for generating these two 'fake' objects for completeness and so that it can be reproduced:
test_list_x = list(530124, 530125) #For when I use 2 points
test_list_y = list(176949, 176950) #For when I use 2 points
data_frame_object = data.frame(530124, 176950)
names(data_frame_object) = c("Longitude", "Latitude")
coordinates(data_frame_object)=~Longitude+Latitude
proj4string(data_frame_object)=CRS("+proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000 +y_0=-100000 +datum=OSGB36 +units=m +no_defs +ellps=airy +towgs84=446.448,-125.157,542.060,0.1502,0.2470,0.8421,-20.4894")
fake_point_shape_object=SpatialPointsDataFrame(data_frame_object, data.frame(id=1:length(data_frame_object)))
coords = matrix( nrow=5, ncol=2)
coords[1,1] = 530106.8
coords[1,2] = 176953.3
coords[2,1] = 530127.5
coords[2,2] = 176953.3
coords[3,1] = 530127.5
coords[3,2] = 176933.3
coords[4,1] = 530106.8
coords[4,2] = 176933.3
coords[5,1] = 530106.8
coords[5,2] = 176953.3
my_fake_polygon = Polygon(coords)
polygon_list = list(my_fake_polygon)
polygon_set <- lapply(seq_along(polygon_list), function(i) Polygons(list(polygon_list[[i]]), i ))
new_polygons <- SpatialPolygons(polygon_set)
new_polygons#proj4string = CRS("+proj=tmerc +lat_0=49 +lon_0=-2 +k=0.9996012717 +x_0=400000 +y_0=-100000 +datum=OSGB36 +units=m +no_defs +ellps=airy +towgs84=446.448,-125.157,542.060,0.1502,0.2470,0.8421,-20.4894")
df <- data.frame("1")
names(df) = "id"
my_fake_polygon <- SpatialPolygonsDataFrame(new_polygons,df)
Now here's the thing, if I create 2 points next to each other (so they are both within the polygon) instead of just one, it works fine, no error. Suggesting there is a bug associated with intersection between 1 point and 1 polygon, WHEN the point carries an attribute to be returned in the intersection process.
You might ask why do you actually need to have the attribute returned if there is just one point, this is because it is an iterative process in which it may not be just one point, it could be none or many.
I would appreciate somebody explaining this error or confirming my findings.
Here are your example data in a more concise way.
library(raster)
pnt <- SpatialPoints(cbind(530124, 176950))
pol <- spPolygons(matrix(c(530106.8, 530127.5, 530127.5, 530106.8, 530106.8, 176953.3, 176953.3, 176933.3, 176933.3, 176953.3), ncol=2))
Now illustrate the problem.
intersect(pol, pnt)
#class : SpatialPolygons
#features : 1
#extent : 530106.8, 530127.5, 176933.3, 176953.3 (xmin, xmax, ymin, ymax)
#coord. ref. : NA
# this fails
intersect(pnt, pol)
#Loading required namespace: rgeos
#Error in j[, 2] : incorrect number of dimensions
# but it works with two points!
intersect(bind(pnt, pnt), pol)
#class : SpatialPoints
#features : 2
#extent : 530124, 530124, 176950, 176950 (xmin, xmax, ymin, ymax)
#coord. ref. : NA
This was another drop=TRUE bug caused by the R default of "dropping" matrices to vectors when a single row is selected. This was fixed in raster version 2.6-11 (not on CRAN yet).
Sorry i can't answer your intersect bug question, but it might be simpler for now to use sp::over to return polygon attributes to points
# dummy polygon
xym <- as.matrix(data.frame(x=c(16.48438,17.49512,24.74609,22.59277,16.48438),
y=c(59.73633,55.12207,55.03418,61.14258,59.73633)))
# make into SpatialPolygon
p = Polygon(xym)
ps = Polygons(list(p),1)
sps = SpatialPolygons(list(ps))
# Promote to SPDF and give an attribute
SPDF = SpatialPolygonsDataFrame(sps, data.frame(N = "hello", row.names = 1))
# make 2 points, one inside the polygon and one outside
p <- data.frame(x=c(16,18),y=c(58,58))
coordinates(p) <- ~x + y
# plot to check
plot(sps)
plot(p,add=T)
# perform the over, returns a named vector for every point in the SpatialPoints
res <- unname(over(p,SPDF))
# promote points to SpatialPointsDataFrame and put in new polygon attribute
data <- data.frame(ID=row.names(p),pol=res)
sp <- SpatialPointsDataFrame(p, data)

R - plotting Netcdf data. How to get right greed?

I'm an absolutely new one in R (it's my second day), it's a little bit difficult for me. So, sorry for this question, but I really need help.
I've already read
R - Plotting netcdf climate data
Plot NetCDF variable-grid data file using ggplot2: "Vector is too large" error
but still don't understand.
I've got to plot NMF2 from this set of data in geo coordinates https://drive.google.com/file/d/0B6IqnlmRMSpcNFBXWWlha1JUUzQ/view?usp=sharing
The easiest way to do it was:
>library(raster)
>varRaster<-raster("F18-SSUSI_EDR-NIGHT-DISK_DD.20150107_SN.26920-00_DF.NC", varname="NMF2_DISK")
>cols <- rev(rainbow(255))
>plot(varRaster, col=cols)
Now I've got a plot, but the grid is not a geo one. So there are two questions:
What I have to do to get the correct grid?
How is it possible to add the world map layer?
Thank you in advance for your help.
UPGRADE
Taking a better look to my data I found out that I have to change missval to NA and also I need to transpose. The best variant I've found is this tutorial and everything is almost all right but the data is now strained all ower the map...And it has to be just a wide track of sattelites scan.
I can't put here the image, because my reputation is very low:(
library(raster)
library(ncdf)
data = open.ncdf("F18-SSUSI_EDR-NIGHT-DISK_DD.20150107_SN.26920-00_DF.NC")
lon=get.var.ncdf(data,"PIERCEPOINT_NIGHT_LONGITUDE")
lat=get.var.ncdf(data,"PIERCEPOINT_NIGHT_LATITUDE")
lon<-lon-180
dim(lat)
nmf2=get.var.ncdf( ex.nc, "NMF2_DISK")
nmf2[nmf2 == -1e+30] <- NA
dim(nmf2)
nmf2_un=get.var.ncdf( ex.nc, "NMF2_DISK_UNCERTAINTY")
nmf2_un[nmf2_un == -1e+30] <- NA
nmf2_1 <- raster(t(nmf2)[ncol(nmf2):1, ])
nmf2_un_1 <- raster(t(nmf2_un)[ncol(nmf2_un):1, ])
w <- brick(nmf2_1, nmf2_un_1)
projection(w) <- CRS("+init=epsg:4326")
extent(w) <- c(min(lon), max(lon), min(lat), max(lat))
plot(w[[1]])
library(maptools)
data(wrld_simpl)
plot(wrld_simpl, add = TRUE)
I would be very glad if somebody can tell me what's wrong!
UPDATE 2
Tried to use raster only. Still have a mistake with projection
library(raster)
inputfile <- "F18-SSUSI_EDR-NIGHT-DISK_DD.20150107_SN.26920-00_DF.NC"
lat <- raster(inputfile, varname="PIERCEPOINT_NIGHT_LATITUDE")
lon <- raster(inputfile, varname="PIERCEPOINT_NIGHT_LONGITUDE")
plat <- rasterToPoints(lat)
plon <- rasterToPoints(lon)
lonlat <- cbind(plon[,3], plat[,3])
lonlat <- SpatialPoints(lonlat, proj4string = CRS("+proj=longlat +datum=WGS84"))
extent(lonlat)
#class : Extent
#xmin : 0.008961686
#xmax : 359.983
#ymin : -84.95161
#ymax : 89.68419
pr <- raster(inputfile, varname="NMF2_DISK")
extent(pr) <- extent(lonlat)
pr
#class : RasterLayer
#dimensions : 408, 13, 5304 (nrow, ncol, ncell)
#resolution : 27.69031, 0.4280289 (x, y)
#extent : 0.008961686, 359.983, -84.95161, 89.68419 (xmin, xmax, ymin, ymax)
#coord. ref. : NA
#data source : C:\Users\Svetlana\Science\GUVI\R\SSUSI\F18-SSUSI_EDR-NIGHT- DISK_DD.20150107_SN.26920-00_DF.NC
#names : NMF2_DISK
#zvar : NMF2_DISK
r <- projectRaster(pr, crs=CRS("+proj=longlat +datum=WGS84"))
#Error in projectRaster(pr, crs = CRS("+proj=longlat +datum=WGS84")) :
# input projection is NA
What's wrong?
And the other question is how to work with missval while using raster? I mean that in spite of using NA there is 8 000 000 for missed values of data. What I have to do with this?

Resources