I have a rasterstack
class : RasterStack
dimensions : 47, 89, 4183, 6 (nrow, ncol, ncell, nlayers)
resolution : 0.5, 0.5 (x, y)
extent : 60.75, 105.25, 15.75, 39.25 (xmin, xmax, ymin, ymax)
crs : NA
names : VegC.1, layer.1, VegC.2, layer.2, VegC.3, layer.3
min values : 0.00000, -11.52596, 0.00000, -11.51896, 0.00000, -11.49996
max values : 21.14100, 16.52118, 18.85200, 16.69225, 23.08900, 20.25300
I plot the stack with same color scheme. However I want the 0 value to be white. i use the colorscheme
cols <- inlmisc::GetColors(scheme = "BuRd",n = 256)
But the plot shows blue on the zero value. Is there an easy way for fixing color scale for raster stacks?
I answer based on the following assumptions:
You are using the raster package
You are using the plot() function of the raster package to generate the plot
(next time, please indicate both the packages and the functions you are using, and even better, provide code with a reproducible example --it doesn't have to be based on your data, just a small dataset that you can use to show your problem)
Most of the times your (very natural!) wish of mapping the zero value to the middle of the palette --even if the data are not symmetric around zero-- is fulfilled by using a parameter in the plotting function. See for instance this post on the pheatmap() function where the breaks parameter is used:
Set 0-point for pheatmap in R
That said, from the documentation of the plot() function of the raster package we see that it accepts a ... parameter which can receive any parameter that is accepted by the image.plot() function of the fields package, as indicated in the Details section of the plot() function:
Details
Most of the code for the plot function for a single Raster* object was taken from image.plot (fields package).")
In the documentation of the image.plot() function of the fields package we read:
breaks Break points in sorted order to indicate the intervals for assigning the colors.
Note that if there are nlevel colors there should be (nlevel+1) breakpoints. If
breaks is not specified (nlevel+1) equally spaced breaks are created where the
first and last bin have their midpoints at the minimum and maximum values in z
or at zlim
You can read the answer for Set 0-point for pheatmap in R on how to set the breaks so that the 0 value is mapped to the middle of the color palette you chose (i.e. white).
Note: I cannot give you a working example because I don't have the raster package installed.
Related
After some extensive googling, I wasn't able to find my answer (first time I couldn't surmount the issue by looking at others questions/answers). I am new to asking questions, so forgive any missteps.
I am attempting to perform what ArcGIS or QGIS performs with the slope tool, just within R. To do so I have been importing a raster that I exported from ArcGIS in GRID format with the following characteristics:
class : RasterLayer
dimensions : 821, 581, 477001 (nrow, ncol, ncell)
resolution : 4.996121, 4.996121 (x, y)
extent : 2832147, 2835049, 14234048, 14238150 (xmin, xmax, ymin, ymax)
crs : +proj=tmerc +lat_0=34.75 +lon_0=-118.583333333333 +k=0.9999 +x_0=800000.000000001 +y_0=3999999.99999999 +datum=NAD83 +units=us-ft +no_defs
source : rr_2020_shell
names : rr_2020_shell
values : 5623.253, 6401.356 (min, max)
It is already projected in the correct coordinate system (EPSG: 3423) but when I go to find the slope using the following code:
RR_2020_Slope = terrain(RR_2020_St1_Raster,'slope', units = 'degrees', neighbors = 8, filename = 'RR_2020_Slope.grd', overwrite = T)
The result is a slope raster that ranges from 0 to 1.28°, which is very different from what I have calculated in ArcGIS using the slope tool. Using the same DEM raster in the same projection in ArcGIS I used the slope tool with an input of 'Degree' for the output measurement, 'Planar' for the method, and 1 for Z factor and my resulting slope raster ranges from 0.001 to 73.396°.
Overall I am wondering where my mistake in R originates from, is it an elevation resolution problem? Are there issues with my projection? Forgive me, I can't necessarily include the data as they are sensitive materials but perhaps there is a clear and obvious mistake in my approach or assumptions about the functions I have used?
The only red flag I see is that you say "it is already projected in the correct coordinate system". Projecting raster data degrades the quality. As cell values get smoothed, the slopes will get smaller. This may be particularly pronounced if the relief is at the scale of the cell size (e.g. sand dunes vs mountain chains). Have you compared with what you get with the original data?
Another source of error could be that the units of the values are different from the units of the coordinate reference system. But it would appear that in your case both are in feet.
Can you also try this with terra::terrain()?
I have a large raster file (5GB) containing only 1's and NA's. I would like to convert this into a multipolygon of the areas with 1's, with adjacent cells dissolved into one polygon.
I have imported the file to R using
r = raster::raster(my_filename)
r
class : RasterLayer
dimensions : 17452, 45000, 785340000 (nrow, ncol, ncell)
resolution : 0.008, 0.008 (x, y)
extent : -180, 180, -55.9875, 83.6285 (xmin, xmax, ymin, ymax)
crs : NA
source : C://...binary_X01_januarysnow.asc
names : binary_X01_januarysnow
and
I have tried several methods to create the polygon:
rasterToPolygons from raster with dissolve==TRUE option (R crashes)
isoband from the isoband package (R crashes),
Both of the approaches have worked as expected when I've tried them on a subset of my raster covering appr. the area of Spain, so I assume the problem is only with the size of the data and not my code.
Then I have tried to read my raster with read_stars, and use stars::st_as_sf(st, as_points = FALSE, merge = TRUE, connect8 = TRUE). This returned an empty polygon, possibly because the file was read as a stars proxy object, but I'm not sure, I couldn't find any information about that online.
Then I have force-read the raster as stars and not as stars proxy by using read_stars(my_filename, proxy=FALSE) and have tried to use the st_as_sf command as above but got the message "Error: cannot allocate vector of size 2.9 Gb"
I know that in the worst case I can probably just decrease the raster resolution and therefore size and will be able to create the polygons I want (but with less precise resolution), but I was wondering if anyone has another suggestion I could try? Both the 1's and NA's are located in large continous areas, so it would be enough to have high resolution on the edges, if that helps.
PS This is my very first question on StackOverflow so I apologize if my problem is not clearly described. I don't know how to provide a reproducible example of a large dataset.
What you are looking for is as.polygons() from the terra package, the raster package's successor. terra handles large data sets better than raster does.
I'm trying to get RGB components from a geoTIFF file in R. The colours on the image correspond to different land classification types and I have a legend for each classification type in RGB components.
I'm using the raster library. My code so far is
library(raster)
my.map = raster("mygeoTIFFfile.tif")
Here is the information on the file once it has been read in:
> my.map[[1]]
class : RasterLayer
dimensions : 55800, 129600, 7231680000 (nrow, ncol, ncell)
resolution : 0.002777778, 0.002777778 (x, y)
extent : -180.0014, 179.9986, -64.99861, 90.00139 (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0
data source : filepah/filename.tif
names : filename.tif
values : 11, 230 (min, max)
The specific geoTIFF file I'm working on can be found here:
http://due.esrin.esa.int/page_globcover.php
(just click on "Globcover2009_V2.3_Global_.zip")
Can someone please help me get the value from a single pixel location from this file please?
The rasterToPoints() function will convert your raster data to a matrix containing x, y, and value for each point. This will be very large, but may be what you're looking for if you want to do a broad analysis of the data.
library(raster)
map <- raster("GLOBCOVER_L4_200901_200912_V2.3.tif")
data <- rasterToPoints(map, progress="text")
head(data)
Another option is to use the extract() function to return a single point by passing a SpatialPoints object with latitude/longitude. If you only want a few individual data points, this will be a lot faster than loading the entire thing into a matrix.
library(raster)
map <- raster("GLOBCOVER_L4_200901_200912_V2.3.tif")
extract(map, SpatialPoints(cbind(-123.3680884, 48.4252848)))
It seems that you are asking the wrong question.
To get a value for a single pixel (grid cell), you can do use indexing. For example, for cell number 10,000 and 10,001 you can do r[10000:10001].
You could get all values by doing values(r). But that will fail for a very large raster like this (unless you have lots of RAM).
However, the question you need answered, it seems, is how to make a map by matching integer cell values with RGB colors.
Let's set up an example raster
library(raster)
r <- raster(nrow=4, ncol=4)
values(r) <- rep(c(11, 14, 20, 30), each=4)
And some matching RGB values
legend <- read.csv(text="Value,Label,Red,Green,Blue
11,Post-flooding or irrigated croplands (or aquatic),170,240,240
14,Rainfed croplands,255,255,100
20,Mosaic cropland (50-70%) / vegetation (grassland/shrubland/forest) (20-50%),220,240,100
30,Mosaic vegetation (grassland/shrubland/forest) (50-70%) / cropland (20-50%) ,205,205,102")
Compute the color code
legend$col <- rgb(legend$Red, legend$Green, legend$Blue, maxColorValue=255)
set up a "color table"
# start with white for all values (1 to 255)
ct <- rep(rgb(1,1,1), 255)
# fill in where necessary
ct[legend$Value+1] <- legend$col
colortable(r) <- ct
plot
plot(r)
You can also try:
tb <- legend[, c('Value', 'Label')]
colnames(tb)[1] = "ID"
tb$Label <- substr(tb$Label, 1,10)
levels(r) <- tb
library(rasterVis)
levelplot(r, col.regions=legend$col, at=0:length(legend$col))
I'm sure if this is possible in R, and if someone knows of a way to do this with some other program please let me know
Currently I have a raster, and I need to turn a group of pixels into an NA group if there isn't a large enough cluster. My current thought process was to convert the raster to a polygon, and then calculate the polygons area and remove the polygons if they weren't large enough. The only problem with this is that rasterToPolygon creates a single layer of polygons, and I have no way of individually indexing each one. Any ideas? Here is an example:
library(raster)
area <- raster(matrix(c(1:4,1),5,5))
shape <- rasterToPolygons(area,fun=function(x){x == 1},dissolve=TRUE)
After dissolving, you can disaggregate the multipart polygon into single part polygons again. The disaggregate method for SpatialPolygons* is in the sp package (which should already be loaded if you have raster loaded).
library(sp)
shape2 <- disaggregate(shape)
shape2
## class : SpatialPolygonsDataFrame
## features : 2
## extent : 0, 1, 0, 1 (xmin, xmax, ymin, ymax)
## coord. ref. : NA
## variables : 1
## names : abc
## min values : 1
## max values : 1
Polygons will have an attribute that indicates the value their original raster value. You could then, for example, add an attribute giving each polygon a unique ID.
shape2$id <- factor(seq_len(length(shape2)))
spplot(shape2, 'id')
I have a large Rasterlayer with integers ranging from 0 to 44.
class : RasterLayer
dimensions : 29800, 34470, 1027206000 (nrow, ncol, ncell)
resolution : 10, 10 (x, y)
extent : 331300, 676000, 5681995, 5979995 (xmin, xmax, ymin, ymax)
coord. ref. : +proj=utm +zone=32 +ellps=GRS80 +units=m +no_defs
data source : /home/mkoehler/stk_rast_whz
names : stk_rast_whz
values : 0, 44 (min, max)
I want to do a stratified sampling of 5000 points per stratum.
I get the following error:
POINTS<-sampleStratified(b, size=5000, na.rm=T, xy=F)
(Error in ys[[i]] <- y : attempt to select less than one element)
Here is a code that reproduces the problems (even when only selecting 1
item per stratum):
set.seed(10)
r <- raster(ncol=5000, nrow=5000)
names(r) <- 'stratum'
r[] <- round((runif(ncell(r)))*44)
sampleStratified(r, size=1,xy=T)
Error in ys[[i]] <- y : attempt to select less than one element
Trying that with fewer strata and changing the settings of "size" or
"exp" have no effect.
R version: [64-bit] C:\Program Files\R\R-3.1.1
Any ideas?
thanks in advance!
This appears to be a bug (as at raster 2.3-12), and occurs when (1) your raster contains cells with value 0, and (2) the raster can't be processed in memory (i.e. canProcessInMemory(r) is FALSE).
The function loops over the unique cell values produced by freq(r), and then indexes a list by each of these values in turn. If one of those values is zero, the error will be triggered since the 0th element does not exist. For example:
list()[[0]]
# Error in list()[[0]] : attempt to select less than one element]
You'll notice that the error doesn't occur if you fill r with, e.g., r[] <- sample(44, ncell(r), replace=TRUE), since it won't have any zeroes.
When the raster can be processed in memory, the function loops over the row numbers of freq(r), and so the subsequent list indexing is sensible.
I've contacted the maintainer to report this bug.
Meanwhile, as a temporary fix, you could use something like the following to make a corrected copy of the function (which will remain available in the current R session).
sampleStratified2 <-
eval(parse(text=sub('sr\\[, 2\\] == i', 'sr[, 2] == f[i, 1]',
sub('i in f\\[, 1\\]', 'i in seq_len(nrow(f))',
deparse(getMethod(sampleStratified,
signature='RasterLayer')#.Data))
)))
sampleStratified2(r, size=1, xy=TRUE)