How do I get mean intensity of TIFF files in a tibble? - r

I am using the following code to get TIFF files into R for analysis:
library(magick)
tiffiles<-list.files("C:/Users/folder_with_multiple_tifs/", pattern = "*.tif", full.names=TRUE)
importedtifs<-c()
for(file in tiffiles) {importedtifs<-append(importedtifs, image_read(file))}
importedtifs
This gives me a tibble with each row corresponding to a TIFF file. I can then use mean(as.integer(importedtifs[[1]])) to get the average pixel intensity of the first TIFF. It is a small positive number for the images I am working with.
I would like to have a single command that returns the mean pixel intensity of each individual TIFF in the tibble. When I try lapply(importedtifs, function(x) mean(as.integer(x))), I get a large negative number, which is not the pixel intensity.
Is there a way to do this? I don't understand exactly how the tibble is storing the data for each TIFF.

DaveArmstrong's solution works. The variation below delivers the means in a list that can be manipulated downstream:
means<-c()
for(i in 1:length(importedtifs)){
means<-c(means, mean(as.integer(importedtifs[[i]])))
}

Related

Retaining original file names when processing multiple raster files using R

I have the following problem: I need to process multiple raster files using the same function in R package landscapemetrics. Basically my raster files are parts of a country map, all of the same shape and size (i.e. quadrants. I figured out a code for 1 file, but I have to do the same with more than 600 rasters. So, doing it manually is very irrational. The steps in my code are the following:
# 1. I load "raster" and "landscapemetrics" packages:
library(raster)
library(landscapemetrics)
# 2. I read in my quadrant:
Quadrant <- raster("C:\\Users\\customer\\Documents\\ ... \\2434-44.tif")
# 3. I process the raster to get landscape metrics tibble:
LS_metrics <- calculate_lsm(landscape = Quadrant)
# 4. Finally, I write it into a csv:
write.csv(LS_metrics, file = "2434-44.csv")
I need to keep the same file name for my csv files as I had for tif (e.g. results from processing quadrant "2434-44.tif", need to be stored in "2434-44.csv", possibly in a folder in wd).
I am new to R. I tried to use list.files() and then apply a for loop, but my code did not work.
I need your advice.
Yours faithfully,
Denis
Your question is really about iteration and character (filename) manipulation; not about landscapemetrics etc. There are many similar questions on this site and resources elsewhere that you can consult. The basic approach can be like this:
# get input filenames
inf <- list.files("/my/path", pattern="\\.tif$", full=TRUE)
# create output filenames
outf <- gsub(".tif", ".csv", basename(inf))
# perhaps put output files in particular folder
dir.create("out", FALSE, FALSE)
outf <- file.path("out", outf)
# iterate
for (i in 1:length(inf)) {
# read input
input <- raster(inf[i])
# do something
output <- data.frame(id=1)
# write output
write.csv(output, outf[i])
}
It's very hard to help without further information. What was the issue with your approach of looping through all files using list.files(). In general, this should work.
Furthermore, most likely you don't want to calculate all available landscape metrics, but rather specify a subselection during the calculate_lsm() function call.

Normalize RasterLayer as Matrix to use as Clip Frame

I was assigned the task to clip a raster from .nc file from a .tif file.
edit (from comment):
i want to extract temp. info from the .nc because i need to check the yearly mean temperature of a specific region. to be comparable the comparison has to occur on exactly the same area. The .nc file is larger than the previously checked area so i need to "clip" it to the extent of a .tif I have. The .tif data is in form 0|1 where it is 0 (or the .tif is smaller than the .nc) the .nc data should be "cliped". In the end i want to keep the .nc data but at the extent of the .tif while still retaining its resolution & projection. (.tif and .nc have different projections&pixel sizes)
Now ordinarily that wouldn't be a problem as i could use raster::crop. This doesn't deal with different projections and different pixel size/resolution though. (I still used it to generate an approximation, but it is not precise enough for the final infromation, as can be seen in the code snippet below). The obvious method to generate a more reliable dataset/rasterset would be to first use a method like raster::projectRaster or raster::sp.Transform # adding sp.transform was done in an edit to the original question and homogenize the datasets but this approach takes too much time, as i have to do this for quite a few .nc files.
I was told the best method would be to generate a normalized matrix from the smaller raster "clip_frame" and then just multiply it with the "nc_to_clip" raster. Doing so should prevent any errors through map projections or other factors. This makes a lot of sense to me in theory but I have no idea how to do this in practice. I would be very grateful to any kind of hint/code snippet or any other help.
I have looked at similar problems on StackOverflow (and other sites) like:
convert matrix to raster in R
Convert raster into matrix with R
https://www.researchgate.net/post/Hi_Is_there_a_way_to_multiply_Raster_value_by_Raster_Latitude
As I am not even sure how to frame the question correctly, I might have overlooked an answer to this problem, if so please point me there!
My (working) code so far, just to give you an idea of how I want to approach the topic (here using the crop-function).
#library(ncdf4)
library(raster)
library(rgdal)
library(tidyverse)
nc_list<-list.files(pattern = ".*0.nc$") # list of .nc files containing raster and temperature information
#nc_to_clip <- lapply(nc_list, raster, varname="GST") # read in as raster
nc_to_clip < -raster(ABC.nc, vername="GST)
clip_frame <- raster("XYZ.tif") # read in .tif for further use as frame
mean_temp_from_raster<-function(input_clip_raster, input_clip_frame){ # input_clip_raster= raster to clip, input_clip_frame
r2_coord<-rasterToPoints(input_clip_raster, spatial = TRUE) # step 1 to extract coordinates
map_clip <- crop(input_clip_raster, extent(input_clip_frame)) # use crop to cut the input_clip_raster (this being the function I have to extend on)
temp<-raster::extract(map_clip, r2_coord#coords) # step 2 to extract coordinates
temp_C<-temp*0.01-273.15 # convert kelvin*100 to celsius
temp_C<-na.omit(temp_C)
mean(temp_C)
return_list<-list(map_clip, mean(temp_C))
return(return_list)
}
mean_tempC<-lapply(nc_to_clip, mean_temp_from_raster,clip_frame)
Thanks!
PS:
I don't have much experience working with .nc files and/or RasterLayers in R as I used to work with ArcGIS/Python (arcpy) for problems like this, which is not an option right now.
Perhaps something like this?
library(raster)
nc <- raster(ABC.nc, vername="GST)
clip <- raster("XYZ.tif")
x <- as(extent(clip), "SpatialPolygons")
crs(x) <- crs(clip)
y <- sp::spTransform(x, crs(nc))
clipped <- crop(nc, y)

How do I open and store a series of images in a folder in R?

So I have a folder with some n images which I want to open and save with the readImage function. Right now a colleague had written something similar for opening and storing the name only of the images. I'd like to do the following:
setwd("~/ABC/One_Folder_Up")
img_src <- "FolderOfInterest"
image_list <- list.files(path=img_src, pattern = "^closed")
But with the actual .tif images named for example: closed100, closed101,....closed201
The above code works great for getting the names. But how can I get this type of pattern but instead open and save images? The output is a large matrix for each image.
So for n = 1 to n, I want to perform the following:
closed175 <- readImage("closed175.tif")
ave175 <- mean(closed175)
SD175 <- SD(closed175)
I'm assuming the image list shown in the first part could be used in the desired loop?
Then, after the images are saved as their own matricies, and all averages and SDs are calculated, I want to put the averages and SDs in a matrix like this:
imavelist <- c(ave175, ave176,......ave200)
Sorry, not an expert coder. Thank you!
edit: maybe lapply?
edit2: if I use this suggestion,
require(imager)
closed_images <- lapply(closed_im_list, readImage)
closed_im_matrix = do.call('cbind', lapply(closed_images, as.numeric))
Then I need a loop to save each element of the image stack matrix as its own individual image.
setwd("~/ABC/One_Folder_Up/FolderOfInterest/")
#for .tif format
image_list=list.files(path=getwd(), pattern = "*.tif")
# for other formats replace tif with appropriate format.
f=function(x){
y=readImage(x)
mve=mean(y)
sd=sd(y)
c(mve,sd)
}
results=data.frame(t(sapply(image_list,f)))
colnames(results)=c("average","sd")
the resul for 3 images:
> results
average sd
Untitled.tif 0.9761128 0.1451167
Untitled2.tif 0.9604224 0.1861798
Untitled3.tif 0.9782997 0.1457034
>

organising a .csv file in R

I'm programming a script for the calculation of cover around points in R.
I have two inputs: an IMG raster file, and a .csv with all the points.
I've used this script:
library(raster)
library(rgdal)
#load in raster and locality data
map <- raster('map.IMG')
sites <- read.csv('points.csv', header=TRUE)
#convert lat/lon to appropirate projection
coordinates(sites) <- c("X", "Y")
proj4string(sites) <- CRS("+init=epsg:27700")
#extract values to points
Landcover<-extract (map, sites, buffer=2000)
extraction <- lapply(Landcover, function(serial) prop.table(table(serial)))
# Write .csv file
lapply(extraction, function(x) write.table( data.frame(x), 'test2.csv' , append= T, sep=',' ))
I get a .csv file in my map, but the data isn't organised in the way I would like it to be.
There a three columns in the csv file. One with 'x', one with 'Freq' (Which I think is the code for every class in my image) and one with the cover part, somewhere between 0-1. See the image included.Image
I want to have on the rows the serial and classes, and under that the correct serial with it's coverage.
Also every point isn't named, so I can't see which is which. In the points.csv I have for example a 'serial' code for every point, which i would like to use for that.
Can somebody steer me in the right direction?
I hope I have been clear with my questions, thank in advance!

writeRaster output file size

I have a function that reads a multi-band image in as a raster brick object, iterates through the bands doing various calculations, and then writes the raster out as a new .tif. All of this works fine, but the file size of the new image file is roughly four times greater (I assume because the original image has 4 bands). I'm wondering if there's a parameter in the writeRaster() function that I'm unaware of, or if there's some other way I can ensure that the output image is basically the same file size as the input.
Original file size is 134 MB; output ranges from 471 to 530 MB or so, depending on format.
Simplified code:
library(rgdal)
library(raster)
path = "/Volumes/ENVI Standard Files/"
img = "qb_tile.img"
imageCorrection = function(path, img){
raster = brick(paste0(path, img))
raster = reclassify(raster, cbind(0, NA))
for(i in 1:nlayers(raster)){
raster[[i]] = raster[[i]] - minValue(raster[[i]])
}
writeRaster(raster, paste0(path,img,"_process.tif"), format = "GTiff", overwrite=TRUE)
}
You can set the default datatype for writing rasters with the rasterOptions() as follows:
rasterOptions(datatype="INT2U")
Or directly in the writeRaster call:
writeRaster(yourRas, "path/to/raster/", dataType="INT2U", options="COMPRESS=LZW")
Also notice the options argument where you can specify compression.
Usually when I export integer rasters from R, I make sure that I really have integers and not floats, since this can result in an empty raster. Try the following before exporting:
ras <- as.integer(ras)
Please note:
Also check for negative values in your raster. Try INT2S if you have values below zero.

Resources