Extracting specific row from multiple files using loop - r

I have 18 years of hourly rainfall data. Each CSV file contains a point (latitude and longitude), with corresponding rain rate and gauged-calibrated values. I want to extract the rain rate and gauged-calibrated rain values of a specific coordinate.
How can I extract the row of a specific coordinate in R using a loop?
How can I merge all the extracted values into one file?

Related

How to change the number of column of raster data in R?

I have two raster data, both have same resolution and origin. And both have 3600 columns.
I use r <- rotate(r) on one raster data to change the log from 0,360 to -180,180. but after this process, the number of columns of this raster data increases to 3601 from 3600. However I need to do a calculation of two raster data, and need the number of columns of two raster data to be same, which is 3600.
I expect to get a raster data with 3600 columns.

Perform join using latitude and longitude in R

I have an excel spreadsheet with the latitude and longitude of bike docking stations.
I have a shape file in R (cb_2018_17_bg_500k.shp) that has GEOID (12 digit FIPS code) column and a column labelled geometry. The values in this column are POLYGON((longitude,latitude))
I am trying to add a column in the excel spreadsheet titled FIPS. So, I need to somehow join the latitude and longitude to GEOID column in the shape file.
I am a novice when it comes to R.
Any advice will be much appreciated.
Rich
So far, I have only managed to upload the shape file to R.

Calculate mean value of date dependent variable for a certain end and start date taking into consideration groups in R

I have two CSV data sets. The first on is monthly data or a certain variable(PM_2.5) given by spacial constrains (latitude and longitude) which can be seen as the place variables. The second data frame contains different start and end dates for the observation. Also, those are given under the same spacial constraint and for each individual v1. You can see the data structure in the pictures. enter image description here enter image description here
I want to sum all observations of (PM_2.5) for one individual (ID) over the observation period (start date to end date) given the constraint that the geospatial identification (latitude, longitude) is the same.
Thanks a lot for your help.
Best,
Luise

Any way to exclude certain periods of weather data from multi-layers raster grid in R?

I have 16 years' daily mean temperature gridded data in netCDF format, the file size is quite big (about 3GB). Initially, I used raster package to load original gridded data in RasterStack object.
I am wondering how can I exclude weather data where its time ranges do not fall in my interest. More specifically, I want to use only 5 years weather data while here I have 15 years weather data instead. How can I operate this filtration for multi-layers raster data in R? For example, the time span of my original gridded data ranges from 1980.1.1 to 1995.12.31, and I want to only keep temperature data from 1980.1.1 to 1984.12.31. How can I filter out my wanted temperature grid from the multi-layers raster in R? Any possible idea to make this happen?
reproducible example:
r <- raster(xmn=5.75, xmx= 15, ymn = 47.25, ymx =55,res=c(0.25,0.25))
tempDat<- do.call(stack,lapply(1:5844,function(i) setValues(r,round(runif(n = ncell(r),min = -4,max = 23)))))
names(tempDat) <- paste0('X',gsub('-','.',ymd('1980.01.01') + days(1:5844)))
Update:
If there are other handy tools that can chunk netCDF file easily, I would like to know how to do it. Any fastest way to filter out my wanter daily mean temperature data from multi-layers raster grid will work for me. Thanks
desired output:
I only want to keep daily mean temperature data from 1980.1.1 to 1984.12.31; how can I make this happen? how can I operate this filtration on multi-layers raster grid in R? Any more thoughts? Thanks
I figure out my own way to answer this question:
So I used raster::getZ() to list out all date and grep to subset the time period that I am only interested in. Here is the solution:
library(raster)
library(ncdf4)
(tg <- brick("C:\\tn_0.25deg_reg_2018.nc"))
tg_date <- getZ(tg)
grep("2018-01-01", tg_date)
grep("2018-05-31", tg_date)
subset
tg_5months <- subset(tg, 1:150)
tg_5months #z$Date <- tg#z$Date[1:150]
and it is done nicely.

extracting pixel values above a value per polygon in R

I have a shapefile containing 38 polygons i.e. 38 states of a country. This shapefile is overlaid on a raster. I need to extract/reclassify pixel above a certain value, specific to each polygon.
For example, I need to extract the raster pixels> 120 for state/polygon 1, pixels> 189 for polygon 2 etc with the resulting raster being the extracted pixels with value 1 and everything else as NoData. Hence, it seems like I need to extract first and then reclassify.
I have the valuees, for extraction, saved as a data frame with a column containing names, matching the names of the states,which is stored as an attribute "Name" in the shapefile.
Any suggestion on how I could go about this?
Should I extract the raster for each state into 38 separate rasters, then do reclassify () and then mosaic to make one raster i.e. the country?

Resources