I'm a new student of R and my current goal is to build the monthly precipitation 3d data (lon,lat,time) of south korea from 2018-2021 from the netcdf files in https://downloads.psl.noaa.gov/Datasets/cpc_global_precip/ (which is daily data).
I was able to extract the data from one cm file following a youtube video but after that I am totally lost. I suspect that I have to use loops to make a dataframe of the data I need.
Can anyone point me in the right direction?
Related
I am currently working with daily precipitation data in netCDF format. The data's at a 4km resolution that covers the United States. However, I want to mask/clip the data with a much higher-resolution shapefile for a particular geographical region (about the size of a county). Ultimately, I want the output to be daily precipitation data, either at that high resolution or the original 4km resolution, for the much smaller area.
I've tried a couple different methods, with the most success using the following code:
prcp_2000 <- raster::brick('pr_2000.nc')
shapefile <- shapefile("polygon_combined.shp")
shapefile <- spTransform(shapefile, crs(prcp_2000))
prcp_2000 <- mask(prcp_2000, shapefile)
prcp_2000 <- crop(prcp_2000, shapefile)
outfile <- paste("prcp_","2000_","CS",".nc",sep="")
writeRaster(prcp_2000, outfile, overwrite=TRUE, format="CDF", varname="prcp", varunit="mm/day", longname="mm of precipitation per day", xname="lon", yname="lat", zname="day", zunit="days since 1900-01-01")
However, I keep getting nothing but infinities/negative infinities for prcp output, even though I'm still getting appropriate variable lengths otherwise (day=366, lat=24, lon=23). Am I missing something?
The problem is the starting shapefile, shapefile <- shapefile("polygon_combined.shp"). I imported "polygon_combined.shp" into QGIS to plot a base layer under it easily. The image I've attached shows that it starts off in the Gulf of Mexico. With a bad location to begin with, transforming it isn't going to save it later.
I don't know the origin of the data, so I have no idea how it was produced this way to start. One way to possibly fix this is get the data from a different source. The EPA makes an ecoregions product that has several levels of regions. I think you want the level IV data to get your region. You may still need to transform it, but it'll be located in the right place to begin with. https://www.epa.gov/eco-research/ecoregion-download-files-state-region-5#pane-47
I am using the WaveTide earth engine app developed by Jaap Nienhuis (2019) to obtain a time series of Wave Energy for specific locations. Once my location is selected on the map (e.g. Lon: 72.50, Lat: -5.50), two graphs appear on the left console, one of which displays Wave Energy J/m2. I then select "Link to WaveWatch time series for this location" above the graphs which downloads a netCDF file ( .nc file) for my chosen location.
I am using R to read the file using the ncdf4 package. The file contains three variables, dp: mean wave direction (centi degrees), hs: significant wave height (millimiter) and tp: peak wave period (milliseconds), and one dimension, time (datenum). I would like to:
convert time into Year-Date-Time
create a column to calculate Wave Energy in J/m2 using the variables for each Year-Date-Time
The file structure seems to be different than other netCDF files. I cannot find appropriate help online and therefore I don't know how to code to obtain a time series of Wave Energy for my location (if at all possible) in a dataframe. Can anyone help?
# load library
library(ncdf4)
# Open file
nc <- nc_open("D:/Downloads/WaveWatch_timeseries_10491.nc")
print(nc)
I have a relatively large-ish (>4,000 records) dataset of species records that needs to be mapped in R, however the only spatial information with them are 6 figure Ordnance Survey’s National Grid (NGR) format grid references (e.g. SD311124, see also A beginners guide to finding grid references). The file format is CSV.
How can I get R to plot the points with this information?
Is there a better way to do it within R rather than bulk converting and adding lat/long coords to the spreadsheet before loading it into R?
I know how to do this in QGIS, but my supervisors would like it in R instead!
I am trying to create background data points in R as part of a species distribution modelling project. I cannot figure out how to convert my environmental predictor raster layers (annual precipitation, mean summer temperatures, etc.) into .grd files so that I can read them into R. Are there some basic steps to follow to convert Raster data into .grd files?
There is really no need to save files in that format. Almost any format can be used, except the ESRI geodatabase. It might be that your files already are in a usable format (e.g. tif, img, esri grid). See what happens when you use raster(filename). Otherwise export to GTiff.
I'm working on rainfall data. I have grid of rainfall points of 1 square kilometer with time step of 10 minutes. I want interpolate those point at every time step. also I want to check the temporal correlation of my rainfall points. The program I'm using is R. But I'm stuck I cant find my way forward...! Here is a link of my single rainfall data
https://drive.google.com/file/d/0B430nIYqp_1OLXZia2htbGRGWTQ/edit?usp=sharing
The CRAN taskview for spatial temporal analysis suggests packages to use.
http://cran.r-project.org/web/views/SpatioTemporal.html
I see that you have tagged gstat, so the gstat way might be of interest to you.
http://cran.r-project.org/web/packages/gstat/vignettes/st.pdf
Does this help?