I would like to use the rasterVis package to plot contours of spatial data (e.g. using levelplot as in this example). However, the dataset comes from a NetCDF file with an irregular grid, as below:
lon(y,x) lat(y,x) var(y,x)
and there is no implication to the projection implied by lon/lat.
Is there a method to plot directly the dataset as raster data
as in these figures without interpolation?
The headers of raster data include an extension of the grid and projection specification, which does not fit with my problem. Two-dimensional lon/lat arrays are not recognized by raster as coordinate systems.
Code and plots as in this example but with netcdf file as:
float lat(y, x) ;
lat:standard_name = "latitude" ;
lat:long_name = "Latitude" ;
lat:units = "degrees_north" ;
lat:nav_model = "grid_T" ;
float lon(y, x) ;
lon:standard_name = "longitude" ;
lon:long_name = "Longitude" ;
lon:units = "degrees_east" ;
lon:nav_model = "grid_T" ;
float icethic(time_centered, y, x) ;
icethic:standard_name = "sea_ice_thickness" ;
icethic:long_name = "Ice thickness (cell average)" ;
icethic:units = "m" ;
icethic:online_operation = "average" ;
icethic:_FillValue = 1.e+20f ;
icethic:missing_value = 1.e+20f ;
icethic:coordinates = "time_centered nav_lon nav_lat" ;
thanks for quick feedback. I would like to plot with rasterVis a NetCDF with irregular grid (lon, lat are 2d arrays):
netcdf temp {
dimensions:
y = 292 ;
x = 362 ;
time_counter = UNLIMITED ; // (1 currently)
variables:
float lat(y, x) ;
lat:standard_name = "latitude" ;
lat:long_name = "Latitude" ;
lat:units = "degrees_north" ;
lat:_CoordinateAxisType = "Lat" ;
float lon(y, x) ;
lon:standard_name = "longitude" ;
lon:long_name = "Longitude" ;
lon:units = "degrees_east" ;
lon:_CoordinateAxisType = "Lon" ;
double time_counter(time_counter) ;
time_counter:standard_name = "time" ;
time_counter:units = "days since 0-00-00 00:00:00" ;
time_counter:calendar = "proleptic_gregorian" ;
float votemper(time_counter, y, x) ;
votemper:standard_name = "Temperature" ;
votemper:long_name = "Temperature" ;
votemper:units = "C" ;
votemper:coordinates = "lon lat time_counter" ;
votemper:_FillValue = 9.96921e+36f ;
votemper:missing_value = 9.96921e+36f ;
votemper:online_operation = "ave(x)" ;
votemper:interval_operation = 3600.f ;
votemper:interval_write = 2678400.f ;
votemper:offline_operation = "ave(x)" ;
}
The code inspired by the rasterVis guide looks like:
library(raster)
library(rasterVis)
stackSIS <- stack("temp.nc")
idx <- c(as.Date('2008-01-15'))
SISmm <- setZ(stackSIS, idx)
names(SISmm) <- month.abb[1]
SISmm
levelplot(SISmm)
but the plot does not consider the lon/lat geographical coordinates as axes but the x,y indices of the array. Indeed when I ask the summary of the raster object I get:
class : RasterStack
dimensions : 292, 362, 105704, 1 (nrow, ncol, ncell, nlayers)
resolution : 1, 1 (x, y)
extent : 0.5, 362.5, 0.5, 292.5 (xmin, xmax, ymin, ymax)
coord. ref. : NA
names : Jan
time : 2008-01-15
i.e. the "extent" considers the indexes and not the coordinates.
Thanks
Related
i have an array with dimension [ 128, 64, 1140 ], and i want to convert this array into functional data object,
this is the function :
for (i in 1:dim(sst2)[1]) {
for (j in 1:dim(sst2)[2]) {
tmpfdd <- Data2fd(argvals = 1140,
y = sst2[i, j,],
basisobj = create.fourier.basis(nbasis = 15))
}
}
when i run this program, i get this error :
Error in argvalsySwap(argvals, y, basisobj) :
dimensions of 'argvals' and 'y' must be compatible;
dim(argvals) = 1; dim(y) = 1140
I have heavy netCDF files with floating 64-bits precision. I would like to pack using specific values for the add_offset and scale_factor parameters (so then I could transform to short I16 precision). I have found information for unpacking with CDO operators but not for packing.
Any help? Thank you in advance!
Edit:
diego#LAcompu:~/new$ ncks -m in.nc
netcdf in {
dimensions:
bnds = 2 ;
lat = 202 ;
lon = 62 ;
time = UNLIMITED ; // (15777 currently)
variables:
float lat(lat) ;
lat:standard_name = "latitude" ;
lat:long_name = "latitude" ;
lat:units = "degrees_north" ;
lat:axis = "Y" ;
float lon(lon) ;
lon:standard_name = "longitude" ;
lon:long_name = "longitude" ;
lon:units = "degrees_east" ;
lon:axis = "X" ;
double t2m(time,lat,lon) ;
t2m:long_name = "2 metre temperature" ;
t2m:units = "Celsius" ;
t2m:_FillValue = -32767. ;
t2m:missing_value = -32767. ;
double time(time) ;
time:standard_name = "time" ;
time:long_name = "time" ;
time:bounds = "time_bnds" ;
time:units = "hours since 1900-01-01 00:00:00.0" ;
time:calendar = "gregorian" ;
time:axis = "T" ;
double time_bnds(time,bnds) ;
} // group /
diego#LAcompu:~/new$ ncap2 -v -O -s 't2m=pack_short(t2m,0.00166667,0.0);' in.nc out.nc
ncap2: WARNING pack_short(): Function has been called with more than one argument
diego#LAcompu:~/new$ ncks -m out.nc
netcdf out {
dimensions:
lat = 202 ;
lon = 62 ;
time = UNLIMITED ; // (15777 currently)
variables:
float lat(lat) ;
lat:standard_name = "latitude" ;
lat:long_name = "latitude" ;
lat:units = "degrees_north" ;
lat:axis = "Y" ;
float lon(lon) ;
lon:standard_name = "longitude" ;
lon:long_name = "longitude" ;
lon:units = "degrees_east" ;
lon:axis = "X" ;
short t2m(time,lat,lon) ;
t2m:scale_factor = -0.000784701646794361 ;
t2m:add_offset = -1.01787074416207 ;
t2m:_FillValue = -32767s ;
t2m:long_name = "2 metre temperature" ;
t2m:missing_value = -32767. ;
t2m:units = "Celsius" ;
double time(time) ;
time:standard_name = "time" ;
time:long_name = "time" ;
time:bounds = "time_bnds" ;
time:units = "hours since 1900-01-01 00:00:00.0" ;
time:calendar = "gregorian" ;
time:axis = "T" ;
} // group /
NCO will automatically pack with optimal values for scale_factor and add_offset with, e.g.,
ncpdq -P in.nc out.nc
and you can add lossless compression as well with
ncpdq -P -L 1 -7 in.nc out.nc
Documentation at http://nco.sf.net/nco.html#ncpdq
and ncap2 accepts specific values of scale_factor and add_offset for per-variable packing with pack() documented at http://nco.sf.net/nco.html#ncap_mth
Demonstration:
zender#spectral:~$ ncap2 -v -O -s 'rec_pck=pack(three_dmn_rec_var,-0.001,40.0);' ~/nco/data/in.nc ~/foo.nc
zender#spectral:~$ ncks -m ~/foo.nc
netcdf foo {
dimensions:
lat = 2 ;
lon = 4 ;
time = UNLIMITED ; // (10 currently)
variables:
float lat(lat) ;
lat:long_name = "Latitude (typically midpoints)" ;
lat:units = "degrees_north" ;
lat:bounds = "lat_bnd" ;
float lon(lon) ;
lon:long_name = "Longitude (typically midpoints)" ;
lon:units = "degrees_east" ;
short rec_pck(time,lat,lon) ;
rec_pck:scale_factor = -0.001f ;
rec_pck:add_offset = 40.f ;
rec_pck:_FillValue = -99s ;
rec_pck:long_name = "three dimensional record variable" ;
rec_pck:units = "watt meter-2" ;
double time(time) ;
time:long_name = "time" ;
time:units = "days since 1964-03-12 12:09:00 -9:00" ;
time:calendar = "gregorian" ;
time:bounds = "time_bnds" ;
time:climatology = "climatology_bounds" ;
} // group /
So this was simpler than I thought in cdo
cdo pack in.nc out.nc
This packs with optimal add_offset and scale_factor, converting the field to I16.
I am using gridMET (http://www.climatologylab.org/gridmet.html) and MACA (http://thredds.northwestknowledge.net:8080/thredds/reacch_climate_CMIP5_macav2_catalog2.html) NetCDF files for a project and facing a performance issue. Implementation of a simple function on the gridMET NetCDF files (time duration: 1979-2015) is around 0.01sec/grid cell. However, processing of MACA NetCDF files (time duration: 2016-2050) with the same function as used for gridMET data is around 0.3sec/grid cell. Processing time over large areas is drastically different for both datasets.
Header information of gridMET file is:
netcdf pr_1980 {
dimensions:
lon = 1386 ;
lat = 585 ;
day = 366 ;
crs = 1 ;
variables:
double lon(lon) ;
lon:units = "degrees_east" ;
lon:description = "longitude" ;
lon:axis = "X" ;
lon:standard_name = "longitude" ;
lon:long_name = "latitude" ;
double lat(lat) ;
lat:units = "degrees_north" ;
lat:description = "latitude" ;
lat:axis = "Y" ;
lat:standard_name = "latitude" ;
lat:long_name = "latitude" ;
float day(day) ;
day:units = "days since 1900-01-01 00:00:00" ;
day:calendar = "gregorian" ;
day:description = "days since 1900-01-01" ;
day:standard_name = "time" ;
day:long_name = "time" ;
float precipitation_amount(day, lat, lon) ;
precipitation_amount:units = "mm" ;
precipitation_amount:description = "Daily Accumulated Precipitation" ;
precipitation_amount:_FillValue = -32767.f ;
precipitation_amount:coordinates = "lon lat" ;
precipitation_amount:cell_methods = "time: sum(interval: 24 hours)" ;
precipitation_amount:missing_value = -32767. ;
precipitation_amount:grid_mapping = "crs" ;
int crs(crs) ;
crs:grid_mapping_name = "latitude_longitude" ;
crs:longitude_of_prime_meridian = 0. ;
crs:semi_major_axis = 6378137. ;
crs:inverse_flattening = 298.257223563 ;
crs:spatial_ref = "GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.0174532925199433,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]]" ;
crs:long_name = "WGS 84" ;
// global attributes:
:author = "John Abatzoglou - University of Idaho, jabatzoglou#uidaho.edu" ;
:datee = "02 December 2017" ;
:note1 = "The projection information for this file is: GCS WGS 1984." ;
:note2 = "Citation: Abatzoglou, J.T., 2013, Development of gridded surface meteorological data for ecological applications and modeling, International Journal of Climatology, DOI: 10.1002/joc.3413" ;
:last_permanent_slice = "306" ;
:last_provisional_slice = "360" ;
:note3 = "Data in slices after last_permanent_slice (1-based) are considered provisional and subject to change with subsequent updates" ;
:note4 = "Data in slices after last_provisional_slice (1-based) are considered early and subject to change with subsequent updates" ;
:note5 = "Days correspond approximately to calendar days ending at midnight, Mountain Standard Time (7 UTC the next calendar day)" ;
:geospatial_bounds_crs = "EPSG:4326" ;
:Conventions = "CF-1.6" ;
:geospatial_bounds = "POLYGON((-124.7666666333333 49.400000000000000, -124.7666666333333 25.066666666666666, -67.058333300000015 25.066666666666666, -67.058333300000015 49.400000000000000, -124.7666666333333 49.400000000000000))" ;
:geospatial_lat_min = "25.066666666666666" ;
:geospatial_lat_max = "49.40000000000000" ;
:geospatial_lon_min = "-124.7666666333333" ;
:geospatial_lon_max = "-67.058333300000015" ;
:geospatial_lon_resolution = "0.041666666666666" ;
:geospatial_lat_resolution = "0.041666666666666" ;
:geospatial_lat_units = "decimal_degrees north" ;
:geospatial_lon_units = "decimal_degrees east" ;
:coordinate_system = "EPSG:4326" ;
:_Format = "classic" ;
}
Header information of MACA file is:
netcdf pr_CanESM2_macav2_2016 {
dimensions:
crs = 1 ;
lat = 585 ;
lon = 1386 ;
time = 366 ;
variables:
int crs(crs) ;
crs:grid_mapping_name = "latitude_longitude" ;
crs:longitude_of_prime_meridian = 0. ;
crs:semi_major_axis = 6378137. ;
crs:inverse_flattening = 298.257223563 ;
double lat(lat) ;
lat:long_name = "latitude" ;
lat:standard_name = "latitude" ;
lat:units = "degrees_north" ;
lat:axis = "Y" ;
lat:description = "Latitude of the center of the grid cell" ;
double lon(lon) ;
lon:long_name = "longitude" ;
lon:standard_name = "longitude" ;
lon:units = "degrees_east" ;
lon:axis = "X" ;
lon:description = "Longitude of the center of the grid cell" ;
float precipitation(time, lat, lon) ;
precipitation:_FillValue = -9999.f ;
precipitation:long_name = "Precipitation" ;
precipitation:units = "mm" ;
precipitation:grid_mapping = "crs" ;
precipitation:standard_name = "precipitation" ;
precipitation:cell_methods = "time: sum(interval: 24 hours)" ;
precipitation:comments = "Total daily precipitation at surface; includes both liquid and solid phases from all types of clouds (both large-scale and convective)" ;
precipitation:coordinates = "time lon lat" ;
float time(time) ;
time:units = "days since 1900-01-01 00:00:00" ;
time:calendar = "gregorian" ;
time:description = "days since 1900-01-01" ;
// global attributes:
:description = "Multivariate Adaptive Constructed Analogs (MACA) method, version 2.3,Dec 2013." ;
:id = "MACAv2-METDATA" ;
:naming_authority = "edu.uidaho.reacch" ;
:Metadata_Conventions = "Unidata Dataset Discovery v1.0" ;
:Metadata_Link = "" ;
:cdm_data_type = "GRID" ;
:title = "Downscaled daily meteorological data of Precipitation from Canadian Centre for Climate Modelling and Analysis (CanESM2) using the run r1i1p1 of the rcp85 scenario." ;
:summary = "This archive contains daily downscaled meteorological and hydrological projections for the Conterminous United States at 1/24-deg resolution utilizing the Multivariate Adaptive Constructed Analogs (MACA, Abatzoglou, 2012) statistical downscaling method with the METDATA (Abatzoglou,2013) training dataset. The downscaled meteorological variables are maximum/minimum temperature(tasmax/tasmin), maximum/minimum relative humidity (rhsmax/rhsmin)precipitation amount(pr), downward shortwave solar radiation(rsds), eastward wind(uas), northward wind(vas), and specific humidity(huss). The downscaling is based on the 365-day model outputs from different global climate models (GCMs) from Phase 5 of the Coupled Model Inter-comparison Project (CMIP3) utlizing the historical (1950-2005) and future RCP4.5/8.5(2006-2099) scenarios. Leap days have been added to the dataset from the average values between Feb 28 and Mar 1 in order to aid modellers." ;
:keywords = "daily precipitation, daily maximum temperature, daily minimum temperature, daily downward shortwave solar radiation, daily specific humidity, daily wind velocity, CMIP5, Gridded Meteorological Data" ;
:keywords_vocabulary = "" ;
:standard_name_vocabulary = "CF-1.0" ;
:history = "Sat Jun 15 16:07:12 2019: C:\\nco\\ncks.exe -3 -d time,0,365,1 macav2metdata_pr_CanESM2_r1i1p1_rcp85_2016_2020_CONUS_daily.nc pr_CanESM2_macav2_2016.nc\n",
"No revisions." ;
:comment = "Total daily precipitation at surface; includes both liquid and solid phases from all types of clouds (both large-scale and convective)" ;
:geospatial_bounds = "POLYGON((-124.7722 25.0631,-124.7722 49.3960, -67.0648 49.3960,-67.0648, 25.0631, -124.7722,25.0631))" ;
:geospatial_lat_min = "25.0631" ;
:geospatial_lat_max = "49.3960" ;
:geospatial_lon_min = "-124.7722" ;
:geospatial_lon_max = "-67.0648" ;
:geospatial_lat_units = "decimal degrees north" ;
:geospatial_lon_units = "decimal degrees east" ;
:geospatial_lat_resolution = "0.0417" ;
:geospatial_lon_resolution = "0.0417" ;
:geospatial_vertical_min = 0. ;
:geospatial_vertical_max = 0. ;
:geospatial_vertical_resolution = 0. ;
:geospatial_vertical_positive = "up" ;
:time_coverage_start = "2016-01-01T00:0" ;
:time_coverage_end = "2020-12-31T00:00" ;
:time_coverage_duration = "P5Y" ;
:time_coverage_resolution = "P1D" ;
:date_created = "2014-05-15" ;
:date_modified = "2014-05-15" ;
:date_issued = "2014-05-15" ;
:creator_name = "John Abatzoglou" ;
:creator_url = "http://maca.northwestknowledge.net" ;
:creator_email = "jabatzoglou#uidaho.edu" ;
:institution = "University of Idaho" ;
:processing_level = "GRID" ;
:project = "" ;
:contributor_name = "Katherine C. Hegewisch" ;
:contributor_role = "Postdoctoral Fellow" ;
:publisher_name = "" ;
:publisher_email = "" ;
:publisher_url = "" ;
:license = "Creative Commons CC0 1.0 Universal Dedication(http://creativecommons.org/publicdomain/zero/1.0/legalcode)" ;
:coordinate_system = "WGS84,EPSG:4326" ;
:NCO = "netCDF Operators version 4.8.1-alpha03 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)" ;
:_Format = "classic" ;
}
gridMET files have 'Classic' format and MACA files have NetCDF4 format. Changing MACA files format to 'Classic' using the following:
ncks -3 in.nc out.nc
still leads to 0.3sec/grid cell processing time for years 2016-2050.
Here is the code I am using to read and process NetCDF files:
ds = xr.open_mfdataset('D:/proj1/*.nc', concat_dim='time')
da = ds.var.sel(lat=273.15, lat=49.4, method='nearest')
da_con = da[(da > 35.5)]
Kindly suggest any modification(s) to NetCDF files to reduce processing overhead.
Interestingly, reordering of dimensions reduced processing time to 0.05sec/grid cell. I used the following command line operation to reorder dimensions:
ncpdq -a lon,lat,time in.nc out.nc
There may be other solutions but this worked for time being.
I have a netcdf file
dimensions:
lsmlat = 1400 ;
lsmlon = 2800 ;
time = UNLIMITED ; // (1 currently)
lsmpft = 1 ;
variables:
double LATIXY(lsmlat, lsmlon) ;
LATIXY:long_name = "latitude" ;
LATIXY:units = "degrees north" ;
double LONGXY(lsmlat, lsmlon) ;
LONGXY:long_name = "longitude" ;
LONGXY:units = "degrees east" ;
I want the variables as
double LATIXY(lsmlat) ;
LATIXY:long_name = "latitude" ;
LATIXY:units = "degrees north" ;
double LONGXY(lsmlon) ;
LONGXY:long_name = "longitude" ;
LONGXY:units = "degrees east" ;
so that the file can be read in GIS.
Any nco command is appreciated.
Thanks.
Your original file has the structure of a curvilinear grid, where both lat and lon arrays are 2-D. Your desired grid is rectangular, where both lat and lon are 1-D. The only way to do this, in general, is to regrid. The NCO operator ncremap does regridding...
I'm attempting to concatenate two NetCDF files along the time (record) axis.
The lon dimension has the same number of values in both files, but the first file's lon coordinate values range from 0.5 to 359.5, and the second file's lon coordinate values range from -179.5 to 179.5. So when I view the final time step of the concatenated file plotted in Panoply I see values that look good spatially but which are positioned in the wrong location on the map. I have tried remedying this by adding 180 to all lon values in the second file before performing the concatenation but it produces similar results.
The command I'm using for this:
$ ncrcat precip.mon.total.1x1.v7.nc first_guess_monthly_2014_01.nc combined.nc
Below is the ncdump -h output for the two files I'm concatenating:
netcdf precip.mon.total.1x1.v7 {
dimensions:
lat = 180 ;
lon = 360 ;
time = UNLIMITED ; // (1356 currently)
variables:
float lat(lat) ;
lat:units = "degrees_north" ;
lat:actual_range = -89.5f, 89.5f ;
lat:long_name = "Latitude" ;
lat:standard_name = "latitude" ;
lat:axis = "Y" ;
lat:coordinate_defines = "point" ;
float lon(lon) ;
lon:long_name = "Longitude" ;
lon:units = "degrees_east" ;
lon:standard_name = "longitude" ;
lon:actual_range = 0.5f, 359.5f ;
lon:axis = "X" ;
lon:coordinate_defines = "point" ;
double time(time) ;
time:long_name = "Time" ;
time:units = "days since 1800-1-1 00:00:00" ;
time:delta_t = "0000-01-00 00:00:00" ;
time:avg_period = "0000-01-00 00:00:00" ;
time:standard_name = "time" ;
time:axis = "T" ;
time:coordinate_defines = "start" ;
time:actual_range = 36889., 78131. ;
float p(time, lat, lon) ;
p:long_name = "GPCC Monthly total of precipitation" ;
p:missing_value = -9.96921e+36f ;
p:statistic = "Total" ;
p:valid_range = 0.f, 8000.f ;
p:parent_stat = "Observations" ;
p:var_desc = "Precipitation" ;
p:actual_range = 0.f, 3153.04f ;
p:dataset = "GPCC Precipitation 1.0degree V7 Full Reanalysis" ;
p:units = "mm" ;
p:level = "Surface" ;
// global attributes:
:Original_Source = "http://www.dwd.de/en/FundE/Klima/KLIS/int/GPCC/GPCC.htm\n is the webpage and the data is at ftp://ftp.dwd.de/pub/data/gpcc/download.html" ;
:Reference = "Users of the data sets are kindly requested to give feed back and to refer to GPCC publications on this webpage: http://www.dwd.de/bvbw/appmanager/bvbw/dwdwwwDesktop/?_nfpb=true&_pageLabel=_dwdwww_klima_umwelt_datenzentren_wzn&T12404518261141645246564gsbDocumentPath=Content%2FOeffentlichkeit%2FKU%2FKU4%2FKU42%2Fteaser__product__access.html&_state=maximized&_windowLabel=T12404518261141645246564&lastPageLabel=_dwdwww_klima_umwelt_datenzentren_wzn" ;
:original_source = "ftp://ftp-anon.dwd.de/pub/data/gpcc/html/download_gate.html" ;
:References = "http://www.esrl.noaa.gov/psd/data/gridded/data.gpcc.html" ;
:Conventions = "CF 1.0" ;
:title = "GPCC Full Data Reanalysis Version 7 1.0x1.0 Monthly Totals" ;
:dataset_title = "Global Precipitation Climatology Centre (GPCC)" ;
:history = "Tue Jun 14 10:59:21 2016: ncrename -v precip,p precip.mon.total.1x1.v7.nc\nCreated 01/2016 based on V7 data obtained via ftp" ;
netcdf first_guess_monthly_2014_01 {
dimensions:
lon = 360 ;
lat = 180 ;
time = UNLIMITED ; // (1 currently)
variables:
double lon(lon) ;
lon:standard_name = "longitude" ;
lon:long_name = "longitude" ;
lon:units = "degrees_east" ;
lon:axis = "X" ;
double lat(lat) ;
lat:standard_name = "latitude" ;
lat:long_name = "latitude" ;
lat:units = "degrees_south" ;
lat:axis = "Y" ;
double time(time) ;
time:standard_name = "time" ;
time:units = "months since 2014-01-01 00:00:00" ;
time:calendar = "proleptic_gregorian" ;
float p(time, lat, lon) ;
p:long_name = "first guess monthly product, precipitation per grid" ;
p:units = "mm/month" ;
p:code = 20 ;
p:_FillValue = -99999.99f ;
float s(time, lat, lon) ;
s:long_name = "first guess monthly product, number of gauges per grid" ;
s:units = "gauges per gridcell" ;
s:code = 21 ;
s:_FillValue = -99999.99f ;
// global attributes:
:CDI = "Climate Data Interface version 1.5.9 (http://code.zmaw.de/projects/cdi)" ;
:Conventions = "CF-1.4" ;
:history = "Mon Apr 07 10:41:42 2014: cdo -setgatts,gattfile gpcc10.nc first_guess_monthly_2014_01.nc\n",
"Mon Apr 07 10:41:41 2014: cdo -merge first_guess_monthly_precip_2014_01.nc first_guess_monthly_numgauge_2014_01.nc gpcc10.nc\n",
"Mon Apr 07 10:41:41 2014: cdo cat gpcc10.nc first_guess_monthly_numgauge_2014_01.nc\n",
"Mon Apr 07 10:41:41 2014: cdo -b F32 -r -f nc -setgrid,firstguess10desc.asc -settunits,month -setmissval,-99999.99 -setpartab,codetable.txt -setcode,21 -setdate,2014-01 -input,r360x180 gpcc10.nc" ;
:DOI = "10.5676/DWD_GPCC/FG_M_100" ;
:title = "GPCC First Guess Product at 1.0°: Near Real-Time First Guess Monthly Land-Surface Precipitation from Rain-Gauges based on SYNOP Data" ;
:summary = "This is the GPCC First Guess Product of monthly global land-surface precipitation based on the station database (SYNOP) available via the Global Telecommunication System (GTS) of the World Meteorological Organization (WMO) at the time of analysis (3 - 5 days after end of the analysis month). This product contains the monthly totals on a regular latitude/longitude grid with a spatial resolution of 1.0° x 1.0° latitude by longitude. Interpolation is made on the anomalies from the GPCC Climatology V2011 in the corresponding resolution (DOI: 10.5676/DWD_GPCC/Clim_M_V2011_100). The temporal coverage of the dataset ranges from January 2005 until the most recent month for which GTS based SYNOP data is available, i.e. the previous month, 3-5 days after its completion." ;
:usage = "This GPCC product is recommended to be used when the timeliness of the precipitation information is of highest importance, e.g. for drought monitoring or watch purposes." ;
:keywords = "precipitation climatology,gpcc,global,gpcp,daily" ;
:id = "first_guess_monthly_precipitation_10" ;
:creator_url = "http://gpcc.dwd.de" ;
:creator_name = "GPCC/DWD" ;
:creator_email = "gpcc#dwd.de" ;
:institution = "Deutscher Wetterdienst" ;
:date_created = "Mo 7. Apr 10:41:41 UTC 2014" ;
:time_coverage_start = "2014-01" ;
:time_coverage_end = "2014-01" ;
:time_coverage_resolution = "month" ;
:geospatial_lat_min = "-90." ;
:geospatial_lat_max = "90." ;
:geospatial_lon_min = "-180." ;
:geospatial_lon_max = "180." ;
:CDO = "Climate Data Operators version 1.5.9 (http://code.zmaw.de/projects/cdo)" ;
}
What might I be doing wrong?
I know this is 4 years too late, but did you try to use CDO?
cdo mergetime file1.nc file2.nc merged_file.nc
I think it can automatically handle the shifted longitude issue in cases like this.