I am trying to read some variables from the necdf file but I am getting an error:
f=open.ncdf("mrgHYDRO_Az_arfswp_Stom_PRUNI_20000101_20001231_1M_sechiba_history.nc")
A = get.var.ncdf(nc=f,varid="Evaporation",verbose=TRUE)
Error in vobjtovarid(nc, varid, verbose = verbose) : Variable not found
Any help please,Best Regards
"file has 9 dimensions:"
[1] "lon Size: 34"
[1] "lat Size: 30"
[1] "veget Size: 13"
"file has 113 variables:"
[1] "double time_counter_bnds[tbnds,time_counter] Longname:time_counter_bnds Missval:1e+30"
[1] "float evap[lon,lat,time_counter] Longname:Evaporation Missval:1e+30"
I believe the correct name to refer to in the call to get.var.ncdf is evap not Evaporation. The longname is just a more descriptive name, the real name is evap.
A = get.var.ncdf(nc=f,varid="evap",verbose=TRUE)
Related
I am plotting a graph with barplot() and any attempts to use the beside=TRUE parameter seem to return the error of Error in -0.01 * height : non-numeric argument to binary operator
The following is the code for the graph:
combi <- as.matrix(combine)
barplot(combi, main="Top 5 hospitals in California",
ylab="Mortality/Admission Rates", col = heat.colors(5), las=1)
The output of the graph is that the bars are stacked on each other instead of being beside each other.
The issue is not reproducible, when combineis a data.frame:
combine <- data.frame(
HeartAttack = c(13.4,12.3,16,13,15.2),
HeartFailure = c(11.1,7.3,10.7,8.9,10.8),
Pneumonia = c(11.8,6.8,10,9.9,9.5),
HeartAttack2 = c(18.3,19.3,21.8,21.6,17.3),
HeartFailure2 = c(24,23.3,24.2,23.8,24.6),
Pneumonia2 = c(17.4,19,17,18.4,18.2)
)
combi <- as.matrix(combine)
barplot(combi, main="Top 5 hospitals in California",
ylab="Mortality/Admission Rates", col = heat.colors(5), las=1, beside = TRUE)
Had the same issue earlier (different dataset, tho) and resolved it by using as.numeric() on my dataframe after I converted it to matrix with as.matrix(). Leaving as as.numeric()" out leads to "Error in -0.01 * height : non-numeric argument to binary operator"
¯\(ツ)/¯
My df called tmp:
> tmp
125 1245 1252 1254 1525 1545 12125 12425 12525 12545 125245 125425
Freq.x.2d "14" " 1" " 1" " 1" " 3" " 2" " 1" " 1" " 9" " 4" " 1" " 5"
Freq.x.3d "13" " 0" " 1" " 0" " 4" " 0" " 0" " 0" "14" " 4" " 1" " 2"
> dim(tmp)
[1] 2 28
> is(tmp)
[1] "matrix" "array" "structure" "vector"
> tmp <- as.matrix(tmp)
> dim(tmp)
[1] 2 28
> is(tmp)
[1] "matrix" "array" "structure" "vector"
> tmp <- as.numeric(tmp)
> dim(tmp)
NULL
> is(tmp)
[1] "numeric" "vector"
barplot(tmp, las=2, beside=TRUE, col=c("grey40","grey80"))
I'm attempting to use the system function in R to run a program, which I expect to yield an error message in some cases. For this I want to write a tryCatch function.
system(command, intern = TRUE) only returns the actual values which were echo'd by the program I'm running, it does not return my error.
In R, how can I get the error message which was yielded by my system?
My code:
test <- tryCatch({
cmd <- paste0("../Scripts/Plink2/plink --file ../InputData/",prefix," --bmerge ",
"../InputData/fs --missing --out ../InputData/",prefix)
print(cmd)
system(cmd)
} , error = function(e) {
# error handler picks up where error was generated
print("EZEL")
print(paste("MY_ERROR: ",e))
}, finally = {
print("something")
})
[1] "../Scripts/Plink2/plink --file ../InputData/GS80Kdata --bmerge ../InputData/fs --missing --out ../InputData/GS80Kdata"
PLINK v1.90b3.37 64-bit (16 May 2016) https://www.cog-genomics.org/plink2
#....
#skipping some lines here to reduce size
#....
Of these, 1414410 are new, while 2462 are present in the base dataset.
Error: 1 variant with 3+ alleles present.
* If you believe this is due to strand inconsistency, try --flip with
# Skipping some more lines here
[1] "something"
However when using intern=TRUE and assigning the system function to a variable won't catch the error in the vector and still prints it in the R console.
Edit: Here the output of the vector (using gsub to reduce the ridiculous size)
> gsub(pattern="\b\\d.*", replacement = "", x = tst)
[1] "PLINK v1.90b3.37 64-bit (16 May 2016) https://www.cog-genomics.org/plink2"
[2] "(C) 2005-2016 Shaun Purcell, Christopher Chang GNU General Public License v3"
[3] "Logging to ../InputData/GS80Kdata.log."
[4] "Options in effect:"
[5] " --bmerge ../InputData/fs"
[6] " --file ../InputData/GS80Kdata"
[7] " --missing"
[8] " --out ../InputData/GS80Kdata"
[9] ""
[10] "64381 MB RAM detected; reserving 32190 MB for main workspace."
[11] "Scanning .ped file... 0%\b"
[12] "2%\b\b"
[13] "%\b\b"
[14] "\b\b"
[15] "\b"
[16] ""
[17] "58%\b\b"
[18] "7%\b\b"
[19] "%\b\b"
[20] "\b\b"
[21] "\b"
[22] "Performing single-pass .bed write (42884 variants, 14978 people)."
[23] "0%\b"
[24] "../InputData/GS80Kdata-temporary.bim + ../InputData/GS80Kdata-temporary.fam"
[25] "written."
[26] "14978 people loaded from ../InputData/GS80Kdata-temporary.fam."
[27] "144 people to be merged from ../InputData/fs.fam."
[28] "Of these, 140 are new, while 4 are present in the base dataset."
[29] "42884 markers loaded from ../InputData/GS80Kdata-temporary.bim."
[30] "1416872 markers to be merged from ../InputData/fs.bim."
[31] "Of these, 1414410 are new, while 2462 are present in the base dataset."
attr(,"status")
[1] 3
>
I need to read a NetCDF file with R and export each time step as a smoothed polygon shapefile.
I have two problems: smoothing the raster and exporting to shapefile with proper projection from the NC file.
The output is a regular grid and is not projected.
Here is a sample code:
>NCFileName = MyncFile.nc
NCFile = open.ncdf(NCFileName)
NCFile
[1] "file CF_OUTPUT.nc has 6 dimensions:"
[1] "time Size: 61"
[1] "height Size: 8"
[1] "lat Size: 185"
[1] "lon Size: 64"
[1] "Time Size: 61"
[1] "DateStrLen Size: 19"
[1] "------------------------"
[1] "file CF_OUTPUT.nc has 20 variables:"
[1] "float temp[lon,lat,height,time] Longname:Temperature Missval:1e+30"
[1] "float relh[lon,lat,height,time] Longname:Relative Humidity Missval:1e+30"
[1] "float airm[lon,lat,height,time] Longname:Air density Missval:1e+30"
[1] "float z[lon,lat,height,time] Longname:Layer top altitude Missval:1e+30"
[1] "float ZH[lon,lat,height,time] Longname:Layer top altitude Missval:1e+30"
[1] "float hlay[lon,lat,height,time] Longname:Layer top altitude Missval:1e+30"
[1] "float PM10ant[lon,lat,height,time] Longname:PM10ant Concentration Missval:1e+30"
[1] "float PM10bio[lon,lat,height,time] Longname:PM10bio Concentration Missval:1e+30"
[1] "float PM10[lon,lat,height,time] Longname:PM10 Concentration Missval:1e+30"
[1] "float PM25ant[lon,lat,height,time] Longname:PM25ant Concentration Missval:1e+30"
[1] "float PM25bio[lon,lat,height,time] Longname:PM25bio Concentration Missval:1e+30"
[1] "float PM25[lon,lat,height,time] Longname:PM25 Concentration Missval:1e+30"
[1] "float C2H4[lon,lat,height,time] Longname:C2H4 Concentration Missval:1e+30"
[1] "float CO[lon,lat,height,time] Longname:CO Concentration Missval:1e+30"
[1] "float SO2[lon,lat,height,time] Longname:SO2 Concentration Missval:1e+30"
[1] "float NO[lon,lat,height,time] Longname:NO Concentration Missval:1e+30"
[1] "float NO2[lon,lat,height,time] Longname:NO2 Concentration Missval:1e+30"
[1] "float O3[lon,lat,height,time] Longname:O3 Concentration Missval:1e+30"
[1] "char Times[DateStrLen,Time] Longname:Times Missval:NA"
[1] "float HGT[lon,lat,time] Longname:Topography Missval:1e+30"
nc.a=get.var.ncdf(NCFile , varid = 'NO2', start=c(1,1,1,1), count=c(-1,-1,1,1))
Pol <- rasterToPolygons(raster(nc.a),dissolve = TRUE)
Pol
class : SpatialPolygonsDataFrame
features : 11829
extent : 0, 1, 0, 1 (xmin, xmax, ymin, ymax)
coord. ref. : NA
variables : 1
names : layer
min values : 0.219758316874504
max values : 0.84041428565979
writeOGR(Pol, dsn = getwd(), layer = 'testPol', driver = 'ESRI Shapefile', overwrite_layer = TRUE)
What I get, however, are grided polygons that are not projected.
UPDATE:
Following #kakk11 and #RobertH answers, I was able to solve part of the problem. I still get a grid-like polygons, not smoothed. Here is what I did so far:
I couldn't extract the variable directly to raster as #RobertH suggested. so I used the 'get.var.ncdf' and then 'raster':
NCFileName = 'MyncFile.nc'
NCFile = open.ncdf(NCFileName)
nc.a = get.var.ncdf(NCFile, varid = 'NO2', start=c(1,1,1,13), count=c(-1,-1,1,1))
nc.a = raster(nc.a)
# put in correct extent:
lat = NCFile$dim$lat$vals
lon = NCFile$dim$lon$vals
ExtentLat = range(lat)
ExtentLon = range(lon)
rm(lat,lon)
nc.a = flip(t(nc.a), direction='y')
# Give it lat/lon coords
extent(nc.a) = c(ExtentLon,ExtentLat)
Then the 'cut' command returns vector, so i used 'ratser:reclassify':
cuts = c(0,5,15,30,50)
classes <- cbind(cuts[1:length(cuts)-1],cuts[2:length(cuts)],cuts[2:length(cuts)])
nc.class <- reclassify(nc.a, classes)
I then used the 'rasterToPolygons' with 'dissolve=TRUE' to create the polygons:
pol <- rasterToPolygons(nc.class, dissolve = TRUE)
# set UTM projection:
WGS84_Projection = "+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0"
proj4string(pol) <- CRS(WGS84_Projection)
writeOGR(pol, dsn = getwd(), layer = 'file' , driver = 'ESRI Shapefile', overwrite_layer = TRUE)
Still, all this creates polygon shapefile with the polygons that are not smooth, which is the main challenge.
Could use some help with this.
Ilik
You first need correctly create a RasterLayer, like this:
r <- raster('MyncFile.nc', var='NO2')
# or, to get all time steps at once
# brick('MyncFile.nc', var='NO2')
You could then generalize (classify) the values using reclassify or cut. For example
cuts <- seq(0.2, 0.9, 0.1)
rc <- cut(r, cuts)
Make polygons and save to shapefile
pol <- rasterToPolygons(rc, dissolve = TRUE)
shapefile(pol, 'file.shp')
I have a NetCDF file with rotated coordinates. I need to convert it to normal lat/lon coordinates (-180 to 180 for lon and -90 to 90 for lat).
library(ncdf4)
nc_open('dat.nf')
For the dimensions, it shows:
[1] " 5 variables (excluding dimension variables):"
[1] " double time_bnds[bnds,time] "
[1] " double lon[rlon,rlat] "
[1] " long_name: longitude"
[1] " units: degrees_east"
[1] " double lat[rlon,rlat] "
[1] " long_name: latitude"
[1] " units: degrees_north"
[1] " char rotated_pole[] "
[1] " grid_mapping_name: rotated_latitude_longitude"
[1] " grid_north_pole_longitude: 83"
[1] " grid_north_pole_latitude: 42.5"
[1] " float tasmax[rlon,rlat,time] "
[1] " long_name: Daily Maximum Near-Surface Air Temperature"
[1] " standard_name: air_temperature"
[1] " units: K"
[1] " cell_methods: time:maximum within days time:mean over days"
[1] " coordinates: lon lat"
[1] " grid_mapping: rotated_pole"
[1] " _FillValue: 1.00000002004088e+20"
[1] " 4 dimensions:"
[1] " rlon Size:310"
[1] " long_name: longitude in rotated pole grid"
[1] " units: degrees"
[1] " axis: X"
[1] " standard_name: grid_longitude"
[1] " rlat Size:260"
[1] " long_name: latitude in rotated pole grid"
[1] " units: degrees"
[1] " axis: Y"
[1] " standard_name: grid_latitude"
[1] " bnds Size:2"
Could anyone show me how to convert the rotated coordinates back to normal lat/lon? Thanks.
NCO's ncks can probably do this in two commands using MSA
ncks -O -H --msa -d Lon,0.,180. -d Lon,-180.,-1.0 in.nc out.nc
ncap2 -O -s 'where(Lon < 0) Lon=Lon+360' out.nc out.nc
I would use cdo for this purpose https://code.zmaw.de/boards/2/topics/102
Another option is just create a mapping between rotated and geographic coordinates and use the original data without interpolation. I can find the equations if necessary.
I went through the CDO link as suggested by #kakk11, but somehow that could not work for me. Afte much research, I found a way
First, convert the rotated grid to curvilinear grid
cdo setgridtype,curvilinear Sin.nc out.nc
Next transform to your desired grid e.g. for global 1X1 degree
cdo remapbil,global_1 out.nc out2.nc
or for a grid like below
gridtype = lonlat
xsize = 320 # replace by your value
ysize = 180 # replace by your value
xfirst = 1 # replace by your value
xinc = 0.0625 # replace by your value
yfirst = 43 # replace by your value
yinc = 0.0625 # replace by your value
save this info as target_grid.txt and then run
cdo remapbil,target_grid.txt out.nc out2.nc
In my case there was additional issue that my variables did not have the grid information. so CDO assumed it to be regular lat-long grid. So before all the above-mentioned steps, I had to add grid information attribute to all the variables (in my cases all the variables ended with _ave) using nco
ncatted -a coordinates,'_ave$',c,c,'lon lat' in.nc
ncatted -a grid_mapping,'_ave$',c,c,'rotated_pole' in.nc
Please note that your should have a variable called rotated_pole in your nc file with the lat long information of rotated pole.
There is also the possibility to do that in R (as the User is referring to it in the question). Of course, NCO and CDO are more efficient (way faster).
Please, look also at this answer.
library(ncdf4)
library(raster)
nsat<- stack (air_temperature.nc)
##check the extent
extent(nsat)
## this will be in the form 0-360 degrees
#change the coordinates
nsat1<-rotate(nsat)
#check result:
extent(nsat1)
##this should be in the format you are looking for: -180/180
Hope this helps.
[edited]
I want to write the output of my R script to a text file. I am doing it with the help of
sink("filename");
However, my output from:
print(sprintf("%d%10f",key,value));
is in the format:
[1] "0 0.014806"
[1] "1 0.053434"
[1] "10 0.014806"
[1] "100 0.053434"
[1] "1000 0.014806"
[1] "10000 0.053434"
[1] "1000000 0.014806"
[1] "10000000 0.014806"
[1] "100000000 0.053434"
I want the output in the text file in the format:
0 0.014806
1 0.053434
10 0.014806
How can this formatting of output be done. I need to remove the [1] as well as the ""?
In place of:
print(sprintf("%d%10f",key,value));
I have now used:
cat(key,value)
I have already sinked the output to another file to get the desired output.