I am working on SDM project using Max-Ent, and Random Forest.
When I prepare tiff, or raster files of environmental data, I have difficulties, because of I’m a beginner.
For example: We have a csv file (1981_1991_ta_totalAverage.csv), which is contains average min temperature from 1981 to 1991.
Question 1: How can I convert this csv file to raster format?
I tried to convert it. When I need raster map of Korea, I used bclim6.asc (BIO6=Minimum Temperature of the Coldest Month for Korea. It seems something wrong. Question 2: How can I get taster file of Korea?
Related
I am working with hourly air temperature raster maps for several months. I also have a .txt file with hourly air temperature values for each month. For each hour I want to calculate UHI:
UHI_1/1/2011_0am = raster - air temperature value (line 1 in .txt file)
UHI_1/1_2011_1am = raster - air temperature value (line 2 in .txt file)
And so on ....
How can I make this in R, instead of uploading each individual raster and calculate UHI individually at each hour?
This is my script, but this is very time consumming:
install.packages("raster")
install.packages("rgdal")
install.packages ("sp")
install.packages("xlsx")
library(raster)
library(rgdal)
library(sp)
library (xlsx)
setwd ("C:/Users/Claudia/Desktop/apr_2014")
apr_1 = raster ("apr__1.tif")
apr_2 = raster ("apr__2.tif")
dif_apr_1= abr_1 - 287.04
writeRaster(dif_apr_1,filename= "dif_apr_1",format="GTiff")
Is there an easier way to do this?
I have a netcdf file with the daily precipitation (for a whole decade) in every latitude and longitude, it's in the form (lon,lat,time). I want to get the monthly average for the longitude=-118.25:-84.75 and for the latitude=13.25:33.25. I need to write another netcdf file in which the variable is monthly precipitation given by (lon,lat,time) but i dont know how to extract the ranges and how to obtain the monthly average since the months are repeated each year.
Just use the tool called cdo and operator sellonlatbox:
cdo -sellonlatbox -118.25,-84.75,13.25,33.25 filein fileout
filein is the name of your input file and fileout is the name of the output.
Afterwards you can use operator monmean to calculate monthly means:
cdo -monmean fileout final_file
I have an excel file with station weather data. For each station, for each year (70 years), I have temperature data, as well as associated latitude and longitude of the stations. I want to create interpolated raster maps (using IDW) for each year for temperature.
My excel files are set up like this, but with 70 years of data:
I would therefore like 70 interpolated maps for each year of temperature. It also may be important to note that the stations for each year are not all the same.
I am willing to try to do this as a batch process in ArcGIS, but find that can be tedious. Is there a faster way to do this, through arcpy or even through R?
We have these data below in a csv file and we would like to create a polar plot from them. We are going to use this R package - openair for creating the polar plot.
timestamp humandate NOppb
1 1412877113 09/10/2014 13:51 19
2 1412876508 09/10/2014 13:41
3 1412876508 09/10/2014 13:41
4 1412877118 09/10/2014 13:51 17
....
However, we are missing some data for using polarPlot(),
# Load package.
library("openair")
polarPlot(dat, pollutant = "NOppb", na.rm = TRUE)
result:
Can't find the variable(s) wd ws
Error in checkPrep(mydata, vars, type, remove.calm = FALSE) :
It requires columns of wd and ws for wind direction and speed which we don't have.
I am told that we can pull these missing data from wunderground's api, but the problem are:
how can pull the data from wunderground's api to match each row of our data above?
the weather data is measured and recorded hourly as it seems but our data is not recorded hourly as you can see it above. so how is this going to match?
Any ideas what can I do?
The openair package provides easy access to UK air quality monitoring station data, including several stations in London. These data will automatically include wind speed and direction (ws and wd). This capability is provided by openair's importAURN and importKCL functions.
Use one of these functions to download an hourly dataset from a monitoring station near your site, for the period of time you are interested in, and merge it with your data by date (timestamp). Timestamps (date column) in openair is a POSIXct date, usually to a whole hour. You will need to convert your timestamp or humandate to POSIXct using as.POSIXct, and name the resulting column date. Then round your date to the nearest whole hour, before merging on date with the AURN dataset.
Then you can make your polar plots based on the wind data, and even compare pollutant measurements to the city's own monitoring station data.
I don't know anything about specific stations in London, but read about this under importAURN and importKCL functions in the openair manual, or help in R after openair is loaded. See openair on CRAN, or the lastest updates on github (https://github.com/davidcarslaw/openair). ps: The openair author is an expert on London air quality.
This question already has answers here:
Read an Excel file directly from a R script
(12 answers)
Closed 8 years ago.
I am doing a project about wind turbines where I test whether it is possible to heat a house only with energi from the wind turbine. So I made this R code. The code can predict the inside temperature. I have dataset in Excel that contains the outside temperature and windspeed every hour in january 2009.
R <- 5 This is the isolation of the wall
C <- 4 This is heat capacity of the wall
Ta <- rep(3,24) constant outside temperature in 24 hour
Ph <- rep(2,24) constant effect of the wind turbine in kW in 24 hour
Ti0 <- 20 This is the beginning temperature in the house
a <- -1/(R*C)
for(k in 2:24) {
Ti[k] <- Ta[k] + exp(a) * (Ti[k-1] - Ta[k]) + R * (1-exp(a)) * Ph[k]
}
My question is. How can I load my Excel dataset into R so R can predict the inside temperatur with changing temperature in 24 hour and changing effect in 24 hour, instead of holding these constant
Use this to load Excel data:
data = read.xls("excelfile.xls")
This will get the data from the first worksheet in your Excel file and save it in a dataset.
(Just make sure the file is in the working directory).
Just in case this doesn't work, try loading the gdata package before importing from the Excel file:
library(gdata)