alternative for raster::extract to extract values from a shapefile - r

I have map of ecoregions in shapefile format and a set of data points coordinates. What I would like to do is for each point extract the ecoregion where it falls.
This is what raster::extract() does. However, it seems that for this to work the package rgeos is necessary, and this is not available for >= OS X Mavericks, which is my case.
Any idea of how could I extract this info w/o using extract()? Or could you please tell me if I'm missing something on how to get raster to properly work on my system (OS X Yosemite, R 3.1.2)?

Related

How to get raster file from a nested raster list produced by landscapemetrics package in R?

Package landscapemetrics can calculate area of each patch for a given raster file, shape of that patch and so on. I want to have not only tibble-frame with patch metrics calculated, but a new raster where each pixel within specific patch will have a value of the area of that patch, shape indicator and so on. We can do it with function spatialize_lsm() (it produces a Large list nested object with probably RasterObject objects within):
library(landscapemetrics)
plot(podlasie_ccilc) # this raster data is provided with package
podlasie.metrics.area <- spatialize_lsm(podlasie_ccilc, what = 'lsm_p_area') # creates a list
plot(podlasie.metrics.area) # produces an error...
How to get a desirable raster file with patch metrics from that list? I guess it is a question of raster package or something else, since landscapemetrics documentation tells nothing about this step.
I not that this data and new raster do not have resolution of the pixel like in meters (30, 30 for Landsat satellite image, for example). So we cannot plot the new raster produced:
podlasie.metrics.area[[1]]
plot(podlasie.metrics.area[[1]])
So I guess landscapemetrics cannot deal with such rasters, we can even use its function to check a suitability of the prior raster for patch discovering:
check_landscape(podlasie_ccilc)
Upd. I did it for the Landsat dataset with resolution 30, 30 and it produced patch area raster, but again I cannot open/show/save as raster it, because of the same error.
Package maintainer helps to solve a problem (yes, it is just related to the structure of list):
plot(podlasie.metrics.area[[1]]$lsm_p_area)

writeRaster function in R is automatically setting (unwanted) maximum value, can I set the max value to null?

I am running into a problem with the "writeRaster" function in the raster package in R. I am importing a raster (TIF) that I made in ArcGIS (a distance to feature raster).
My goal was to resample the distance raster to the correct resolution and extent, then "mask" it with the appropriate raster to crop it to the shape I require. When I check the results of the mask with the basic plot function, everything looks great and I can see that each pixel in the new masked raster has a distance value.
However, when I write this raster to a file using the writeRaster function, the resulting raster looks like "swiss cheese" and has missing values for any distance over 35km. After much reading, I cannot find any documentation to suggest that there is a way to modify the maximum value set by writeRaster---or that it should even be setting a max value. I have included my code and the basic plots below. A big thank you to anyone who attempts to help me with this!
#Read in distance to fresh water raster
distFW <- raster("D:/Academia/Arc Data/Grackle/NicaCR_90mlayers/dist_FW.tif")
[plot(distFW)][1]
#Resample this layer to the desired resolution and template
NiCR_DistFW<-as.integer(resample(distFW,NiCRrast.tmpl,method="ngb"))
#essentially the same as the first plot
[plot(NiCR_DistFW)][2]
#Mask the resampled raster to the desired shape
NiCR.DistFW.mask.utm <- mask(NiCR_DistFW,NiCR_Mask) #with CA countries cut out.
[plot(NiCR.DistFW.mask.utm)][3]
#write raster to file (this is where things get weird)
writeRaster(x=NiCR.DistFW.mask.utm, filename='DistFWmask2.tif', format='GTiff', datatype='INT2S') #a way to ensure INT2S
#read the newly written raster file in to R so we can review it
dFW <-raster("DistFWMask2.tif")
[plot(dFW)_writeRaster_result][4]
[1]: https://i.stack.imgur.com/v9RkK.jpg
[2]: https://i.stack.imgur.com/v2DG3.jpg
[3]: https://i.stack.imgur.com/cCwJe.jpg
[4]: https://i.stack.imgur.com/MjWj7.jpg
As you can see from plot 4, an undesirable max value has been set. I was the raster I write to file to look like the one in plot 3, not plot 4.
Thanks in advance for any advice.
Well friends, after taking an hour to detail my question I managed to figure out the answer myself. It had to do with setting the datatype.
INT2S has a maximum value of 32,767
by switching it to INT4S, I capture the full range of values in my raster.
Problem solved!

Issue with coordinate projection for detecting spatial autocorrelation in R

We have a dataset that contains latitude and longitude coordinates, as well as attribute information, each in its own separate column, stored as numeric. These coordinates have been geocoded based on the geographic coordinate system WGS 1984.
We know that we have significant spatial autocorrelation in our data, which we are hoping to visualize in a bubble plot using the “sp” package. We are modeling our example off of others online, such as here: https://beckmw.wordpress.com/2013/01/07/breaking-the-rules-with-spatial-correlation/ . However, when we try to use the coordinates command within "sp", we keep getting an error message:
Code example:
coords <- data.frame(lead$X, lead$Y)
coordinates(coords) <- c("lead6.X","lead6.Y")
Error in if (nchar(projargs) == 0) projargs <- as.character(NA) missing value where TRUE/FALSE needed
We can't load our direct code because it's sensitive and hosted on a virtual environment without access to the internet. Does anyone have ideas for why this might be happening? We've looked into the proj4 package but can't figure out how to specify a projection system (or is that even the error that we are getting?). If anyone knows of any other packages in R or ways to visualize spatial autocorrelation, those would be much appreciated too.
Your code is a bit "strange": seems you are trying to build a dataset containing only coordinates. AFAIU, you may need something in this line :
data <- data.frame(lead$X, lead$Y, lead$Z)
,with lead$Z corresponding to a generic "variable" you want to inspect, then
coordinates(data) <- c('X','Y')`
proj4string(data) <- "+init=epsg:4326"
, which should give you a proper "SpatialPointsDataframe" with lat-lon WGS84 geographic coordinates (the first line could be also dropped, and you'll keep all variables in the data of the spatialpointsdataframe).
HTH

Autokriging spatial data

I'm trying to use a kriging function to create vertical maps of chemical parameters in an ocean transect, and I'm having a hard time getting started.
My data look like this:
horiz=rep(1:5, 5)
depth=runif(25)
value = horiz+runif(25)/5
df <- data.frame(horiz, depth, value)
The autoKrige function in the automap package looks like it should do the job for me but it takes an object of class SpatialPointsDataFrame. As far as I can tell, the function spTransform in package rgdal creates SpatialPointsDataFrame objects, but there are two problems:
OSX binaries of this aren't available from CRAN, and my copy of RStudio running on OXS 10.7 doesn't seem to be able to install it, and
This function seems to work on lat/long data and correct distance values for the curvature of the Earth. Since I'm dealing with a vertical plane (and short distances, scale of hundreds of meters) I don't want to correct my distances.
There's an excellent discussion of kriging in R here, but due to the issues listed above I don't quite understand how to apply it to my specific problem.
I want a matrix or dataframe describing a grid of points with interpolated values for my chemical parameters, which I can then plot (ideally using ggplot2). I suspect that the solution to my problem is considerably easier than I'm making it out to be.
So there a a few question you want answered:
The spTransform function does not create SPDF's, but transforms between projections. To create a SPDF you can use a simple data.frame as a start. To transform df to a SPDF:
coordinates(df) = c("horiz", "depth")
OS X binaries of rgdal can be found at http://www.kyngchaos.com. But I doubt if you need rgdal.
spTransform can operate on latlong data, but also on projected data. But I do not think you need rgdal, or spTransform, see also point 1.
After you create the SPDF using point 1, you can use the info at the post you mentioned to go on.

Creating x and y distance coordinates for R from a .kml file

I would like to use a .kml track file to make a set of x, y coordinates for use in R.
What I have right now is a GoogleEarth track, which I believe is a LineString. I have heard that the rgdal package is usually what people use, but it doesn't work on Mac versions of R. If possible, I'd like to do this on a Mac, where I do the rest of my analyses. If necessary, I can do the conversion on R64 with Windows, and then bring the coordinates to my Mac, but that seems...clunky.
The beginning of the .kml code looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<kml xmlns="http://www.opengis.net/kml/2.2"
xmlns:gx="http://www.google.com/kml/ext/2.2" xmlns:kml="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom">
<Document>
<name>Perimeter_Track.kml</name>
<Placemark>
<name>ACTIVE LOG</name>
<LineString>
<coordinates>
-157.80736808,21.4323436,20.324951171875
I want to convert it into x , y coordinates in kilometers from a point in my map. The finished product will be a line outline of a body of water, with species abundance data overlaid on it.
I have tried a couple of methods already:
1. Converting the .kml file into a .csv and importing it to r using read.csv;
2. Importing coordinates using getKMLcoordinates in the maptools package.
The problem with (1) is that when I try to convert the .kml coords into csv, I get an error in the converter program (kmlcsv) that says it can't read the file (I'm not sure why- the error logs aren't available).
When I try (2), I get coordinates that are arranged weirdly.
spa<-getKMLcoordinates("Perimeter_Track.kml", ignoreAltitude=TRUE)
summary(spa) returns:
Length Class Mode
[1,] 128 -none- numeric
[2,] 242 -none- numeric
[3,] 34 -none- numeric
[4,] 126 -none- numeric
I believe this is because the .kml file is actually four separate tracks, separated by small gaps (i.e., where they turned the GPS off for a short time, then started again). Do I need to import these all separately in order to get the whole shape? If so, how do I do this?
I would like, eventually, to get this shape on a grid that is x by y km, where the coordinates are in km instead of GPS coords. If anyone has any insight into how to do this, I would love to hear from you!
Thanks very much in advance.
Even though a precompiled package isn't available, you can still install rgdal from its source on a mac like follows:
Install the "GDAL complete" framework from http://www.kyngchaos.com/software/frameworks.
Add the locations of the programs you just installed to your unix path. In mac terminal, do:
PATH=/Library/Frameworks/GDAL.framework/unix/bin:/Library/Frameworks/PROJ.framework/unix/bin$PATH
Download the source for the rgdal package from CRAN at http://cran.r-project.org/web/packages/rgdal/index.html.
Open R and build/install the rgdal package. Note that we have to specify the locations to some of the stuff we just installed.
install.packages('~/Downloads/rgdal_0.7-1.tar.gz', repos=NULL, type='source', configure.args=c('--with-proj-include=/Library/Frameworks/PROJ.framework/unix/include', '--with-proj-lib=/Library/Frameworks/PROJ.framework/unix/lib'))
This installs fine on my Mac OS X 10.6. Good luck!
So the basic idea with your data might be:
library(rgdal)
library(maptools)
# Load KML coordinates
coords = getKMLcoordinates('data.kml')
coords = SpatialPoints(coords, CRS('+proj=longlat'))
# Load US Maps (get from www.gadm.org)
load('USA_adm1.RData')
hawaii = gadm[gadm$NAME_1 == 'Hawaii', ]
# Transform coordinates
hawaii.proj = spTransform(hawaii, CRS=CRS('+init=epsg:2784 +units=km'))
coords.proj = spTransform(coords, CRS=CRS('+init=epsg:2784 +units=km'))
# Plot
dev.new(width=4, height=4)
plot(hawaii.proj, axes=T, xlim=c(450,550), ylim=c(0,60))
points(coords.proj, pch=16, col='red')
Great place to live!
Once you've read something into an "sp" class object (probably a SpatialLinesDataFrame here) using readOGR from rgdal, you can transform it to a cartesian system from lat-long with the spTransform function.
Which system to transform it into depends on where on the earth it is. There's a bunch of standard transforms that depend on longitude called 'UTM' zones (Universal Transverse Mercator). Simply look up the zone for your longitude, find the EPSG code, and fire up spTransform.
For the UK, there's a standard grid system called the Ordnance Survey Grid, which has an EPSG code of 27700. So to transform something in lat-long (EPSG:4326) to OSGB metres, I do:
mapOS = spTransform(mapLL,CRS=CRS("+init=epsg:27700"))
There's lots of examples in the help for spTransform.
Note this is all great only if your data are on a small part of the earth...

Resources