I am trying to crop a shapefile containing rivers and streams (sf.streams) by the extent of an AOI shapefile (shp.AOI) I already read in prior. I am not finding a tutorial on the web explaining this. Any help would be appreciated. I attached some code below that did not work due to me being new to R and to the sf package. I don't have any formal R training and I am learning as I am doing so sorry if this is a simple question or if I am wayyy off. I also don't know if I am supposed to be cropping and masking, I am confused on the two. What I want to do is remove any data outside the AOI to save on computing power and time because the datasets I am using are very large. THANKS!!!
shp.AOI <- readOGR(dsn="InputData/GIS/AOI", layer="AOI") %>%
spTransform(., crs.NAD83.UTM.Z10) %>%
tidy(.)
sf.streams <-
sf::st_read(file.path("InputData", "GIS", "Streams","Preprocessed","Rivers.shp"),
stringsAsFactors=F, crs=crs.NAD83.UTM.Z10) %>%
st_transform(.,aoi=shp.AOI)
I also tried........
sf.streams <-
sf::st_read(file.path("InputData", "GIS", "Streams","Preprocessed","Rivers.shp"),
stringsAsFactors=F, crs=crs.NAD83.UTM.Z10) %>%
st_crop(.,aoi=shp.AOI)
Related
I have a problem getting raster data in the right orientation. The original raster data when imported into R looks this .
I tried using the transpose function in raster but it didn't work. The transposed data looks like this .
I used the code below. Any help or advice would be greatly appreciated. Also, is there a way to apply the plausible solution to the entire stack (all rasters are of the same extent)? Thank you.
f_PM <- list.files(path=".",
pattern='tif$', full.names=TRUE)
s_PM <- stack(f_PM) ## create a stack of the rasters
plot(s_PM[[1]]) ##check the orientation
PM <- s_PM[[1]] ## pick one raster and try to change the orientation
PM2 <- t(PM)
plot(PM2)
You need to flip then transpose:
> plot(m)
> r = t(flip(m))
> plot(r)
Note there's https://gis.stackexchange.com where spatial questions like this get asked. (Too much noise here to help with most spatial stuff).
I was assigned the task to clip a raster from .nc file from a .tif file.
edit (from comment):
i want to extract temp. info from the .nc because i need to check the yearly mean temperature of a specific region. to be comparable the comparison has to occur on exactly the same area. The .nc file is larger than the previously checked area so i need to "clip" it to the extent of a .tif I have. The .tif data is in form 0|1 where it is 0 (or the .tif is smaller than the .nc) the .nc data should be "cliped". In the end i want to keep the .nc data but at the extent of the .tif while still retaining its resolution & projection. (.tif and .nc have different projections&pixel sizes)
Now ordinarily that wouldn't be a problem as i could use raster::crop. This doesn't deal with different projections and different pixel size/resolution though. (I still used it to generate an approximation, but it is not precise enough for the final infromation, as can be seen in the code snippet below). The obvious method to generate a more reliable dataset/rasterset would be to first use a method like raster::projectRaster or raster::sp.Transform # adding sp.transform was done in an edit to the original question and homogenize the datasets but this approach takes too much time, as i have to do this for quite a few .nc files.
I was told the best method would be to generate a normalized matrix from the smaller raster "clip_frame" and then just multiply it with the "nc_to_clip" raster. Doing so should prevent any errors through map projections or other factors. This makes a lot of sense to me in theory but I have no idea how to do this in practice. I would be very grateful to any kind of hint/code snippet or any other help.
I have looked at similar problems on StackOverflow (and other sites) like:
convert matrix to raster in R
Convert raster into matrix with R
https://www.researchgate.net/post/Hi_Is_there_a_way_to_multiply_Raster_value_by_Raster_Latitude
As I am not even sure how to frame the question correctly, I might have overlooked an answer to this problem, if so please point me there!
My (working) code so far, just to give you an idea of how I want to approach the topic (here using the crop-function).
#library(ncdf4)
library(raster)
library(rgdal)
library(tidyverse)
nc_list<-list.files(pattern = ".*0.nc$") # list of .nc files containing raster and temperature information
#nc_to_clip <- lapply(nc_list, raster, varname="GST") # read in as raster
nc_to_clip < -raster(ABC.nc, vername="GST)
clip_frame <- raster("XYZ.tif") # read in .tif for further use as frame
mean_temp_from_raster<-function(input_clip_raster, input_clip_frame){ # input_clip_raster= raster to clip, input_clip_frame
r2_coord<-rasterToPoints(input_clip_raster, spatial = TRUE) # step 1 to extract coordinates
map_clip <- crop(input_clip_raster, extent(input_clip_frame)) # use crop to cut the input_clip_raster (this being the function I have to extend on)
temp<-raster::extract(map_clip, r2_coord#coords) # step 2 to extract coordinates
temp_C<-temp*0.01-273.15 # convert kelvin*100 to celsius
temp_C<-na.omit(temp_C)
mean(temp_C)
return_list<-list(map_clip, mean(temp_C))
return(return_list)
}
mean_tempC<-lapply(nc_to_clip, mean_temp_from_raster,clip_frame)
Thanks!
PS:
I don't have much experience working with .nc files and/or RasterLayers in R as I used to work with ArcGIS/Python (arcpy) for problems like this, which is not an option right now.
Perhaps something like this?
library(raster)
nc <- raster(ABC.nc, vername="GST)
clip <- raster("XYZ.tif")
x <- as(extent(clip), "SpatialPolygons")
crs(x) <- crs(clip)
y <- sp::spTransform(x, crs(nc))
clipped <- crop(nc, y)
I'm trying to apply a co-ordinate reference system to a dataset I'm using but am unable to do so. I'm not very experienced with this following only a few tutorials, so it's possible it's a really simple fix. My error is "Error in UseMethod("st_crs<-") :
no applicable method for 'st_crs<-' applied to an object of class "c('tbl_df', 'tbl', 'data.frame')""
The full dataset is below and should run in any R as I'm pulling the dataset from online, thanks :) I'm hoping to use plot(wildlife) to see the dataset after applying the crs.
library(tidyverse) #lots
library(dplyr) #pipes
library(ggplot2)#plots
library(sf) #maps
wildlife <- st_read("https://opendata.arcgis.com/datasets/07c7d3a8031b401d80feb16512a659d5_13.geojson") #pulling geojson data from online.
wildlife <- tibble(wildlife)
glimpse(wildlife)
wildlife <- wildlife %>%
select(site_no, geometry)
st_crs(wildlife) <- 4326 #this line results in an error
st_crs(wildlife)
Sorry, but it's somewhat hard to interpret exactly what you are asking when you say: "the full dataset is below and should run in any R as I'm pulling the dataset from online, thanks :). "
If you are asking whether this dataset is somehow damaged and cannot be plotted, then I can say, no. This dataset reads and plots without issue. It's hard to guess what is wrong with your code or guess your intended goals, but the code for simply reading the file and plotting the data follows:
Get Data:
wildlife <- st_read("https://opendata.arcgis.com/datasets/07c7d3a8031b401d80feb16512a659d5_13.geojson")
Explore Data:
glimpse(wildlife) # dim 254 x 8
head(wildlife, 3)
Plot Data:
plot(wildlife$geometry)
If you wanted to apply this feature to this data, the following code works without error:
st_crs(wildlife) <- 4326
The plotted data can be view from this link.plot
I'm trying to plot trips between zipcodes in R. Specifically, I'm trying to create an interactive where you can click on each zipcode, and see the other zipcodes colored according to how many people traveled from the zip you clicked on to the other zipcodes. Sort of like this: https://www.forbes.com/special-report/2011/migration.html
But less fancy; just showing "out-migration" would be super.
I've been messing with this in R using the leaflet package, but I haven't managed to figure it out. Could someone with better R skills help me out? Any insight would be much appreciated.
I've downloaded a shapefile of zipcodes in LA county from here:
https://data.lacounty.gov/Geospatial/ZIP-Codes/65v5-jw9f
Then I used the code below to create some toy data.
You can find the zipcode shapefiles here:
https://drive.google.com/file/d/0B2a3BZ6nzEGJNk55dmdrdVI2MTQ/view?usp=sharing
And you can find the toy data here:
https://drive.google.com/open?id=0B2a3BZ6nzEGJR29EOFdjR1NPR3c
Here's the code I've got so far:
require(rgdal)
setwd("~/Downloads/ZIP Codes")
# Read SHAPEFILE.shp from the current working directory (".")
shape <- readOGR(dsn = ".", layer = "geo_export_89ff0f09-a580-4844-988a-c4808d510398")
plot(shape) #Should look like zip codes in LA county
#get a unique list of zipcodes
zips <- as.numeric(as.character(unique(shape#data$zipcode)))
#create a dataframe with all the possible combination of origin and destination zipcodes
zips.df <- data.frame(expand.grid(as.character(zips),as.character(zips)), rpois(96721,10))
#give the dataframe some helpful variable names
names(zips.df) <- c("origin_zip", "destination_zip","number_of_trips")
Like I said, any help would be much appreciated. Thanks!
I have read so many threads and articles and I keep getting errors. I am trying to make a choropleth? map of the world using data I have from the global terrorism database. I want to color countries on a factor of nkills or just the number of attacks in that country.. I don't care at this point. Because there are so many countries with data, it is unreasonable to make any plots to show this data.
Help is strongly appreciated and if I did not ask this correctly I sincerely apologize, I am learning the rules of this website as I go.
my code (so far..)
library(maps)
library(ggplot2)
map("world")
world<- map_data("world")
gtd<- data.frame(gtd)
names(gtd)<- tolower(names(gtd))
gtd$country_txt<- tolower(rownames(gtd))
demo<- merge(world, gts, sort=FALSE, by="country_txt")
In the gtd data frame, the name for the countries column is "country_txt" so I thought I would use that but I get error in fix.by(by.x, x) : 'by' must specify a uniquely valid column
If that were to work, I would plot as I have seen on a few websites..
I have honestly been working on this for so long and I have read so many codes/other similar questions/websites/r handbooks etc.. I will accept that I am incompetent when it comes to R gladly for some help.
Something like this? This is a solution using rgdal and ggplot. I long ago gave up on using base R for this type of thing.
library(rgdal) # for readOGR(...)
library(RColorBrewer) # for brewer.pal(...)
library(ggplot2)
setwd(" < directory with all files >")
gtd <- read.csv("globalterrorismdb_1213dist.csv")
gtd.recent <- gtd[gtd$iyear>2009,]
gtd.recent <- aggregate(nkill~country_txt,gtd.recent,sum)
world <- readOGR(dsn=".",
layer="world_country_admin_boundary_shapefile_with_fips_codes")
countries <- world#data
countries <- cbind(id=rownames(countries),countries)
countries <- merge(countries,gtd.recent,
by.x="CNTRY_NAME", by.y="country_txt", all.x=T)
map.df <- fortify(world)
map.df <- merge(map.df,countries, by="id")
ggplot(map.df, aes(x=long,y=lat,group=group)) +
geom_polygon(aes(fill=nkill))+
geom_path(colour="grey50")+
scale_fill_gradientn(name="Deaths",
colours=rev(brewer.pal(9,"Spectral")),
na.value="white")+
coord_fixed()+labs(x="",y="")
There are several versions of the Global Terrorism Database. I used the full dataset available here, and then subsetted for year > 2009. So this map shows total deaths due to terrorism, by country, from 2010-01-01 to 2013-01-01 (the last data available from this source). The files are available as MS Excel download, which I converted to csv for import into R.
The world map is available as a shapefile from the GeoCommons website.
The tricky part of making choropleth maps is associating your data with the correct polygons (countries). This is generally a four step process:
Find a field in the shapefile attributes table that maps (no pun intended) to a corresponding field in your data. In this case, it appears that the field "CNTRY_NAME" in the shapefile maps to the field "country_txt" in gtd database.
Create an association between ploygon IDs (stored in the row names of the attribute table), and the CNTRY_NAME field.
Merge the result with your data using CNTRY_NAME and country_txt.
Merge the result of that with the data frame created using the fortify(map) - this associates ploygons with deaths (nkill).
Building on the nice work by #jlhoward. You could instead use rworldmap that already has a world map in R and has functions to aid joining data to the map. The default map is deliberately low resolution to create a 'cleaner' look. The map can be customised (see rworldmap documentation) but here is a start :
library(rworldmap)
#3 lines from #jlhoward
gtd <- read.csv("globalterrorismdb_1213dist.csv")
gtd.recent <- gtd[gtd$iyear>2009,]
gtd.recent <- aggregate(nkill~country_txt,gtd.recent,sum)
#join data to a map
gtdMap <- joinCountryData2Map( gtd.recent,
nameJoinColumn="country_txt",
joinCode="NAME" )
mapDevice('x11') #create a world shaped window
#plot the map
mapCountryData( gtdMap,
nameColumnToPlot='nkill',
catMethod='fixedWidth',
numCats=100 )
Following a comment from #hk47, you can also add the points to the map sized by the number of casualties.
deaths <- subset(x=gtd, nkill >0)
mapBubbles(deaths,
nameX='longitude',
nameY='latitude',
nameZSize='nkill',
nameZColour='black',
fill=FALSE,
addLegend=FALSE,
add=TRUE)