I'm struggling with ggplot for days. I want to built a map with the different areas in different colors, and add the names of the cities on it. I manage to plot the map with the areas colored in a different fashion with the following code:
#require
library(plyr)
library(dplyr)
library(rgdal)
library(ggplot2)
library(ggmap)
#open data
data = read.table("region.txt", header=T, sep="\t", quote="", dec=".")
#open shapefile
mapa <- readOGR(dsn=".",layer="DEPARTEMENT")
#merge dataframe/shapefile
mapa#data$id <- rownames(mapa#data)
mapa#data <- join(mapa#data, data, by="ID_GEOFLA")
mapa.df <- fortify(mapa)
mapa.df <- join(mapa.df,mapa#data, by="id")
plotData <- join(mapa.df, data)
#plot
mapfr <- ggplot(plotData) +
aes(long,lat,group=group,fill=area) +
geom_polygon() +
geom_path(color="NA") +
coord_fixed() +
theme_nothing(legend = TRUE)
I then open the dataset containing the names and the long/lat of the cities I want to plot on the created map :
#opendata
points = read.table("cities.txt", header=T, sep="\t", quote="", dec=".")
#add points on the map
mapfr +
geom_point(data = points, aes(x = long, y = lat), color = "black", size = 1)
But my points are totally out of the map. Nevertheless, the coordinates are correct. Any idea what I should change to get my points correctly plotted? I know there is a way to do it with the "maps" package, but I'd like to use ggplot.
My datasets are available here
Just in case somebody struggles with the same issue. Instead of looking for another shapefile with the coordinates in another format, it is possible to convert the actual coordinates with the spTransformcommand from the rgdal package - add this line in the code right after importing the shapefile mapa <- spTransform(mapa, CRS("+proj=longlat +datum=WGS84")) and coordinates will be given in long/lat
Related
I am using an excel sheet for data. One column has FIPS numbers for GA counties and the other is labeled Count with numbers 1 - 5. I have made a map with these values using the following code:
library(usmap)
library(ggplot2)
library(rio)
carrierdata <- import("GA Info.xlsx")
plot_usmap( data = carrierdata, values = "Count", "counties", include = c("GA"), color="black") +
labs(title="Georgia")+
scale_fill_continuous(low = "#56B1F7", high = "#132B43", name="Count", label=scales::comma)+
theme(plot.background=element_rect(), legend.position="right")
I've included the picture of the map I get and a sample of the data I am using. Can anyone help me put the actual Count numbers on each county?
Thanks!
Data
The usmap package is a good source for county maps, but the data it contains is in the format of data frames of x, y co-ordinates of county outlines, whereas you need the numbers plotted in the center of the counties. The package doesn't seem to contain the center co-ordinates for each county.
Although it's a bit of a pain, it is worth converting the map into a formal sf data frame format to give better plotting options, including the calculation of the centroid for each county. First, we'll load the necessary packages, get the Georgia data and convert it to sf format:
library(usmap)
library(sf)
library(ggplot2)
d <- us_map("counties")
d <- d[d$abbr == "GA",]
GAc <- lapply(split(d, d$county), function(x) st_polygon(list(cbind(x$x, x$y))))
GA <- st_sfc(GAc, crs = usmap_crs()#projargs)
GA <- st_sf(data.frame(fips = unique(d$fips), county = names(GAc), geometry = GA))
Now, obviously I don't have your numeric data, so I'll have to make some up, equivalent to the data you are importing from Excel. I'll assume your own carrierdata has a column named "fips" and another called "values":
set.seed(69)
carrierdata <- data.frame(fips = GA$fips, values = sample(5, nrow(GA), TRUE))
So now we left_join our imported data to the GA county data:
GA <- dplyr::left_join(GA, carrierdata, by = "fips")
And we can calculate the center point for each county:
GA$centroids <- st_centroid(GA$geometry)
All that's left now is to plot the result:
ggplot(GA) +
geom_sf(aes(fill = values)) +
geom_sf_text(aes(label = values, geometry = centroids), colour = "white")
I want to create a map of Germany where each state is shaded according to its gross domestic product. I know how to do this in R (and put the code below). Is there a possibility to do this in Julia in an equally simple way?
library(tidyverse)
library(ggplot2)
library(sf)
shpData = st_read("./geofile.shp")
GDPData <- read.delim("./stateGDP.csv", header=FALSE)
GDPData <- rename(GDPData,StateName=V1,GDP=V2)
GDPData %>%
left_join(shpData) ->mergedData
ggplot(mergedData) + geom_sf(data = mergedData, aes(fill = BIP,geometry=geometry)) + coord_sf(crs = st_crs(mergedData))-> pBIP1
You'd load the Shapefile and use Plots to plot it.
The ideomatic code is something like
using Plots, Shapefile, CSV
shp = Shapefile.shapes(Shapefile.Table("geofile.shp"))
GDPData = CSV.read("stateGDP.csv")
plot(shp, fill_z = GDPData.V2')
Note the ' which transposes the values to a column vector - this will tell Plots to apply the colors to individual polygons.
I have two netcdf files:
-air quality values (which is the z value or filled value) record for each grid (COL, ROW, LAY, TSTEP)
https://drive.google.com/open?id=1SHnmzV4L1Lqjj9XMdIFQ-nQHOGRpnPhB
-longitude and latitude for each grid
https://drive.google.com/open?id=1dvsh2Ct2--3Bvcux4EXoRm0ntzAVx9OW
I want to make tile plot on a map based on those information.
longitudes and latitudes are not in consist interval.
I have tried extracting part of the data and using ggplot + geom_raster
but the result is not what I expect to have
https://imgur.com/a/2fg33Sp
I also tried used the whole data with geom_tile and geom_polygon, but no tile was ploted
https://imgur.com/a/Zl1N2jV
library(ncdf4)
#path and file
path <- "C:/Users/jhuang/Documents"
file <- "emis_mole_all_20060801_12US1_cmaq_cb05_tx_C25_2006am.ncf"
GRID <- "GRIDCRO2D_Benchmark"
#pollutant interested
poll <- "SO2"
file1 <- sprintf("%s/%s",path, file)
file2 <- sprintf("%s/%s",path, GRID)
ncin <- nc_open(file1)
gridin <- nc_open(file2)
#extract LAT and LON and also SO2 concentrations for each grid
LAT <- ncvar_get(gridin,"LAT")
LON <- ncvar_get(gridin,"LON")
data <- ncvar_get(ncin,poll)
library(ggplot2)
library(maps)
library(ggmap)
#extract first time step
data_1 <- data[,,1]
#organize all data into one data frame
data_2 <- data.frame(cbind(as.vector(LON),as.vector(LAT),as.vector(data_1)))
colnames(data_2) <- c("LON","LAT","POLL")
us_states <- map_data("state")
ggplot(data = data_2, aes(x=LON,y=LAT,fill=POLL)) +
geom_tile()+
geom_polygon(data=us_states,aes(x=long, y=lat, group=group), colour="black", fill="red", alpha=0)
I expect to see https://imgur.com/a/q9uD0gh. I can use NCL to make this plot, just wonder is that possible to make similar plot in R.
After searching around a lot, asking, and doing some code, I kinda got the bare minimum for doing kriging in R's gstat.
Using 4 points (I know, totally bad), I kriged the unsampled points located between them. But in actuality, I don't need all of those points. Inside that area, there is a smaller subarea... this area is the one I actually need.
Long story short.. I have measurements taken from 4 weather stations that report rainfall data. The lat and long coordinates for these points are:
lat long
7.16 124.21
8.6 123.35
8.43 124.28
8.15 125.08
My road to kriging can be seen through my previous questions on StackOverflow.
This: Create variogram in R's gstat package
And this: Create Grid in R for kriging in gstat
I know that the image in has the coordinates (at least according to my estimates):
Leftmost: 124 13ish 0 E(DMS)
Rightmost : 124 20ish 0 E
Topmost corrdinates: 124 17ish 0 E
Bottommost coordinates: 124 16ish 0 E
Conversion will take place for that but that doesn't matter I think, or easier to deal with later.
The image is also irregular (but aren't they all though).
Think of it like a doughnut, you krige the the whole circular shape of the doughnut but you only need the area covered by the hole so you remove or at least disregard the values you got from the doughnut itself.
I have an image (.jpg) of the area in question, I will have to convert the image into a shapefile or some other vector format using QGIS or similar software. After that, I will have to insert that vector image inside the 4 point kriged area, so I know which coordinates to actually consider and which ones to remove.
Finally, I take the values of the area covered by the image and store them into a csv or database.
Anybody know how I can start with this? Total noob at R and statistics. Thanks to anyone who responds.
I just want to know if its possible and if it is provide some tips. Thanks again.
Might as well also post my script:
suppressPackageStartupMessages({
library(sp)
library(gstat)
library(RPostgreSQL)
library(dplyr) # for "glimpse"
library(ggplot2)
library(scales) # for "comma"
library(magrittr)
library(gridExtra)
library(rgdal)
library(raster)
library(leaflet)
library(mapview)
})
drv <- dbDriver("PostgreSQL")
con <- dbConnect(drv, dbname="Rainfall Data", host="localhost", port=5432,
user="postgres", password="postgres")
day_1 <- dbGetQuery(con, "SELECT lat, long, rainfall FROM cotabato.sample")
coordinates(day_1) <- ~ lat + long
plot(day_1)
x.range <- as.integer(c(7.0,9.0))
y.range <- as.integer(c(123.0,126.0))
grid <- expand.grid(x=seq(from=x.range[1], to=x.range[2], by=0.05),
y=seq(from=y.range[1], to=y.range[2], by=0.05))
coordinates(grid) <- ~x+y
plot(grid, cex=1.5)
points(day_1, col='red')
title("Interpolation Grid and Sample Points")
day_1.vgm <- variogram(rainfall~1, day_1, width = 0.02, cutoff = 1.8)
day_1.fit <- fit.variogram(day_1.vgm, model=vgm("Sph", psill = 8000, range = 1))
plot(day_1.vgm, day_1.fit)
plot1 <- day_1 %>% as.data.frame %>%
ggplot(aes(lat, long)) + geom_point(size=1) + coord_equal() +
ggtitle("Points with measurements")
plot(plot1)
############################
plot2 <- grid %>% as.data.frame %>%
ggplot(aes(x, y)) + geom_point(size=1) + coord_equal() +
ggtitle("Points at which to estimate")
plot(plot2)
grid.arrange(plot1, plot2, ncol = 2)
coordinates(grid) <- ~ x + y
############################
day_1.kriged <- krige(rainfall~1, day_1, grid, model=day_1.fit)
day_1.kriged %>% as.data.frame %>%
ggplot(aes(x=x, y=y)) + geom_tile(aes(fill=var1.pred)) + coord_equal() +
scale_fill_gradient(low = "yellow", high="red") +
scale_x_continuous(labels=comma) + scale_y_continuous(labels=comma) +
theme_bw()
write.csv(day_1.kriged, file = "Day_1.csv")
EDIT: The code has changed since the last time. But that doesn't matter I guess, I just want to know if its possible and can anybody provide the simplest example of it being possible. I can derive the solution to the example to my own problem from there.
Let me know if you find this useful:
"Think of it like a doughnut, you krige the the whole circular shape of the doughnut but you only need the area covered by the hole so you remove or at least disregard the values you got from the doughnut itself."
For this you load your vectorial data:
donut <- rgdal::readOGR('/variogram', 'donut')
day_1#proj4string#projargs <- "+proj=longlat +datum=WGS84 +no_defs +ellps=WGS84 +towgs84=0,0,0" # Becouse donut shape have this CRS
plot(donut, axes = TRUE, col = 3)
plot(day_1, col = 2, pch = 20, add = TRUE)
Then you delete the 'external ring' and keep the insider. Also indicates that the second isn't a hole anymore:
hole <- donut # for keep original shape
hole#polygons[1][[1]]#Polygons[1] <- NULL
hole#polygons[1][[1]]#Polygons[1][[1]]#hole <- FALSE
plot(hole, axes = TRUE, col = 4, add = TRUE)
After that you chek whicch points are inside 'hole' new blue vector layer:
over.pts <- over(day_1, hole)
day_1_subset <- day_1[!is.na(over.pts$Id), ]
plot(donut, axes = TRUE, col = 3)
plot(hole, col = 4, add = TRUE)
plot(day_1, col = 2, pch = 20, add = TRUE)
plot(day_1_subset, col = 'white', pch = 1, cex = 2, add = TRUE)
write.csv(day_1_subset#data, 'myfile.csv') # write intersected points table
write.csv(as.data.frame(coordinates(day_1_subset)), 'myfile.csv') # write intersected points coords
writeOGR(day_1_subset, 'path', 'mysubsetlayer', driver = 'ESRI Shapefile') # write intersected points shape
With this code you can solve the 'ring' or doughnut 'hole' if you already have the shapefile.
If you have an image and want to clip it try the follow:
In the case you load a raster (get basemap image from web):
coordDf <- as.data.frame(coordinates(day_1)) # get basemap from points
# coordDf <- data.frame(hole#polygons[1][[1]]#Polygons[1][[1]]#coords) # get basemap from hole
colnames(coordDf) <- c('x', 'y')
imag <- dismo::gmap(coordDf, lonlat = TRUE)
myimag <- raster::crop(day_1.kriged, hole)
plot(myimag)
plot(day_1, add = TRUE, col = 2)
In case you use day_1.kriged:
myCropKrig<- raster::crop(day_1.kriged, hole)
myCropKrig %>% as.data.frame %>%
ggplot(aes(x=x, y=y)) + geom_tile(aes(fill=var1.pred)) + coord_equal() +
scale_fill_gradient(low = "yellow", high="red") +
scale_x_continuous(labels=comma) + scale_y_continuous(labels=comma) +
geom_point(data=coordDf[!is.na(over.pts$Id), ], aes(x=x, y=y), color="blue", size=3, shape=20) +
theme_bw()
And "Finally, I take the values of the area covered by the image and store them into a csv or database."
write.csv(as.data.frame(myCropKrig), 'myCropKrig.csv')
Hope you find this useful and I respond your meaning
To simplify your question:
You want to delineate an area based on an image that is not georeferenced.
You want to extract results of a interpolation only for this area
Few steps are required
You need to use QGIS to georeference your image (Raster > Georeferencer). You need to have a georeferenced map in background to help. This creates a raster object with spatial information.
Two possibilities.
2.a. The central part of your image has a color than can be directly used as a mask in R (Ex. All green pixels in middle of red pixels).
2.b. If not, you need to use QGIS to delineate manually a Polygon of the area (Layer > Create Layer > New Shapefile > Polygon)
Import your raster or polygon shapefile in R
Use function raster::mask to extract values of your interpolation using the raster image or the SpatialPolygon.
I've got a shapefile (SpatialLinesDataFrame) containing all streets in cologne, which can be downloaded from here. I merged this #data with data from an external source. How can i plot these streets (if possible on an google map using ggmaps), so that every street has a different colour (or thickness), depending on its individual value?
So far, i have done this:
shapefile <- readOGR(shapfile, "Strasse", stringsAsFactors=FALSE,
encoding="latin-9")
shp <- spTransform(shapefile, CRS("+proj=longlat +datum=WGS84"))
at this point i add another column to the shp#data data frame, which contains a certain value for each street. Then I fortifiy the the shapefile so it can be plotted using ggplot:
shp$id <- rownames(shp#data)
shp.df <- as.data.frame(shp)
data_fort <- fortify(shp, region = "id")
data_merged <- join(data_fort, shp.df, by="id")
When i use geom_lines, the lines do not look good and are not easy to identify:
ggplot(data_merged, aes(x=long, y=lat,
group=group,
colour=values)) +
geom_line()
Here i saw that one could transform the shapefile so that geom_segement (or in this case a modified function "geom_segment2") can be used, but then would loose my the street specific values.
So this code grabs the 100 longest roads from your shapefile, randomly assigns "values" on (1,10), and plots that with color based on value, on top of a google raster image of Cologne.
library(ggplot2)
library(ggmap) # for ggmap(...) and get_map(...)
library(rgdal) # for readOGR(...)
library(plyr) # for join(...)
set.seed(1) # for reproducible example
setwd(" <directory with your shapefiles> ")
spl <- readOGR(dsn=".", "Strasse", encoding="latin-9")
spl <- spl[spl$SHAPE_LEN %in% tail(sort(spl$SHAPE_LEN),100),]
shp <- spTransform(spl, CRS("+proj=longlat +datum=WGS84"))
shp.df <- data.frame(id=rownames(shp#data),
values=sample(1:10,length(shp),replace=T),
shp#data, stringsAsFactors=F)
data_fort <- fortify(shp)
data_merged <- join(data_fort, shp.df, by="id")
ggmap(get_map(unlist(geocode("Cologne")),zoom=11))+
geom_path(data=data_merged,size=1,
aes(x=long,y=lat,group=group,color=factor(values)))+
labs(x="",y="")+
theme(axis.text=element_blank(),axis.ticks=element_blank())
It is possible to make the ggmap(...) call simpler using, e.g.,
ggmap(get_map("Cologne"))
but there's a problem: the zoom=... argument is interpreted differently and I wasn't able to zoom the map sufficiently.