Cannot run map_data("state") - r

I am trying to work on a US map in R. I have done it a lot of times but this time it gives me this error when I try to load:
us<- map_data("state")
Error in .C(C_map_type, as.character(mapbase), integer(1)) :
Incorrect number of arguments (2), expecting 0 for ''
I have ggmap and ggplot2 libraries loaded. Where am I going wrong?

You need the 'maps' package along with ggmap.
library(maps)
library(ggmap)
us<- map_data("state")
This should work

It looks like there are bugs in tidyverse which interferes with ggplot2 maps functionality. See this related question.
This works in a clean, freshly-restarted R session:
us <- ggplot2::map_data("state")
However, this does not:
library(tidyverse)
us2 = ggplot2::map_data("state")

Related

Unable to use has_goog_key() in R

Below simple lines before start to use 'ggmap'
install.packages("devtools")
devtools::install_github("dkahle/ggmap")
library(devtools)
library(dplyr)
library(purrr)
library(ggmap)
library(gmapsdistance)
api_key = Sys.getenv("A_GOOGLE_API_KEY")
register_google(key = api_key)
set.api.key(api_key)
has_goog_key()
It returns:
Error in has_goog_key() : could not find function "has_goog_key"
What went wrong, and how can I check if the given Goole API key is valid?
has_goog_key is a function in a different ggmap package, which you can find via devtools::install_github("fresques/ggmap"). In the package you're using, the equivalent function would be has_google_key.

Random, default, functions not working in R (freq_terms() yesterday, tokens() today)

I think i'm going crazy here. I'm working on some code for an assignment in r. Machine learning, so i have a bit of data but i've limited myself to 500 records for now to get it working.
Yet this weekend i ran into something really stupid, R suddenly couldn't remember that it had a freq_terms() function and told me that the function didn't exist.
Since it was only used to make an overview of most frequent terms used it wasn't a major deal.
Yet today that part of the code (that i didn't change) works again. Hooray! Only now Rstudio tells me that tokens() doesn't exist... and that's a big problem since i do need that function a lot!
This is the code and error:
reviewtokens <- tokens(Merged_Kaggle_Data$Merged_Review, what="word", remove_numbers=TRUE, remove_punct=TRUE, remove_symbols=TRUE, remove_hyphens=TRUE)
Error in tokens(Merged_Kaggle_Data$Merged_Review, what = "word", remove_numbers = TRUE, :
could not find function "tokens"
It worked fine on my desktop yesterday evening, today it doesn't work at all and i didn't change anything in that code.
library list:
library(biclust)
library(caret)
library(DBI)
library(dendextend)
library(dplyr)
library(e1071)
library(fpc)
library(ggplot2)
library(ggthemes)
library(igraph)
library(irlba)
library(plotrix)
library(qdap)
library(quanteda)
library(randomForest)
library(Rcampdf)
library(RColorBrewer)
library(reshape2)
library(RMySQL)
library(rpart)
library(rpart.plot)
library(RWeka)
library(SnowballC)
library(tm)
library(wordcloud)
library(glmnet)

conflicting filter command in R

I am using the usual filter R command. However, when I run this on some data.frame, such basic as filter(data,data$entry==some_data), the output is a time serie. This is obviously related to the time series libraries I imported. How can I fix it ?
I imported the following libraries
library(ggplot2)
library(dplyr)
library(zoo)
library(stringi)
library(gridExtra)
library(rCharts)
library(xts)
library(tseries)
library(forecast)
library(curl)
library(vars)
library(astsa)
library(urca)
library(fGarch)
The default filter when you start R is stats::filter, it is used on time series. dplyr should mask it when loaded, so maybe you didn't load dplyr? Or maybe another package you loaded afterwards masked the dplyr version...
You can always specify the version you need by using package::function notation, e.g., dplyr::filter(data, ...). You can also check on conflicts (multiple definitions of objects) with conflicts().
As a side note, you should not be using $ inside dplyr::filter for the data you pass in, it is built to work with unquoted column names:
filter(data,data$entry==some_data) # bad
filter(data, entry == some_data) # good

R Leaflet: lines missing when plotting polylines

I have a pretty simple spatial object composed of a bunch of lines. I can plot it in different ways with no problems: QGIS, mapshaper.org. Even the standard R plot() function:
But when I plot it with leaflet(), some segments mysteriously disappear, leaving disconnected lines behind:
A reproducible example follows. NOTE: I use a GeoJSON source file for simplicity here. I have also tried saving the lines as an ESRI shapefile, with the same effect: The data is plotted OK with QGIS, or plot(), etc but not with leaflet().
library(leaflet)
library(rgdal)
download.file("https://www.dropbox.com/s/nij2oa2rp7ijaaj/commuter_rail.geojson?dl=1",
method = "auto", mode = "wb", destfile = "commuter_rail.json")
commuterLines <- readOGR("commuter_rail.json",
"OGRGeoJSON")
# Straight R Plot - Looks good
plot(commuterLines)
# Plot using leaflet - Some lines are missing!
leaflet() %>% addPolylines(data = commuterLines)
UPDATE:
Here's the reproducible example running as a shiny app, hosted at shinyapps.io, and showing the weird leaflet behavior: https://havb.shinyapps.io/leaflet_example/
UPDATE: the problem seems to be a bug in an older version of the leaflet package available from CRAN. Installing the latest development version from Github resolves the issue.
I don't have enough rep to comment, but I tried your code and it worked for me:
Perhaps it has something to do with your local configuration? Have you tried reinstalling the leaflet package?

Automatically load map in R from script on startup error

I've been trying to get this working without any luck (I'm new to R so don't really know how to do much and have been following examples). Basically I have an R script created that will automatically load a map when run, saved as .R file:
library(maps)
library(maptools)
library(mapdata)
map('worldHires',
c('UK', 'Ireland', 'Isle of Man','Isle of Wight'),
xlim=c(-11,3), ylim=c(49,60.9))
I want this to happen automatically when I open the RGui without having to go to load script, and then select run all. I've read about editing the Rprofile.site file which I've done and I've added an entry to it:
.First <- function(){
library(maps)
library(maptools)
library(mapdata)
map('worldHires',
c('UK', 'Ireland', 'Isle of Man','Isle of Wight'),
xlim=c(-11,3), ylim=c(49,60.9))
}
However, when I start R, I think it loads the libraries but then it says:
Error in maptype(database) : could not find function "data"
and no map is produced. However it works perfectly fine when I load the script manually and then press run all.
Am I doing something wrong here? Does the .First function only load packages? What would make it work? I've also tried just using source(script location) in the first function and that gives the same error.
The problem you are getting is that .First scripts can only use functions in the base package, unless the packages have been explicitly loaded. So in your case, you need to load
utils for the data function.
graphics for the par function.
Putting this together gives:
.First <- function() {
library(utils); library(graphics)
library(maps)
library(maptools)
library(mapdata)
map('worldHires',
c('UK', 'Ireland', 'Isle of Man','Isle of Wight'),
xlim=c(-11,3), ylim=c(49,60.9))
}

Resources