"abline" doesn't work after "plot" when inside "with" - r

I want to create a scatterplot and draw the regression line for a subset of a dataset. To give a reproducible example I'll use the CO2 dataset.
I tried this but the regression line doesn't appear for some reason
with(subset(CO2,Type=="Quebec"),plot(conc,uptake),abline(lm(uptake~conc)))
What is the correct way to give a command like this? Can I do it with a one-liner?

You need to provide both your lines of code as a single R expression. The abline() is being taken as a subsequent argument to with(), which is the ... argument. This is documented a a means to pass arguments on to future methods, but the end result is that it is effectively a black hole for this part of your code.
Two options, i) keep one line but wrap the expression in { and } and separate the two expressions with ;:
with(subset(CO2,Type=="Quebec"), {plot(conc,uptake); abline(lm(uptake~conc))})
Or spread the expression out over two lines, still wrapped in { and }:
with(subset(CO2,Type=="Quebec"),
{plot(conc,uptake)
abline(lm(uptake~conc))})
Edit: To be honest, if you are doing things like this you are missing out on the advantages of doing the subsetting via R's model formulae. I would have done this as follows:
plot(uptake ~ conc, data = CO2, subset = Type == "Quebec")
abline(lm(uptake ~ conc, data = CO2, subset = Type == "Quebec"), col = "red")
The with() is just causing you to obfuscate your code with braces and ;.

From ?with: with ... evaluates expr in a local environment created using data. You're passing abline() via .... You need to do something like this:
with(subset(CO2,Type=="Quebec"),{plot(conc,uptake);abline(lm(uptake~conc))})

Gavin and Joshua offer good solutions to your immediate problem; here's the equivalent plot using ggplot:
library(ggplot2)
qplot(conc, uptake, data = CO2[CO2$Type == "Quebec" , ]) + stat_smooth(method = "lm", se = FALSE)

Related

How to solve "no visible binding for global variable 'x' " [duplicate]

EDIT: Hadley Wickham points out that I misspoke. R CMD check is throwing NOTES, not Warnings. I'm terribly sorry for the confusion. It was my oversight.
The short version
R CMD check throws this note every time I use sensible plot-creation syntax in ggplot2:
no visible binding for global variable [variable name]
I understand why R CMD check does that, but it seems to be criminalizing an entire vein of otherwise sensible syntax. I'm not sure what steps to take to get my package to pass R CMD check and get admitted to CRAN.
The background
Sascha Epskamp previously posted on essentially the same issue. The difference, I think, is that subset()'s manpage says it's designed for interactive use.
In my case, the issue is not over subset() but over a core feature of ggplot2: the data = argument.
An example of code I write that generates these notes
Here's a sub-function in my package that adds points to a plot:
JitteredResponsesByContrast <- function (data) {
return(
geom_point(
aes(
x = x.values,
y = y.values
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
)
}
R CMD check, on parsing this code, will say
granovagg.contr : JitteredResponsesByContrast: no visible binding for
global variable 'x.values'
granovagg.contr : JitteredResponsesByContrast: no visible binding for
global variable 'y.values'
Why R CMD check is right
The check is technically correct. x.values and y.values
Aren't defined locally in the function JitteredResponsesByContrast()
Aren't pre-defined in the form x.values <- [something] either globally or in the caller.
Instead, they're variables within a dataframe that gets defined earlier and passed into the function JitteredResponsesByContrast().
Why ggplot2 makes it difficult to appease R CMD check
ggplot2 seems to encourage the use of a data argument. The data argument, presumably, is why this code will execute
library(ggplot2)
p <- ggplot(aes(x = hwy, y = cty), data = mpg)
p + geom_point()
but this code will produce an object-not-found error:
library(ggplot2)
hwy # a variable in the mpg dataset
Two work-arounds, and why I'm happy with neither
The NULLing out strategy
Matthew Dowle recommends setting the problematic variables to NULL first, which in my case would look like this:
JitteredResponsesByContrast <- function (data) {
x.values <- y.values <- NULL # Setting the variables to NULL first
return(
geom_point(
aes(
x = x.values,
y = y.values
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
)
}
I appreciate this solution, but I dislike it for three reasons.
it serves no additional purpose beyond appeasing R CMD check.
it doesn't reflect intent. It raises the expectation that the aes() call will see our now-NULL variables (it won't), while obscuring the real purpose (making R CMD check aware of variables it apparently wouldn't otherwise know were bound)
The problems of 1 and 2 multiply because every time you write a function that returns a plot element, you have to add a confusing NULLing statement
The with() strategy
You can use with() to explicitly signal that the variables in question can be found inside some larger environment. In my case, using with() looks like this:
JitteredResponsesByContrast <- function (data) {
with(data, {
geom_point(
aes(
x = x.values,
y = y.values
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
}
)
}
This solution works. But, I don't like this solution because it doesn't even work the way I would expect it to. If with() were really solving the problem of pointing the interpreter to where the variables are, then I shouldn't even need the data = argument. But, with() doesn't work that way:
library(ggplot2)
p <- ggplot()
p <- p + with(mpg, geom_point(aes(x = hwy, y = cty)))
p # will generate an error saying `hwy` is not found
So, again, I think this solution has similar flaws to the NULLing strategy:
I still have to go through every plot element function and wrap the logic in a with() call
The with() call is misleading. I still need to supply a data = argument; all with() is doing is appeasing R CMD check.
Conclusion
The way I see it, there are three options I could take:
Lobby CRAN to ignore the notes by arguing that they're "spurious" (pursuant to CRAN policy), and do that every time I submit a package
Fix my code with one of two undesirable strategies (NULLing or with() blocks)
Hum really loudly and hope the problem goes away
None of the three make me happy, and I'm wondering what people suggest I (and other package developers wanting to tap into ggplot2) should do.
You have two solutions:
Rewrite your code to avoid non-standard evaluation. For ggplot2, this means using aes_string() instead of aes() (as described by Harlan)
Add a call to globalVariables(c("x.values", "y.values")) somewhere in the top-level of your package.
You should strive for 0 NOTES in your package when submitting to CRAN, even if you have to do something slightly hacky. This makes life easier for CRAN, and easier for you.
(Updated 2014-12-31 to reflect my latest thoughts on this)
Have you tried with aes_string instead of aes? This should work, although I haven't tried it:
aes_string(x = 'x.values', y = 'y.values')
This question has been asked and answered a while ago but just for your information, since version 2.1.0 there is another way to get around the notes: aes_(x=~x.values,y=~y.values).
In 2019, the best way to get around this is to use the .data prefix from the rlang package, which also gets exported to ggplot2. This tells R to treat x.values and y.values as columns in a data.frame (so it won't complain about undefined variables).
Note: This works best if you have predefined columns names that you know will exist in you data input
#' #importFrom ggplot2 .data
my_func <- function(data) {
ggplot(data, aes(x = .data$x, y = .data$y))
}
EDIT: Updated to export .data from ggplot2 instead of rlang based off #Noah comment
If
getRversion() >= "3.1.0"
You can add a call at the top level of the package:
utils::suppressForeignCheck(c("x.values", "y.values"))
from:
help("suppressForeignCheck")
Add this line of code to the file in which you provide package-level documentation:
if(getRversion() >= "2.15.1") utils::globalVariables(c("."))
Example here
how about using get()?
geom_point(
aes(
x = get('x.values'),
y = get('y.values')
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
Because the manual for ?aes_string says
All these functions are soft-deprecated. Please use tidy evaluation
idioms instead (see the quasiquotation section in aes()
documentation).
So I read that page, and came up with this pattern:
ggplot2::aes(x = !!quote(x.values),
y = !!quote(y.values))
It is about as fugly as an IIFE, and mixes base expressions with tidy-bang-bangs. But does not require the global variables workaround, either, and doesn't use anything that is deprecated afaict. It seems like it also works with calculations in aesthetics and the derived variables like ..count..

How do I use the group argument for the plot_summs() function from the jtools package?

I am plotting my coefficient estimates using the function plot_summs() and would like to divide my coefficients into two separate groups.
The function plot_summs() has an argument groups, however, when I try to use it as explained in the documentation, I do not get any results nor error. Can someone give me an example of how I can use this argument please?
This is the code I currently have:
plot_summs(model.c, scale = TRUE, groups = list(pane_1 = c("AQI_average", "temp_yearly"), pane_2 = c("rain_1h_yearly", "snow_1h_yearly")), coefs = c("AQI Average"= "AQI_average", "Temperature (in Farenheit)" = "temp_yearly","Rain volume in mm" = "rain_1h_yearly", "Snow volume in mm" = "snow_1h_yearly"))
And the image below is what I get as a result. What I would like to get is to have two panes separate panes. One which would include "AQI_average" and "temp_yearly" and the other one that would have "rain_1h_yearly" and "snow_1h_yearly". Event though I use the groups argument, I do not get this.
Output of my code
By minimal reproducible example, markus is refering to a piece of code that enables others to exactly reproduce the issue you are refering to on our respective computers, as described in the link that they provided.
To me, it seems the problem is that the groups function does not seem to work in plot_summs - it seems someone here also pointed it out.
If plot_summs is replaced by plot_coef, the groups function work for me. However, the scale function does not seem to be available. A workaround might be:
r <- lm(Sepal.Length ~ Sepal.Width + Petal.Length + Petal.Width, data = iris)
y <- plot_summs(r, scale = TRUE) #Plot for scaled version
t <- plot_coefs(r, #Plot for unscaled versions but with facetting
groups =
list(
pane_1 = c("Sepal.Width", "Petal.Length"),
pane_2 = c("Petal.Width"))) + theme_linedraw()
y$data$group <- t$data$group #Add faceting column to data for the plot
t$data <- y$data #Replace the data with the scaled version
t
I hope this is what you meant!

How to control plot layout for lmerTest output results?

I am using lme4 and lmerTest to run a mixed model and then use backward variable elimination (step) for my model. This seems to work well. After running the 'step' function in lmerTest, I plot the final model. The 'plot' results appear similar to ggplot2 output.
I would like to change the layout of the plot. The obvious answer is to do it manually myself creating an original plot(s) with ggplot2. If possible, I would like to simply change the layout of of the output, so that each plot (i.e. plotted dependent variable in the final model) are in their own rows.
See below code and plot to see my results. Note plot has three columns and I would like three rows. Further, I have not provided sample data (let me know if I need too!).
library(lme4)
library(lmerTest)
# Full model
Female.Survival.model.1 <- lmer(Survival.Female ~ Location + Substrate + Location:Substrate + (1|Replicate), data = Transplant.Survival, REML = TRUE)
# lmerTest - backward stepwise elimination of dependent variables
Female.Survival.model.ST <- step(Female.Survival.model.1, reduce.fixed = TRUE, reduce.random = FALSE, ddf = "Kenward-Roger" )
Female.Survival.model.ST
plot(Female.Survival.model.ST)
The function that creates these plots is called plotLSMEANS. You can look at the code for the function via lmerTest:::plotLSMEANS. The reason to look at the code is 1) to verify that, indeed, the plots are based on ggplot2 code and 2) to see if you can figure out what needs to be changed to get what you want.
In this case, it sounds like you'd want facet_wrap to have one column instead of three. I tested with the example from the **lmerTest* function step help page, and it looks like you can simply add a new facet_wrap layer to the plot.
library(ggplot2)
plot(Female.Survival.model.ST) +
facet_wrap(~namesforplots, scales = "free", ncol = 1)
Try this: plot(difflsmeans(Female.Survival.model.ST$model, test.effs = "Location "))

How can you use ggplot to superimpose many plots of related functions in an automatic way?

I have a family of functions that are all the same except for one adjustable parameter, and I want to plot all these functions on one set of axes all superimposed on one another. For instance, this could be sin(n*x), with various values of n, say 1:30, and I don't want to have to type out each command individually -- I figure there should be some way to do it programatically.
library(ggplot2)
define trig functions as a function of frequency: sin(x), sin(2x), sin(3x) etc.
trigf <- function(i)(function(x)(sin(i*x)))
Superimpose two function plots -- this works manually of course
ggplot(data.frame(x=c(0,pi)), aes(x)) + stat_function(fun=trigf(1)) + stat_function(fun=trigf(2))
now try to generalize -- my idea was to make a list of the stat_functions using lapply
plotTrigf <- lapply(1:5, function(i)(stat_function(fun=function(x)(sin(i*x))) ))
try using the elements of the list manually but it doesn't really work -- only the i=5 plot is shown and I'm not sure why when that's not what I referenced
ggplot(data.frame(x=c(0,pi)), aes(x)) +plotTrigf[[1]] + plotTrigf[[2]]
I Thought this Reduce might handle the 'generalized sum' to add to a ggplot() but it doesn't work -- it complains of a non-numeric argument to binary operator
Reduce("+", plotTrigf)
So I'm kind of stuck both in executing this strategy, or perhaps there's some other way to do this.
Are you using version R <3.2? The problem is that you actually need to evaluate your i parameter in your lapply call. Right now it's being left as a promise and not getting evaulated till you try to plot and at that point i has the last value it had in the lapply loop which is 5. Use:
plotTrigf <- lapply(1:5, function(i) {force(i);stat_function(fun=function(x)(sin(i*x))) })
You can't just add stat_function calls together, even without Reduce() you get the error
stat_function(fun=sin) + stat_function(fun=cos)
# Error in stat_function(fun = sin) + stat_function(fun = cos) :
# non-numeric argument to binary operator
You need to add them to a ggplot object. You can do this with Reduce() if you just specify the init= parameter
Reduce("+", plotTrigf, ggplot(data.frame(x=c(0,pi)), aes(x)))
And actually the special + operator for ggplot objects allows you to add a list of objects so you don't even need the Reduce at all (see code for ggplot2:::add_ggplot)
ggplot(data.frame(x=c(0,pi)), aes(x)) + plotTrigf
The final result is
You need to use force in order to make sure the parameter is being evaluated at the right time. It's a very useful technique and a common source of confusion in loops, you should read about it in Hadley's book http://adv-r.had.co.nz/Functions.html
To solve your question: you just need to add force(i) when defining all the plots, inside the lapply function, before making the call to stat_function. Then you can use Reduce or any other method to combine them. Here's a way to combine the plots using lapply (note that I'm using the <<- operator which is discouraged)
p <- ggplot(data.frame(x=c(0,pi)), aes(x))
lapply(plotTrigf, function(x) {
p <<- p + x
return()
})

How can I handle R CMD check "no visible binding for global variable" notes when my ggplot2 syntax is sensible?

EDIT: Hadley Wickham points out that I misspoke. R CMD check is throwing NOTES, not Warnings. I'm terribly sorry for the confusion. It was my oversight.
The short version
R CMD check throws this note every time I use sensible plot-creation syntax in ggplot2:
no visible binding for global variable [variable name]
I understand why R CMD check does that, but it seems to be criminalizing an entire vein of otherwise sensible syntax. I'm not sure what steps to take to get my package to pass R CMD check and get admitted to CRAN.
The background
Sascha Epskamp previously posted on essentially the same issue. The difference, I think, is that subset()'s manpage says it's designed for interactive use.
In my case, the issue is not over subset() but over a core feature of ggplot2: the data = argument.
An example of code I write that generates these notes
Here's a sub-function in my package that adds points to a plot:
JitteredResponsesByContrast <- function (data) {
return(
geom_point(
aes(
x = x.values,
y = y.values
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
)
}
R CMD check, on parsing this code, will say
granovagg.contr : JitteredResponsesByContrast: no visible binding for
global variable 'x.values'
granovagg.contr : JitteredResponsesByContrast: no visible binding for
global variable 'y.values'
Why R CMD check is right
The check is technically correct. x.values and y.values
Aren't defined locally in the function JitteredResponsesByContrast()
Aren't pre-defined in the form x.values <- [something] either globally or in the caller.
Instead, they're variables within a dataframe that gets defined earlier and passed into the function JitteredResponsesByContrast().
Why ggplot2 makes it difficult to appease R CMD check
ggplot2 seems to encourage the use of a data argument. The data argument, presumably, is why this code will execute
library(ggplot2)
p <- ggplot(aes(x = hwy, y = cty), data = mpg)
p + geom_point()
but this code will produce an object-not-found error:
library(ggplot2)
hwy # a variable in the mpg dataset
Two work-arounds, and why I'm happy with neither
The NULLing out strategy
Matthew Dowle recommends setting the problematic variables to NULL first, which in my case would look like this:
JitteredResponsesByContrast <- function (data) {
x.values <- y.values <- NULL # Setting the variables to NULL first
return(
geom_point(
aes(
x = x.values,
y = y.values
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
)
}
I appreciate this solution, but I dislike it for three reasons.
it serves no additional purpose beyond appeasing R CMD check.
it doesn't reflect intent. It raises the expectation that the aes() call will see our now-NULL variables (it won't), while obscuring the real purpose (making R CMD check aware of variables it apparently wouldn't otherwise know were bound)
The problems of 1 and 2 multiply because every time you write a function that returns a plot element, you have to add a confusing NULLing statement
The with() strategy
You can use with() to explicitly signal that the variables in question can be found inside some larger environment. In my case, using with() looks like this:
JitteredResponsesByContrast <- function (data) {
with(data, {
geom_point(
aes(
x = x.values,
y = y.values
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
}
)
}
This solution works. But, I don't like this solution because it doesn't even work the way I would expect it to. If with() were really solving the problem of pointing the interpreter to where the variables are, then I shouldn't even need the data = argument. But, with() doesn't work that way:
library(ggplot2)
p <- ggplot()
p <- p + with(mpg, geom_point(aes(x = hwy, y = cty)))
p # will generate an error saying `hwy` is not found
So, again, I think this solution has similar flaws to the NULLing strategy:
I still have to go through every plot element function and wrap the logic in a with() call
The with() call is misleading. I still need to supply a data = argument; all with() is doing is appeasing R CMD check.
Conclusion
The way I see it, there are three options I could take:
Lobby CRAN to ignore the notes by arguing that they're "spurious" (pursuant to CRAN policy), and do that every time I submit a package
Fix my code with one of two undesirable strategies (NULLing or with() blocks)
Hum really loudly and hope the problem goes away
None of the three make me happy, and I'm wondering what people suggest I (and other package developers wanting to tap into ggplot2) should do.
You have two solutions:
Rewrite your code to avoid non-standard evaluation. For ggplot2, this means using aes_string() instead of aes() (as described by Harlan)
Add a call to globalVariables(c("x.values", "y.values")) somewhere in the top-level of your package.
You should strive for 0 NOTES in your package when submitting to CRAN, even if you have to do something slightly hacky. This makes life easier for CRAN, and easier for you.
(Updated 2014-12-31 to reflect my latest thoughts on this)
Have you tried with aes_string instead of aes? This should work, although I haven't tried it:
aes_string(x = 'x.values', y = 'y.values')
This question has been asked and answered a while ago but just for your information, since version 2.1.0 there is another way to get around the notes: aes_(x=~x.values,y=~y.values).
In 2019, the best way to get around this is to use the .data prefix from the rlang package, which also gets exported to ggplot2. This tells R to treat x.values and y.values as columns in a data.frame (so it won't complain about undefined variables).
Note: This works best if you have predefined columns names that you know will exist in you data input
#' #importFrom ggplot2 .data
my_func <- function(data) {
ggplot(data, aes(x = .data$x, y = .data$y))
}
EDIT: Updated to export .data from ggplot2 instead of rlang based off #Noah comment
If
getRversion() >= "3.1.0"
You can add a call at the top level of the package:
utils::suppressForeignCheck(c("x.values", "y.values"))
from:
help("suppressForeignCheck")
Add this line of code to the file in which you provide package-level documentation:
if(getRversion() >= "2.15.1") utils::globalVariables(c("."))
Example here
how about using get()?
geom_point(
aes(
x = get('x.values'),
y = get('y.values')
),
data = data,
position = position_jitter(height = 0, width = GetDegreeOfJitter(jj))
)
Because the manual for ?aes_string says
All these functions are soft-deprecated. Please use tidy evaluation
idioms instead (see the quasiquotation section in aes()
documentation).
So I read that page, and came up with this pattern:
ggplot2::aes(x = !!quote(x.values),
y = !!quote(y.values))
It is about as fugly as an IIFE, and mixes base expressions with tidy-bang-bangs. But does not require the global variables workaround, either, and doesn't use anything that is deprecated afaict. It seems like it also works with calculations in aesthetics and the derived variables like ..count..

Resources