problem with anova refit argument: objects must inherit from classes "gls" - r

I am testing whether I should include certain variance structure in my model, for which I'm using the LRT test from the anova function. The problem is that when I set the argument refit to FALSE (in fact, it doesn't work with TRUE neither), anova returns an error that I don't know how to handle.
> m1 <- lme(Compscore ~ Cond * Time, random = ~1|Participant, data = lyster.long4)
> m4 <- update(m1, weights = varIdent(form = ~ Time | Cond))
> an.res <- anova.lme(m1, m4, refit=FALSE)
Error in anova.lme(m1, m4, refit = FALSE) :
objects must inherit from classes "gls", "gnls", "lm", "lmList", "lme", "nlme", "nlsList", or
"nls"
Without that argument, the function works fine. I'm using lme package.

Related

MuMIn dredge gam error using default na.omit

I have a global model I'm trying to dredge, but I keep getting the error "Error in dredge(myglobalmod, evaluate = TRUE, trace = 2) :
'global.model' uses 'na.action' = "na.omit"
I tried running the global model with na.action="na.omit" within the gam() call and leaving it out (since it's the default).
myglobalmod <- gam(response~ s(x1) + s(x2) + s(x3) + offset(x4), data=mydata, family="tw", na.action="na.omit")
options(na.action=na.omit)
mydredge <- dredge(myglobalmod, evaluate=TRUE, trace=2)
When I didn't include na.action="na.omit" within the gam, I got a similar error.
I then tried with a subset of the data that has all the NA rows removed, but same error.
I've gotten dredge to work before so I'm not sure why it doesn't like the na.omit now, I'm using the same code.
MuMIn insists that you use na.action = na.fail, in order to ensure that the same data set is used for every model (if NA values were left in the data set, different subsets could be used for different models depending on which variables were used). You can use na.omit(mydata) or mydata[complete.cases(mydata), ] to get rid of NA values before you start (assuming that the NA values in your data set occur only in variables you will be using for the full model).
> library(MuMIn)
> m1 <- lm(mpg ~ ., data = mtcars)
> d0 <- dredge(m1)
Error in dredge(m1) :
'global.model''s 'na.action' argument is not set and options('na.action') is "na.omit"
> m1 <- lm(mpg ~ ., data = mtcars, na.action = na.fail)
> d1 <- dredge(m1)
Fixed term is "(Intercept)"

Probing interactions in nlme using the "interactions" package in R

I am running a linear mixed effects models using the "nlme" package looking at stress and lifestyle as predictors of change in cognition over 4 years in a longitudinal dataset. All variables in the model are continuous variables.
I am able to create the model and get the summary statistics using this code:
mod1 <- lme(MS ~ age + sex + edu + GDST1*Time + HLI*Time + GDST1*HLI*Time, random= ~ 1|ID, data=NuAge_long, na.action=na.omit)
summary(mod1)
I am trying to use the "interactions" package to probe the 3-way interaction:
sim_slopes(model = mod1, pred = Time, modx = GDST1, mod2 = HLI, data = NuAge_long)
but am receiving this error:
Error in if (tcol == "df") tcol <- "t val." : argument is of length zero
I am also trying to plot the interaction using the same "interactions" package:
interact_plot(model = mod1, pred = Time, modx = GDST1, mod2 = HLI, data = NuAge_long)
and am receiving this error:
Error in UseMethod("family") : no applicable method for 'family' applied to an object of class "lme"
I can't seem to find what these errors mean and why I'm getting them. Any help would be appreciated!
From ?interactions::sim_slopes:
The function is tested with ‘lm’, ‘glm’,
‘svyglm’, ‘merMod’, ‘rq’, ‘brmsfit’, ‘stanreg’ models. Models
from other classes may work as well but are not officially
supported. The model should include the interaction of
interest.
Note this does not include lme models. On the other hand, merMod models are those generated by lme4::[g]lmer(), and as far as I can tell you should be able to fit this model equally well with lmer():
library(lme4)
mod1 <- lmer(MS ~ age + sex + edu + GDST1*Time + HLI*Time + GDST1*HLI*Time
+ (1|ID), data=NuAge_long)
(things will get harder if you want to specify correlation structures, e.g. correlation = corAR1(), which works for lme() but not lmer() ...)

Error: $ operator not defined for this S4 class while running hoslem.test

I'm working on an optimization of a logistic regression model made with glm, the optimization is a lasso regression using glmnet. I want to compare both models using the output of a Hosmer Lemeshow test and I get this output.
For the glm I get
> hl <- hoslem.test(trainingDatos$Exited, fitted(logit.Mod))
> hl
Hosmer and Lemeshow goodness of fit (GOF) test
data: trainingDatos$Exited, fitted(logit.Mod)
X-squared = 2.9161, df = 8, p-value = 0.9395
And when I try to run the test for the lasso regression I get
> hll <- hoslem.test(trainingDatos$Exited, fitted(lasso.model), g=10)
Error in cut.default(yhat, breaks = qq, include.lowest = TRUE) :
'x' must be numeric
I also tried to use the coefficients of the lasso regression to make it numeric and I get
> hll <- hoslem.test(trainingDatos$Exited, fitted(lasso.model$beta), g=10)
Error: $ operator not defined for this S4 class
But when I treat it as an S4
> hll <- hoslem.test(trainingDatos$Exited, fitted(lasso.model#beta), g=10)
Error in fitted(lasso.model#beta) :
trying to get slot "beta" from an object (class "lognet") that is not an S4 object
Any way to run the test for my lasso regression?
Here is my full code for the lasso regression, can't share the database right now sorry
#Creation of Training Data Set
input_ones <- Datos[which(Datos$Exited == 1), ] #All 1s
input_zeros <- Datos[which(Datos$Exited == 0), ] #All 0s
set.seed(100)
#Training 1s
input_ones_training_rows <- sample(1:nrow(input_ones), 0.7*nrow(input_ones))
#Training 0s
input_zeros_training_rows <- sample(1:nrow(input_zeros), 0.7*nrow(input_ones))
training_ones <- input_ones[input_ones_training_rows, ]
training_zeros <- input_zeros[input_zeros_training_rows, ]
trainingDatos <- rbind(training_ones, training_zeros)
library(glmnet)
#Conversion of training data into matrix form
x <- model.matrix(Exited ~ CreditScore + Geography + Gender
+ Age + Tenure + Balance + IsActiveMember
+ EstimatedSalary, trainingDatos)[,-1]
#Defining numeric response variable
y <- trainingDatos$Exited
sed.seed(100)
#Grid search to find best lambda
cv.lasso<-cv.glmnet(x, y, alpha = 1, family = "binomial")
#Creation of the model
lasso.model <- glmnet(x, y, alpha = 1, family = "binomial",
lambda = cv.lasso$lambda.1se)
coef(cv.lasso, cv.lasso$lambda.1se)
#Now trying to run the test
library(ResourceSelection)
set.seed(12657)
hll <- hoslem.test(trainingDatos$Exited, fitted(lasso.model), g=10)#numeric value error
hll <- hoslem.test(trainingDatos$Exited, fitted(lasso.model$beta), g=10)#$ not defined for S4
hll <- hoslem.test(trainingDatos$Exited, fitted(lasso.model#beta), g=10)#saying that beta is nos S4
glmnet uses a unique predict() method for obtaining fitted values. As rightly mentioned, the errors came from using fitted(). Meanwhile, running such tests could be easier with the gofcat package. Supported objects are passed directly to the functions. Your glm model, for instance, goes hosmerlem(logit.Mod).

How to get a model matrix from clmm objects?

I want to estimate a multilevel ordered logistic model and afterwards access the model matrix. When running a simplified example from ?clmm:
library("ordinal")
mod1 <- clmm(SURENESS ~ PROD + (1|RESP), data = soup)
model.matrix(mod1)
I get the error message Error in eval(predvars, data, env) : object 'SURENESS' not found. From other packages I expected that setting parameters like model = TRUE the data going in are also exported to the estimated model, but here all relevant parameters seem to be set accordingly by default. Did I miss some parameters or elements from mod1 (I went through attributes(mod1) but did not find a model matrix.
Strangely if I set a random data.frame, it works:
set.seed(123)
df <- data.frame(y = factor(sample(c("A", "B", "C"), size = 1000, replace = TRUE), ordered = TRUE),
x = rnorm(1000),
id = factor(rep(1:10, each = 100)))
mod2 <- clmm(y ~ 1 + x + (1|id), data = df)
model.matrix(mod2)
So what's the difference between mod1 and mod2 and how do I get a model.matrix from mod1?
I do not think model.matrix(mod2) works for clmm objects. However, you can try to build a parallel model for the fixed effects part using functions like 'polr' and apply model.matrix() to the output object. The random-effects part can be fixed separately by using the clmm output.

Error using model averaging and predicting with MuMln in R

I'm using the MuMln package in R to get an averaged model (http://www.inside-r.org/packages/cran/MuMIn/docs/model.avg), and predict from that. The package also includes a predict function specifically for an object returned by model.avg (http://www.inside-r.org/node/123636). I've tried using the examples listed, code as follows:
# Example from Burnham and Anderson (2002), page 100:
fm1 <- lm(y ~ X1 + X2 + X3 + X4, data = Cement)
ms1 <- dredge(fm1)
# obtain model average for AIC delta <2
avgm <- model.avg(ms1, subset=delta<2)
# predict from the averaged model
averaged.full <- predict(avgm, full = TRUE)
But I keep getting
Error in predict.averaging(avgm, full = TRUE): can predict only from 'averaging' object containing model list
which I don't understand, because I did follow the examples and used an object returned by model.avg. Am I missing something?
When you create an "averaging" object directly from "model.selection" object, it does not contain the component models, which are required for predict to work. You can use model.avg(..., fit = TRUE) which will fit the models again.
To avoid fitting the models twice, you can first create a list of all models with
lapply(dredge(..., evaluate = FALSE), eval) and afterwards
use model.avg(..., subset = ...) on it.

Resources