Cannot get adjusted means for glmer using lsmeans - r

I have a glm that I would like to get adjusted means for using lsmeans. The following code makes the model (and seems to be doing it correctly):
library(lmerTest)
data$group <- as.factor(data$grp)
data$site <- as.factor(data$site)
data$stimulus <- as.factor(data$stimulus)
data.acc1 = glmer(accuracy ~ site + grp*stimulus + (1|ID), data=data, family=binomial)
However, using when I try to use any of the below code to get adjusted means for the model, I get the error
Error in lsmeansLT(model, test.effs = test.effs, ddf = ddf) :
The model is not linear mixed effects model.
lsmeans(data.acc1, "stimulus")
or
data.lsm <- lsmeans(data.acc1, accuracy ~ stimulus ~ grp)
pairs(data.lsm)
Any suggestiongs?

The problem is that you have created a generalised linear mixed model using glmer() (in this case a mixed logistic regression model) not a linear mixed model using lmer(). The lsmeans() function does not accept objects created by glmer() because they are not linear mixed models.
Answers in this post might help: I can't get lsmeans output in glmer
And this post might be useful if you want to understand/compute marginal effects for mixed GLMs: Is there a way of getting "marginal effects" from a `glmer` object

Related

What is the survest equivalent when running a mixed effects cox regression using coxme in R?

I ran this mixed effects model
survival<-coxme(Surv(tr_dur_a,ftr_2)~mod_muac+ drev_oedema_now_day0+danger_adm+tr_diet2+danger_adm + tr_daysec+hiv_stat+breast_feeding_adm+ sex_adm+mod_diar+diag_sepsis_adm+diag_lrti_pneumonia_adm +other_dx+ (1|site), data=survival1b)
I would like the coxme equivalent of survest. My aim is to get survival probabilities
So I am aiming for something like:
Csur <- survest(survival,newdata = Cnew, times=time)

How can I incorporate a categorical variable with ~200 levels in a nonlinear mixed effects model in R?

I am trying to fit a nonlinear mixed effects model with a categorical variable genotype that has around 200 levels.
So this is the linear version of the model.
mlinear <- lmer(WUE ~ moisture * genotype + (1|pot), data = d8)
Now I'm trying to make the same model, but with a logistic function instead of linear
mlogistic <- nlme(WUE ~ SSlogis(moisture, Asym, xmid, scal), data = d8, fixed = Asym + xmid + scal ~ 1, random = Asym + xmid + scal~1|pot)
Problem is, now I don't know how to incorporate genotype into this nonlinear model. Asym, xmid, and scal parameters should be able to vary between each genotype. Anyone know how to do this?
It looks like you’re using lme4::lmer for your linear model and nlme::nlme for the logistic? If you use lme4 for both, you should be able to keep the same model specification, and use lme4::glmer with family = binomial for the logistic model. The whole idea behind a GLM with a link function is that you shouldn’t have to do anything different with your predictor vs a linear model, as the link function takes care of that.
library(lme4)
mlinear <- lmer(WUE ~ moisture * genotype + (1|pot), data = d8)
mlogistic <- glmer(WUE ~ moisture * genotype + (1|pot), family = binomial, data = d8)
All that being said, how is WUE measured? You probably want to use either a logistic model (if binary) or linear (if continuous), not both.
Just to add to the answer by #zephryl, who has explained how you can do this, I would like to focus on:
Since my WUE responses are almost alway curved, I'm trying to fit a logistic function instead of a linear one.
Fitting a logistic model does not really make sense here. The distribution of WUE is not relevant. It is the conditional distribution that matters, and we typically assess this by inspecting the residuals. If the residuals show a nonlinear pattern then there are several ways to model this including
transformations
nonlinear terms
splines

Using merTools::predictInterval for Poisson family mixed models

I am utilizing the predictInterval() function from the merTools package. My model is fit utilizing a Poisson family specification like the below:
glmer(y ~ (1|key) + x, data = dat, family = poisson())
When I use predictInterval() to calculate the prediction interval associated with my model, I get the following warning message:
Warning message:
Prediction for NLMMs or GLMMs that are not mixed binomial regressions is not tested. Sigma set at 1.
I am taking this to mean that predictInterval() doesn't have an implementation for models fit with a Poisson distribution. I therefore do not trust the resulting interval.
Is my interpretation correct? I have searched around for similar issues but haven't found anything.
Any help would be greatly appreciated.

How to test significant improvement of LRM model

Using the rms package of Frank Harrell I constructed a predictive model using the lrm function.
I want to compare if this model has a significant better predictive value on a binomial event in comparison with another (lrm-) model.
I used different functions like anova(model1, model2) or the pR2 function of the pscl library to compare the pseudo R^2, but they all don't work with the lrm based model.
What can I do best to see if my new model is significant beter than the earlier model?
Update: Here is a example (where I want to predict the chance on bone metastasis) to check if size or stage (in addition to other variabele) gives the best model:
library(rms)
getHdata(prostate)
ddd <- datadist(prostate)
options( datadist = "ddd" )
mod1 = lrm(as.factor(bm) ~ age + sz + rx, data=prostate, x=TRUE, y=TRUE)
mod2 = lrm(as.factor(bm) ~ age + stage + rx, data=prostate, x=TRUE, y=TRUE)
It seems fundamentally the question is about comparing two non-nested models.
If you fit your models using the glm function you can use the -vuong- function in -pscl- package.
To test the fit of 2 nested models, you can use the lrtest function from the "rms" package.
lrtest(mod1,mod2)

Logistic regression & bootstrap

I'm trying to run first a logistic regression using lrm from the package RMS.
My model works fine with glm but not with lrm.
model1 <- lrm( Outcome30Days ~ ISS1 + ISS2 + as.factor(GCSgr)+
as.factor(Gender)*as.factor(agegr),data=sub2, x=T, y=T, se.fit=T)
If ISS1 and ISS2 are removed the model runs, but with these 2 variables it won't.
error message:
Unable to fit model using “lrm.fit”
I need to run it with lrm, because the package validate using bootstrap works (apparently) only with lrm.
Any help would be appreciated.
lrm has a lower tolerance for correlation among independent variables than glm. If your model runs with glm and runs with lrm when you remove some variables this is probably the issue. Luckily, you can adjust the tolerance with the tol argument. By default tol=1e-7. Try changing it to tol=1e-9. The code would look like this:
model1 <- lrm( Outcome30Days ~ ISS1 + ISS2 + as.factor(GCSgr)+
as.factor(Gender)*as.factor(agegr),data=sub2, x=T, y=T, se.fit=T, tol=1e-9)
This is better than messing with the penalty because changing the penalty will change your log-likelihood and could affect your results.

Resources