Obtaining individual slopes from an lme4 object in R - r

I'm new to lme4 package in R. In my example below, I was wondering if it might be possible to obtain the gender slopes (i.e., differences) for each dep after fitting my glmer model?
dat <- data.frame(dep = rep(LETTERS[1:6],each=2), gender = rep(c("Ma","Fe"),6),
admit=c(512,89,353,17,120,202,138,131,53,94,22,24),
reject=c(313,19,207,8,205,391,279,244,138,299,351,317))
lme4::glmer(cbind(admit,reject) ~ gender+dep + (gender|dep), data=dat, family=binomial)

In lme4 you can get the estimated slopes from ranef, but in your model you will need to sum the global and unit specific terms, as in the example below.
library(lme4)
dat <- data.frame(dep = rep(LETTERS[1:6],each=2), gender = rep(c("Ma","Fe"),6),
admit=c(512,89,353,17,120,202,138,131,53,94,22,24),
reject=c(313,19,207,8,205,391,279,244,138,299,351,317))
mod1 <- glmer(cbind(admit,reject) ~ gender+dep + (gender|dep), data=dat, family=binomial)
summary(mod1)
ran_gender <- ranef(mod1)$dep
fe_mod1 <- fixef(mod1)
slopes <- fe_mod1[[2]] + ran_gender[,2]
slopes

Related

Inter- or extrapolate from a cubic spline cox-proportional hazard model using RMS package

I am using the cph function in the RMS package to assess the association of WEIGHT with an EVENT outcome. I am using cubic spline with 3 knots.
How can I use the model to predict the hazard ratio for weights that were not included in the original data? i.e. How can I use the model to inter- or extrapolate?
Here is an example:
set.seed(123)
library(rms)
WT <- rnorm(10, 30, 10)
EVENT <- sample(c(0,1), replace=TRUE, size=10)
TIME <- c(seq(1,10,1))
df <- as.data.frame(cbind(TIME,EVENT,WT))
fit <- cph(Surv(TIME, EVENT==1) ~ rcs(WT, 3), data = df)
fit
d = datadist(df)
options(datadist = 'd')
Predict(fit, WT) # this predicts hazard for WT included in the data
How can I use the model fit to predict hazrad at WT==70 for example?
I am using R studio.

Getting estimated means after multiple imputation using the mitml, nlme & geepack R packages

I'm running multilevel multiple imputation through the package mitml (using the panimpute() function) and am fitting linear mixed models and marginal models through the packages nlme and geepack and the mitml:with() function.
I can get the estimates, p-values etc for those through the testEstimates() function but I'm also looking to get estimated means across my model predictors. I've tried the emmeans package, which I normally use for getting estimated means when running nlme & geepack without multiple imputation but doing so emmeans tell me "Can't handle an object of class “mitml.result”".
I'm wondering is there a way to get pooled estimated means from the multiple imputation analyses I've run?
The data frames I'm analyzing are longitudinal/repeated measures and in long format. In the linear mixed model I want to get the estimated means for a 2x2 interaction effect and in the marginal model I'm trying to get estimated means for the 6 levels of 'time' variable. The outcome in all models is continuous.
Here's my code
# mixed model
fml <- Dep + time ~ 1 + (1|id)
imp <- panImpute(data=Data, formula=fml, n.burn=50000, n.iter=5000, m=100, group = "treatment")
summary(imp)
plot(imp, trace="all")
implist <- mitmlComplete(imp, "all", force.list = TRUE)
fit <- with(implist, lme(Dep ~ time*treatment, random = ~ 1|id, method = "ML", na.action = na.exclude, control = list(opt = "optim")))
testEstimates(fit, var.comp = TRUE)
confint.mitml.testEstimates(testEstimates(fit, var.comp = TRUE))
# marginal model
fml <- Dep + time ~ 1 + (1|id)
imp <- panImpute(data=Data, formula=fml, n.burn=50000, n.iter=5000, m=100)
summary(imp)
plot(imp, trace="all")
implist <- mitmlComplete(imp, "all", force.list = TRUE)
fit <- with(implist, geeglm(Dep ~ time, id = id, corstr ="unstructured"))
testEstimates(fit, var.comp = TRUE)
confint.mitml.testEstimates(testEstimates(fit, var.comp = TRUE))
is there a way to get pooled estimated means from the multiple imputation analyses I've run?
This is not a reprex without Data, so I can't verify this works for you. But emmeans provides support for mira-class (lists of) models in the mice package. So if you fit your model in with() using the mids rather than mitml.list class object, then you can use that to obtain marginal means of your outcome (and any contrasts or pairwise comparisons afterward).
Using example data found here, which uncomfortably loads an external workspace:
con <- url("https://www.gerkovink.com/mimp/popular.RData")
load(con)
## imputation
library(mice)
ini <- mice(popNCR, maxit = 0)
meth <- ini$meth
meth[c(3, 5, 6, 7)] <- "norm"
pred <- ini$pred
pred[, "pupil"] <- 0
imp <- mice(popNCR, meth = meth, pred = pred, print = FALSE)
## analysis
library(lme4) # fit multilevel model
mod <- with(imp, lmer(popular ~ sex + (1|class)))
library(emmeans) # obtain pooled estimates of means
(em <- emmeans(mod, specs = ~ sex) )
pairs(em) # test comparison

Find R-square value of Weibull fit (Survival model) in R

I have a survival object (S) for which I am doing a weibull fit using the survreg function and weibull distribution in R.
S = Surv(data$ValueX, data$ValueY)
W = Survreg(S ~ 1, data=data, dist="weibull")
How do I extract the R-square value of the Weibull fit which is essentially a linear line? Or is there a function to calculate the correlation coefficient value Rho?
Basically, I want to calculate the goodness of fit.
Look at pam.censor in the PAmeasures package which produces an R^2 like statistic. Using the ovarian dataset from the survival package:
library(PAmeasures)
library(survival)
fit.s <- survreg(Surv(futime, fustat) ~ age, data = ovarian, dist="weibull" )
p <- predict(fit.s, type = "response")
with(ovarian, pam.censor(futime, p, fustat))
For the ovarian data with an age regressor we get a value of only 0.0915 .
Another idea is that for a Weibull model with no covariates we have S(t) = exp(- (lambda * t)^p) so log(-log(S(t))) is linear in log(t) hence we could use the R squared of the corresponding regression to measure how well the model fits to a Weibull.
library(survival)
fit1 <- survfit(Surv(futime, fustat) ~ 1, data = ovarian)
sum1 <- summary(fit1, times = ovarian$futime)
fo <- log(-log(surv)) ~ log(time)
d <- as.data.frame(sum1[c("time", "surv")])
fit.lm <- lm(fo, d)
summary(fit.lm)$r.sq
plot(fo, d)
abline(fit.lm)
For the ovarian data without covariates the R^2 at 93% is high but the plot does suggest systematic departures from linearity so it may not really be Weibull.
Other
Not sure if this is of interest but the eha package has the check.dist function which can be used for a visual comparison of a parametric baseline hazard model to a cox proportional hazard model. See the documentation as well as:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5233524/
Using the ovarian dataset from survival:
library(eha)
library(surival)
fit.c <- coxreg(Surv(futime, fustat) ~ age, data = ovarian)
fit.p <- phreg(Surv(futime, fustat) ~ age, data = ovarian, dist = "weibull")
check.dist(fit.c, fit.p)
The survAUC package has three functions that provide r squared type statistics for cox proportional hazard models (OXS, Nagelk and XO).

Variance-covariance matrix for random effect from GAM with mgcv package

Random effect and variance-covariance matrix of random effect with lme4 package are extracted as following:
library(lme4)
fm1 <- lmer(Reaction ~ Days + (1|Subject), sleepstudy)
fm1.rr <- ranef(fm1,condVar=TRUE)
fm1.pv <- attr(rr[[1]],"postVar")
I wonder how I can do this with mgcv?
'gam.vcomp' function does extract the estimated variance components, but not for each level of random effect.
library(mgcv)
fm2 <- gam(Reaction ~ Days + s (Subject, bs="re"), data = sleepstudy, method = "REML")
gam.vcomp(fm2)
library(lme4)
data(sleepstudy)
fm1 <- lmer(Reaction ~ Days + (1|Subject), sleepstudy)
fm1.rr <- ranef(fm1,condVar=TRUE)$Subject[,1]
fm1.pv <- sqrt(attr(ranef(fm1,condVar=TRUE) [['Subject']],"postVar")[1,1,])
library(mgcv)
fm2 <- gam(Reaction ~ Days + s (Subject, bs="re"),
data = sleepstudy, method = "REML")
To extract random effect for each Subject
idx <-grep("Subject", names(coef(fm2)))
fm2.rr<-coef(fm2)[idx]
attributes(fm2.rr)<-NULL
We can see that random effects in both models are identical as expected.
To extract variance-covariance matrix for random effect and calculate an error we use parameter Vp which is a Bayesian posterior covariance matrix:
fm2.pv <-sqrt(diag(fm2$Vp))[idx]
Or frequentist estimated covariance matrix Ve
fm2.pv <-sqrt(diag(fm2$Ve))[idx]
We can see that random effect errors estimated with mgcv slightly differ that those estimated with lme4 model. Errors based on a Bayesian posterior covariance matrix are larger, whereas based on a frequentist matrix are smaller.
You can also use the package gamm4, which is based on the gamm package but using lme4 underneath. The model would be fitted as:
fm3 <- gamm4(Reaction ~ Days, random = ~ (1|Subject), data = sleepstudy)
Random effects and variance-covariance matrix of random effects can be obtained following the normal lme4 procedure.
fm3.rr <- ranef(fm3$mer,condVar=TRUE)
fm3.pv <- attr(fm3.rr[[1]],"postVar")[1,1,]
However gamm4 can be much slower than gam so read the help file to see when it best suits your need.

Plot Effects of Variables in Interaction Terms

I would like to plot the effects of variables in interaction terms, using panel data and a FE model.
I have various interaction effects in my equation, for example this one here:
FIXED1 <- plm(GDPPCgrowth ~ FDI * PRIVCR, data = dfp)
I can only find solutions for lm, but not for plm.
So on the x-axis there should be PRIVCR and on the y-axis the effect of FDI on growth.
Thank you for your help!
Lisa
I am not aware of a package that supports plm objects directly. As you are asking for FE models, you can just take an LSDV approach for FE and do the estimation by lm to get an lm object which works with the effects package. Here is an example for the Grunfeld data:
library(plm)
library(effects)
data("Grunfeld", package = "plm")
mod_fe <- plm(inv ~ value + capital + value:capital, data = Grunfeld, model = "within")
Grunfeld[ , "firm"] <- factor(Grunfeld[ , "firm"]) # needs to be factor in the data NOT in the formula [required by package effects]
mod_lsdv <- lm(inv ~ value + capital + value:capital + firm, data = Grunfeld)
coefficients(mod_fe) # estimates are the same
coefficients(mod_lsdv) # estimates are the same
eff_obj <- effects::Effect(c("value", "capital"), mod_lsdv)
plot(eff_obj)

Resources