Handling ceiling effect of dependent variable in mixed models (R lmer) - r

I would like to ask around who has any experience with handling ceiling effects of the dependent variable in linear mixed effects models. I found this paper here suggesting results for models outside mixed modeling ( https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2778494/ ) but it would be very convenient to find options within the lme4 (lmer) package in R. Any ideas? Can this problem be solved with generalized mixed models by adjusting the link function? How?

Related

Are there packages in Julia that estimates the marginal means or marginal effects?

I am new to Julia and i estimated some multilevel regressions using Mixed Models. Everything worked perfectly fine but i would like to estimate the marginal means or marginal effects. In R there are two packages that i am aware of for that regard: emmeans and ggeffects. Are there similar packages in Julia?
In Julia, there is now Effects.jl, which uses the same technique as the effects package in R (which is what ggeffects uses for its computation).
Since your question is tagged mixed-models, you might also consider JellyMe4 which adds support for lme4/MixedModels to RCall.
Don't believe there is a good package for this at the moment, though you could use RCall.jl and process your data there. Or, if you don't mind doing it manually - you could possibly calculate it from the predict() method from GLM.jl

Is it possible to let the precision/variance parameter in a beta regression via GAM vary with the predictor as well?

I want to fit a spatiotemporal model where my dependent variable is in the range [>0,<1].
A beta regression seems suitable for this case.
I tried the betareg package, that works like a charm, but to my knowledge I cannot include complex interaction terms that occur e.g. in spatiotemporal datasets to account for autocorrelation.
I know that GAMs e.g. package mgcv support beta regression via the betar() family. To my knowledge the precision/variance parameter is held constant though and only the mean (mu) changes as a function of the predictors.
my model looks like this (it is conceptual so no example data needed):
mgcv::gam(Y~ te(latitude,longitude,day)+s(X1)+s(X2)+s(X3),family=betar())
The problem is that only mu is modelled but not phi / precision
In the betareg I can let vary phi with my predictors:
betareg::betareg(Y ~ X1+X2+X3+latitude+longitude | X1+X2+X3+latitude+longitude)
but this doesn´t let me model the spatiotemporal term as needed, because simple additive effects are not suitable for that and I need something like what is supported with the te() functionality from mgcv or any other kind of interaction term.
Is there any work around or a way to model phi but account for my spatiotemporal term either via mgcv or betareg or any other R package?
Thanks a lot!

How to calculate marginal effects for plm models

I wish to obtain marginal effects for covariates that are in my plm models in first differences with interacted variables. For my lm and glm models I am using the margins package and its functions. However, this method does not seem to work with panel models.
What alternatives do I have?
Thank you.
EDIT:
In the absence of any viable solution I would calculate by hand the marginal effects at specific values of the relevant covariables. Ideally I would like to do so in a faster way.

Boosted trees and Variable Interactions in R

How can one see in a Boosted trees classification model for machine learning (adaboost), which variables interact with each other and how much? I would like to make use of this in R gbm package if possible.
To extract the interaction between input variables, you can use any package like lm. http://www.r-bloggers.com/r-tutorial-series-regression-with-interaction-variables/
You can use ?interact.gbm. See also this cross-validated question, which directs to a vignette of a related technique from the package dismo.
In general, these interactions may not necessarily agree with the interaction terms estimated in a linear model.

Panel data with binary dependent variable in R

Is it possible to do regressions in R using a panel data set with a binary dependent variable? I am familiar with using glm for logit and probit and plm for panel data, but am not sure how to combine the two. Are there any existing code examples?
EDIT
It would also be helpful if I could figure out how to extract the matrix that plm() is using when it does a regression. For instance, you could use plm to do fixed effects, or you could create a matrix with the appropriate dummy variables and then run that through glm(). In a case like this, however, it is annoying to generate the dummies yourself and it would be easier to have plm do it for you.
The package "pglm" might be what you need.
http://cran.r-project.org/web/packages/pglm/pglm.pdf
This package offers some functions of glm-like models for panel data.
Maybe the package lme4 is what you are looking for.
It seems to be possible to run generalized regressions with fixed effects using the comand glme.
But you should be aware that panel data with binary dependent variable is different than the usual linear models.
This site may be helpful.
Best regards,
Manoel
model.frame(plmmodel)
will give you the data frame that is actually used by plm for fitting the model (i.e. after list-wise deletion if you have NAs, etc.)
I don't think that plm has implemented functions to estimate models with binary outcomes, but I may be wrong. Check out the reference manual at: http://cran.r-project.org/web/packages/plm/index.html
If I'm right, this would suggest that you can't "combine the two" without considerable work in extending the functions provided by plm.

Resources