stepwise for Ridge Regression in R - r

I am doing modeling based on ridge regression by R.
I want to make a step-wise for ridge regression,
however, i can only get a error which said
"Error in terms.default(object) : no terms component nor attribute"
My R code:
TempReg = step(lm.ridge(DepVar ~ ., data = RandomVars,lambda = 0),direction="both", trace=0)
My R code when i use general regression (workable):
TempReg = step(lm(DepVar ~ ., data = RandomVars), direction="both", trace=0)
what can i do if i want to make stepwise for ridge

You could try maybe using step-wise first and then ridge regression. But as mentioned, it does not make sense as it is basically Lasso regression. You could also try elastic net regression as it uses both the L1 and L2 penalties.

Related

How to obtain analysis of variance table for a nonlinear regression model in R

Previously I used SAS to fit data into nonlinear regression model. SAS was able to produce an analysis of variance table for the model. The table displays the degrees of freedom, sums of squares, and mean squares along with the model F test.
Please refer to Table 69.4 in this pdf file.
Source: https://support.sas.com/documentation/onlinedoc/stat/132/nlin.pdf
How can I re-create something similar in R? Thanks in advance.
I'm not sure what type of nonlinear regression you're interested in- but the general approach would be to run the model and call for a summary. The typical linear model would be:
linearmodel = lm(`outcomevar` ~ `predictorvar`, data = dataset)
linearmodel #gives coefficients
summary(linearmod) # gives model fit
For nonlinear regression you would add the polynomial term. For quadratic fit it would be
y = b0 + b1(Var) + b2(Var * Var) or:
nonlinmodel = lm(`outcomevar` ~ `predictorvar` + I(`predictorvar`^2), data = dataset)
nonlinmodel
summary(nonlinmodel)
other methods here: https://data-flair.training/blogs/r-nonlinear-regression/

Run logit regression in r

I am trying to run a probit regression in R.
I am wondering which function suits best for it.
I had a look at the glm function. Am I right with this?
For example, does the output of the following code give the result of a probit regression with fixed effects in the "spieler"-level?:
probTc = glm(Esieg~TTRverf+ factor(spieler),family = "binomial", data = datregT)

non nested model comparison using R

To explain my problem , i have this simulated data using R.
require(splines)
x=rnorm(20 ,0,1)
y=rep(c(0,1),times=10)
First i fitted a regular (linear effects) logistic regression model.
fit1=glm(y~x ,family = "binomial")
Then to check the non linear effects, i fitted this natural spline model .
fit2=glm(y~ns(x,df=2) ,family = "binomial")
Based on my thinking models , i believe these 2 models are non nested models.
Next i wanted check whether the non linear model (fit2) has any significant effects compared to the regular logistic model (fit1).
Is there any way to compare this two models? I believe i cannot use the lrtest function in lmtest package, because these two models are not nested models.
Any suggestion will be highly appreciated
Thank you.

Cannot get adjusted means for glmer using lsmeans

I have a glm that I would like to get adjusted means for using lsmeans. The following code makes the model (and seems to be doing it correctly):
library(lmerTest)
data$group <- as.factor(data$grp)
data$site <- as.factor(data$site)
data$stimulus <- as.factor(data$stimulus)
data.acc1 = glmer(accuracy ~ site + grp*stimulus + (1|ID), data=data, family=binomial)
However, using when I try to use any of the below code to get adjusted means for the model, I get the error
Error in lsmeansLT(model, test.effs = test.effs, ddf = ddf) :
The model is not linear mixed effects model.
lsmeans(data.acc1, "stimulus")
or
data.lsm <- lsmeans(data.acc1, accuracy ~ stimulus ~ grp)
pairs(data.lsm)
Any suggestiongs?
The problem is that you have created a generalised linear mixed model using glmer() (in this case a mixed logistic regression model) not a linear mixed model using lmer(). The lsmeans() function does not accept objects created by glmer() because they are not linear mixed models.
Answers in this post might help: I can't get lsmeans output in glmer
And this post might be useful if you want to understand/compute marginal effects for mixed GLMs: Is there a way of getting "marginal effects" from a `glmer` object

Classification table and ROC curve - logistic regression in R using lrm

I want to have a classification table for logistic regression using lrm function in rms package and then plot the roc curve.I have perfomed this using glm function.Example code
train<-sample(dim(data)[1],.8*dim(data)[1]) #80-20 training/test
datatrain<-data[train,]
datatest<-data[-train,]
fit<-glm(Target ~ ., data=datatrain,family=binomial()) #Target is 0/1 variable
prob=predict(fit,type=c("response"),datatest)
datatest$prob=prob
library(pROC)
ROC <- roc(Target==1 ~ prob, data = datatest)
plot(ROC)
confusion<-table(prob>0.5,datatest$Target)
errorrate<-sum(diag(confusion))/sum(confusion)
errorrate
How to get the confusion matrix using lrm function?
The lrm function returns a fit object that inherits from the glm-class. That is not explicitly stated in the lrm help page, but it's easy enough to verify. After running the setup code in the first example on the ?lrm page
> f <- lrm(ch ~ age)
> class(f)
[1] "lrm" "rms" "glm"
So you should be able to use the ordinary predict method you were using above. Prof Harrell advises against using split-sample validation and the use of ROC curves for model comparison. He provides mechanisms for better methods in his package.

Resources