How are loess and locfit different? [closed] - r

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
'loess' is implemented in R 'stats' package and 'locfit' in 'locfit' package. They are both nonparametric regression methods that uses local regression. What are the difference between two methods?

Based on this introduction it would appear that locfit is a generalization of loess and in fact you can obtain a loess fit using locfit, but locfit also has additional options to fit more general models including logistic regression style fits and general density estimation. It can also fit loess style models, but using a different weighting formula or even varying bandwidth.

Related

How to Constrain the slope to be positive in regression? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
How can I constrain my regression coefficient (only the slope, not the intercept) to be positive? It's a general statistical question, but specifically, I would like to have an r solution, and even more specifically when using model 2 regression (major axis regression).
You could do linear regression with nls, and limit the paramater range there.
Example: Using the nl2sol algorithm from the Port library we want to find a data set with x and y values with a negative Y-intercept and slope between 1.5 and 1.6:
nls(y~a+b*x,algorithm="port",start=c(a=0,b=1.5),lower=c(a=-Inf,b=1.4),upper=c(a=Inf,b=1.6))
This solution and others are explained in the more general question at https://stats.stackexchange.com/questions/61733/linear-regression-with-slope-constraint

What is the purpose of machine learning? Can we use something else to predict the future data instead of machine learning method? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I started studying machine learning in R and I was just curious...
We are trying to find the model with good accuracy by using train and test data for prediction... but instead of using machine learning process, can't we predict the future with Regression model?
I just want to know how machine learning can change the results... will the plot of machine learning model be different with Regression model plot?
I'm just curious...
Regression models are part of Machine learning methods. You can implement a regression model, train it (meaning that the algorithm computes the coefficients, as for a basic regression) and then test it on your test set.

stepwise regression using caret in R [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have used leaps package in R to perform forward and backward feature elimination. However, I want automate the cross validation and prediction operations. Therefore, how can I use forward/backward selection in caret?
in leaps package you could do it this way
forward <- regsubsets(x ~ ., data, nvmax = 20,
method = "forward")
You should be able to run a stepwise regression in caret::train() with method=glmStepAIC from the MASS package. For details, see the list of models supported by caret on the caret documentation website.
The caret test cases for this model are accessible on the caret GitHub repository.

Sampling weights for subpopulations in R [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm working with a large, national survey that was collected using complex survey methods. As such, I'm needing to account for sample weights and other survey design features (e.g., sampling strata). I'm new to this methodology, so apologies if the answers here are obvious.
I've had success running path analysis models using the 'lavaan' package paired with the 'lavaan.survey' package. However, some of my models involve only a subset of the data (e.g., only female participants).
How can I adjust the sample weights to reflect the fact that I am only analyzing a subsample (e.g., females)?
The subset() function in the survey package handles subpopulations correctly, and since lavaan.survey uses the survey package to get the basic standard errors for the population covariance matrix, it should all flow through properly.

All Vs All classification with kernlab R [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I could not find any documentation on how to perform All vs All multi-class classification with kernlab package in R. Any kind of help would be appreciated.
Well apparently the ksvm function of the package does it automatically as it says here .
This is how to use (I quote from the link above):
svp <- ksvm(xtrain,ytrain,type="C-svc",kernel=’vanilladot’,C=100,scaled=c())
And this is the comment below:
"Question 12
Test the ability of a SVM to predict the class of the disease from gene expression. Check the influence of the parameters.
Finally, we may want to predict the type and stage of the diseases. We are then confronted with a multi-class problem, since the variable to predict can take more than two values:
y <- ALL$BT
print(y)
Fortunatelly, kernlab implements automatically multi-class SVM by an all-versus-all strategy to combine several binary SVM."

Resources