R) How can I do a multiple non-linear regression in R? - r

I used Curve Expert to get a non-linear equation for one y.
Now I have a two X, but don know how I have to merge this in one equation.
In other word, I want to know how I can do a multiple NON-linear regression with two non-linear equation in R.
Is there any packages for do this math or can I use nls function for this?

Related

Weighted mixture model of two distributions where weight depends on the value of the distribution?

I'm trying to replicate the precipitation mixture model from this paper: http://dx.doi.org/10.1029/2006WR005308
f(r) is the gamma PDF, g(r) is the generalized Pareto PDF, and w(r) is the weighting function, which depends on the value r being considered. I've looked at R packages like distr and mixtools that handle mixture models, but I only see examples where w is a constant, and I haven't found any implementations where the mixture is a function of the value. I'm struggling to create valid custom functions to represent h(r) so if someone could point me to a package that would be super helpful.

Inferring/expressing the polynomial equation of a fitted smoothing spline?

If I smooth a data vector with a smoothing cubic spline my understanding is that each ‘segment’ between knots should be representable as a cubic polynomial.
Is it possible to infer the equation of each segment from the spline coefficients after e.g fitting by the smooth.spline function in R?
This is straightforward for an interpolating spine as the array of polynomial coefficients is generated explicitly. However, I’ve not been able to find an answer as to whether this is possible with a smoothing spline or regression spline?
The reason for wanting this is to in turn obtain an analytical expression for the derivative of a segment of a spline.

compare differences between coefficients in different regression equations

I am trying to compare differences between coefficients in different regression equations.
Specifically, I have 2 regressions looking at the effect of Importance to Donate on Guilt, Feeling, and Responsibility
aov_I <- aov(newdata_I$AV_importance_to_donate~newdata_I$AV_guilty+newdata_I$AV_percieved_resp+feeling_I)
summary(aov_I)
aov_S <- aov(newdata_S$AV_importance_to_donate~newdata_S$AV_guilty+newdata_S$AV_percieved_resp+feeling_S)
summary(aov_S)
I would like to compare the differences between the coefficients in these two different regression equations.
How can I do this??
Thank you so much in advance!
You can view just the coefficients by doing aovI$coefficients[2] and aovS$coefficients[2] and then combine them into a dataframe using cbind, then just view with a bar graph if you don't need to do a real statistical comparison

Automatic model creation, for model selection, in polynomial regression in R

Let's imagine that for a target value 'price', I have predictive variables of x, y, z, m, and n.
I have been able to analyse different models that I could fit through following methods:
Forward, backward, and stepwise selection
Grid and Lasso
KNN (IBk)
For each I got RMSE and MSE for prediction and I can choose the best model.
All these are helpful with linear models.
I'm just wondering if there is any chance to do the same for polynomial regressions (squared, cubic, ...) so I can fit and analyse them as well in the same dataset.
Have you seen caret package? Its very powerfull and groups a lot of machine learning models. It can compares different models and also see the best metaparameters.
http://topepo.github.io/caret/index.html

clustering by subjects in Nonlinear Least Squares in R

I would like to cluster standard errors at subject level in a nonlinear least square regression in R. Is it possible to do this with the nls command?

Resources