how to specify objective function in nls() function in R - r

I have created an objective function f(x0) that returns sum of squares between actual and theoretical values by specifying intial guess x0. How do I specify minimization problem using nls() function?
Thanks in advance.

Related

How to calculate Godambe information matrix in R?

I have a likelihood function in R that I am optimizing using 'optim' and calculating the hessian matrix using hessian=T in the optim function. I want to calculate the Godambe Information matrix in R, which is defined as:
G(theta)= H(theta) J(theta)^-1 H(theta)
where J(theta) is the variability matrix and H(theta) is the sensitivity matrix.
I am not sure how to calculate these matrices in R for my likelihood function and the estimates obtained from the optim. Please help.

Is there an R function / package that can perform inverse normal distribution? [duplicate]

To plot a normal distribution curve in R we can use:
(x = seq(-4,4, length=100))
y = dnorm(x)
plot(x, y)
If dnorm calculates y as a function of x, does R have a function that calculates x as a function of y? If not what is the best way to approach this?
What dnorm() is doing is giving you a probability density function. If you integrate over that, you would have a cumulative distribution function (which is given by pnorm() in R). The inverse of the CDF is given by qnorm(); that is the standard way these things are conceptualized in statistics.
I'm not sure if the inverse of the density function is built in -- it's not used nearly as often as the inverse of the cumulative distribution function. I can't think offhand of too many situation where the inverse density function is useful. Of course, that doesn't mean there aren't any, so if you are sure this is the function you need, you could just do:
dnorminv<-function(y) sqrt(-2*log(sqrt(2*pi)*y))
plot(x, y)
points(dnorminv(y),y,pch=3)
The derivation of the inverse of the standard normal pdf is:

Defining gam function type

I would like to apply a gam model on a dataset with specifying the types of functions to use.
It would be something like :
y ~ cst1 * (s(var1)-s(var2)) * (1 - exp(var3*cst2))
s has to be the same function for both var1 and var2. I don't have a prior idea on s function family. If I resume, the model would find the constants (cst1 and cst2) plus the function s.
Is it possible? If not, is there any way (another type of models) i can use to do what i'm looking for?
Thanks in advance for replies.
This model could be fit with nls, the non-linear least squares package. This will allow you to model the formula you want directly. The splines will need to be done manually, though. This question gets at what you would be trying to do.
As far as getting the splines to be the same for var1 and var2, you can do this by subtracting the basis matrices. Basically you want to compute the coefficient vector A where the term is A * s(var1) + A * s(var2) = A * (s(var1) - s(var2)). You wouldn't want to just do s(var1 - var2); in general, f(x) - f(y) != f(x - y). To do this in R, you would
Compute the spline basis matrices with ns() for var1 and var2, giving them the same knots. You need to specify both the knots and the Boundary.knots parameters so that the two splines will share the same basis.
Subtract the two spline basis matrices (the output from the ns() function).
Adapt the resulting subtracted spline basis matrix for the nls formula, as they do in the question I linked earlier.

Prediction at a new value using lowess function in R

I am using lowess function to fit a regression between two variables x and y. Now I want to know the fitted value at a new value of x. For example, how do I find the fitted value at x=2.5 in the following example. I know loess can do that, but I want to reproduce someone's plot and he used lowess.
set.seed(1)
x <- 1:10
y <- x + rnorm(x)
fit <- lowess(x, y)
plot(x, y)
lines(fit)
Local regression (lowess) is a non-parametric statistical method, it's a not like linear regression where you can use the model directly to estimate new values.
You'll need to take the values from the function (that's why it only returns a list to you), and choose your own interpolation scheme. Use the scheme to predict your new points.
Common technique is spline interpolation (but there're others):
https://www.r-bloggers.com/interpolation-and-smoothing-functions-in-base-r/
EDIT: I'm pretty sure the predict function does the interpolation for you. I also can't find any information about what exactly predict uses, so I've tried to trace the source code.
https://github.com/wch/r-source/blob/af7f52f70101960861e5d995d3a4bec010bc89e6/src/library/stats/R/loess.R
else { ## interpolate
## need to eliminate points outside original range - not in pred_
I'm sure the R code calls the underlying C implementation, but it's not well documented so I don't know what algorithm it uses.
My suggestion is: either trust the predict function or roll out your own interpolation algorithm.

Numerical integration of numerical function in R

I'm, trying to apply this solution to find the p-value in an arbitrary distribution defined from data experiments. I have estimated this distribution using the density function in R. Now, I would like to integrate this function to apply the solution proposed by #mpiktas. However, the integrate function requires a function as input, not two vectors x and y with the values that define the function, which is what density provides.
Any idea on how to deal with this numerical integration based on x-y values in R?

Resources