Automatic Differentiation in R - r

I need to perform optimization using a custom function in R. For the sake of argument, my complicated function is
ComplicatedFunction<-function(X){X*exp(-X/cummax(X))}
How would I, in a fast and automated fashion, extract the gradient and hessian.
A simple example is mean square error. MSE<-function(X){mean(X**2)}. The gradient is X and the hessian just just a bunch of 1s

Related

What is the Difference between Linear programming optimization and a gradient descent optimization?

In linear programming problem we formulate two linear functions and an optimization function. where we find points where the two linear functions intersect and substitute these values in the optimization function to get the max or min.
How is this different from a gradient decent optimization. Can anybody elaborate on this mathematically. Are both methods reaching the global maximum or minimum? which is better?
linear programming finds the weights that optimize that linear combination. it is guaranteed to work, but only works for functions that are linear combinations
gradient descent can work on any function, as long as you know its derivative. However, it is only guaranteed to work if the function is convex. Otherwise it will get stuck at a local optimum
So, there's really no choice. If you have a linear combination, linear programming is better. In every other case, gradient descent is your only option.

Supplying the Hessian to optim()

I have an objective function (a log likelihood), which I want to maximize using R for a vector of inputs (parameters). I have the gradient of the function (i.e. the score vector), and I also happen to know the Hessian of the function.
In Matlab, I can maximize the function easily and the performance is drastically improved by including both the gradient and the Hessian to the minimization using optimset('GradObj', 'on') and optimset('Hessian', 'on'). In particular, the latter makes a huge difference in this case.
However, I want to do this in R. In optim, I can supply the gradient, but as far as I can tell I can only request the Hessian.
My question: is there a straight-forward way of inlcuding the Hessian for optimization problems in R, as there is in Matlab?

Solving a system of unknowns in terms of unknowns

I am trying to solve a 5x5 Cholesky decomposition (for a variance-covariance matrix) all in terms of unknowns (no constants).
A simplified version, for the sake of giving an example, would be a 2x2 decomposition:
[[a,0],[b,c]]*[[a,b],[0,c]]=[[U1,U2],[U2,U3]]
Is there a software (I'm proficient in R, so if R can do it that would be great) that could solve the above to yield an answer of the left-hand variables in terms of the right-hand variables? i.e. this would be the final answer:
a = sqrt(U1)
b = U2/sqrt(U1)
c = sqrt(U3+U2/U1)
Take a look at this Wikipedia section.
The symbolic definition of the (i,j)th entry of the decomposition is defined recursively in terms of the entries above and to the left. You could implement these recursions using Matlab's Symbolic Math Toolbox and then apply them (symbolically) to obtain your formulas for the 5x5 case. Be warned that you'll probably end up with extremely complicated formulas for some of the unknowns, and - excepting unusual circumstances - it will be fine to implement the decomposition iteratively even for a fixed size 5x5 matrix.

Inverse Laplace transform in R

I am trying to do some computations using Laplace transforms in R. I used the
continued fractions approach to compute Laplace transform of a birth-death
process as described in Abate 1999. But I cannot find a simple numerical routine to compute the inverse Laplace transform (evaluated at 0 in my case). Does anyone have ideas on how to do this in R?
Computing inverse Laplace transforms numerically is tricky. I remember seeing some relatively recent results on the ACM. Googling around a bit, I found some
Python code implementing one of these algorithms. Maybe you can adapt it to your purposes.

optimization function in R that can accept objective, gradient, AND hessian?

I have a complex objective function I am looking to optimize. The optimization problem takes a considerable time to optimize. Fortunately, I do have the gradient and the hessian of the function available.
Is there an optimization package in R that can take all three of these inputs? The class 'optim' does not accept the Hessian. I have scanned the CRAN task page for optimization and nothing pops.
For what it's worth, I am able to perform the optimization in MATLAB using 'fminunc' with the the 'GradObj' and 'Hessian' arguments.
I think the package trust which does trust region optimization will do the trick. From the documentation of trust, you see that
This function carries out a minimization or maximization of a function
using a trust region algorithm... (it accepts) an R function that
computes value, gradient, and Hessian of the function to be minimized
or maximized and returns them as a list with components value,
gradient, and hessian.
In fact, I think it uses the same algorithm used by fminunc.
By default fminunc chooses the large-scale algorithm if you supply the
gradient in fun and set GradObj to 'on' using optimset. This algorithm
is a subspace trust-region method and is based on the
interior-reflective Newton method described in [2] and [3]. Each
iteration involves the approximate solution of a large linear system
using the method of preconditioned conjugate gradients (PCG). See
Large Scale fminunc Algorithm, Trust-Region Methods for Nonlinear
Minimization and Preconditioned Conjugate Gradient Method.
Both stats::nlm() and stats::nlminb() take analytical gradients and hessians. Note, however, that the former (nlm()) currently does not update the analytical gradient correctly but that this is fixed in the current development version of R (since R-devel, svn rev 72555).

Resources