Could the autograd or jax packages be used to generate the equivalent of analytic derivatives for OpenMDAO explicit components? i.e. something more accurate than finite differences (or perhaps more accurate or more general than the complex step method?) but without the work of manually computing and programming the analytic gradients?
I'm not an expert on either of these packages, but they seem to be designed for just this purpose.
Related
As you probably know functions can be represented as a infinite series. For example f(x) = cosx can be represented as this. My question is if this is every used practically in programming for any type of application. I know it can be used I was just wondering if it actually is for serious projects.
Aside from infinite series, there are other representations for functions which can be useful for computing approximations. Asymptotic series, identities involving other "elementary" functions, and interpolation in a table of values are all used in different contexts. Take a look at Abramowitz & Stegun "Handbook of Mathematical Functions" to get an idea of the variety of possibilities. Also look for the source code for popular libraries or systems such as R, Numpy, Scipy, or Octave to see what approaches have been used by the authors of that software.
Specifically about series approximations for trigonometric functions, I think that might be a reasonable thing to do, but only if the range of the argument is reduced (via identities) so that it is as small as possible.
Approximation of functions is a great topic; good luck and have fun.
I need to estimate parameters of continuous-discrete nonlinear stochastic dynamic system using Kalman filtering techniques.
I'm going to use Julia ode45() from ODE and implement Extended Kalman Filter by myself to compute loglikelihood. ODE is written fully in Julia, ForwardDiff supports differentiation of native Julia functions, including nested differentiation, that's what I also need cause I want to use ForwardDiff in my EKF implementation.
Will ForwardDiff handle differentiation of such a comprehensive function like the loglikelihood I've described?
ODE.jl is in maintenance mode so I would recommend using DifferentialEquations.jl instead. In the DiffEq FAQ there is an explanation about using ForwardDiff through the ODE solvers. It works, but as in the FAQ I would recommend using sensitivity analysis since that's a better way of calculating the derivatives (it will take a lot less compilation time). But yes, DiffEqParamEstim.jl is a whole repository for parameter estimation of ODEs/SDEs/DAEs/DDEs and it uses ForwardDiff.jl through the solvers.
(BTW, what you're looking to do sounds interesting. Feel free to get in touch with us in the JuliaDiffEq channel to talk about the development of parameter estimation tooling!)
I'm currently looking for a lua alternative to the R programming languages; optim() function, if anyone knows how to deal with this?
http://numlua.luaforge.net/ looks interesting but doesn't seem to have minimization. The most promising lead seems to be a Lua wrapper for GSL, which has a variety of multidimensional minimization algorithms included.
With derivatives
- BFGS (method="BFGS" in optim) and two conjugate gradient methods (Fletcher-Reeves and Polak-Ribiere) which are two of the three options available for method="CG" in optim.
Without derivatives
- the Nelder-Mead simplex (method="Nelder-Mead", the default in optim).
More specifically, see here for the Lua shell documentation covering minimization.
I agree with #Zack that you should try to use existing implementations if at all possible, and that you might need a little bit more background knowledge to know which algorithms will be useful for your particular problems ...
R's implementation of optim isn't actually written in R. If you type "optim" with no parentheses at the prompt, it'll dump out the definition of the function, and you can see that after some error checking and argument shuffling it invokes an .Internal routine (coded in C and/or Fortran) to do all the real work.
So your best bet is to find a C library for mathematical optimization -- sorry, I have no recommendations -- and wrap that into Lua. I doubt anyone has written native-Lua code for this, and I would not recommend trying to code it yourself; doing mathematical optimization efficiently is still an active domain of basic research, and the best-so-far algorithms are decidedly nontrivial to implement.
My application has some parabolic partial differential equations...which are inter-related and use some variables which the user inputs via a UI from a desktop application.
Can you guide me through as to which software or library or a particular language would serve the best purpose for the above?
Maybe Python language with:
PyQt for UI
SciPy for scientific computing
Or Matlab, or its free counterpart gnu octave or scilab, of freemat.
Or just crank it up in Wolfram Alpha web UI.
http://www.wolframalpha.com/input/?i=X^2%2B2x%2B1%3D0
Or Wolfram Mathematica 8.
Since you said "equations", I'll assume there's more than one and that they're coupled. It's highly unlikely that you'll find a closed-form solution for a problem that difficult.
When I hear "parabolic PDE", the prototype for me is transient diffusion. That usually means a numerical integration forward in time using explicit Euler (small steps, unstable), implicit, or Crank-Nicholson integration scheme.
I'd discretize using finite element methods and weighted residuals. This is how you turn those PDEs into matrix equations.
Once both of those are decided upon, you'll have a set of linear algebra problems to solve repeatedly for each time step. You can use any good linear algebra library you have available in the language of your choice.
Maybe MATLAB or Octave, its open source cousin, could help you here.
I'm checking a simple moving average crossing strategy in R. Instead of running a huge simulation over the 2 dimenional parameter space (length of short term moving average, length of long term moving average), I'd like to implement the Particle Swarm Optimization algorithm to find the optimal parameter values. I've been browsing through the web and was reading that this algorithm was very effective. Moreover, the way the algorithm works fascinates me...
Does anybody of you guys have experience with implementing this algorithm in R? Are there useful packages that can be used?
Thanks a lot for your comments.
Martin
Well, there is a package available on CRAN called pso, and indeed it is a particle swarm optimizer (PSO).
I recommend this package.
It is under actively development (last update 22 Sep 2010) and is consistent with the reference implementation for PSO. In addition, the package includes functions for diagnostics and plotting results.
It certainly appears to be a sophisticated package yet the main function interface (the function psoptim) is straightforward--just pass in a few parameters that describe your problem domain, and a cost function.
More precisely, the key arguments to pass in when you call psoptim:
dimensions of the problem, as a vector
(par);
lower and upper bounds for each
variable (lower, upper); and
a cost function (fn)
There are other parameters in the psoptim method signature; those are generally related to convergence criteria and the like).
Are there any other PSO implementations in R?
There is an R Package called ppso for (parallel PSO). It is available on R-Forge. I do not know anything about this package; i have downloaded it and skimmed the documentation, but that's it.
Beyond those two, none that i am aware of. About three months ago, I looked for R implementations of the more popular meta-heuristics. This is the only pso implementation i am aware of. The R bindings to the Gnu Scientific Library GSL) has a simulated annealing algorithm, but none of the biologically inspired meta-heuristics.
The other place to look is of course the CRAN Task View for Optimization. I did not find another PSO implementation other than what i've recited here, though there are quite a few packages listed there and most of them i did not check other than looking at the name and one-sentence summary.