Automatic Differentiation Library?

Hey people, I'm trying to get a library that does Automatic Differentiation. I've not successfully done this because of a few different command line errors from trying two different libraries.

If there's anyone who has done this successfully before, I'd love to hear what library worked and I'd probably have some follow up questions after.

(I can go and dig up the errors, but the point of this thread is foremost to know if anyone has used any package here, and to then try that. )
For what purpose do you require a library that does automatic differentiation?

For many purposes (e.g. Newton-Raphson solvers etc.) you can usually get a perfectly serviceable approximation to a derivative by finite differences:
dy/dx is approximately ( y(x+dx)-y(x-dx) ) / ( 2 *dx )
where dx is "small".

So, what is your end goal?
Analytical/Symbolic/Numerical differentiation all take a longer time.

My goal is to send different mathematical functions to a program and calculate in good time their gradient, second order derivates and possibly the Hessian.

That didn’t answer the question.
Non-linear optimization. Does that answer the question yet?

OK, so if you want a gradient based approach then you can get derivatives from the finite-difference approximations like
dy/dx is approximately ( y(x+dx)-y(x-dx) ) / ( 2 *dx )
d2y/dx2 is approximately ( y(x+dx)-2*y(x)+y(x-dx) ) / (dx)2
etc.

However, for highly non-linear problems with lots of local minima that gradient-based methods tend to get stuck in, you can look at the modern bio-inspired optimisation techniques like particle swarm optimisation, genetic algorithm, simulated annealing etc. Personally, I've found the first of those to be the most useful. I didn't use a library - it's easy enough to code your own. If you need a library then MatLab has them built in.
Topic archived. No new replies allowed.