TransWikia.com

Can automatic differentiation be used on the parameters of an optimization problem?

Computational Science Asked by Sal L on August 18, 2021

If I wanted to perform an optimization using a Newton-based solver where the Hessian and gradient of a function are known analytically, and then use a package such as Adept to compute a Jacobian matrix ("reverse mode") with respect to the optimization parameters, would that be possible? Would there be excessive memory usage if the optimization takes several thousand steps to converge?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP