Computational Science Asked by Sal L on August 18, 2021
If I wanted to perform an optimization using a Newton-based solver where the Hessian and gradient of a function are known analytically, and then use a package such as Adept to compute a Jacobian matrix ("reverse mode") with respect to the optimization parameters, would that be possible? Would there be excessive memory usage if the optimization takes several thousand steps to converge?
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP