Computational Science Asked by Zebx on June 20, 2021
Is there a way to have an estimate of the error propagated on an ode numerical solution by the error of the initial conditions? I suppose this depend on the numerical method used and on the problem features, but is there some theorem or general treatment that could be adapted to the problem and the method used?
In general, the initial error will already grow exponentially even if you solved the ODE exactly. The Lyapunov exponent (related to the Lipschitz condition) will tell you have fast.
Separately, though, the Gronwall lemma is exactly what you are looking for. It allows you to estimate the error at the end time as a function of the errors introduced in each time step, and how fast the previous error grows in each time step. You should easily be able to specialize this to your specific case.
Correct answer by Wolfgang Bangerth on June 20, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP