Data Science Asked by Adam Murphy on August 18, 2021
The point of EarlyStopping is to stop training at a point where validation loss (or some other metric) does not improve.
If I have set EarlyStopping(patience=10, restore_best_weights=False)
, Keras will return the model trained for 10 extra epochs after val_loss reached a minimum. Why would I ever want this? Has this model not just trained for 10 unnecessary epochs? Wouldn’t it make more sense to give me back the model that was trained at the lowest validation loss i.e. with restore_best_weights=True
?
Would love to hear situations where doing those extra 10 epochs of training is better than not doing them.
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP