TransWikia.com

Training loss stuck in the starting epochs but then starts decreasing. What could be the reason for it?

Data Science Asked on March 13, 2021

I am training a model where I found a unique problem that for starting 4 epochs, my loss did not change with the epochs but after that, it started changing. Could it be because of the high learning rate, local minima or something else like some regularisation parameter is high??

One Answer

It could be any of those factors. The root cause can be found through experimentation. Hold everything constant and change a single factor. Then systematically change each factor while holding all other factors constant.

Answered by Brian Spiering on March 13, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP