TransWikia.com

Does training of neural networks follow the same order in each epoch?

Data Science Asked on February 8, 2021

Each epoch uses the weight from the end of the previous epoch(correct me if I am wrong). Is the updating of parameters after each batch always in the same order? To rephrase, are the batches always in the same order? Could this bias the learning and are there any adaptations that deal with this.

One Answer

All the parameters are updated after a batch, there is no notion of order of update.

The batch can or cannot be in the same order. It could lead to overfitting in the sense of the network learn the order of the dataset. An easy turnaround is to shuffle the batch between each epochs.

Answered by Chopin on February 8, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP