Data Science Asked on May 6, 2021
I have trained a Deep Neural architecture for regression problem and after the hundred’s of epochs, model predicting the same output for both training and testing data.
When I reduced the batch size, atleast I’m not getting the same value for all the samples.
According to me reasons could be, model is not getting trained and gradient died in between. If it’s correct, Im not sure why it’s happening in my case where I have just "3 CNN + 3 DNN" layers on my architecture.
Regards.
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP