TransWikia.com

Why there is Different accuracy for two same trained model?

Data Science Asked on January 25, 2021

Trained the same model twice with the same dataset, the same parameters (Epochs, Batch Size, Learning rate, etc..). But both trained model shows different train as well as test accuracy on the same dataset.
(code is same for two models)

Why both models show different accuracy?

  1. Train the model

    Test accuracy = 87.98%

  2. Again Train the model

    Test accuracy = 67.18%

One Answer

Somewhere in the code there should be some parameter that is initialised randomly, this is usually called the random seed. It could be the different initialisation of your neural network weights that is affecting the results, or maybe the k-fold being different if done at random. The extent of the difference between the performance suggests that you might not have enough data to reliably learn a good model, or your model is underspecified and gets stuck at different local minima. Consider, changing the learning rate and other hyperparameters and see if the effect stays.

Answered by LuckyLuke on January 25, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP