Data Science Asked by Morshed on April 26, 2021
I have seen that validation loss is used for avoiding overfitting of the training set and cross-validation is used for generalized the models’ results.
Are they use for similar purposes or results? If not, then how can I use both validation loss and cross-validation combined on CNN?
Yes, you are right validation loss is used to show when an ML model is overfitting it's training data. However this only occurs when your training data doesn't fit your model, especially when it's quite low. cross validation is iterating your validation data over your training data because there is no sufficient data to account for both training, validation and test data. Thus its possible to used both of them to evaluate and improve your models performance although they are different concepts.
Answered by Daniel Olah on April 26, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP