Data Science Asked by Corse on August 28, 2020
I am trying to do a RandomizedSearchCV for a simple regression task. Essentially two inputs to one output.
Upon inspecting the resulting models, it appears that the ‘best’ model has a mean_squared_error of around -8.3, while there were other ‘non-best’ models which had mean_squared_error close to 0 (i.e. around 0.004).
In this case, may I know if the Keras optimized model selection is correct? Why does it select one with mean_squared_error not close to zero?
Shouldn’t the best model give MSE close to zero? I understand that the optimization seeks to minimize loss, but how can I interpret this?
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP