Data Science Asked by AylaRT on December 17, 2020
I have a highly imbalanced dataset (± 5% positive instances), for which I am training binary classifiers. I am using nested 5-fold cross-validation with grid search for hyperparameter tuning.
I want to avoid undersampling, so I have been looking into the class_weight hyperparameter. For sklearn’s decisiontree classifier, this works really well and is easily given as a hyperparameter. However, this is not an option for sklearn’s neural network (multi-layer perceptron) as far as I can tell. I have been using Keras instead and I can apply class_weight with gridsearchCV, but not with cross_val_score.
Is there a way to use class_weights in keras with cross-validation?
The sklearn MLPClassifier does not implement any option for class weights at the moment. There are at least two paths for you to follow. You could go with writing a custom loss function, which allows you to stay in the sklearn framework without reaching out to keras. Another option is to implement the cross-validation yourself, which is not difficult to do, and run your keras model for each fold. An example can be found here.
Answered by Simon on December 17, 2020
You should be able to pass class_weights
through in the fit_params
argument of cross_val_score
.
Answered by Ben Reiniger on December 17, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP