Data Science Asked by yatu on February 1, 2021
I see it is possible to add a weight for unbalanced problems in XGBoost’s Scikit-Learn API through scale_pos_weight. Does it have an equivalent in the Learning API? If not,
Yes, you can use scale_pos_weight
in the native python API; it goes in the params
dictionary. E.g.,
params = {'objective': 'binary:logistic',
'scale_pos_weight': 2.5}
model = xgboost.train(params, dmat)
https://xgboost.readthedocs.io/en/latest/parameter.html#parameters-for-tree-booster https://github.com/dmlc/xgboost/blob/master/demo/kaggle-higgs/speedtest.py
Answered by Ben Reiniger on February 1, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP