Data Science Asked on December 26, 2021
I understand building a ROC curve when the output is a probability, say, from a logistic regression model. You can build a ROC curve by varying the cutoff threshold.
But what about decision trees of the form:
if attribute_1 > x:
decision = positive
else:
if attribute_2 < y:
decision = position
else:
decision = negative
You can adjust the cutoff for both attributes and all will affect your confusion matrix. Does it make sense to build a ROC curves when there are multiple thresholds?
Thanks
The ROC Curve has no relation with the way your model works, but instead, with its outputs. If the target is binary and your model outputs anything in between 0 to 1 (e.g. [0, 0.2, 0.4, ..., 1] or continuous probabilities), there's a sense in building the ROC Curve. If instead, the only outputs of your model are 0 or 1, the ROC Curve would be kind of useless, and computing simpler metrics like Precision and Recall would make more sense.
Answered by Henrique Nader on December 26, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP