Data Science Asked on January 20, 2021
The only good answer i found was this :
https://stackoverflow.com/questions/33118361/determine-most-important-feature-per-class
But the above answer doesn’t work in binary classification because there is only one i, so instead i used the top most positive weights to detect most important features for class 1, and most negative ones for class 0 :
myclassifier = SGDClassifier(loss='log', penalty='l1', l1_ratio=0.9, learning_rate='optimal' , shuffle=False, fit_intercept=True , verbose=True)
myclassifier.fit(X_train, Y_train)
top_indices_class1 = np.argsort(myclassifier.coef_[0])[-100:]
for x in top_indices_class1:
print(feature_list[int(x)] , myclassifier.coef_[0][int(x)] )
top_indices_class0 = np.argsort(myclassifier.coef_[0])[:100]
for x in top_indices_class0:
print(feature_list[int(x)] , myclassifier.coef_[0][int(x)])
Is this a good method to find the most important features for each class? is there any better way to do this?
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP