TransWikia.com

How to interpret the confusion matrix and compare the result of features extraction with LBP and Haralick

Data Science Asked by Abdelmalek Mallek on May 28, 2021

I’m begginer in deep learning so I tried to execute a code of liveness face detection from github in this link :https://github.com/imironica/liveness , so when I tried to run features extraction with LBP and HAralik, I got a multiple results in confusion matrixes like this:
1- test features with LBP:

Train Nearest neighbors (3)
Accuracy: 0.7621621621621621
Confusion matrix:
 [[1625   45]
 [ 703  772]]

Train SGD
Accuracy: 0.8158982511923688
Confusion matrix:
 [[1588   82]
 [ 497  978]]

Train Naive Bayes
Accuracy: 0.8756756756756757
Confusion matrix:
 [[1544  126]
 [ 265 1210]]

Train Decision Tree Classifier 
Accuracy: 0.6791732909379968
Confusion matrix:
 [[1569  101]
 [ 908  567]]

Train Adaboost Classifier 
Accuracy: 0.6515103338632751
Confusion matrix:
 [[1630   40]
 [1056  419]]

Train Gradient Boosting Classifier
Accuracy: 0.6419713831478537
Confusion matrix:
 [[1622   48]
 [1078  397]]

Train Random Forest Classifier
Accuracy: 0.7106518282988871
Confusion matrix:
 [[1641   29]
 [ 881  594]]

Train Extremelly Trees Classifier
Accuracy: 0.724006359300477
Confusion matrix:
 [[1638   32]
 [ 836  639]]


Train Linear SVM with C=1 
Accuracy: 0.8076311605723371
Confusion matrix:
 [[1593   77]
 [ 528  947]]

Train SVM with C=10 
Accuracy: 0.8082670906200318
Confusion matrix:
 [[1591   79]
 [ 524  951]]

2-and when I tried test features with Haralik I got:

Train Nearest neighbors (3)

Nearest neighbors (3): 0.753577106518283

[[1580   90]

 [ 685  790]]

Train SGD

SGD: 0.7424483306836248

[[1598   72]

 [ 738  737]]

Train Naive Bayes

Naive Bayes: 0.7093799682034976

[[1521  149]

 [ 765  710]]

Train Decision Tree Classifier 

Decision Tree Classifier : 0.6511923688394277

[[1527  143]

 [ 954  521]]

Train Adaboost Classifier 

Adaboost Classifier : 0.763751987281399

[[1537  133]

 [ 610  865]]

Train Gradient Boosting Classifier

Gradient Boosting Classifier: 0.7939586645468999

[[1544  126]

 [ 522  953]]

Train Random Forest Classifier

Random Forest Classifier: 0.7007949125596185

[[1564  106]

 [ 835  640]]

Train Extremelly Trees Classifier

Extremelly Trees Classifier: 0.7364069952305247

[[1543  127]

 [ 702  773]]

Train Linear SVM with C=0.01 

Linear SVM with C=0.01 : 0.5303656597774244

Train Linear SVM with C=0.1 

Linear SVM with C=0.1 : 0.729093799682035

Train Linear SVM with C=1 

Linear SVM with C=1 : 0.7335453100158983

Train Linear SVM with C=10 

Linear SVM with C=10 : 0.7869634340222575

Train Linear SVM with C=100 

Linear SVM with C=100 : 0.7939586645468999

Train Linear SVM with C=500 

Linear SVM with C=500 : 0.7936406995230525

Train Linear SVM with C=1000 

Linear SVM with C=1000 : 0.7968203497615263

Train Linear SVM with C=2000 

Linear SVM with C=2000 : 0.794912559618442

Train SVM with C=0.01 

SVM with C=0.01 : 0.5310015898251192

Train SVM with C=0.1 

SVM with C=0.1 : 0.5306836248012718

Train SVM with C=1 

SVM with C=1 : 0.7297297297297297

Train SVM with C=10 

SVM with C=10 : 0.7456279809220986

Train SVM with C=100 

SVM with C=100 : 0.7802861685214626

Train SVM with C=500 

SVM with C=500 : 0.7910969793322734

Train SVM with C=1000 

SVM with C=1000 : 0.7926868044515103

Train SVM with C=2000 

SVM with C=2000 : 0.7939586645468999

Train SVM with C=4000 

SVM with C=4000 : 0.7926868044515103

Train SVM with C=10000 

SVM with C=10000 : 0.7930047694753577

Train SVM with C=2000000 

SVM with C=2000000 : 0.7364069952305247 

so now I’m trying to understand what is the difference between LBP and Haralik in features extraction and which is the best to choose?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP