Cross Validated Asked by visionEnthusiast on January 28, 2021
I am trying to evaluate one of the models using COCO dataset metrics. For some of the precision metrics I am getting results that I can comprehend. For example:
Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.546
Average Recall (AR) @[ IoU=0.50:0.95 | area=small | maxDets=100 ] = 0.62
But for few of them I am getting -1 as the result. For example,
Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = -1
Average Precision (AP) @[ IoU=0.50:0.95 | area=large | maxDets=100 ] = -1
Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = -1
Average Recall (AR) @[ IoU=0.50:0.95 | area=large | maxDets=100 ] = -1
Not sure how to comprehend -1 for AP or AR.
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP