Related Computer Vision Links
Learn Eval Computer Vision Tutorial, validate concepts with Eval Computer Vision MCQ Questions, and prepare interviews through Eval Computer Vision Interview Questions and Answers.
CV Evaluation Metrics MCQ
IoU for overlap, AP under PR curves for detection—pick metrics that match the task and failure costs.
IoU
Overlap
AP / mAP
PR curve
P & R
Tradeoff
Confusion
Matrix
Metrics that match the task
Classification uses accuracy, precision, recall, F1, and ROC-AUC. Object detection pairs predictions to ground truth with IoU thresholds, then averages precision across recall (AP) and classes (mAP). Segmentation reports mean IoU across classes; ranking tasks need NDCG. Always align metric with business cost (false positives vs misses).
Why mAP
Summarizes the quality of ranked detections across IoU cutoffs and recall—better than single-threshold accuracy.
Key ideas
IoU
Intersection over union for boxes or masks.
Precision / Recall
TP/(TP+FP) vs TP/(TP+FN).
AP
Area under precision-recall curve after sorting by score.
mAP
Mean AP over classes (and sometimes IoU thresholds).
Detection match
NMS → sort by score → greedy match to GT with IoU ≥ threshold