Replies: 1 comment 2 replies
-
|
its in relation to the The mean Average Precision (mAP) quantitative measure from PASCAL VOC 2010. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi there,
I'm trying to compare my custom trained yolo5s model against other models, however they do no state the [email protected] or the [email protected]:.95. only AP. My question is, how do you attain the AP of your model. I had a look at this medium explanation:
https://medium.com/@timothycarlen/understanding-the-map-evaluation-metric-for-object-detection-a07fe6962cf3
However im still quite confused on how to calculate it properly.
Does anyone have a clearer simpler explanation, it would be much appreciated.
Many thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions