Classification | Performance metrics |
---|---|
Accuracy | (TP+TN)/(TP+FN+FP+TN): Accuracy if how close a measured value is to the actual(true) value. |
Precision | TP/(TP+FP): Precision quantifies the number of positive class predictions that actually belong to the positive class. |
Recall | TP/(TP+FN): Recall quantifies the number of positive class predictions made out of all positive examples in the dataset. |
F1 score | 2/(1/Precision +1/Recall): F+Measure provides a single score that balances both the concerns of precision and recall in one number. |