Confusion Matrix Metrics:
From: | To: |
A 4x4 confusion matrix is a table that visualizes the performance of a classification model with four classes. It shows how many predictions were correct (diagonal) and incorrect (off-diagonal) for each class.
The calculator computes various metrics for each class:
Where:
Details: Confusion matrices help identify which classes are being confused with others, allowing for targeted model improvement. They provide more insight than simple accuracy metrics.
Tips: Enter the counts of actual vs predicted classifications in the 4x4 grid. The calculator will compute metrics for each class and overall accuracy.
Q1: What's the difference between precision and recall?
A: Precision measures how many selected items are relevant, while recall measures how many relevant items are selected.
Q2: When should I use F1-score?
A: F1-score is useful when you want to balance precision and recall, especially with imbalanced datasets.
Q3: What does specificity measure?
A: Specificity measures the proportion of actual negatives that are correctly identified.
Q4: How do I interpret off-diagonal elements?
A: Off-diagonal elements show misclassifications - which classes are being confused with which others.
Q5: Can I use this for binary classification?
A: While you can, a 2x2 matrix would be simpler for binary classification cases.