Compute confusion matrix to evaluate the accuracy of a classification. Nội dung chính
By definition a confusion matrix \[C\] is such that \[C_{i, j}\] is equal to the number of observations known to be in group \[i\] and predicted to be in group \[j\].
Thus in binary classification, the count of true negatives is \[C_{0,0}\], false negatives is \[C_{1,0}\], true positives is \[C_{1,1}\] and false positives is \[C_{0,1}\].
Read more in the User Guide.
Parameters:y_truearray-like of shape [n_samples,]Ground truth [correct] target values.
y_predarray-like of shape [n_samples,]Estimated targets as returned by a classifier.
labelsarray-like of shape [n_classes], default=NoneList of labels to index the matrix. This may be used to reorder or select a subset of labels. If None
is given, those that appear at least once in y_true
or y_pred
are used in sorted order.
Sample weights.
New in version 0.18.
Normalizes confusion matrix over the true [rows], predicted [columns] conditions or all the population. If None, confusion matrix will not be normalized.
Returns:Cndarray of shape [n_classes, n_classes]Confusion matrix whose i-th row and j-th column entry indicates the number of samples with true label being i-th class and predicted label being j-th class.
References
Examples
>>> from sklearn.metrics import confusion_matrix >>> y_true = [2, 0, 2, 2, 0, 1] >>> y_pred = [0, 0, 2, 2, 0, 2] >>> confusion_matrix[y_true, y_pred] array[[[2, 0, 0], [0, 0, 1], [1, 0, 2]]]
>>> y_true = ["cat", "ant", "cat", "cat", "ant", "bird"] >>> y_pred = ["ant", "ant", "cat", "cat", "ant", "cat"] >>> confusion_matrix[y_true, y_pred, labels=["ant", "bird", "cat"]] array[[[2, 0, 0], [0, 0, 1], [1, 0, 2]]]
In the binary case, we can extract true positives, etc as follows:
>>> tn, fp, fn, tp = confusion_matrix[[0, 1, 0, 1], [1, 1, 1, 0]].ravel[] >>> [tn, fp, fn, tp] [0, 2, 1, 1]
Examples using sklearn.metrics.confusion_matrix¶
How do you get confusion matrix?
How to calculate a confusion matrix for binary classification.
Construct your table. ... .
Enter the predicted positive and negative values. ... .
Enter the actual positive and negative values. ... .
Determine the accuracy rate. ... .
Calculate the misclassification rate. ... .
Find the true positive rate. ... .
Determine the true negative rate..
How do you get the confusion matrix in Sklearn?
In order to get a confusion matrix in scikit-learn:.
Run a classification algorithm. classifier.fit[X_train, y_train] ... .
Import metrics from the sklearn module. ... .
Run the confusion matrix function on actual and predicted values. ... .
Plot the confusion matrix. ... .
Inspect the classification report..
How do you make a confusion matrix in python without Sklearn?
You can derive the confusion matrix by counting the number of instances in each combination of actual and predicted classes as follows: import numpy as np def comp_confmat[actual, predicted]: # extract the different classes classes = np. unique[actual] # initialize the confusion matrix confmat = np.
How do you make a confusion matrix in Jupyter notebook?
“confusion matrix in jupyter notebook” Code Answer's.
from sklearn. metrics import confusion_matrix..
conf_mat = confusion_matrix[y_test, y_pred].
sns. heatmap[conf_mat, square=True, annot=True, cmap='Blues', fmt='d', cbar=False].
How do you make a confusion matrix in keras?
Here's what you'll do:.
Create the Keras TensorBoard callback to log basic metrics..
Create a Keras LambdaCallback to log the confusion matrix at the end of every epoch..
Train the model using Model. fit[], making sure to pass both callbacks..
What is the right syntax to plot confusion matrix?
Plot Confusion Matrix for Binary Classes With Labels You need to create a list of the labels and convert it into an array using the np. asarray[] method with shape 2,2 . Then, this array of labels must be passed to the attribute annot . This will plot the confusion matrix with the labels annotation.