AI-generated Key Takeaways
-
The
ConfusionMatrix.accuracy()method computes the overall accuracy of a confusion matrix. -
Overall accuracy is defined as the number of correct predictions divided by the total number of predictions.
-
The method returns a floating-point number representing the calculated accuracy.
-
The JavaScript and Python code examples demonstrate how to calculate overall accuracy and other metrics like consumer's accuracy, producer's accuracy, and the Kappa statistic from a confusion matrix.
| Usage | Returns |
|---|---|
ConfusionMatrix.
accuracy
()
|
Float |
| Argument | Type | Details |
|---|---|---|
|
this:
confusionMatrix
|
ConfusionMatrix |
Examples
Code Editor (JavaScript)
// Construct a confusion matrix from an array (rows are actual values, // columns are predicted values). We construct a confusion matrix here for // brevity and clear visualization, in most applications the confusion matrix // will be generated from ee.Classifier.confusionMatrix. var array = ee . Array ([[ 32 , 0 , 0 , 0 , 1 , 0 ], [ 0 , 5 , 0 , 0 , 1 , 0 ], [ 0 , 0 , 1 , 3 , 0 , 0 ], [ 0 , 1 , 4 , 26 , 8 , 0 ], [ 0 , 0 , 0 , 7 , 15 , 0 ], [ 0 , 0 , 0 , 1 , 0 , 5 ]]); var confusionMatrix = ee . ConfusionMatrix ( array ); print ( "Constructed confusion matrix" , confusionMatrix ); // Calculate overall accuracy. print ( "Overall accuracy" , confusionMatrix . accuracy ()); // Calculate consumer's accuracy, also known as user's accuracy or // specificity and the complement of commission error (1 − commission error). print ( "Consumer's accuracy" , confusionMatrix . consumersAccuracy ()); // Calculate producer's accuracy, also known as sensitivity and the // compliment of omission error (1 − omission error). print ( "Producer's accuracy" , confusionMatrix . producersAccuracy ()); // Calculate kappa statistic. print ( 'Kappa statistic' , confusionMatrix . kappa ());
import ee import geemap.core as geemap
Colab (Python)
# Construct a confusion matrix from an array (rows are actual values, # columns are predicted values). We construct a confusion matrix here for # brevity and clear visualization, in most applications the confusion matrix # will be generated from ee.Classifier.confusionMatrix. array = ee . Array ([[ 32 , 0 , 0 , 0 , 1 , 0 ], [ 0 , 5 , 0 , 0 , 1 , 0 ], [ 0 , 0 , 1 , 3 , 0 , 0 ], [ 0 , 1 , 4 , 26 , 8 , 0 ], [ 0 , 0 , 0 , 7 , 15 , 0 ], [ 0 , 0 , 0 , 1 , 0 , 5 ]]) confusion_matrix = ee . ConfusionMatrix ( array ) display ( "Constructed confusion matrix:" , confusion_matrix ) # Calculate overall accuracy. display ( "Overall accuracy:" , confusion_matrix . accuracy ()) # Calculate consumer's accuracy, also known as user's accuracy or # specificity and the complement of commission error (1 − commission error). display ( "Consumer's accuracy:" , confusion_matrix . consumersAccuracy ()) # Calculate producer's accuracy, also known as sensitivity and the # compliment of omission error (1 − omission error). display ( "Producer's accuracy:" , confusion_matrix . producersAccuracy ()) # Calculate kappa statistic. display ( "Kappa statistic:" , confusion_matrix . kappa ())

