ESTIMATES OF THE ACCURACY OF GROUND COVER MAPPING BY THE VENEZUELA MAPBIOMAS PROJECT

ACCESS TO THE COLLECTION STATISTICS PANEL 1.0 [LINK]

Accuracy analysis is the primary way to assess the quality of the mapping performed by MapBiomas. In addition to telling what the overall hit rate is, the accuracy analysis also reveals estimates of the hit and miss rates for each assigned class. MapBiomas Venezuela evaluated the global precision and for each class of use and cover for all the years between 1985 and 2022.

Precision estimates were based on evaluating a sample of pixels, which we call the reference database, consisting of ~71,500 samples. The number of pixels in the reference database was predetermined by statistical sampling techniques. In each year, each pixel in the reference database was evaluated by technicians trained in visual interpretation of Landsat images. The accuracy evaluation was performed using metrics that compare the assigned class with the class evaluated by the technicians in the reference database.

In each year, the precision analysis is performed from the cross-tabulation of the sampled frequencies of the mapped and actual classes, in the format of Table 1. The frequencies ni,j represent the number of pixels in the sample classified as class i, and evaluated as class j. Marginal line totals represent the number of samples mapped as class i, while the marginal totals of columns   represent the number of samples evaluated by the technicians as class j. Table 1 is called the error matrix or confusion matrix.

Table 1: Generic sample error matrix

From the results in Table 1, the sample proportions in each table cell are estimated by . The Value Matrix then used to generate:

  1. The user precision: are the estimates of the pixel fractions of the mapping, for each class, correctly classified. User accuracy is associated with a commission error, which is the error made when assigning a pixel to class i, when it belongs to another class. The user precision for class i is estimated by and the commission error for These metrics are associated with the reliability from each assigned class. 
  2. Producer Accuracy: these are the sample fractions of pixels from each class correctly assigned to their classes by the classifiers. Producer precision is associated with the omission error, which occurs when we cannot assign a pixel of class j correctly. The producer precision for class j is estimated by and the omission error by These metrics are associated with the sensitivity of the classifier, that is, the ability to correctly distinguish a particular class from others.
  3. overall precision: It is the estimate of the overall correctness ratio of the classifiers. The estimate is given by , the sum of the main diagonal of the proportions matrix. The precision complement or the error total it still breaks down into area disagreement and assignment disagreement1. Area discrepancy measures the fraction of the error attributed to the amount of area misallocated to classes by the mapping, while assignment discrepancy measures the proportion of offset errors.

The matrix also provides estimates of the different types of errors. For example, it is possible to see through these the estimation of the composition of the area of ​​each assigned class. Therefore, in addition to the success rate of the class assigned as forest, for example, we also estimate what fraction of these areas may be pasture or other land cover and use classes, per year. We understand that this level of transparency informs users and maximizes the potential of mapping various types of users.

1- Pontius Jr, R. G., & Millones, M. (2011). Death to Kappa: birth of quantity disagreement and allocation disagreement for accuracy assessment. International Journal of Remote Sensing, 32(15), 4407-4429.


ABOUT THE GRAPHICS

GENERAL STATISTICS

Show the mean yearly total precision and the decomposed error for area and assignment disagreement.

Graph 1. Total annual precision graph

This graph shows the total precision and the total error by year. Total error is broken down into area disagreement and assignment disagreement. Precision is plotted at the top and errors at the bottom of the graph.

Graph 2. Error matrix

This chart shows the accuracy of user, producer, and class confusion, for each year. The first shows the confusions of each assigned class. The second shows the confusions of each real class.

Graph 3. History of the class

This chart allows you to inspect the confusions of a particular class over time. The user and producer precision for each class is shown, along with the confounds in each year.

ACCESS TO THE COLLECTION STATISTICS PANEL 1.0 [LINK]