Geographic Information Systems Asked by doubtful_noob on September 16, 2020
I know I can get a confusion matrix and an overall accuracy, but is there a way to calculate producer and user accuracy for each class separately from the Google Earth Engine?
This is my validation dataset: https://code.earthengine.google.com/?asset=users/RohitNandakumar/accuracy
This is my training dataset: https://code.earthengine.google.com/?asset=users/RohitNandakumar/train
var sent = ee.ImageCollection("COPERNICUS/S2_SR"),
table = ee.FeatureCollection("users/RohitNandakumar/train"),
geometry =
/* color: #d63000 */
/* displayProperties: [
{
"type": "rectangle"
}
] */
ee.Geometry.Polygon(
[[[79.34242573948666, 29.562958506122595],
[79.34242573948666, 29.296831501244323],
[79.79217854710384, 29.296831501244323],
[79.79217854710384, 29.562958506122595]]], null, false),
accuracy = ee.FeatureCollection("users/RohitNandakumar/accuracy");
var land = sent.filterDate('2019-04-1','2019-11-01')
.filterMetadata("MGRS_TILE","equals",'44RLT')
.filter(ee.Filter.lt("CLOUDY_PIXEL_PERCENTAGE",5))
.sort('system:time_start');
var image = ee.Image(land.toList(land.size()).get(1)).clip(geometry)//.aside(print);
//supervised classification
var label = 'MC_ID';
var bands = ['B2','B3','B4','B5','B6','B7','B8','B8A','B9','B11','B12']
var training = image.select(bands).sampleRegions({
collection: table,
properties: [label],
scale: 10
});
var training = training.randomColumn().filter(ee.Filter.lt('random', 0.1))
var trained = ee.Classifier.cart().train(training, label, bands);
var classified = image.select(bands).classify(trained);
var validation = image.select(bands).sampleRegions({
collection: accuracy,
properties: [label],
});
var validated = validation.classify(trained);
var testAccuracy = validated.errorMatrix('MC_ID', 'classification');
print('Validation error matrix: ', testAccuracy);
print('Validation overall accuracy: ', testAccuracy.accuracy());
I think you should run the functions reduceRegion()
and/or reproject()
before the supervised classification, because this bands are not in the same scale.
To answer your question: try to interpret the confusion matrix, there you can evaluate if a pixel was classified in the wrong class.
Answered by Bernard C on September 16, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP