Geographic Information Systems Asked on May 30, 2021
I am trying to calculate the difference between two rasters. If i use Raster Calculator in QGIS I get the correct result (when manually checked using the information tool and the pixel values for the new layer and input layers). When I use Raster Difference tool in the SAGA tools (within qGIS) I get a very similar but not exact result when it comes down to the decimal place. I’m wondering what the difference is between these two tools? I would use the raster calculator but the method I have been directed to is the rater difference tool so I was just wondering why they are outputting a slightly different result with the same inputs.
If you are using Rasters with quite high levels of floating point precision, then each tool will choose a cut off point on what to do the comparison with.
So say your raster value is:
2.839899999999999999999999999999999999999999999999999999999
Each tool will do the processing with a certain precision.
Perhaps QGIS will use:
2.83989999999999999999999
And SAGA will use:
2.839899999999
The results will be slightly different. Although depending on your application of the end result. The result might also be the same. If you are looking at a NDVI value of 0.10000001 vs 0.10000002, there is not difference.
To fix this you could transform your data into integers, so no decimal places. By multiplying them by a certain factor, say 10000, then doing the processing, and converting them back to decimals by dividing by 10000 again.
But if the QGIS tool works, there is no reason not to use it. They are doing the same thing.
Answered by HeikkiVesanto on May 30, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP