You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We know that there is a lag (and a large window of uncertainty) around when the algorithm detects mining in a given pixel compared to when that mining actually occurred. It would be helpful to quantify how large the size and dispersion of that error is.
Preliminary thoughts are to look at the trajectory of NDVI within a given year for certain pixels using NDVI from individual scenes (rather than composites), however, this might be challenging to automate, given the cloudiness of most summer Landsat scenes in Appalachia.
Another approach might be to just set a number of control points for this specific purpose and go through them manually to assess actual timing of active mining compared to the output of the algorithm.
OR maybe there are far better ideas...
The text was updated successfully, but these errors were encountered:
We know that there is a lag (and a large window of uncertainty) around when the algorithm detects mining in a given pixel compared to when that mining actually occurred. It would be helpful to quantify how large the size and dispersion of that error is.
Preliminary thoughts are to look at the trajectory of NDVI within a given year for certain pixels using NDVI from individual scenes (rather than composites), however, this might be challenging to automate, given the cloudiness of most summer Landsat scenes in Appalachia.
Another approach might be to just set a number of control points for this specific purpose and go through them manually to assess actual timing of active mining compared to the output of the algorithm.
OR maybe there are far better ideas...
The text was updated successfully, but these errors were encountered: