Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Develop method to more accurately detect when mining began in a given pixel #75

Open
WassonMF opened this issue Jul 19, 2016 · 0 comments

Comments

@WassonMF
Copy link

We know that there is a lag (and a large window of uncertainty) around when the algorithm detects mining in a given pixel compared to when that mining actually occurred. It would be helpful to quantify how large the size and dispersion of that error is.

Preliminary thoughts are to look at the trajectory of NDVI within a given year for certain pixels using NDVI from individual scenes (rather than composites), however, this might be challenging to automate, given the cloudiness of most summer Landsat scenes in Appalachia.

Another approach might be to just set a number of control points for this specific purpose and go through them manually to assess actual timing of active mining compared to the output of the algorithm.

OR maybe there are far better ideas...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant