Skip to content
This repository has been archived by the owner on Oct 26, 2024. It is now read-only.

Implement discretization checker. #162

Open
tbenthompson opened this issue Nov 22, 2022 · 3 comments
Open

Implement discretization checker. #162

tbenthompson opened this issue Nov 22, 2022 · 3 comments

Comments

@tbenthompson
Copy link
Member

tbenthompson commented Nov 22, 2022

UPDATE: See the suggestion at the bottom, check lambda minus epsilon and if it's different, raise lots of warnings.

tuning for a model with discrete test statistics is kinda scary. playing with a simple binomial model with n=10 so that the possible test stats are i / 10 for i in 0…10. the tuning code does the correct thing and there are no errors yet but i just wanted to bring this up…

example: suppose that the selected lambda** is 0.2. well, 0.2 is not precisely representable in floating point. the actual output from the code is 0.19999999999999996. if we use this threshold, the the type I error control turns out to be ~1%. but if we accidentally go use 0.2 as the threshold, then suddenly our type I error control is ~6% because there are lots of ties right at 0.19999999999999996. this is kinda scary and feels like the kind of thing that is going to bite us in the ass sometime.

potential ideas:

  • subtract a small value from the final tuning threshold?
  • ignore this?
@tbenthompson
Copy link
Member Author

@tbenthompson
Copy link
Member Author

there’s still the question of how we detect errors like this…

one option is to do that correction but only as a check:

  • receive lambda**
  • run validation
  • choose the tile with the highest TIE (much cheaper than re-validating the whole grid)
  • check if lambda** - 1e-12 changes the TIE.
  • if it does, print lots of warnings.

@tbenthompson
Copy link
Member Author

tbenthompson commented Nov 23, 2022

I ran into this issue again when I flipped between two different implementations of the same model. The two implementations were identical up to one or two ulps. But, that was enough to matter. Subtracting $\varepsilon$ from $\lambda^{**}$ would have completely solved the problem.

@tbenthompson tbenthompson changed the title Discrete tuning is scary Document "Discrete tuning is scary" Feb 28, 2023
@tbenthompson tbenthompson changed the title Document "Discrete tuning is scary" Implement discretization checker. Feb 28, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant