We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cf. https://en.wikipedia.org/wiki/Grayscale#Converting_color_to_grayscale:
luminance = 0.2126 * r + 0.7152 * g + 0.0722 * b
Here we have r, g and b from 0 to 31 and no FPU...
The text was updated successfully, but these errors were encountered:
NB: 0.2126 is 2126 / 10000, and so on.
luminance = (2126 * r + 7152 * g + 722 * b) / 100
should be between 0 and 255 if 8 bits and comparable across 2 colors.
Sorry, something went wrong.
Formula used for console16 palette, see #26.
Have another byte array alongside each palette with luminance values?
Have another byte array alongside each palette with indexes of colors ordered by luminance?
Compute luminance at runtime and keep it another array? Sort it?
Handle only 16 color palettes? 16 and 256 ones? (2 and 4 colors palettes are less used)
NB: with 256 colors, collision may happen more frequently, perhaps compute luminance with 16bits instead of 8.
CHiPs44
No branches or pull requests
cf. https://en.wikipedia.org/wiki/Grayscale#Converting_color_to_grayscale:
luminance = 0.2126 * r + 0.7152 * g + 0.0722 * b
Here we have r, g and b from 0 to 31 and no FPU...
The text was updated successfully, but these errors were encountered: