You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current implementation treats the value 0 (=TB_DEFAULT) for fg/bg special in write_sgr(). This leads to much unexpected behavior, for once, one cannot use the TB_* constants with this mode anymore.
Color
TB_OUTPUT_NORMAL
TB_OUTPUT_256
clear
0
0
black
1
-
red
2
1
...
...
...
What seems to be the most sensible but also most radical solution to me would be:
Make the TB_* constants line up with the actual values used ain SGR, ie. 0 for black etc.
Set TB_DEFAULT to a different magic value that's not used anywhere and catch that, if we are breaking, we could actually use -1 and use int16_t instead.
In fact, if we wanted to at some point support truecolor, we should perhaps even make it a bigger value.
Also I've found that the code uses the TB_BOLD constant for bg to encode "blinking", which should be documented.
The text was updated successfully, but these errors were encountered:
The current implementation treats the value 0 (=
TB_DEFAULT
) for fg/bg special inwrite_sgr()
. This leads to much unexpected behavior, for once, one cannot use theTB_*
constants with this mode anymore.What seems to be the most sensible but also most radical solution to me would be:
TB_*
constants line up with the actual values used ain SGR, ie. 0 for black etc.TB_DEFAULT
to a different magic value that's not used anywhere and catch that, if we are breaking, we could actually use -1 and useint16_t
instead.Also I've found that the code uses the
TB_BOLD
constant for bg to encode "blinking", which should be documented.The text was updated successfully, but these errors were encountered: