You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 20, 2023. It is now read-only.
If I understand correctly, Stream.Resolution() is duration for a single bit and Duration() should return for all bits*.
Current implementation uses the count of bytes instead of bits, this looks wrong to me. https://github.com/google/periph/blob/master/conn/gpio/gpiostream/gpiostream.go#L68
It should be return b.Res * time.Duration(len(b.Bits)) * 8
Yeah, it is confusing that len(b.Bits) returns number of bytes.
There may be some code assuming this bug.
The text was updated successfully, but these errors were encountered:
Assuming number of bits to be 8*len(b.Bit) may work for most cases but we may want to specify number of bits explicitly in the struct. For example, gpiostream.Program{} could be a loop of 3bit-stream.
If I understand correctly, Stream.Resolution() is duration for a single bit and Duration() should return for all bits*.
Current implementation uses the count of bytes instead of bits, this looks wrong to me.
https://github.com/google/periph/blob/master/conn/gpio/gpiostream/gpiostream.go#L68
It should be
return b.Res * time.Duration(len(b.Bits)) * 8
Yeah, it is confusing that len(b.Bits) returns number of bytes.
There may be some code assuming this bug.
The text was updated successfully, but these errors were encountered: