You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I noticed that the activations are not binary, but floats between 0 and 1, and I was wandering if there is a bug.
The usage of floats is due to the fact that, also in the binary models, the hard tanh function is used, e.g.:
self.tanh2 = nn.Hardtanh(inplace=True)
In the paper, however, it is mentioned that the activation function should behave as a sign function in the forward step - is this correct? Thanks,
The text was updated successfully, but these errors were encountered:
Righ, I was checking the input of the BinaryConv module, missing how you use Binarize in the first step of the forward function of that module. Thanks again!
Same question here, not sure if @ocarinamat or @itayhubara can provide more insight into it.
I'm working specifically with binary resent with CIFAR10. While the BinarizeLinear,BinarizeConv2d are used, the activations (after Hardtanh) are float between -1 and 1. My understanding is that BinarizeConv2d just binarizes the weights of the convolutional layers. I'm not what I'm missing here to get the binary activations as well.
Hi, I noticed that the activations are not binary, but floats between 0 and 1, and I was wandering if there is a bug.
The usage of floats is due to the fact that, also in the binary models, the hard tanh function is used, e.g.:
self.tanh2 = nn.Hardtanh(inplace=True)
In the paper, however, it is mentioned that the activation function should behave as a sign function in the forward step - is this correct? Thanks,
The text was updated successfully, but these errors were encountered: