Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Internal state is float between 0 and 1, not binary? #13

Open
ocarinamat opened this issue Nov 20, 2018 · 3 comments
Open

Internal state is float between 0 and 1, not binary? #13

ocarinamat opened this issue Nov 20, 2018 · 3 comments

Comments

@ocarinamat
Copy link

Hi, I noticed that the activations are not binary, but floats between 0 and 1, and I was wandering if there is a bug.
The usage of floats is due to the fact that, also in the binary models, the hard tanh function is used, e.g.:

self.tanh2 = nn.Hardtanh(inplace=True)

In the paper, however, it is mentioned that the activation function should behave as a sign function in the forward step - is this correct? Thanks,

@itayhubara
Copy link
Owner

Please look at resnet_binary line 4, this line import binarize conv and fully connected operators:

from .binarized_modules import BinarizeLinear,BinarizeConv2d

If you look in binarized_modules .py line 13 you'll find your sign function

@ocarinamat
Copy link
Author

Righ, I was checking the input of the BinaryConv module, missing how you use Binarize in the first step of the forward function of that module. Thanks again!

@hoangle96
Copy link

Same question here, not sure if @ocarinamat or @itayhubara can provide more insight into it.

I'm working specifically with binary resent with CIFAR10. While the BinarizeLinear,BinarizeConv2d are used, the activations (after Hardtanh) are float between -1 and 1. My understanding is that BinarizeConv2d just binarizes the weights of the convolutional layers. I'm not what I'm missing here to get the binary activations as well.

Thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants