Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why does the alexnet_binary has Hardtanh activation when alexnet has ReLU activation? #20

Open
Ashokvardhan opened this issue Aug 8, 2019 · 1 comment

Comments

@Ashokvardhan
Copy link

Ashokvardhan commented Aug 8, 2019

@itayhubara : I noticed that all the binarized neural network files alexnet_binary.py, resnet_binary.py, vgg_cifar10_binary.py have Hardtanh activation function whereas their respective parent architectures in the files alexnet.py, resnet.py, vgg_cifar_10 have ReLU activation function. Is there any specific reason for this? However the Theano implementaion of Binary Connect code here uses ReLU activation when we binarize just the weights.

@itayhubara
Copy link
Owner

I just demonstrated how to change the model to BNN model on ResNet and vgg_cifar. You should replace the ReLU with HardTanH in all other models as well. Basically you must ensure that you have BNN and the HardTanH before every binary GEMM operation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants