You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@itayhubara : I noticed that all the binarized neural network files alexnet_binary.py, resnet_binary.py, vgg_cifar10_binary.py have Hardtanh activation function whereas their respective parent architectures in the files alexnet.py, resnet.py, vgg_cifar_10 have ReLU activation function. Is there any specific reason for this? However the Theano implementaion of Binary Connect code here uses ReLU activation when we binarize just the weights.
The text was updated successfully, but these errors were encountered:
I just demonstrated how to change the model to BNN model on ResNet and vgg_cifar. You should replace the ReLU with HardTanH in all other models as well. Basically you must ensure that you have BNN and the HardTanH before every binary GEMM operation.
@itayhubara : I noticed that all the binarized neural network files
alexnet_binary.py, resnet_binary.py, vgg_cifar10_binary.py
have Hardtanh activation function whereas their respective parent architectures in the filesalexnet.py, resnet.py, vgg_cifar_10
have ReLU activation function. Is there any specific reason for this? However the Theano implementaion of Binary Connect code here uses ReLU activation when we binarize just the weights.The text was updated successfully, but these errors were encountered: