Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchNorm in WRN #98

Open
bkj opened this issue Sep 24, 2020 · 1 comment
Open

BatchNorm in WRN #98

bkj opened this issue Sep 24, 2020 · 1 comment

Comments

@bkj
Copy link

bkj commented Sep 24, 2020

Hi --

I noticed that the last BatchNorm in WRN is always set to is_training:
https://github.com/google-research/uda/blob/master/image/randaugment/wrn.py#L117

is_training changes for all of the other BNs. Is this intentional? Does it give some kind of performance advantage?

Thanks!

@bkj
Copy link
Author

bkj commented Sep 25, 2020

Digging into this deeper -- seems like using batch statistics vs running statistics make a fair bit of difference in the convergence of the model. Do you have a good explanation for that? Seems interesting + surprising to me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant