Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting nan values while using response normalization layer #8

Open
GoogleCodeExporter opened this issue Sep 6, 2015 · 1 comment

Comments

@GoogleCodeExporter
Copy link

The network training is fine without adding any contrast normalization layer 
(all types), but ones add the contrast normalization layers, after several 
iterations the net gets nan values.  I tried different values of the size, 
scale and pow values, and tried to place the layer before and after pooling 
layer. 


Original issue reported on code.google.com by [email protected] on 8 May 2013 at 7:44

@GoogleCodeExporter
Copy link
Author

I have a similar issue. Does it still work when you remove the contrast 
normalization layer? 

Original comment by [email protected] on 13 Sep 2013 at 7:30

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant