-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ch3.5 p69 line 30 ReconLayer.pretrain denoise_name not including variable_scope #11
Comments
书里没有 |
对,自己加了一个scope, 因为想要对train和test使用reuse,然后发现recon_layer1.pretrain好像有这个问题,其他的layer没有发现这个问题。 |
能否把代码都贴上来,我看了下 ReconLayer的源码,没发现问题~ |
It seems that your code is different with the official code here |
Yes, i added the variable scope, supposedly the layer name should include the scope. But pretrain can't find the denoise layer if the model is under a scope. |
I test the code after adding |
which code did you test, official example? Are you able to reproduce the error using my code? |
假设设置variable_scope (say 'ae'),denoise_name仍然会用本身的名字denoising1,而不是ae/denoising1. 不然有KeyError。如果有不同scope相同denoise_name的情况下,是不是就没法区别呢?
with tf.variable_scope("ae", reuse=reuse):
...
recon_layer1.pretrain(sess, x=x, X_train=X_train, X_val=X_val, denoise_name='ae/denoising1', n_epoch=n_epochs, batch_size=batch_size, print_freq=print_interval, save=True, save_name='w1pre_')
logging.info(" denoising layer keep: %f" % self.all_drop[set_keep[denoise_name]])
KeyError: 'ae/denoising1'
The text was updated successfully, but these errors were encountered: