You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi
Q1 - I don't understand why don't use all of y_pred[ :,:,:]? why use y_pred[ :, 2:,:]? why don't use 0,1 dims?
def ctc_lambda_func(args):
y_pred, labels, input_length, label_length = args
# the 2 is critical here since the first couple outputs of the RNN
# tend to be garbage:
y_pred = y_pred[:, 2:, :]
return K.ctc_batch_cost(labels, y_pred, input_length, label_length)
Q2 - What's meaning the downsampling parameter? if we change the input_size, is necessary to change the value of downsampling parameter? what's the principle of the parameter?
and I also don't understand, why use this line and why we multiply (self.img_w // self.downsample_factor - 2) to np.ones((self.batch_size, 1)) ? what's the advantage of this work? if we use only np.ones((self.batch_size, 1)) , Causing the problem?
and why use (self.downsample_factor - 2) ? why 2 ?
The text was updated successfully, but these errors were encountered:
Hi
Q1 - I don't understand why don't use all of
y_pred[ :,:,:]
? why usey_pred[ :, 2:,:]
? why don't use 0,1 dims?Q2 - What's meaning the downsampling parameter? if we change the input_size, is necessary to change the value of downsampling parameter? what's the principle of the parameter?
and I also don't understand, why use this line and why we multiply
(self.img_w // self.downsample_factor - 2)
tonp.ones((self.batch_size, 1))
? what's the advantage of this work? if we use onlynp.ones((self.batch_size, 1))
, Causing the problem?and why use
(self.downsample_factor - 2)
? why 2 ?The text was updated successfully, but these errors were encountered: