You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.
I think it's because there's a W2 dense layer after the normal lstm layer. 45th line of cell.py is y = Activation(self.activation)(W2(h)). actually h is the output of lstmcell. And y is the output of W2. So you can try to change the activation function of this line.
I think you are right. The default activation function is tanh. Also normalization the data is another choice to solve the problem.
Hi,
I am working with seq2seq library, and I have some problems,....
I have vector that have values inside from 1 to 11399,...
so my (training) vector is looking like : [1, 200, 1235, 11300,...]
But I always get back ('predicted') values from 0 to 1... like
[0, 0.2, 0.3, 1,..]
I guess this is due to a activation function (softmax?). Is there a way to defined a activation function for seq2seq model layers?
Marko
The text was updated successfully, but these errors were encountered: