You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great article and code, I've learned a lot from that! Here I have a question.
Why we choose the activation function of the last layer as relu in LSTM autoencoder, as there is a part of data less than 0 after standardization, causing the reconstruction error not fall close to 0. Is it better to choose tanh instead?
Thanks a lot!
The text was updated successfully, but these errors were encountered:
Thanks for your great article and code, I've learned a lot from that! Here I have a question.
Why we choose the activation function of the last layer as relu in LSTM autoencoder, as there is a part of data less than 0 after standardization, causing the reconstruction error not fall close to 0. Is it better to choose tanh instead?
Thanks a lot!
The text was updated successfully, but these errors were encountered: