You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One more issue - your implementation of MLP, in model.py, is just a bunch of stacked linear layers with no non-linearities between them. This is mathematically equivalent to just a linear layer with in_dim = dims[0] and out_dim=dims[-1].
Why not use the activation between the layers?
The text was updated successfully, but these errors were encountered:
One more issue - your implementation of MLP, in model.py, is just a bunch of stacked linear layers with no non-linearities between them. This is mathematically equivalent to just a linear layer with in_dim = dims[0] and out_dim=dims[-1].
Why not use the activation between the layers?
The text was updated successfully, but these errors were encountered: