You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for reporting this to us. We will consider supporting this activation function in a future version, but it will definitely take some time.
@shizhouxing I think you can consider SiLU activations and other similar activation functions (Softplus, ELU, etc) in benchmarks for non-linear functions. They are indeed very useful non-linearities in many applications. Sometimes they perform better than ReLU, and sometimes people have theoretical requirements that smooth functions are needed, so ReLU cannot be used.
SiLU is a popular activation function. It is used in the YOLOv5 networks. Can you support it?
The text was updated successfully, but these errors were encountered: