Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support SiLU activation function #20

Open
cong-liu-2000 opened this issue Feb 18, 2023 · 1 comment
Open

Support SiLU activation function #20

cong-liu-2000 opened this issue Feb 18, 2023 · 1 comment
Assignees

Comments

@cong-liu-2000
Copy link

SiLU is a popular activation function. It is used in the YOLOv5 networks. Can you support it?

@huanzhang12
Copy link
Member

Thanks for reporting this to us. We will consider supporting this activation function in a future version, but it will definitely take some time.

@shizhouxing I think you can consider SiLU activations and other similar activation functions (Softplus, ELU, etc) in benchmarks for non-linear functions. They are indeed very useful non-linearities in many applications. Sometimes they perform better than ReLU, and sometimes people have theoretical requirements that smooth functions are needed, so ReLU cannot be used.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants