-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Operator requests #34
Comments
Shape operator |
@i-amgeek Thanks for your report! Generally reshape is hard to implement since the difference between NCHW (ONNX) and NHWC (dabnn). However, the cases that reshape is before gemm/matmul are trivial. I'll support them soon. |
Resize, |
How about 3D Conv/Pooling layers? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Currently dabnn only support primitive operators, such as Conv, FullyConnected, Pool, Add, Concat and ReLU. While they are already enough for many networks (e.g., ResNet, SqueezeNet), some more operators may also be needed for more complicated networks.
Moreover, I believe that the deployment of BNNs needs efforts from both the researchers of BNNs and the engineers of inference frameworks (like dabnn). So there should be a way to enable communication between BNN researchers and dabnn developers.
Therefore, this issue is opened for collecting the request of operators. It helps me to prioritize the various operators. Please reply to this issue if you want some operators that have not been implemented in dabnn, then I will implement them as long as I have enough time.
Requested operators list:
The text was updated successfully, but these errors were encountered: