Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: update operator list in torch support's documentation section #486

Merged
merged 1 commit into from
Feb 7, 2024

Conversation

RomanBredehoft
Copy link
Contributor

A user pointed that that this list was not up to date. Turns out we indeed added/changed quite some things since we've updated this section.

closes https://github.com/zama-ai/concrete-ml-internal/issues/4268

@RomanBredehoft RomanBredehoft requested a review from a team as a code owner February 6, 2024 10:51
@cla-bot cla-bot bot added the cla-signed label Feb 6, 2024
- [`torch.nn.BatchNorm2d`](https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm2d.html)
- [`torch.nn.BatchNorm3d`](https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm3d.html)
- [`torch.erf, torch.special.erf`](https://pytorch.org/docs/stable/special.html#torch.special.erf)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a doubt in putting these two here (erf + pad), but I think it's ok

- [`torch.ge, torch.greater_equal`](https://pytorch.org/docs/stable/generated/torch.ge.html)
- [`torch.lt, torch.less`](https://pytorch.org/docs/stable/generated/torch.lt.html)
- [`torch.le, torch.less_equal`](https://pytorch.org/docs/stable/generated/torch.le.html)
- [`torch.eq`](https://pytorch.org/docs/stable/generated/torch.eq.html)
Copy link
Contributor Author

@RomanBredehoft RomanBredehoft Feb 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

side note: torch.eq is not the same as torch.equal , and I think we support torch.eq but to be confirmed maybe


- `brevitas.nn.QuantLinear`
- `brevitas.nn.QuantConv1d`
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a doubt here : is supporting torch.conv1d enough for supporting brevitas.nn.QuantConv1d ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think so yeah

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we support all Brevitas operators that are equivalent to those already supported in Torch.
So the first sentence seems more appropriate.

Should we add: to this list: qnn.QuantLinear, qnn.QuantIdentity, qnn.QuantReLU

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

qnn.QuantLinear is in this list and qnn.QuantIdentity is below in the "Quantizers" sub-section

but for other like activation functions, @andrei-stoian-zama pointed out to me that we should not suggest users to use them as :

  • it's less certain that we indeed do support them
  • we most likely never need them after all, as activation functions don't require quantization (done with a TLU), we can just use torch.relu

so I don't think there's much to add here !


- `brevitas.nn.QuantLinear`
- `brevitas.nn.QuantConv1d`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think so yeah

@RomanBredehoft RomanBredehoft merged commit b617740 into main Feb 7, 2024
12 checks passed
@RomanBredehoft RomanBredehoft deleted the docs/update_torch_support_operator_list_4268 branch February 7, 2024 12:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants