Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: implement quantized unfold #474

Merged
merged 6 commits into from
Feb 2, 2024

Conversation

tguerand
Copy link
Contributor

Implement the QuantizedUnfold Class for support of Unfold Operation
See this issue

@tguerand tguerand requested a review from a team as a code owner January 30, 2024 02:57
Copy link

cla-bot bot commented Jan 30, 2024

Thank you for your pull request. We require contributors to sign our Contributor License Agreement / Terms and Conditions, and we don't seem to have the users @tguerand on file. In order for us to review and merge your code, please sign:

  • For individual contribution: our CLA
  • for Bounty submission, if you are an individual: our T&C
  • for Bounty submission, if you are a company: our T&C
    to get yourself added.

If you already signed one of this document, just wait to be added to the bot config.

@aquint-zama
Copy link
Contributor

@cla-bot check

@cla-bot cla-bot bot added the cla-signed label Jan 30, 2024
Copy link

cla-bot bot commented Jan 30, 2024

The cla-bot has been summoned, and re-checked this pull request!

@RomanBredehoft
Copy link
Contributor

Hello @tguerand , thanks a lot for your contribution !

As you can notice above, some checks failed 😞 You can see the full logs here.

We therefore invite you to solve these issues on your side and push again ! In order to make this process a lot easier to handle, you can run the make conformance (first) and make pcc (second) commands locally. Once they don't raise any errors anymore, you should be ready to push.

Let me know if you have any questions 😉

@tguerand
Copy link
Contributor Author

I have passed the make conformance and make pcc
A test of my implementation does not pass, it is linked to the padding operation from onnx and torch, I have also detailed the issue here in discord
Let me know how to fix it , thanks !

@tguerand
Copy link
Contributor Author

It is the last test of test_quantized_ops.py , I have added a comment at the first line to highlight it
Also I do not know how to request a code review, how to do it ? I see in the workflow that I have done it in the beginning but I suppose it is automatic?

@RomanBredehoft
Copy link
Contributor

RomanBredehoft commented Jan 30, 2024

Hello again @tguerand,
Thanks a lot for making pcc pass 🎉

Indeed, there seem to be a few issues with the tests :

  • in tests/quantization/test_quantized_ops.py::test_all_ops_were_tested , you just need to add QuantizedUnfold: test_quantized_unfold in currently_tested_ops (basically we make sure all quantized ops are properly tested somewhere !
  • coverage on line 2606 in quantized_ops.py: add a # type: ignore (the double space before the # is important !) as an inline comment as this line is only meant for debug (and not tests)
  • as for test_quantized_unfold[False-params7-16] and test_quantized_unfold[True-params7-16], indeed it looks like a shape problem (probably related to the pad issue you mention), tell me if you can solve it, otherwise I'll take a look at it and see if I can help you more !

@tguerand
Copy link
Contributor Author

I ve pushed the two first points, regarding the padding I do not know how to fix it yet I ll also look more into it
thanks!

@tguerand
Copy link
Contributor Author

tguerand commented Feb 1, 2024

Ok i think I ve found it: it comes from the ceil_mode variable that introduces tensorflow style padding in compute_onnx_pool_padding in onnx_impl_utils.py.
I have two options:

  1. remove the parameter and just put ceil_mode = 0 as argument of the compute_onnx_pool_padding function call line 2138 in onnx_impl_utils.py, and remove the ceil_mode == 1 part lines 2577 to 2592 in quantized_ops.py (easy method)
  2. add a ceil_mode == 1 part in onnx_impl_utils.py ( I have to think a bit how to do it )

Let me know which method you prefer for a global integration of the QuantizedUnfold, as it is a torch function I do not know if 2. makes sense

Thanks!

@andrei-stoian-zama
Copy link
Collaborator

Great job in tracking down the issue. We can go with option 1.

@RomanBredehoft
Copy link
Contributor

RomanBredehoft commented Feb 1, 2024

Yes good catch @tguerand, option 1 seems fine enough indeed ! You can push your fix and we'll trigger the CI once for you again

@tguerand
Copy link
Contributor Author

tguerand commented Feb 2, 2024

Ok i ve pushed the changes that removes ceil_mode, it passes all the tests of test_quantized_ops.py :)

@RomanBredehoft
Copy link
Contributor

Sorry @tguerand, last time I've given you the wrong disable to put in the code 😞 In quantized_ops.py, line 2588, replace # type: ignore by # pragma: no cover

after this all should be good and we'll be able to merge the PR 😉 Sorry again !

@tguerand
Copy link
Contributor Author

tguerand commented Feb 2, 2024

ok i ve changed it and pushed :)

Copy link
Contributor

@RomanBredehoft RomanBredehoft left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, thanks a lot for this contribution ! 🎉

@RomanBredehoft RomanBredehoft merged commit fa3ef88 into zama-ai:main Feb 2, 2024
15 of 18 checks passed
@fd0r
Copy link
Contributor

fd0r commented Feb 2, 2024

Thanks a lot for the contribution @tguerand !

@tguerand
Copy link
Contributor Author

tguerand commented Feb 2, 2024

Thanks all, do you know if there is a pre release for this feature to be implemented soon?

@fd0r
Copy link
Contributor

fd0r commented Feb 2, 2024

It isn't planned but we could do a release candidate for you to use it straight from pypi if you need it.

@tguerand
Copy link
Contributor Author

tguerand commented Feb 5, 2024

oh yes that would be great thanks! I could install it locally but it would be much easier as a release candidate on pypi :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants