-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support AWQ models #1049
base: main
Are you sure you want to change the base?
Support AWQ models #1049
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
@mvafin, can we have a test for this, e.g. with "hf-internal-testing/Mixtral-tiny-AWQ" or any other dummy model? |
yes a test would be great |
@AlexKoff88 @IlyasMoutawwakil Test added |
@mvafin, some of the tests failed:
Can you please check on your side? |
Fixed |
still getting awq cuda errors |
That might be because optimum is not tested with 2024.6 version of openvino |
enable awq export only if ov support it
* fix style * disable autogptq and autoawq install for old transformers testing
- if: ${{ matrix.transformers-version == 'latest' && matrix.test-pattern == '*modeling*'}} | ||
name: Install auto-gptq, autoawq | ||
run: | | ||
pip install auto-gptq autoawq --extra-index-url https://download.pytorch.org/whl/cpu |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
these are not valid extra urls for auto-gptq and awq
The PR in its current form seems to degrade the user experience greatly. My understanding is that it suggests we pass the responsibility of patching autoawq and autogptq to the user ? like writing their own |
What does this PR do?
Add support for AWQ models after this support was added by OpenVINO in 2024.6 (openvinotoolkit/openvino#27859)
Patching for 16bit can be also used for GPTQ and AWQ models to support FP16/BF16 regions in the model.
Fixes # (issue)
Before submitting