Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failed to run ChatGLM GPTQ #1

Open
Aaron4Fun opened this issue Dec 8, 2023 · 1 comment
Open

failed to run ChatGLM GPTQ #1

Aaron4Fun opened this issue Dec 8, 2023 · 1 comment

Comments

@Aaron4Fun
Copy link

Aaron4Fun commented Dec 8, 2023

Have you met this problem? Any suggestions will be appreciated!
TypeError: register_forward_pre_hook() got an unexpected keyword argument 'with_kwargs'

auto-gptq == 0.5.1
torch == 1.13.0
transformers == 4.35.2
optimum == 1.15.0

image
@sammysun0711
Copy link
Owner

sammysun0711 commented Dec 9, 2023

@Aaron4Fun, I suppose latest optimum 1.15.0 introduces regression. Here are two solutions you can try:

  1. Create a new python virtual env using [python requirements] (https://github.com/sammysun0711/ov_llm_bench/blob/main/requirements.txt)
  2. Use fix for optimum 1.15.0: Fix compatibility causallm models export with optimum 1.15 huggingface/optimum-intel#487

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants