Supported Model in README #60
-
Can you please update the README, specifying which model do you support? All LLMs from the hugging face Or specific architecture? Extended from this issue: #30 By the way, this is great library for quantizing model. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi, I think it's quite clear. |
Beta Was this translation helpful? Give feedback.
Hi, I think it's quite clear.
If a Hugging Face model is not on the list (llama, Mistral, Mixtral, Phi), use auto mode: https://github.com/mobiusml/hqq/?tab=readme-ov-file#auto-mode-1
Auto-mode basically tries to figure out how to quantize stuff automatically, but it's not guaranteed to work.