Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in Evaluation after Applying PV-Tuning to AQLM Quantized Model #156

Open
gunho1123 opened this issue Dec 3, 2024 · 1 comment
Open

Comments

@gunho1123
Copy link

Hello,
I would like to evaluate a model that has been quantized using AQLM and subsequently fine-tuned with PV-Tuning.
After PV-Tuning, I obtained a best_model directory containing files such as model.layers.0.mlp.down_proj.weight.pth.

However, when I attempted to run lmeval.py, I encountered the following error:

FileNotFoundError: [Errno 2] No such file or directory: '/mnt/temp/AQLM/finetuned/meta-llama/Llama-3.1-8B/best_model/0.pth'

Could you help me resolve this issue? Specifically:

  1. Should the fine-tuned model files be converted to a specific format before evaluation?
  2. Is there any additional step required to ensure compatibility with lmeval.py?

Thank you in advance for your assistance.

@Vahe1994
Copy link
Owner

Hello!
Sorry for the late answer. After PV-tuning finetuning, because it is using FSDP you should convert it to standard format using https://github.com/Vahe1994/AQLM/tree/pv-tuning?tab=readme-ov-file#4-exporting-the-saved-model. Sorry for confusion. I will try to add this into Readme documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants