-
Notifications
You must be signed in to change notification settings - Fork 176
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
QPEFT微调后,通过single_turn_mm测试,回答乱码了 #200
Comments
输出是这样的。请问是调代码的时候出错了,还是权重被调乱了?elligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligelligellig dispositionPubelligelligelligelligellig dispositionelligelligelligelligellig dispositionelligPubPubellig disposition disposition disposition dispositionPubPubPubPubPubPubPubIntegrPubPubPubPubPubPubIntegrPubPub dispositionPubPubPubPub dispositionIntegrPubPubPub dispositionPub dispositionPubPubPubPubPubPubPub disposition da disposition disposition dispositionPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPub dispositionPub da disposition dispositionPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPub daPubPubPub dispositionPubPubPubPubPubPubPub daPub daPubPubPub jobPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPubPub jobPubPub jobPubPub job jobPubPubPubzegzegPubzegPub jobPubPubPubPubPubPubPubPubPubPubPubPubPub jobPubPubPubPubPubPubPubPubPub jobPubPub jobPubPubPubPub jobPub jobPubPub jobPub jobPub jobPubPubPubzegPub jobPubPubPub jobPubPub jobPubPubPub jobPubPub job job job jobPubzeg jobPubPubzegzeg jobPub job jobPubPubPubPub jobzegPub jobPubPubPubPub jobPub jobzeg job jobzegzeg jobPubzeg jobPubPub jobPub job job job jobPub job job job job jobPub jobPubPub jobPubPubPubzegPubPub job jobzegzeg job jobzeg jobPub job jobPubPub jobzeg jobzeg job jobPub jobzeg jobPubzeg jobPubPub jobzegconnectionzegzegPub job job job job jobPub job jobPubconnectionPub job job jobzegconnectionPub jobzeg jobPubzeg job jobPub job jobPub job job jobPub jobPubPubPubPub job job job job jobconnectionPub job jobconnectionPubconnectionPubconnection job jobPub job jobPub job job job job jobPubPubzeg job job jobPubPub job job job jobconnectionPub job job job job jobconnection job jobPubPub jobPub job job jobPub job jobPub jobPub jobconnectionPub job jobconnection job job jobconnectionconnection job jobconnection jobconnectionconnection |
在使用官方提供的权重,并且非qpeft模式下,是正常回答的 |
你好,我微调llama_qformerv2_peft,数据量是122个(图像-问答)对,回答比较复杂所以需要微调,训练时最后一轮loss为1.4051。在测试的时候,由于代码报类型不一致错误,我在image处理完,放到cuda前转换了类型为image.cuda().bfloat(),其它代码没有改动。通过gradio界面显示的输出乱码。请问是数据量太少了,或者数据的回答比较复杂而造成的这个原因吗?还是我量化微调的过程出错了。
The text was updated successfully, but these errors were encountered: