We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No description provided.
The text was updated successfully, but these errors were encountered:
大佬有没有继续预训练的使用方法啊?
Sorry, something went wrong.
这个应该就是按照预训练任务继续微调吧,比如 MLM 任务,就也随机 Mask 掉一些 token,然后用 AutoModelForMaskedLM 加载模型参数微调。我不知道有没有专门的库。
AutoModelForMaskedLM
如果是大规模模型 LLM 的话,你可以看一下 Huggingface 官方的 PEFT 库,里面提供了很多流行的 efficient tuning 方法,比如 Low-Rank Adaptation (LoRA)。
No branches or pull requests
No description provided.
The text was updated successfully, but these errors were encountered: