-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问断点续训应该如何设置 #178
Comments
学习率继承方面:只看到client_lr_scheduler保存,但是没有找到client_lr_scheduler的加载和调用 |
能否指点一下,谢谢 |
实在抱歉,sat目前还不支持optmizer的断点复原。 |
,-------- 原始邮件 --------发件人: elesun2018 ***@***.***>日期: 2024年4月12日周五 10:46收件人: THUDM/S .wissArmyTransformer ***@***.***>抄送: Subscribed ***@***.***>主 题: Re: [THUDM/SwissArmyTransformer] 请问断点续训应该如何设置 (Issue #178)
image.png (view on web)
请问optimizer不用继承前期训练的学习器,lr_scheduler是通过args.iteration继承的前期训练的学习率吗
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
因为optimizer状态一般会占用比较大的磁盘空间,所以我们没有保存optmizer。如果希望通过iteration来继承学习率,需要将 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
没有找到优化器保存呢,
断点续训 如何继承 优化器
The text was updated successfully, but these errors were encountered: