You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I want to keep some parts of model to fp32. I set the torch_dtype to torch.float16 according transformers, but all parameters are set to fp32. Is there any other suitable method?
The text was updated successfully, but these errors were encountered:
Hello, I want to keep some parts of model to fp32. I set the torch_dtype to torch.float16 according transformers, but all parameters are set to fp32. Is there any other suitable method?
![图片](https://private-user-images.githubusercontent.com/58412102/398630947-6fb477ac-2b2d-4e1f-9765-efb5eee9c4fd.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzg5Njk2NDcsIm5iZiI6MTczODk2OTM0NywicGF0aCI6Ii81ODQxMjEwMi8zOTg2MzA5NDctNmZiNDc3YWMtMmIyZC00ZTFmLTk3NjUtZWZiNWVlZTljNGZkLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMDclMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjA3VDIzMDIyN1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTAyNjc5YWNkM2M3MTg4NzBiZGRhYzI1ZWM2YmFlZTcwMWI3NzQwZGYwMTg5YjE5ZWFkYTZjNmU0MGYxODg0M2EmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.fz-kTxgHl3Y-E3FzM6G0yUgACu93w2exFjVL79Wc1sg)
The text was updated successfully, but these errors were encountered: