Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

android fp16 use_fp16_arithmetic=true精度下降严重 #5771

Open
cbingdu opened this issue Nov 8, 2024 · 1 comment
Open

android fp16 use_fp16_arithmetic=true精度下降严重 #5771

cbingdu opened this issue Nov 8, 2024 · 1 comment

Comments

@cbingdu
Copy link

cbingdu commented Nov 8, 2024

detail | 详细描述 | 詳細な説明

pytorch训练的模型导出onnx,使用onnxr2ncnn工具转为ncnn模型,使用ncnnoptimize分别优化为fp32和fp16的ncnn模型,推理时不管加载fp32还是fp16模型,推理使用fp32结果差异很小,但是启用了fp16推理,尤其设置use_fp16_arithmetic=true,推理结果差异就很大了

@cbingdu
Copy link
Author

cbingdu commented Nov 8, 2024

ncnn中use_fp16_arithmetic=true这种在pytorch训练的时候有什么方案去逼近来保证推理的时候精度下降不会太严重

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant