Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX export #4

Open
ionymikler opened this issue Dec 3, 2024 · 1 comment
Open

ONNX export #4

ionymikler opened this issue Dec 3, 2024 · 1 comment

Comments

@ionymikler
Copy link

Hi @falcon-xu

I am looking into how to export the model into ONNX, with the intention of benchmarking its performance in different machines and with potential optimization techniques like quantization and others...
I see there is some minimal code in the repo that suggests that you attempted to do this (the DeiTOnnxConfig class in configuration_deit.py fx. , but I can't seem to find any scripts actually using this.
Any recommendations on how to proceed?

I understand from reading about the way pytorch does onnx export that the default process using torch.jit.trace() does not capture the dynamics of the model, so I guess an alternative that does so is needed. Have you played with this?

@falcon-xu
Copy link
Owner

Hi @ionymikler,

Currently, we do not export the model into ONNX. As you mentioned, using torch.jit.trace() doesn’t capture the full dynamics of the model, particularly when the model has control flow or other dynamic operations. For that, I guess you can use torch.onnx.export() with dynamic axes or explore exporting through torch.jit.script(), which handles more complex models with dynamic behavior.

If you want further guidance on the specifics of exporting with ONNX or need help with integrating this into your workflow, the community of ONNX may help you.

Best regards,
Falcon

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants