-
Notifications
You must be signed in to change notification settings - Fork 133
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request: Need models in onnx and .pt format (not just .bin and .config) #38
Comments
I converted Marigold to ONNX and uploaded to HF in case it's useful to you: https://huggingface.co/julienkay/Marigold |
would love to have it in CoreML too! |
Please share scripts to automate these steps here, we will consider including these harness bits into the repository at a later stage. |
@julienkay I see the folder structure is different: the VAE encoder and decoder were moved to the top level. Is there a way to keep the original structure? If this is done intentionally, how is this onnx checkpoint used? |
Essentially I've just used the conversion script from diffusers. Here is the code (probably not all packages required)
|
Afaik, separate encoder/decoder seems to be the "standard" way for onnx based pipelines in diffusers. |
I was requesting the model in another format because I cannot convert it without the proper model configuration file (I've tried) Need them in onnx or .pt format specifically for a Unity application called Depthviewer. Can we make this happen?
Here is a list of models and their formats available, as you can see depth-anything has onnx, I was hoping marigold could profile this also https://airtable.com/appjWiS91OlaXXtf0/shrchKmROzpsq0HFw/tblviBOLphAw5Befd
The text was updated successfully, but these errors were encountered: