-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ONNX inference issue #134
Comments
my version |
Exactly same issue with:
Stacktrace:
|
@baudm I am currently encountering some technical issues and would appreciate your assistance. Firstly, I would like to inquire about how to resolve the aforementioned problem. Secondly, after using an older version of the project and converting it to ONNX format, I noticed a significant discrepancy between the output results and the pre-trained model. Regarding this issue, could you provide some suggestions to help address this problem? Thank you very much for taking the time to respond amidst your busy schedule. |
in parseq/models.py line-117, update tensor.bool to tensor.float,may be the version of torch and onnxruntime-gpu |
when i run this code, i got the warning
import torch
import onnx
parseq = load_from_checkpoint('pretrained=parseq').eval()
parseq.refine_iters = 0
parseq.decode_ar = False
image = torch.rand(1, 3, *parseq.hparams.img_size)
parseq.to_onnx('parseq.onnx', image, do_constant_folding=True, opset_version=14)
onnx_model = onnx.load('parseq.onnx')
![image](https://private-user-images.githubusercontent.com/48754757/311620518-2b706fa1-a09e-4499-9330-870dbfc7c820.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk3NjgxNTUsIm5iZiI6MTczOTc2Nzg1NSwicGF0aCI6Ii80ODc1NDc1Ny8zMTE2MjA1MTgtMmI3MDZmYTEtYTA5ZS00NDk5LTkzMzAtODcwZGJmYzdjODIwLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTclMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjE3VDA0NTA1NVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTkxMTRiYjc5MGM1ZmVkMjc3N2NlOWQxMmY2ZTZkMzNlMTc5MDAwN2M2ZDRmMTYyNjRhNGVjY2VlYjllNGI5YzkmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.6z7d-tiZS3OHN9OMcMbNH0ggU5qRdIsERTG1QY-E1mg)
onnx.checker.check_model(onnx_model, full_check=True)
then, when I use this ONNX model for inference, I encounter the following error
![image](https://private-user-images.githubusercontent.com/48754757/311620988-83434750-d5c5-4e05-97a5-67c2327ef4c2.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk3NjgxNTUsIm5iZiI6MTczOTc2Nzg1NSwicGF0aCI6Ii80ODc1NDc1Ny8zMTE2MjA5ODgtODM0MzQ3NTAtZDVjNS00ZTA1LTk3YTUtNjdjMjMyN2VmNGMyLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTclMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjE3VDA0NTA1NVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTUwYmZkMWJkN2YwOTMxYzMxZWJmNmRlYmJiMGY4Y2EyYmY0M2ZjMDNkZjhlMjViMWQwZDdkMmFkNjI0NzM2ZGEmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.3VhHEbgiaO2Z4GzltWfyNmB1ftj-VUhZP2HRGnBzaII)
and this is my code
![image](https://private-user-images.githubusercontent.com/48754757/311621273-915e7056-2187-46f6-8825-ea7ecab6caee.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk3NjgxNTUsIm5iZiI6MTczOTc2Nzg1NSwicGF0aCI6Ii80ODc1NDc1Ny8zMTE2MjEyNzMtOTE1ZTcwNTYtMjE4Ny00NmY2LTg4MjUtZWE3ZWNhYjZjYWVlLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTclMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjE3VDA0NTA1NVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWE0OTVlMWM3OTQ3ODU0ZjhjYjk2MDhiNjgyYzY2M2E2OGFmZjJhZTQ3OTRmNjY3ODQxODc3NGY4MzdlZjk2MmYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.gVmNQIIHDQ1-6-HaMj04U2CG7SxBr56LhnjYTKcJvS4)
The text was updated successfully, but these errors were encountered: