Skip to content
This repository has been archived by the owner on Dec 7, 2022. It is now read-only.

tensorrt based craft returning 0 values. #2

Open
Justsubh01 opened this issue Mar 16, 2022 · 5 comments
Open

tensorrt based craft returning 0 values. #2

Justsubh01 opened this issue Mar 16, 2022 · 5 comments

Comments

@Justsubh01
Copy link

I am trying do to inference in tensorrt based craft but when i do inference it returns:

c| bs: (1, 320, 560, 2)
ic| 'head: ': 'head: '
    y_out[0:np.array(bs).prod()]: array([0., 0., 0., ..., 0., 0., 0.], dtype=float32)
ic| 'tail: ': 'tail: '
    y_out[np.array(bs).prod():]: array([0., 0., 0., ..., 0., 0., 0.], dtype=float32)
ic| 5, 'y2: ', y.shape: torch.Size([1, 320, 560, 2])
ic| 'value: ': 'value: '
    y: tensor([[[[0., 0.],
                 [0., 0.],
                 [0., 0.],
                 ...,
                 [0., 0.],
                 [0., 0.],
                 [0., 0.]],
       
                [[0., 0.],
                 [0., 0.],
                 [0., 0.],
                 ...,
                 [0., 0.],
                 [0., 0.],
                 [0., 0.]],
       
                [[0., 0.],
                 [0., 0.],
                 [0., 0.],
                 ...,
                 [0., 0.],
                 [0., 0.],
                 [0., 0.]],
       
                ...,
       
                [[0., 0.],
                 [0., 0.],
                 [0., 0.],
                 ...,
                 [0., 0.],
                 [0., 0.],
                 [0., 0.]],
       
                [[0., 0.],
                 [0., 0.],
                 [0., 0.],
                 ...,
                 [0., 0.],
                 [0., 0.],
                 [0., 0.]],
       
                [[0., 0.],
                 [0., 0.],
                 [0., 0.],
                 ...,
                 [0., 0.],
                 [0., 0.],
                 [0., 0.]]]])

onnx based craft working fine...

thanks, If you can help..

@k9ele7en
Copy link
Owner

yeah, the inference function for RT is not return correct result yet, I will fix it when I have time. Thanks...

@Justsubh01
Copy link
Author

Hi @k9ele7en,

Thank you very much for your response. Meanwhile, if you could suggest some solution which I can do myself then I will try and update you back.

Thank you in advance!

@k9ele7en
Copy link
Owner

k9ele7en commented Mar 21, 2022

A guy have faced the same issue, the error is from incorrect inference RT engine process, which require multiple steps of moving data between host (CPU) and device(s) (GPUs).
I commented in that issue, you can follow these to debug and find out...
#1 (comment)
Or you can try this new interface to convert and infer RT, but not as deep and optimized as traditional way: #1 (comment)
Thanks.

@Justsubh01
Copy link
Author

Hi @k9ele7en any updates on the issue ?

@k9ele7en
Copy link
Owner

k9ele7en commented Apr 5, 2022

I till not have time to work on it yet, also currently I don't have a machine that have GPU and CUDA, RT setup on that..

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants