-
Notifications
You must be signed in to change notification settings - Fork 209
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
input_dim doesn't match buffersDIM[0] #154
Comments
The problem lies in these two lines in NetworkRT.cpp
Since I'm using a custom architecture, my input names don't match "data" and "out", so the input and output indices are never found. This causes a cascade of silent errors which evenutally manifest when input_dim is the wrong dimensions, or (if that's manually set) the output buffers are never initialized. |
You need to know the name of the input and output layer of the network and set it to the getBindingIndex function. |
Even if I known the name of my layers (I do), I have to modify NetworkRT.cpp in order to make use of them. So the next time I git pull the tkDNN repo, this bug comes back. |
I built a tensorrt engine with a custom architecture and I'm using the NetworkRT class to simplify inference.
In the NetworkRT class, the dimensions of my input_dim are {n=1, c=0, h=0, w=0, l=1}, which are obviously incorrect. An input image can't be 0x0 pixels. But in buffersDIM[0], the dimensions are correct: {n=1, c=1, h=192, w=192, l=1}.
Similar story for output_dim and buffersDIM[0]: {n=1, c=0, h=0, w=0, l=1} and {n=1, c=1, h=2, w=0, l=1}, respectively.
Is this a problem with my TensorRT engine, or am I missing something when initializing my NetworkRT instance?
The text was updated successfully, but these errors were encountered: