Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensRT issue #15

Open
Alpgirl opened this issue Oct 23, 2024 · 6 comments
Open

TensRT issue #15

Alpgirl opened this issue Oct 23, 2024 · 6 comments

Comments

@Alpgirl
Copy link

Alpgirl commented Oct 23, 2024

Good day!
I successfully set up the environment. When I launch src/theia/scripts/preprocessing/feature_extraction.py , there is a warning: "tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT". I installed TensorRT from https://developer.nvidia.com/tensorrt and via pip, but I still catch the same warning.
Could you please explain how critical is to set up TensorRT and how to install it properly?
Thank you.

@jshang-bdai
Copy link
Contributor

Hi @Alpgirl ,

TLDR: you can safely ignore it.

This warning comes along with Tensorflow. We do not use Tensorflow except for reading Open-X-Embodiment dataset. We explored this dataset at our early study but we didn't use it eventually.

@Alpgirl
Copy link
Author

Alpgirl commented Oct 23, 2024

Actually, I am interested in Open-X-Embodiement dataset...
Do you know if TensorRT accelerates computation of features?

@jshang-bdai
Copy link
Contributor

I don't think so because our model is based on PyTorch.

Also a kind note, if you don't plan to use SAM as a teacher model, computing teacher features on-the-fly won't cost much more time. SAM features may take huge storage space for a large dataset like OXE.

@Alpgirl
Copy link
Author

Alpgirl commented Oct 24, 2024

@jshang-bdai , thank you!

@jshang-bdai
Copy link
Contributor

Sorry @Alpgirl let me correct my comment on TensorRT. It's not dependent on Tensorflow or Pytorch.

After a brief search, I think you could use it, for example https://huggingface.co/docs/optimum/en/onnxruntime/usage_guides/gpu#accelerated-inference-on-nvidia-gpus. Sorry for the confusion!

@Alpgirl
Copy link
Author

Alpgirl commented Oct 25, 2024

@jshang-bdai , thank you for the search. But I have been unsuccessful in installing TensorRT yet. Will try more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants