-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TensRT issue #15
Comments
Hi @Alpgirl , TLDR: you can safely ignore it. This warning comes along with Tensorflow. We do not use Tensorflow except for reading Open-X-Embodiment dataset. We explored this dataset at our early study but we didn't use it eventually. |
Actually, I am interested in Open-X-Embodiement dataset... |
I don't think so because our model is based on PyTorch. Also a kind note, if you don't plan to use SAM as a teacher model, computing teacher features on-the-fly won't cost much more time. SAM features may take huge storage space for a large dataset like OXE. |
@jshang-bdai , thank you! |
Sorry @Alpgirl let me correct my comment on TensorRT. It's not dependent on Tensorflow or Pytorch. After a brief search, I think you could use it, for example https://huggingface.co/docs/optimum/en/onnxruntime/usage_guides/gpu#accelerated-inference-on-nvidia-gpus. Sorry for the confusion! |
@jshang-bdai , thank you for the search. But I have been unsuccessful in installing TensorRT yet. Will try more. |
Good day!
I successfully set up the environment. When I launch src/theia/scripts/preprocessing/feature_extraction.py , there is a warning: "tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT". I installed TensorRT from https://developer.nvidia.com/tensorrt and via pip, but I still catch the same warning.
Could you please explain how critical is to set up TensorRT and how to install it properly?
Thank you.
The text was updated successfully, but these errors were encountered: