upload files for TensorRT and TF-TRT for yolo LP #508
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Created two nodes for tensorRT inference. Requires TensorRT, protobuf=3.8.0 and pycuda installed (currently only tested on Nvidia jetson) Refer to the notion page for more details: https://www.notion.so/TensorRT-d63035adb45c43e180a2517c9fce474d
yolo_TF_TRT:
-Builds a tensortRT optimized saved_model format using the TF TensorRT API
-Requires the saved model directory of yolov4 and yolov4tiny (TF2 saved model format)
-File path of the newly built tensorRT saved_model can be editied in the config file
tensorRT:
-use a preconverted .trt yolov4 and yolov4tiny model for inference
-Following the notion page, yolov4/v4tiny models converted from darknet weigjts > onnx > .trt
-requires precompiling of C++ plugin, currently the plugin is referenced from another repo found in the notion page. If this specific method is adopted, the plugin folder has to be replicated from the other repo.
-The plugin path then has to be updated in the config file