Skip to content

Commit

Permalink
upd readme for demo
Browse files Browse the repository at this point in the history
  • Loading branch information
max-unfinity committed Dec 13, 2024
1 parent 318b4e9 commit 217aa54
Showing 1 changed file with 12 additions and 6 deletions.
18 changes: 12 additions & 6 deletions supervisely_integration/demo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ After you've trained a model in Supervisely, you can download the checkpoint fro

1. **Set up environment**. Install [requirements](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/rtdetrv2_pytorch/requirements.txt) manually, or use our pre-built docker image [DockerHub](https://hub.docker.com/r/supervisely/rt-detrv2/tags). Clone [RT-DETRv2](https://github.com/supervisely-ecosystem/RT-DETRv2) repository with model implementation.
2. **Download** your checkpoint and model files from Supervisely Platform.
3. **Run inference**. Refer to our demo scripts: [demo_torch.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_torch.py), [demo_onnx.py], [demo_trt.py]
3. **Run inference**. Refer to our demo scripts: [demo_pytorch.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_pytorch.py), [demo_onnx.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_onnx.py), [demo_tensorrt.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_tensorrt.py)


## Step-by-step guide:
Expand Down Expand Up @@ -41,19 +41,25 @@ git clone https://github.com/supervisely-ecosystem/RT-DETRv2
### 2. Download checkpoint and model files from Supervisely Platform

For RT-DETRv2, you need to download the following files:

**For PyTorch inference:**
- `checkpoint.pth` - model weights, for example `best.pth`
- `model_config.yml` - model configuration
- `model_meta.json` - class names

Go to Team Files in Supervisely Platform and download the files:
**ONNXRuntime and TensorRT inference require only \*.onnx and \*.engine files respectively.**
- Exported ONNX/TensorRT models can be found in the `export` folder in Team Files after training.

Go to Team Files in Supervisely Platform and download the files.

![team_files_download](img/team_files_download.png)
Files for PyTorch inference:

![team_files_download](https://github.com/user-attachments/assets/796bf915-fbaf-4e93-a327-f0caa51dced4)

### 3. Run inference

We provide several demo scripts to run inference with your checkpoint:

- [demo_torch.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_torch.py) - simple PyTorch inference
- [demo_onnx.py] - ONNXRuntime inference
- [demo_trt.py] - TensorRT inference
- [demo_pytorch.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_pytorch.py) - simple PyTorch inference
- [demo_onnx.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_onnx.py) - ONNXRuntime inference
- [demo_tensorrt.py](https://github.com/supervisely-ecosystem/RT-DETRv2/blob/main/supervisely_integration/demo/demo_tensorrt.py) - TensorRT inference

0 comments on commit 217aa54

Please sign in to comment.