Skip to content
Change the repository type filter

All

    Repositories list

    • server

      Public
      The Triton Inference Server provides an optimized cloud and edge inferencing solution.
      Python
      BSD 3-Clause "New" or "Revised" License
      1.5k8.4k58660Updated Nov 28, 2024Nov 28, 2024
    • C++
      BSD 3-Clause "New" or "Revised" License
      628109Updated Nov 28, 2024Nov 28, 2024
    • Triton CLI is an open source command line interface that enables users to create, deploy, and profile models served by the Triton Inference Server.
      Python
      25122Updated Nov 27, 2024Nov 27, 2024
    • Python
      BSD 3-Clause "New" or "Revised" License
      2019605Updated Nov 27, 2024Nov 27, 2024
    • backend

      Public
      Common source, scripts and utilities for creating Triton backends.
      C++
      BSD 3-Clause "New" or "Revised" License
      9129703Updated Nov 26, 2024Nov 26, 2024
    • client

      Public
      Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.
      Python
      BSD 3-Clause "New" or "Revised" License
      2345753226Updated Nov 26, 2024Nov 26, 2024
    • common

      Public
      Common source, scripts and utilities shared across all Triton repositories.
      C++
      BSD 3-Clause "New" or "Revised" License
      756303Updated Nov 26, 2024Nov 26, 2024
    • core

      Public
      The core library and APIs implementing the Triton Inference Server.
      C++
      BSD 3-Clause "New" or "Revised" License
      105107019Updated Nov 26, 2024Nov 26, 2024
    • C++
      101804Updated Nov 26, 2024Nov 26, 2024
    • Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.
      Python
      Apache License 2.0
      75435224Updated Nov 26, 2024Nov 26, 2024
    • The Triton backend for the ONNX Runtime.
      C++
      BSD 3-Clause "New" or "Revised" License
      57134713Updated Nov 26, 2024Nov 26, 2024
    • Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.
      C++
      BSD 3-Clause "New" or "Revised" License
      146555010Updated Nov 26, 2024Nov 26, 2024
    • The Triton backend for the PyTorch TorchScript models.
      C++
      BSD 3-Clause "New" or "Revised" License
      4312703Updated Nov 26, 2024Nov 26, 2024
    • TRITONCACHE implementation of a Redis cache
      C++
      BSD 3-Clause "New" or "Revised" License
      41220Updated Nov 26, 2024Nov 26, 2024
    • An example Triton backend that demonstrates sending zero, one, or multiple responses for each request.
      C++
      BSD 3-Clause "New" or "Revised" License
      7500Updated Nov 26, 2024Nov 26, 2024
    • Third-party source packages that are modified for use in Triton.
      C
      BSD 3-Clause "New" or "Revised" License
      56605Updated Nov 26, 2024Nov 26, 2024
    • The Triton backend for TensorRT.
      C++
      BSD 3-Clause "New" or "Revised" License
      296401Updated Nov 26, 2024Nov 26, 2024
    • The Triton backend for TensorFlow.
      C++
      BSD 3-Clause "New" or "Revised" License
      204502Updated Nov 26, 2024Nov 26, 2024
    • tutorials

      Public
      This repository contains tutorials and examples for Triton Inference Server
      Python
      BSD 3-Clause "New" or "Revised" License
      96578813Updated Nov 26, 2024Nov 26, 2024
    • The Triton TensorRT-LLM Backend
      Python
      Apache License 2.0
      10871526319Updated Nov 26, 2024Nov 26, 2024
    • FIL backend for the Triton Inference Server
      Jupyter Notebook
      Apache License 2.0
      3672513Updated Nov 22, 2024Nov 22, 2024
    • OpenVINO backend for Triton.
      C++
      BSD 3-Clause "New" or "Revised" License
      163055Updated Nov 20, 2024Nov 20, 2024
    • pytriton

      Public
      PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.
      Python
      Apache License 2.0
      51746110Updated Nov 19, 2024Nov 19, 2024
    • Simple Triton backend used for testing.
      C++
      BSD 3-Clause "New" or "Revised" License
      4200Updated Nov 19, 2024Nov 19, 2024
    • Implementation of a local in-memory cache for Triton Inference Server's TRITONCACHE API
      C++
      BSD 3-Clause "New" or "Revised" License
      1510Updated Nov 19, 2024Nov 19, 2024
    • Example Triton backend that demonstrates most of the Triton Backend API.
      C++
      BSD 3-Clause "New" or "Revised" License
      12600Updated Nov 19, 2024Nov 19, 2024
    • The Triton repository agent that verifies model checksums.
      C++
      BSD 3-Clause "New" or "Revised" License
      71000Updated Nov 19, 2024Nov 19, 2024
    • The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.
      C++
      MIT License
      29126225Updated Nov 5, 2024Nov 5, 2024
    • Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.
      Python
      Apache License 2.0
      2518541Updated Sep 10, 2024Sep 10, 2024
    • contrib

      Public
      Community contributions to Triton that are not officially supported or maintained by the Triton project.
      Python
      BSD 3-Clause "New" or "Revised" License
      7801Updated Jun 5, 2024Jun 5, 2024