Skip to content
Change the repository type filter

All

    Repositories list

    • A Flexible Framework for Experiencing Cutting-edge LLM Inference Optimizations
      Python
      Apache License 2.0
      82313k4007Updated Mar 7, 2025Mar 7, 2025
    • vllm

      Public
      A high-throughput and memory-efficient inference and serving engine for LLMs
      Python
      Apache License 2.0
      6.1k1200Updated Mar 7, 2025Mar 7, 2025
    • Mooncake

      Public
      Mooncake is the serving platform for Kimi, a leading LLM service provided by Moonshot AI.
      C++
      Apache License 2.0
      1682.8k294Updated Mar 7, 2025Mar 7, 2025
    • FlashInfer: Kernel Library for LLM Serving
      Cuda
      Apache License 2.0
      239000Updated Feb 11, 2025Feb 11, 2025