Skip to content

[ChatQnA] Switch to vLLM as default llm backend on Gaudi #1119

[ChatQnA] Switch to vLLM as default llm backend on Gaudi

[ChatQnA] Switch to vLLM as default llm backend on Gaudi #1119

Triggered via pull request January 16, 2025 12:48
Status Success
Total duration 3m 1s
Artifacts

check-online-doc-build.yml

on: pull_request
Fit to window
Zoom out
Zoom in