Skip to content

[ChatQnA] Switch to vLLM as default llm backend on Gaudi #1125

[ChatQnA] Switch to vLLM as default llm backend on Gaudi

[ChatQnA] Switch to vLLM as default llm backend on Gaudi #1125

Triggered via pull request January 16, 2025 13:59
Status Success
Total duration 3m 9s
Artifacts

check-online-doc-build.yml

on: pull_request
Fit to window
Zoom out
Zoom in