Skip to content

[ChatQnA] Switch to vLLM as default llm backend on Xeon #1117

[ChatQnA] Switch to vLLM as default llm backend on Xeon

[ChatQnA] Switch to vLLM as default llm backend on Xeon #1117

Triggered via pull request January 16, 2025 12:42
Status Success
Total duration 2m 54s
Artifacts

check-online-doc-build.yml

on: pull_request
Fit to window
Zoom out
Zoom in