Skip to content

[ChatQnA] Switch to vLLM as default llm backend on Gaudi #3148

[ChatQnA] Switch to vLLM as default llm backend on Gaudi

[ChatQnA] Switch to vLLM as default llm backend on Gaudi #3148

Triggered via pull request January 16, 2025 12:48
@wangkl2wangkl2
opened #1404
Status Cancelled
Total duration 32s
Artifacts

pr-docker-compose-e2e.yml

on: pull_request_target
get-test-matrix  /  Get-test-matrix
7s
get-test-matrix / Get-test-matrix
Matrix: example-test
Fit to window
Zoom out
Zoom in

Annotations

2 errors and 1 warning
example-test (ChatQnA, gaudi) / get-test-case
Canceling since a higher priority waiting request for 'E2E test with docker compose-1404' exists
example-test (ChatQnA, gaudi) / get-test-case
The operation was canceled.
get-test-matrix / Get-test-matrix
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636