[ChatQnA] Switch to vLLM as default llm backend on Gaudi #3148
pr-docker-compose-e2e.yml
on: pull_request_target
get-test-matrix
/
Get-test-matrix
7s
Matrix: example-test
Annotations
2 errors and 1 warning
example-test (ChatQnA, gaudi) / get-test-case
Canceling since a higher priority waiting request for 'E2E test with docker compose-1404' exists
|
example-test (ChatQnA, gaudi) / get-test-case
The operation was canceled.
|
get-test-matrix / Get-test-matrix
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|