Skip to content

[ChatQnA] Update the default LLM to llama3-8B on cpu/gpu/hpu #1231

[ChatQnA] Update the default LLM to llama3-8B on cpu/gpu/hpu

[ChatQnA] Update the default LLM to llama3-8B on cpu/gpu/hpu #1231

Triggered via pull request January 20, 2025 13:33
Status Success
Total duration 1m 12s
Artifacts

pr-link-path-scan.yml

on: pull_request
check-the-validity-of-hyperlinks-in-README
1m 2s
check-the-validity-of-hyperlinks-in-README
check-the-validity-of-relative-path
11s
check-the-validity-of-relative-path
Fit to window
Zoom out
Zoom in

Annotations

2 warnings
check-the-validity-of-relative-path
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
check-the-validity-of-hyperlinks-in-README
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636