Skip to content

[KUNLUNXIN] case config: llama3-8b, chatglm3-6b, interconnect #1024

[KUNLUNXIN] case config: llama3-8b, chatglm3-6b, interconnect

[KUNLUNXIN] case config: llama3-8b, chatglm3-6b, interconnect #1024

Triggered via pull request October 4, 2024 18:40
@w4ynew4yne
synchronize #757
w4yne:main
Status Failure
Total duration 1d 5h 1m 47s
Artifacts
run-klx-training-test
0s
run-klx-training-test
Fit to window
Zoom out
Zoom in

Annotations

1 error
run-klx-training-test
This request was automatically failed because there were no enabled runners online to process the request for more than 1 days.