Skip to content

Fix hf llama precision. #3634

Fix hf llama precision.

Fix hf llama precision. #3634

Triggered via pull request November 13, 2023 07:21
Status Cancelled
Total duration 2m 29s
Artifacts

main.yml

on: pull_request
Build-dipu-pytorch-for-ascend
0s
Build-dipu-pytorch-for-ascend
Build-dipu-camb
0s
Build-dipu-camb
Build-dipu-camb-latest-target
0s
Build-dipu-camb-latest-target
Build-dipu-cuda
0s
Build-dipu-cuda
Build-dipu-cuda-latest-target
0s
Build-dipu-cuda-latest-target
Build-dipu-ascend
0s
Build-dipu-ascend
Build-dipu-ascend-latest-target
0s
Build-dipu-ascend-latest-target
Test-dipu-camb
0s
Test-dipu-camb
Test-one-iter-camb
0s
Test-one-iter-camb
Test-dipu-camb-latest-target
0s
Test-dipu-camb-latest-target
Test-dipu-cuda
0s
Test-dipu-cuda
Test-one-iter-cuda
0s
Test-one-iter-cuda
Test-dipu-cuda-latest-target
0s
Test-dipu-cuda-latest-target
Test-dipu-ascend
0s
Test-dipu-ascend
Test-one-iter-ascend
0s
Test-one-iter-ascend
Test-dipu-ascend-latest-target
0s
Test-dipu-ascend-latest-target
Fit to window
Zoom out
Zoom in

Annotations

2 errors
Rsync code
Canceling since a higher priority waiting request for 'daoxin/fix_hf_transformer_precision' exists
Rsync code
The operation was canceled.