Fix hf llama precision. #3645
Triggered via pull request
November 14, 2023 02:43
Status
Cancelled
Total duration
13m 42s
Artifacts
–
main.yml
on: pull_request
Rsync code
3m 19s
Test-dipu-camb-latest-target
5m 4s
Test-dipu-cuda-latest-target
3m 54s
Test-dipu-ascend-latest-target
8s
Annotations
12 errors
Test-one-iter-cuda
Canceling since a higher priority waiting request for 'daoxin/fix_hf_transformer_precision' exists
|
Test-one-iter-cuda
The operation was canceled.
|
Test-dipu-cuda
Canceling since a higher priority waiting request for 'daoxin/fix_hf_transformer_precision' exists
|
Test-dipu-cuda
The operation was canceled.
|
Test-dipu-camb
Canceling since a higher priority waiting request for 'daoxin/fix_hf_transformer_precision' exists
|
Test-dipu-camb
The operation was canceled.
|
Test-dipu-cuda-latest-target
Canceling since a higher priority waiting request for 'daoxin/fix_hf_transformer_precision' exists
|
Test-dipu-cuda-latest-target
The operation was canceled.
|
Test-dipu-camb-latest-target
Canceling since a higher priority waiting request for 'daoxin/fix_hf_transformer_precision' exists
|
Test-dipu-camb-latest-target
The operation was canceled.
|
Test-one-iter-camb
Canceling since a higher priority waiting request for 'daoxin/fix_hf_transformer_precision' exists
|
Test-one-iter-camb
The operation was canceled.
|