[decoding_attention] Fix broken flash_attention and xformers #275
Triggered via pull request
December 6, 2024 18:23
Status
Cancelled
Total duration
22m 27s
Artifacts
–
pr.yaml
on: pull_request
h100-pytorch-test
/
linux-test-h100
15m 59s
h100-triton-main-test
/
linux-test-h100
22m 11s
Annotations
4 errors
h100-triton-main-test / linux-test-h100
Canceling since a higher priority waiting request for 'TritonBench PR Test-99-false' exists
|
h100-triton-main-test / linux-test-h100
The operation was canceled.
|
h100-pytorch-test / linux-test-h100
Canceling since a higher priority waiting request for 'TritonBench PR Test-99-false' exists
|
h100-pytorch-test / linux-test-h100
The operation was canceled.
|