[decoding_attention] Fix broken flash_attention and xformers #273
pr.yaml
on: pull_request
h100-pytorch-test
/
linux-test-h100
48m 5s
h100-triton-main-test
/
linux-test-h100
48m 11s
Annotations
2 errors
h100-pytorch-test / linux-test-h100
Process completed with exit code 1.
|
h100-triton-main-test / linux-test-h100
Process completed with exit code 1.
|