[decoding_attention] Fix broken flash_attention and xformers #299
pr.yaml
on: pull_request
h100-pytorch-test
/
linux-test-h100
48s
h100-triton-main-test
/
linux-test-h100
51s
Annotations
2 errors
h100-pytorch-test / linux-test-h100
Process completed with exit code 1.
|
h100-triton-main-test / linux-test-h100
Process completed with exit code 1.
|