[decoding_attention] Fix broken flash_attention and xformers #300
pr.yaml
on: pull_request
h100-pytorch-test
/
linux-test-h100
50s
h100-triton-main-test
/
linux-test-h100
50s
Annotations
2 errors
h100-triton-main-test / linux-test-h100
Process completed with exit code 1.
|
h100-pytorch-test / linux-test-h100
Process completed with exit code 1.
|