Fix broken flash_attention and xformers (#99) #318
pr.yaml
on: push
h100-pytorch-test
/
linux-test-h100
6m 47s
h100-triton-main-test
/
linux-test-h100
2m 17s