Skip to content

[decoding_attention] Fix broken flash_attention and xformers #303

[decoding_attention] Fix broken flash_attention and xformers

[decoding_attention] Fix broken flash_attention and xformers #303

Triggered via pull request December 9, 2024 17:26
Status Failure
Total duration 12m 48s
Artifacts

pr.yaml

on: pull_request
h100-pytorch-test  /  linux-test-h100
6m 12s
h100-pytorch-test / linux-test-h100
h100-triton-main-test  /  linux-test-h100
6m 20s
h100-triton-main-test / linux-test-h100
Fit to window
Zoom out
Zoom in

Annotations

2 errors
h100-triton-main-test / linux-test-h100
Process completed with exit code 1.
h100-pytorch-test / linux-test-h100
Process completed with exit code 1.