Skip to content

Commit

Permalink
Reinstall flash_attention if it is modified in the CI.
Browse files Browse the repository at this point in the history
  • Loading branch information
xuzhao9 committed Dec 6, 2024
1 parent de24cc4 commit 2b3f822
Show file tree
Hide file tree
Showing 2 changed files with 13 additions and 1 deletion.
12 changes: 12 additions & 0 deletions .github/workflows/_linux-test-h100.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,23 @@ jobs:
uses: actions/checkout@v3
with:
submodules: recursive
- uses: dorny/paths-filter@v3
id: flash_attn_updated
with:
filters: |
src:
- 'submodules/flash-attention/**'
- name: Tune Nvidia GPU
run: |
sudo nvidia-smi -pm 1
sudo ldconfig
nvidia-smi
- name: Reinstall flash-attn (optional)
if: steps.flash_attn_updated
run: |
. "${SETUP_SCRIPT}"
python install.py --fa2
python install.py --fa3
- name: Install Tritonbench
run: |
# speedup install and skip compile by reusing the docker .so files
Expand Down
2 changes: 1 addition & 1 deletion submodules/flash-attention
Submodule flash-attention updated 104 files

0 comments on commit 2b3f822

Please sign in to comment.