Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix FlashAttension SLM path failure #2182

Merged
merged 2 commits into from
Sep 11, 2024
Merged

Conversation

quintinwang5
Copy link
Contributor

@vlad-penkin vlad-penkin linked an issue Sep 10, 2024 that may be closed by this pull request
@etiotto etiotto changed the title [FIX]: fix FlashAttension SLM path failure Fix FlashAttension SLM path failure Sep 10, 2024
@whitneywhtsang
Copy link
Contributor

@whitneywhtsang
Copy link
Contributor

@whitneywhtsang whitneywhtsang enabled auto-merge (squash) September 11, 2024 03:05
@whitneywhtsang whitneywhtsang merged commit 0b30cf9 into llvm-target Sep 11, 2024
4 checks passed
@whitneywhtsang whitneywhtsang deleted the quintin/fix_fa_slm branch September 11, 2024 03:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[CI] FlashAttension SLM path failure
3 participants