Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update vllm colab to work with asyncio requirements #33122

Merged
merged 3 commits into from
Nov 19, 2024

Conversation

damccorm
Copy link
Contributor

In #32687 I updated the vLLM model handler to use an asyncio event loop. This led to huge efficiency gains (around <batch_size>x faster), but also broke it in environments where there is already an asyncio event loop because of https://stackoverflow.com/questions/55409641/asyncio-run-cannot-be-called-from-a-running-event-loop-when-using-jupyter-no

This is generally fine (this is not a normal environment to run Beam), but it does break things for colab environments. It can be easily fixed with this tweak, though - this PR makes that tweak to our example so that this keeps working. I've confirmed we don't need this to run in Dataflow/other runners


Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:

  • Mention the appropriate issue in your description (for example: addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, comment fixes #<ISSUE NUMBER> instead.
  • Update CHANGES.md with noteworthy changes.
  • If this contribution is large, please file an Apache Individual Contributor License Agreement.

See the Contributor Guide for more tips on how to make review process smoother.

To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md

GitHub Actions Tests Status (on master branch)

Build python source distribution and wheels
Python tests
Java tests
Go tests

See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.

@damccorm
Copy link
Contributor Author

This shouldn't be merged before the release, but it can get reviewed now I guess

@damccorm damccorm marked this pull request as ready for review November 14, 2024 19:20
Copy link
Contributor

Assigning reviewers. If you would like to opt out of this review, comment assign to next reviewer:

R: @robertwb added as fallback since no labels match configuration

Available commands:

  • stop reviewer notifications - opt out of the automated review tooling
  • remind me after tests pass - tag the comment author after tests pass
  • waiting on author - shift the attention set back to the author (any comment or push by the author will return the attention set to the reviewers)

The PR bot will only process comments in the main thread (not review comments).

@damccorm
Copy link
Contributor Author

R: @shunping

Copy link
Contributor

Stopping reviewer notifications for this pull request: review requested by someone other than the bot, ceding control. If you'd like to restart, comment assign set of reviewers

Copy link
Contributor

@shunping shunping left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@damccorm damccorm merged commit ca92726 into master Nov 19, 2024
3 checks passed
@damccorm damccorm deleted the users/damccorm/vllmColab branch November 19, 2024 15:19
damccorm added a commit that referenced this pull request Nov 19, 2024
damccorm added a commit that referenced this pull request Nov 19, 2024
damccorm added a commit that referenced this pull request Dec 2, 2024
…#33171)

* Revert "Revert "Update vllm colab to work with asyncio requirements (#33122)"…"

This reverts commit baa1591.

* Install triton as well
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants