Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Perf]: increase batch invoke by rewriting http call logic using aiohttp #1548

Merged
merged 1 commit into from
Apr 23, 2024

Conversation

aybruhm
Copy link
Member

@aybruhm aybruhm commented Apr 23, 2024

Description

This PR improves the batch processing logic to run in parallel rather than sequentially.

Related Issue

Closes #1538

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Apr 23, 2024
Copy link

vercel bot commented Apr 23, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
agenta ✅ Ready (Inspect) Visit Preview Apr 23, 2024 8:42am

@dosubot dosubot bot added Backend enhancement New feature or request python Pull requests that update Python code labels Apr 23, 2024
@aybruhm aybruhm requested a review from mmabrouk April 23, 2024 08:45
@aybruhm
Copy link
Member Author

aybruhm commented Apr 23, 2024

@mmabrouk, I have deployed the PR branch to cloud.beta

Copy link
Member

@mmabrouk mmabrouk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for the PR @aybruhm this is much much faster (now it takes 1 second instead of 10 seconds).

  • Can you please add this (speeding up the evaluation) to the changelog (create a separate pr for that). If there are any other updates we have forgotten (observability beta), can you please add it too. Thank you!

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Apr 23, 2024
@mmabrouk mmabrouk merged commit 3641d5a into main Apr 23, 2024
7 checks passed
@mmabrouk mmabrouk deleted the 1538-bug-evaluation-speed-and-llm-call-latency branch April 23, 2024 10:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Backend enhancement New feature or request lgtm This PR has been approved by a maintainer python Pull requests that update Python code size:M This PR changes 30-99 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug] Evaluation speed and LLM call latency
2 participants