Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request]: Allow LiteLLM to track cost when accessing models from LiteLLM Proxy #4196

Closed
xingyaoww opened this issue Oct 3, 2024 · 4 comments
Labels
enhancement New feature or request fix-me Attempt to fix this issue with OpenHands Stale Inactive for 30 days

Comments

@xingyaoww
Copy link
Collaborator

What problem or use case are you trying to solve?

Currently, litellm cannot track cost when invoking a model from a LiteLLM proxy.

Describe the UX of the solution you'd like

Do you have thoughts on the technical implementation?

LiteLLM .completion call's response will have an x-litellm-response-cost in the response header when calling models from an LLM proxy.

We should modify openhands/llm/llm.py so that we first check if the response header is present, if so, we skip any cost calculation and use that directly as completion cost.

image

@xingyaoww xingyaoww added enhancement New feature or request eval-this fix-me Attempt to fix this issue with OpenHands and removed eval-this labels Oct 3, 2024
Copy link
Contributor

github-actions bot commented Oct 3, 2024

OpenHands started fixing the issue! You can monitor the progress here.

Copy link
Contributor

github-actions bot commented Oct 3, 2024

An attempt was made to automatically fix this issue, but it was unsuccessful. A branch named 'openhands-fix-issue-4196' has been created with the attempted changes. You can view the branch here. Manual intervention may be required.

openhands-agent added a commit that referenced this issue Oct 3, 2024
Copy link
Contributor

github-actions bot commented Nov 5, 2024

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Nov 5, 2024
@xingyaoww
Copy link
Collaborator Author

This should already been addressed in self.model_info in the LLM class.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request fix-me Attempt to fix this issue with OpenHands Stale Inactive for 30 days
Projects
None yet
Development

No branches or pull requests

1 participant