Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Sagemaker async endpoint deployment. #2567

Open
yinsong1986 opened this issue Nov 17, 2024 · 0 comments
Open

Support Sagemaker async endpoint deployment. #2567

yinsong1986 opened this issue Nov 17, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@yinsong1986
Copy link

Description

Some one reported error when deploying models on LMI Sagemaker containers using Sagemaker async endpoint. Please refer to vllm-project/vllm#2912

Expected Behavior

Error Message

2024-11-16T23:22:06.943:[sagemaker logs] [xxxxxxxxxxxxxxxxxxxxxx] The response from container primary did not specify the required Content-Length header

How to Reproduce?

(If you developed your own code, please provide a short script that reproduces the error. For existing examples, please provide link.)

Steps to reproduce

[(Paste the commands you ran that produced the error.)

(vllm-project/vllm#2912)

What have you tried to solve it?

@yinsong1986 yinsong1986 added the bug Something isn't working label Nov 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant