-
-
Notifications
You must be signed in to change notification settings - Fork 342
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
jupyter_ai_magics.providers.ChatOpenAIProvider() got multiple values for keyword argument 'max_tokens' #624
Comments
Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! 🤗 |
@EduardDurech Thanks for opening this! I was able to reproduce this by passing a similar override on the command line using the tip of the
|
I'm getting the same error with Bedrock-chat models, in fact even giving some nonsense like |
It looks like the parameter is being ingested into both
This is true whether the parameter name is |
Not sure if relevant but I realized that it seemed to autosave to the |
There is an error when setting
max_tokens
foropenai-chat
in thejupyter_jupyter_ai_config.json
Error
jupyter-ai/packages/jupyter-ai/jupyter_ai/chat_handlers/base.py
Lines 204 to 209 in 8ea0ff3
Steps:
jupyter_jupyter_ai_config.json
in aconfig
path specified injupyter --paths
(I picked~/.jupyter
)Parametres are successfully loaded
[I FooDate AiExtension] Configured model parameters: {'openai-chat:gpt-4-1106-preview': {'max_tokens': 4095}}
Searching the codebase I cannot find that
max_tokens
is set anywhere and https://api.python.langchain.com/en/latest/chat_models/langchain_community.chat_models.openai.ChatOpenAI.html does not setThe text was updated successfully, but these errors were encountered: