Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes custom model parameter overrides in config #630

Closed

Conversation

JasonWeill
Copy link
Collaborator

Removes model_parameters, which were redundant with provider_params, from chat and completion code. Fixes #624.

Tested by passing a custom model parameter in my JupyterLab config:

jupyter lab --AiExtension.model_parameters openai-chat:gpt-4-1106-preview='{"max_tokens": 4095}'

In the chat UI and the completion interface, this model now works. Temporarily added log statements confirm that the custom parameter is included in the llm object exactly once.

model_parameters was not redundant with provider_params in the magic command handler, so it is not modified. Magic commands with a model with custom model config continue to work.

@JasonWeill JasonWeill added bug Something isn't working @jupyter-ai/chatui labels Feb 8, 2024
Copy link
Member

@dlqqq dlqqq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This breaks traitlets configurability because this PR just deletes model_parameters, the trait being passed from the top-level AiExtension class in extension.py:

model_parameters = Dict(
    key_trait=Unicode(),
    value_trait=Dict(),
    default_value={},
    help="""Key-value pairs for model id and corresponding parameters that
    are passed to the provider class. The values are unpacked and passed to
    the provider class as-is.""",
    allow_none=True,
    config=True,
)

Please find another way to fix this bug that doesn't delete the traitlets configuration.

@JasonWeill
Copy link
Collaborator Author

Closes in favor of #632, which was approved.

@JasonWeill JasonWeill closed this Feb 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working @jupyter-ai/chatui
Projects
None yet
Development

Successfully merging this pull request may close these issues.

jupyter_ai_magics.providers.ChatOpenAIProvider() got multiple values for keyword argument 'max_tokens'
2 participants