-
-
Notifications
You must be signed in to change notification settings - Fork 340
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow usage without NVIDIA partner package #622
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dlqqq Great job on finding a fix for this issue. Verified both cases work as expected, was able to launch JLab and work with JAI. Code looks good!
With partner provider Install
[I 2024-02-06 08:57:11.719 AiExtension] Registered model provider `nvidia-chat`.
Without partner provider install
[W 2024-02-06 08:58:30.219 AiExtension] Unable to load model provider `nvidia-chat`. Please install the `langchain_nvidia_ai_endpoints` package.
Per discussion, let's use the term |
The term "partners" is in reference to LangChain partner packages. As this term appears only in a single source directory and not in any user-facing strings, I think this is a non-issue. Proceeding to merge. |
@meeseeksdev please backport to 1.x |
Co-authored-by: david qiu <[email protected]>
Issue
#579 added a change that imports the partner package
langchain_nvidia_ai_endpoints
directly, which causes anImportError
to be raised when importingjupyter_ai
without the partner package installed in the same environment.PR description
This PR fixes that issue by defining the NVIDIA provider in an isolated module that is not imported by anything else in
jupyter_ai_magics
. This allows that module to import fromlangchain_nvidia_ai_endpoints
directly. This branch also catches anyImportError
raised while loading the entry points and prints a short helpful warning to the terminal:Reviewing this PR
You will need to re-install the package, then test both cases:
Callout for future work
One issue with this PR is that it breaks our provider convention where everything is re-exposed at the top-level package. That is, the statement
works for
AI21Provider
, but fails forChatNVIDIAProvider
.In the future, I actually would like to revert our convention of exposing all the providers at the package root. We only do so now because I thought that entry points could only be exposed from the package root, which is untrue. This effort would require first removing all of these imports from
packages/jupyter-ai-magics/jupyter_ai_magics/__init__.py
:Then, editing each entry point definition from:
To a definition that specifies the source module directly: