Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Custom Model (o1-mini) error: Error: Model request failed: Maximum call stack size exceeded #828

Closed
pboiii opened this issue Nov 17, 2024 · 2 comments

Comments

@pboiii
Copy link

pboiii commented Nov 17, 2024

Receive error when trying to add o1-mini custom model:
Error: Model request failed: Maximum call stack size exceeded

Image 11-17-24 at 12 20 PM

@logancyang
Copy link
Owner

logancyang commented Nov 17, 2024

Ok I see the issue, o1-preview and o1-mini do not support streaming at the moment.

One way to modify is to add a field stream: bool to CustomModel, default to true but let the user decide when filling the form.

Might be related to the part you are changing @Emt-lin. I found this: langchain-ai/langchainjs#7229 Not sure if it's released or not.

@TheBigFudge
Copy link

Hey @pboiii, @logancyang,

I now have access to o1-preview and o1-mini as well. Could you help clarify the correct way to integrate these models into Obsidian Copilot?

Should I go with @logancyang’s suggestion of adding a stream: bool field to CustomModel (defaulting to true)? Can you explain this any further? Or is there a preferred alternative that has already been tested?

Thanks in advance! 🙌

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

No branches or pull requests

3 participants