Replies: 1 comment
-
To also support Ollama in automatic configurations, here is a PR: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I just wanted to share an end-to-end example of using Ollama.
The relevant langchain doc is here: https://python.langchain.com/api_reference/ollama/chat_models/langchain_ollama.chat_models.ChatOllama.html#langchain_ollama.chat_models.ChatOllama
Python file
ex.py
Shell commands
The result is:
Beta Was this translation helpful? Give feedback.
All reactions