Replies: 5 comments 1 reply
-
Thank you @rhyswynn. Definitely want to have a First, PSAI needs to be on parity with the OpenAI Python SDK. They just dropped v2 so that is my current focus. The challenge for integration is that PSAI is using the new Assistant API which is a conversational stateless approach. The others are completion and stateful. I have not come up with a good way to build this. The current frameworks do not use the stateless OpenAI APIs. They use the previous version. Plus Ollama, Gemini, Claude use the stateful older approach. https://gist.github.com/dfinke/9e5c10e8ad1333af1e4e426dd3dbb5e3 - this is simple psllm.ps1 that works with Gemini, Claude and OpenAI older approach. Having a Right now I am on the hook to do a book for Manning "AI Assisted PowerShell" so that is a priority. Happy to discuss and review PRs that could work. |
Beta Was this translation helpful? Give feedback.
-
@rhyswynn What kind of machine are you running the OLlama on? |
Beta Was this translation helpful? Give feedback.
-
Hi all, I just managed to run Ollama on my desktop via Docker-Desktop. (Unfortunately, it is only CPU-based because I have an AMD graphics card, and there is no support for it yet. However, for my use cases, the CPU R9 3900x with 32GB RAM is decent enough.) I am also very interested in integrating a local Docker-based LLM. I'm using "open-webui Currently, I am experimenting with the PSAI module and am highly impressed with the concept of offloading decision-making tasks within PowerShell scripts to an AI. (@dfinke Thank you very much for your work so far!) My specific goal is to automate the process of renaming and organizing documents that I have scanned as PDFs (with OCR) locally on my desktop using PowerShell. Any advice or guidance on this would be greatly appreciated. Thank you! |
Beta Was this translation helpful? Give feedback.
-
Thank you @ABrauser. I haven't looked at the Ollama implementation. Don't think it would be hard. Like I said in the prev comment, I am thinking through a @ABrauser @rhyswynn as a note, GitHub is previewing access to ~all the models. I have a private working copy. It only works with the OpenAI models. Other models have different request/response approaches. GitHub has abstracted that and enable @ABrauser as a workaround, it probably won't work after the plugins are implemented. Checkout Let me know how it goes or have questions. |
Beta Was this translation helpful? Give feedback.
-
update #48 |
Beta Was this translation helpful? Give feedback.
-
Hi there! Great contribution to the community, as always, thank you!
Any thoughts on adding Ollama support? Ollama allows for integration with many other open source local LLMs, which can be valuable tools for developing LLM queries without spending on tokens. I have integrated Ollama support with a different Azure/OpenAI LLM query framework in Python, so it can definitely be done without too much effort. Thanks again!
Beta Was this translation helpful? Give feedback.
All reactions