-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integration with transformers
#13
Comments
https://github.com/oobabooga/text-generation-webui, which just runs transformers (and has support for LLaMA etc) has an API which is pretty similar to the openAI shape--probably not hard to adapt loom to post requests to the local API, I was planning on doing this |
transformers
transformers
I just discovered https://github.com/hyperonym/basaran which should accomplish this! |
interesting, that looks much better |
@mirrorcult Have you succeed to launch loom on llama? |
I'd like to second this. Having read the blog posts on Loom, this thing borders on the legendary... so would be very nice to get it working with a local LLM. Even though oobabooga's Most LLM tools are missing the multiverse features, which are kind of the point of this. SillyTavern with its Timelines extension comes close, for chat mode, but it'd be really nice to play around with base models with a tool like this. |
It would be interesting to be able to use
loom
with open source LLMs such as GPT-Neo-X, FLAN-UL2, and LLaMA. The transformers library by Huggingface has support for almost every open source LLM through a standardized interface.One approach to accomplish this could be direct integration. Another approach, to keep the
loom
client thin, could be to develop (maybe this already exists?) a shim that adapts the OpenAI API shape to atransformers
backendThe text was updated successfully, but these errors were encountered: