Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release new llama-cpp-haystack version #492

Closed
awinml opened this issue Feb 27, 2024 · 1 comment
Closed

Release new llama-cpp-haystack version #492

awinml opened this issue Feb 27, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@awinml
Copy link
Contributor

awinml commented Feb 27, 2024

Describe the bug
The model_path parameter was renamed to model in #243. This change is not reflected on the current llama-cpp-haystack version on pypi.
The documentation showcases the new API with model, causing a mismatch.

To Reproduce
To download the model:

curl -L -O "https://huggingface.co/TheBloke/OpenHermes-2.5-Mistral-7B-GGUF/resolve/main/openhermes-2.5-mistral-7b.Q4_K_M.gguf"
from haystack_integrations.components.generators.llama_cpp import LlamaCppGenerator

generator = LlamaCppGenerator(
    model="openhermes-2.5-mistral-7b.Q4_K_M.gguf", 
    n_ctx=512,
    n_batch=128,
    generation_kwargs={"max_tokens": 128, "temperature": 0.1},
)
generator.warm_up()
prompt = f"Who is the best American actor?"
result = generator.run(prompt)

Describe your environment (please complete the following information):

  • OS: Linux
  • Haystack version: haystack-ai==2.0.0b8
  • Integration version: llama-cpp-haystack==0.2.1
@awinml awinml added the bug Something isn't working label Feb 27, 2024
@anakin87
Copy link
Member

Thanks for reporting this!

I've just released a new version: https://pypi.org/project/llama-cpp-haystack/0.3.0/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants