Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for additional LLM generators in Langfuse integration #887

Closed
mrm1001 opened this issue Jul 9, 2024 · 0 comments · Fixed by #1087
Closed

Support for additional LLM generators in Langfuse integration #887

mrm1001 opened this issue Jul 9, 2024 · 0 comments · Fixed by #1087
Assignees
Labels
feature request Ideas to improve an integration P2

Comments

@mrm1001
Copy link

mrm1001 commented Jul 9, 2024

At the moment, we only support OpenAI generator, but there has been demand in the community to use other models, such as Anthropic's Claude LLMs.

Below is an example from Ardaric in discord:

import os
from haystack import Pipeline
from haystack.components.builders import PromptBuilder
from haystack_integrations.components.generators.anthropic import AnthropicGenerator

from dotenv import load_dotenv

def make_pipeline():
    from haystack_integrations.components.connectors.langfuse import LangfuseConnector
    pipe = Pipeline()

    extract_prompt_builder = PromptBuilder(template=extract_question_prompt)
    extractor_llm = AnthropicGenerator(model="claude-3-haiku-20240307")

    pipe.add_component("tracer", LangfuseConnector("Test Logging"))
    pipe.add_component(instance=extract_prompt_builder, name="extract_prompt_builder")
    pipe.add_component(instance=extractor_llm, name="extractor_llm")

    pipe.connect("extract_prompt_builder", "extractor_llm")

    return pipe

if __name__ == "__main__":
    load_dotenv()
    print(os.getenv('LANGFUSE_SECRET_KEY'), os.getenv('LANGFUSE_PUBLIC_KEY'), os.getenv('LANGFUSE_HOST'), os.getenv('HAYSTACK_CONTENT_TRACING_ENABLED'))
    # os.environ['LANGFUSE_DEBUG'] = 'True'
    extract_question_prompt = "{{batch}} | How old is John?"
    pipe = make_pipeline()
    result = pipe.run(
        {"extract_prompt_builder": {"batch": "John is 34 years old"}}
    )
    print(result)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Ideas to improve an integration P2
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants