Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs: Update CohereGenerator docstrings #960

Merged
merged 3 commits into from
Aug 8, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -15,12 +15,12 @@

@component
class CohereGenerator(CohereChatGenerator):
"""LLM Generator compatible with Cohere's generate endpoint.
"""Generates text using Cohere's models through Cohere's `generate` endpoint.
NOTE: Cohere discontinued the `generate` API, so this generator is a mere wrapper
around `CohereChatGenerator` provided for backward compatibility.
Example usage:
### Usage example
```python
from haystack_integrations.components.generators.cohere import CohereGenerator
Expand All @@ -40,6 +40,15 @@ def __init__(
):
"""
Instantiates a `CohereGenerator` component.
:param api_key: Cohere API key.
:param model: Cohere model to use for generation.
:param streaming_callback: Callback function that is called when a new token is received from the stream.
The callback function accepts [StreamingChunk](https://docs.haystack.deepset.ai/docs/data-classes#streamingchunk)
as an argument.
:param api_base_url: Cohere base URL.
:param **kwargs: Additional arguments passed to the model. These arguments are specific to the model.
You can check them in model's documentation.
"""

# Note we have to call super() like this because of the way components are dynamically built with the decorator
Expand All @@ -52,8 +61,8 @@ def run(self, prompt: str):
:param prompt: the prompt to be sent to the generative model.
:returns: A dictionary with the following keys:
- `replies`: the list of replies generated by the model.
- `meta`: metadata about the request.
- `replies`: A list of replies generated by the model.
- `meta`: Information about the request.
"""
chat_message = ChatMessage(content=prompt, role=ChatRole.USER, name="", meta={})
# Note we have to call super() like this because of the way components are dynamically built with the decorator
Expand Down