Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance trajectory reports with improved UI and stats display #586

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

mentatai[bot]
Copy link

@mentatai mentatai bot commented Nov 30, 2024

  • Added LLM response stats next to the ParsedModelEvent UUID for better visibility.
  • Implemented "Show More" and "Show Less" functionality that appears only when text is truncated, improving user experience.
  • Enhanced the visual distinction of "Model Action" events in the timeline to highlight key actions.

Closes #584

…ranscript

# Add message stats and expand/collapse functionality to conversation transcript

This commit adds the following improvements to the conversation transcript:

1. Displays message statistics (speed and cost) for each assistant message.
2. Adds an "Expand" button to messages that overflow their container, allowing the user to expand and collapse the message content.
3. Styles the expanded messages to have a maximum height and scrollable content.

These changes improve the user experience by providing more information about the model's performance and making it easier to read long messages in the transcript.
Title: Add streaming response support to Conversation class

This change adds support for streaming responses to the `Conversation` class. Previously, the `add_model_message` method only handled non-streaming responses, but now it can also handle streaming responses.

The key changes are:

1. Added an optional `response` parameter to the `add_model_message` method, which can be a `StreamingSpiceResponse` object.
2. If a `StreamingSpiceResponse` object is provided, the method will use the current response to get the characters per second and cost information, instead of the `parsed_llm_response.llm_response`.
3. Added support for the `Other/Proprietary License` to the `license_check.py` file, as this is a valid license type used in the project.

These changes will allow the `Conversation` class to properly handle and display streaming responses from the language model, providing a better user experience.
Title: Improve cost calculation in Conversation class

The changes made in this commit improve the cost calculation in the `Conversation` class. Specifically:

1. The `cost` variable is now checked for `None` before being used in the `stats` string. This ensures that the cost is only displayed if it is available.
2. The cost is now divided by 100 to display the cost in dollars instead of cents, as this is a more common way to represent monetary values.

These changes make the cost display more robust and user-friendly.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

improve trajectory reports
0 participants