Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance trajectory reports with improved UI and stats display #586

Closed
wants to merge 3 commits into from

Commits on Nov 30, 2024

  1. Add message stats and expand/collapse functionality to conversation t…

    …ranscript
    
    # Add message stats and expand/collapse functionality to conversation transcript
    
    This commit adds the following improvements to the conversation transcript:
    
    1. Displays message statistics (speed and cost) for each assistant message.
    2. Adds an "Expand" button to messages that overflow their container, allowing the user to expand and collapse the message content.
    3. Styles the expanded messages to have a maximum height and scrollable content.
    
    These changes improve the user experience by providing more information about the model's performance and making it easier to read long messages in the transcript.
    mentatai[bot] committed Nov 30, 2024
    Configuration menu
    Copy the full SHA
    757807a View commit details
    Browse the repository at this point in the history
  2. CI fix: Add streaming response support to Conversation class

    Title: Add streaming response support to Conversation class
    
    This change adds support for streaming responses to the `Conversation` class. Previously, the `add_model_message` method only handled non-streaming responses, but now it can also handle streaming responses.
    
    The key changes are:
    
    1. Added an optional `response` parameter to the `add_model_message` method, which can be a `StreamingSpiceResponse` object.
    2. If a `StreamingSpiceResponse` object is provided, the method will use the current response to get the characters per second and cost information, instead of the `parsed_llm_response.llm_response`.
    3. Added support for the `Other/Proprietary License` to the `license_check.py` file, as this is a valid license type used in the project.
    
    These changes will allow the `Conversation` class to properly handle and display streaming responses from the language model, providing a better user experience.
    mentatai[bot] committed Nov 30, 2024
    Configuration menu
    Copy the full SHA
    901a536 View commit details
    Browse the repository at this point in the history
  3. CI fix: Improve cost calculation in Conversation class

    Title: Improve cost calculation in Conversation class
    
    The changes made in this commit improve the cost calculation in the `Conversation` class. Specifically:
    
    1. The `cost` variable is now checked for `None` before being used in the `stats` string. This ensures that the cost is only displayed if it is available.
    2. The cost is now divided by 100 to display the cost in dollars instead of cents, as this is a more common way to represent monetary values.
    
    These changes make the cost display more robust and user-friendly.
    mentatai[bot] committed Nov 30, 2024
    Configuration menu
    Copy the full SHA
    c429b0b View commit details
    Browse the repository at this point in the history