Skip to content

0.19a0

Pre-release
Pre-release
Compare
Choose a tag to compare
@simonw simonw released this 20 Nov 04:25
· 4 commits to main since this release
  • Tokens used by a response are now logged to new input_tokens and output_tokens integer columns and a token_details JSON string column, for the default OpenAI models and models from other plugins that implement this feature. #610
  • llm prompt now takes a -u/--usage flag to display token usage at the end of the response.
  • llm logs -u/--usage shows token usage information for logged responses.
  • llm prompt ... --async responses are now logged to the database. #641