Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ installable elia #3

Merged
merged 3 commits into from
Jul 28, 2023
Merged

Conversation

juftin
Copy link
Contributor

@juftin juftin commented Jul 27, 2023

Summary

TLDR; Install and run Elia with pipx


I wanted to try ChatGPT out as a TUI and stumbled upon Elia - really fantastic work, I'm blown away by the app. I decided to fork the project to make it more installable and include a CLI and came up with this feature branch. I hope you'll enjoy the new functionality. I think it will help other people to try the TUI out too.

Changes

  • Prepares the elia repo to be installed using directly from Github
    • Updates the pyproject.toml to use textual from PyPI instead of local dev branch
    • Adds documentation on how to install and run elia via pipx
  • Resolves a bug for chat timestamp calculation using the humanize library
  • Moves the create_database functionality to a first-class member of the elia_chat package
    • Database file location is moved next to the source code at elia_chat/database/elia.sqlite.
  • Adds a click CLI to the project allowing for interactions like elia db reset and additional CLI level documentation
  • Updates the built-in system message per conversation to be customizable using a ELIA_DIRECTIVE environment variable.

Considerations

  • I found the system message while poking around the database and thought it would be fun to play around with and came up with ELIA_DIRECTIVE. I'm happy to take that out or keep it as is.
  • I considered changing the database file location to be persisted outside of the source code, something like ~/.elia/chat.db or similar. Having a persisted database between installs would be convenient but require database migrations. Ultimately I went with its location at elia_chat/database/elia.sqlite to use different databases per install.

@darrenburns
Copy link
Owner

Oh, this is awesome, I love it! 👏

I really like the CLI idea, but I think I'd prefer it if we flattened the commands so it would just be elia reset (without the db subcommand for now, with the possibility of adding it if the CLI was to grow).

We could even add the import_chatgpt.py script to the CLI so that people could import their ChatGPT conversations into their local database easily, without having to manually run the script: elia import path/to/chatgpt_export.json. (Not necessarily in this PR, just thinking out loud)

For ELIA_DIRECTIVE - I'm happy to go with that for now. Ultimately I'd like to be able to customise the system message from within the TUI, so that you can have a different system message per chat - e.g. in one chat I want a "foreign language tutor", and in another chat I want a "programming expert who talks like pirate".

Some other general info/context floating around my head that I haven't documented anywhere...

I haven't worked on it in a little while, but I'm hoping to integrate Textualize/textual#2931 when it's ready (hopefully I'll finish it next week) - it turns out having multi-line input is quite useful for interacting with LLMs :)

A little warning if you're playing with it - the biggest issue right now with Elia though is that it always attempts to send the full chat to the ChatGPT API. However, different models have different context lengths and we should only try to send as many messages as would fit into that context length. Basically, expect errors from the ChatGPT API after conversations surpass the context limit. There's already some code for computing the token length of each message (you can see it in the message info pop-up modal). I think Elia should just be sending the most recent N messages that fit within the context window of the chosen model to the API instead of the entire thread.

@darrenburns darrenburns self-requested a review July 27, 2023 22:00
@juftin
Copy link
Contributor Author

juftin commented Jul 28, 2023

I really like the CLI idea, but I think I'd prefer it if we flattened the commands so it would just be elia reset (without the db subcommand for now, with the possibility of adding it if the CLI was to grow).

✅ easy refactor, I agree.

We could even add the import_chatgpt.py script to the CLI so that people could import their ChatGPT conversations into their local database easily, without having to manually run the script: elia import path/to/chatgpt_export.json. (Not necessarily in this PR, just thinking out loud)

✅ Done, this was an easy one too

For ELIA_DIRECTIVE - I'm happy to go with that for now. Ultimately I'd like to be able to customise the system message from within the TUI, so that you can have a different system message per chat - e.g. in one chat I want a "foreign language tutor", and in another chat I want a "programming expert who talks like pirate".

Yeah, I like that a lot. A dropdown or something like that would be helpful. A nice place to display the current system message would be the header per chat as well

I haven't worked on it in a little while, but I'm hoping to integrate Textualize/textual#2931 when it's ready (hopefully I'll finish it next week) - it turns out having multi-line input is quite useful for interacting with LLMs :)

I've been keeping an eye on this one. It's going to be awesome. Besides multi-line selection I was thinking it would be neat to have a code snippet copy button to show up. Not sure how difficult this would be but being able to dump code to the clipboard could be useful in lots of markdown use cases.

A little warning if you're playing with it - the biggest issue right now with Elia though is that it always attempts to send the full chat to the ChatGPT API. However, different models have different context lengths and we should only try to send as many messages as would fit into that context length. Basically, expect errors from the ChatGPT API after conversations surpass the context limit. There's already some code for computing the token length of each message (you can see it in the message info pop-up modal). I think Elia should just be sending the most recent N messages that fit within the context window of the chosen model to the API instead of the entire thread.

🤔 I was wondering how that worked exactly. I've been keeping my conversations pretty short anyways. I will try to take a look at fine-tuning that context length when I get a chance. Having a delete button would be useful for the chat window. I find myself tidying up using elia reset quite often when just deleting in the TUI would work too.

@juftin juftin force-pushed the elia-external-install branch from 3bebb5e to 2c2b381 Compare July 28, 2023 00:32
Copy link
Owner

@darrenburns darrenburns left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work, thank you very much!

@darrenburns darrenburns merged commit 09040ba into darrenburns:master Jul 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants