-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gpt model addn to conversation and seperate trimmed history + formatting #13
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
agentai/conversation.py
Outdated
@@ -13,8 +13,11 @@ class Message(BaseModel): | |||
|
|||
|
|||
class Conversation: | |||
def __init__(self, history: List[Message] = [], id: Optional[str] = None, max_history_tokens: int = 200): | |||
def __init__( | |||
self, history: List[Message] = [], id: Optional[str] = None, max_history_tokens: int = 200, model: str = "gpt2" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's default to gpt3.5-turbo
and use the corresponding encoder, tokenizer as well everywhere. This "gpt2" is confusing.
agentai/conversation.py
Outdated
|
||
def add_message(self, role: str, content: str, name: Optional[str] = None) -> None: | ||
message_dict = {"role": role, "content": content} | ||
if name: | ||
message_dict["name"] = name | ||
message = Message(**message_dict) | ||
self.history.append(message) | ||
self.trim_history() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This call should be optional, only if the user sets it — it should be turned on.
agentai/conversation.py
Outdated
def __init__(self, history: List[Message] = [], id: Optional[str] = None, max_history_tokens: int = 200): | ||
def __init__( | ||
self, history: List[Message] = [], id: Optional[str] = None, max_history_tokens: int = 200, model: str = "gpt2" | ||
): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If max_history_tokens
is set to None, and default should not be 200 — but None — do no trimming and let the damn thing error out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Conversation should have a default model e.g. GPT3.5-Turbo, and empty __init__
call should work too.
@@ -83,7 +83,7 @@ Note that `agentai` automatically parses the Python Enum type (TemperatureUnit) | |||
3. **Create a Conversation object and add messages** | |||
|
|||
```python | |||
conversation = Conversation() | |||
conversation = Conversation(model=GPT_MODEL) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've not formed a full opinion on whether Conversation should take a GPT_MODEL param at all — I'm inclined to believe it should be either more general e.g. create a LLM class and build around that OR (more preferred) — read this from env in some way.
In the mean time, merging this so that I can play with it and form stronger opinions.
This PR has the changes related to issues #10 and #11.