-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add context to conversations #20
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently you store the user messages chunked, but they should be stored as one per actual user message
app/bot/__init__.py
Outdated
args = split_message[1:] | ||
|
||
await _handle_command(command, args, message) | ||
return | ||
|
||
# Add message to history | ||
_bothist.bot_history.add_message(message) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One issue that I realized on Ruby Bot, is that this chat history isn't thread safe, so there might be issues where the history gets a little out of order if it receives multiple messages while still processing an existing one.
There isn't really a great solution to that though, other than maybe appending both user and bot responses only after the response is generated, but you'd need a way to temporarily add the current message so it knows to generate the current response.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One possible way is to also include a timestamp along with the role and content, and then insert the message into where it would go chronologically instead of just appending it
response = await llm.generate_response( | ||
message, | ||
_botconf.botconfig.system_prompt + | ||
f" The user's name is {message.author.name}", | ||
_botconf.bot_config.system_prompt, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't make a lot of sense with how chat system prompts work, afaik. Essentially the first message in the chat history will be your system prompt, with a role of system
, and the content of your system prompt.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I add the system prompt to the messages before generating the response:
https://github.com/btc-raspberrypiclub/piclub-bot/pull/20/files#diff-de86459da278c0199e14e2432900816166309e862e9dee0dd929ae3307acfec4R36
app/bot/__init__.py
Outdated
) | ||
logger.info(f"response: `{response}`") | ||
if response is not None: | ||
if len(response) > 2000: | ||
# Split message into <=2000 character chunks | ||
for response_chunk in _split_text(response): | ||
await message.reply(response_chunk) | ||
_bothist.bot_history.add_message( | ||
await message.reply(response_chunk), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You definitely do not want to chunk the messages in the history, this will cause it to think each chunk is a new message. Append the entire response as one string instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I might need to add the possibility of passing a list of messages as well, and then the contents can be concatenated
@@ -31,17 +32,22 @@ async def generate_response( | |||
f"User prompt:\n{message.content}" | |||
) | |||
|
|||
messages: list[dict[str, str]] = [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm. Well, it's not the most efficient, but this does let you set the system prompt dynamically.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What would a more efficient way be? This is the exact same format that the api needs, so we would have to convert it to this eventually
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're recalculating the message array. Could store the whole thing rather than concatenating the system prompt every time. Not a big deal though
Close #8
Add channel specific history
Close #11
Simply use
Member.display_name
instead ofMember.name