Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add context to conversations #20

Merged
merged 4 commits into from
Jul 4, 2024
Merged

Conversation

Agent-E11
Copy link
Member

Close #8
Add channel specific history

Close #11
Simply use Member.display_name instead of Member.name

@Agent-E11 Agent-E11 linked an issue Jun 27, 2024 that may be closed by this pull request
Copy link
Contributor

@joshbuker joshbuker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently you store the user messages chunked, but they should be stored as one per actual user message

args = split_message[1:]

await _handle_command(command, args, message)
return

# Add message to history
_bothist.bot_history.add_message(message)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One issue that I realized on Ruby Bot, is that this chat history isn't thread safe, so there might be issues where the history gets a little out of order if it receives multiple messages while still processing an existing one.

There isn't really a great solution to that though, other than maybe appending both user and bot responses only after the response is generated, but you'd need a way to temporarily add the current message so it knows to generate the current response.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One possible way is to also include a timestamp along with the role and content, and then insert the message into where it would go chronologically instead of just appending it

response = await llm.generate_response(
message,
_botconf.botconfig.system_prompt +
f" The user's name is {message.author.name}",
_botconf.bot_config.system_prompt,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't make a lot of sense with how chat system prompts work, afaik. Essentially the first message in the chat history will be your system prompt, with a role of system, and the content of your system prompt.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

)
logger.info(f"response: `{response}`")
if response is not None:
if len(response) > 2000:
# Split message into <=2000 character chunks
for response_chunk in _split_text(response):
await message.reply(response_chunk)
_bothist.bot_history.add_message(
await message.reply(response_chunk),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You definitely do not want to chunk the messages in the history, this will cause it to think each chunk is a new message. Append the entire response as one string instead.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I might need to add the possibility of passing a list of messages as well, and then the contents can be concatenated

@@ -31,17 +32,22 @@ async def generate_response(
f"User prompt:\n{message.content}"
)

messages: list[dict[str, str]] = [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Well, it's not the most efficient, but this does let you set the system prompt dynamically.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What would a more efficient way be? This is the exact same format that the api needs, so we would have to convert it to this eventually

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're recalculating the message array. Could store the whole thing rather than concatenating the system prompt every time. Not a big deal though

@Agent-E11 Agent-E11 merged commit a98c665 into main Jul 4, 2024
1 check passed
@Agent-E11 Agent-E11 deleted the 8-add-context-to-conversations branch July 4, 2024 03:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Use user's nickname if present instead of their username Add context to bot conversations
2 participants