Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add context to bot conversations #8

Closed
Agent-E11 opened this issue May 2, 2024 · 2 comments · Fixed by #20
Closed

Add context to bot conversations #8

Agent-E11 opened this issue May 2, 2024 · 2 comments · Fixed by #20
Labels
enhancement New feature or request

Comments

@Agent-E11
Copy link
Member

I tried to do this the first time I added llm support, but I ran into the issue that you cannot easily fetch a chain of replies from a message. This probably means that we will have to keep our own cache of contexts, and possibly associate each new message to an existing one, if it is replying to a message that was "in the conversation". Or, we could just store a universal context for the day, and remove it every midnight or something (this might be pretty resource intensive though)

@Agent-E11 Agent-E11 added the enhancement New feature or request label May 2, 2024
@Agent-E11
Copy link
Member Author

This is probably a good use of an sqlite database

@Agent-E11
Copy link
Member Author

Regarding storage format. A way to distinguish between users could be something like this (in the list of messages):

{
    { "role": "user", "content": "(agent_e11 <@0123456789>) Hello, this is my message." }, // (user_name <@user_id>)
    { "role": "assistant", "content": "Hello, this is my message." } // No markup
}

And then add something like this to the system prompt:

The user messages are prefixed with (user_name <@user_id>),
where the user's name is user_name and the user's id is user_id.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant