Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent the bot from replying multiple text messages #40

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

firelightning13
Copy link

@firelightning13 firelightning13 commented Sep 21, 2023

Currently, the bot will reply to multiple people all at once, which could be unintended effect, especially when you have multiple people in the text channel. (assuming ALWAYS_REPLY is set to true)

For example:
image

This PR "should" fix this problem, where the bot will only reply once. The current AI model nowadays are pretty great at summarize what happened in the chat room (evident by just using listen-only mode).

The intended way for bot to reply (note that I sent replies to the bot 2 times instead of one):
image

To-do checklist

  • Implement task creation and cancellation from asyncio library
  • Use aiohttp to create POST session instead of usual POST request to prevent blocking
  • Resolved memory issues (bot can't see 2nd last reply)
  • TextGen compatible (though it doesn't prevent server from stopping the generation because of API limitation)
  • KoboldAI API compatible
  • Koboldcpp compatible (generate unique key to API to support multiple channel ID)
  • Multiple channel ID support

Currently, this PR is untested for multiple channel IDs, so if you want to test this implementation, you can pull my main repo instead.

- Implement asynchronous call in textgen.py
- Use asyncio cancel the current task and start a new one each time a text message is received
@firelightning13 firelightning13 changed the title Prevent the bot from replying multiple replies Prevent the bot from replying multiple text messages Sep 21, 2023
firelightning13 and others added 3 commits September 26, 2023 14:28
- Bot can now see 2nd last user reply by storing last message before send another reply
- Avoid `NoneType` error in messagehandler.py, I forgor
- implementation carried over to koboldAI API
- so far tested only with koboldcpp
@ausboss
Copy link
Owner

ausboss commented Oct 4, 2023

👀I like what I see. I am going to play around with this more and possibly merge soon.

- Made a `_stop` function in koboldai.py
- By invoking `/api/extra/abort`
- Official KoboldAI doesn't work
@firelightning13
Copy link
Author

👀I like what I see. I am going to play around with this more and possibly merge soon.

@ausboss Thanks for the reply. I only had a short time testing it, but so far there's no problem in my end. However, I do notice that it supports multiple channel IDs, so this may pose some problems. (it will likely cancel the reply from other channel)

So when cancelling the task, it should check the channel ID as well. I'll look into it later for the solution, but I'm not a full time programmer though. (I'm a graphic designer lol)

@ausboss
Copy link
Owner

ausboss commented Oct 5, 2023

I also am not a full time programmer so I fully understand! You are doing some great stuff, extra kudos for being able to decipher my spaghetti code. I have tried to address these types of issues with a while loop that checks messages and appends them and it was ok but had issues. I'm liking this approach and the modifications you are adding.

In that function it already has the channel_id being passed in so maybe we can use that. With history and stop_sequences I use the dictionaries self.histories and self.stop_sequences. self.histories uses channel_id as the key and value is the memory object. self.stop_sequences does the same thing with the channel_id as the key but the value is a list of stop_strings thats dynamically created for that channel and adds names to it. My thought is that maybe we could create another dictionary for the messages to be handled in a similar fashion so it will be able to check the channel before cancelling it.

@firelightning13
Copy link
Author

firelightning13 commented Oct 5, 2023

I think I know what you meant by that. Maybe if I use self.current_task as a dictionary and then go through a loop that would check the channel_id, this could potentially solve the problem. I'll see if asyncio would handle this sort of thing, and figure out how to clear any tasks that has been done.

FYI, I actually solved the multiple message issue a long time ago, but I didn't make a PR (you can look at the date in the screenshot haha) because I heavily modified your code. Back then I changed it to OpenAI API compatible which defeats the purpose of this project, and that was before you implemented Langchain.

@firelightning13
Copy link
Author

firelightning13 commented Oct 9, 2023

Concedo (the main developer from koboldcpp) released an API documentation: https://lite.koboldai.net/koboldcpp_api#/ This might be useful to check if the endpoint is the official KoboldAI or koboldcpp
TODO: Using "genkey" for koboldcpp with multi-user mode enabled (important for supporting multiple channel IDs)

Update: Right now, I've been too busy doing stuff IRL, so updating this PR might take a while.

firelightning13 and others added 4 commits October 9, 2023 22:57
- Using dict for current_tasks, with channel IDs as a key
- Perform a check if the endpoint is using koboldcpp or KoboldAI
- TODO: support koboldcpp with multi-user requests
- using dict to generate unique keys for each of channel IDs
- `_stop` function rolled back to normal request POST instead of aiohttp session
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants