Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sharing session while message is streaming will lead to truncated message #412

Open
kahkeng opened this issue Aug 8, 2023 · 0 comments
Open

Comments

@kahkeng
Copy link
Collaborator

kahkeng commented Aug 8, 2023

When a response is streaming in, we are not committing new tokens to the database because it slows things down.

If we allow a share operation to take a snapshot, the message will be truncated and missing most of the tokens.

  • One fix is to prevent sharing in the UI via spinner, similar to ChatGPT, while a response is still incoming. (Requires BE support to identify when streaming is completed.)

  • Another fix is to make BE periodically commit automatically, so this issue is somewhat mitigated (majority of what is visible for long responses will be in the snapshot).

  • Another fix is to make the BE use a pointer to the original chat message, so appends from streaming will still appear in the shared sesssion. (This could save space and address this issue but could end up accidentally doxxing if the streaming response wasn't previewed before the share.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant