Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Prompt Sizes Constantly Increasing (from few k to few hundreds k) #6601

Open
1 task done
isolomatov-gd opened this issue Feb 4, 2025 · 3 comments
Open
1 task done
Labels
enhancement New feature or request

Comments

@isolomatov-gd
Copy link

isolomatov-gd commented Feb 4, 2025

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

Ask it to build something bigger, like event planner application. Then ask to build e2e tests. Observe that prompt size is increasing and never decreases. LLM should review its own context and remove obsolete or irrelevant content (like fixed errors).
One screenshot shows monotonically increasing prompt size, which makes everything 50x more expensive and 10x slow.
The other screenshot shows that after reseting it back to normal.

Image ![Image](https://github.com/user-attachments/assets/1b5e8ac9-e6f6-4be1-bc67-db918b5dc7d2)

OpenHands Installation

Docker command in README

OpenHands Version

0.22

Operating System

MacOS

Logs, Errors, Screenshots, and Additional Context

Let me know if something is needed, I can share source code and prompt I use. I don't think it is specific to my case.

@mamoodi
Copy link
Collaborator

mamoodi commented Feb 4, 2025

@enyst is what this user is seeing as designed? Or is there an issue here?

@enyst
Copy link
Collaborator

enyst commented Feb 4, 2025

Thank you for the report @isolomatov-gd ! You are correct, they do constantly increase, currently. There are three things that may help:

  • prompt caching is enabled by default, so depending if the LLM and provider you're using supports it, it should help with costs. It can make a huge difference with Anthropic.
  • we will reduce it in half when it gets to the context window limit
  • we are working on condensers, a mechanism to reduce the prompt size, while keeping important information available to the agent. Currently, we actually have merged several options, it's just that they are not enabled by default. I thought it's possible to enable it from the browser, but I can't get it to work on main right now. I tried setting localStorage.setItem('ENABLE_DEFAULT_CONDENSER', 'true') in the browser, and that didn't work, sorry. It is possible to enable it programmatically, if you run in development mode.

@csmith49 may have more insight on how to enable it?

@mamoodi
Copy link
Collaborator

mamoodi commented Feb 4, 2025

Given enyst's comment, I'm going to switch the label to enhancement rather than a bug.

@mamoodi mamoodi added enhancement New feature or request and removed bug Something isn't working labels Feb 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants