Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for adding context dynamically to existing chats #97

Open
naddeoa opened this issue Jan 30, 2025 · 3 comments
Open

Support for adding context dynamically to existing chats #97

naddeoa opened this issue Jan 30, 2025 · 3 comments

Comments

@naddeoa
Copy link

naddeoa commented Jan 30, 2025

I've been trying all of the neovim AI plugins over the last few days and I've found Parrot to be the simplest option that has the ability to chat, manage chats, switch models, and replace/insert code into buffers. I especially love the fact that chats are just editable buffers and switching between different models within a single conversation is so trivial.

The one thing that I did like in the other plugins (like avante and codecompanion) that I'm not sure how to get in Parrot is context management in chats. The closest thing I could find so far was custom hooks and template variables, but those have some drawbacks. Specifically, there's no way of using them in an existing chat that I can tell, and there's no way to convert a one time hook answer into a persistent chat so you can continue on.

Did I overlook something? I think the way that the other plugins accomplish this (with either # for / in the chat windows) was pretty simple and elegant. If that were to be directly ported to Parrot then you would just take the current template variables and allow them to be added inline in chats with /multifilecontent, for example.

@frankroeder
Copy link
Owner

Thank you very much for your feedback. This is indeed the goal of Parrot. It should be simple to use and understand.

That's right, and I am already working on a possible solution. However, I am not happy with it, and therefore it will take more time. Definitely got something like @buffer or @file to add further context.

The thing is that these overall features make everything way more complicated and harder to use and understand.

Continuing a hook in a new chat should be possible with an autocommand listening to the user event PrtDone, visual selection, and then :PrtChatNew.

It is possible to start a chat with additional "placeholder" context:

      TalkToFile = function(prt, params)
        local chat_prompt = [[
        Let us talk about this code in file {{filename}}
        ```{{filetype}}
        {{filecontent}}
        ```
        ]]
        prt.ChatNew(params, chat_prompt)
      end,

It is not the nicest solution as everything will be squeezed into the system prompt shown at the top of the chat, but it works for now.

@naddeoa
Copy link
Author

naddeoa commented Jan 31, 2025

@frankroeder Thanks for getting back to me. I just added a bunch of aliases to invoke different versions of chats and rewrites/appends/prepends that include more context from the current file. That definitely unblocks a lot of my issues, happy to wait on the actual implementation. In case it helps anyone, here are the variants of the built in commands that also include more context from the current file. I'll probably make another variant that includes the open buffer content as well.

As a side note, a template that includes the current file might be a decent default? Without this, something like "make this function retry" wouldn't really work because the llm wouldn't know that you already have a retry library imported, for example, and might just implement a one-off function. Or pydoc styles would be inconsistent with the rest of the file when adding documentation.

                hooks = {
                    RewriteFullContext = function(prt, params)
                        local chat_prompt = [[
I have the following selection from {{filename}}:

```{{filetype}}
{{selection}}
```

This is the full file for context:

```{{filetype}}
{{filecontent}}
```

{{command}}
Respond exclusively with the snippet that should replace the selection above.
DO NOT RESPOND WITH ANY TYPE OF COMMENTS, JUST THE CODE!!!
]]
                        local model_obj = prt.get_model("command")
                        prt.Prompt(params, prt.ui.Target.rewrite, model_obj, "🤖 " .. model_obj.name .. " rewrite ~ ",
                            chat_prompt)
                    end,
                    AppendFullContext = function(prt, params)
                        local chat_prompt = [[
I have the following selection from {{filename}}:

```{{filetype}}
{{selection}}
```

This is the full file for context:

```{{filetype}}
{{filecontent}}
```

{{command}}
Respond exclusively with the snippet that should be appended after the selection above.
DO NOT RESPOND WITH ANY TYPE OF COMMENTS, JUST THE CODE!!!
DO NOT REPEAT ANY CODE FROM ABOVE!!!
]]
                        local model_obj = prt.get_model("command")
                        prt.Prompt(params, prt.ui.Target.append, model_obj, "🤖 " .. model_obj.name .. " append ~ ",
                            chat_prompt)
                    end,
                    PrependFullContext = function(prt, params)
                        local chat_prompt = [[
I have the following selection from {{filename}}:

```{{filetype}}
{{selection}}
```

This is the full file for context:

```{{filetype}}
{{filecontent}}
```

{{command}}
Respond exclusively with the snippet that should be prepended before the selection above.
DO NOT RESPOND WITH ANY TYPE OF COMMENTS, JUST THE CODE!!!
DO NOT REPEAT ANY CODE FROM ABOVE!!!
]]
                        local model_obj = prt.get_model("command")
                        prt.Prompt(params, prt.ui.Target.prepend, model_obj, "🤖 " .. model_obj.name .. " prepend ~ ",
                            chat_prompt)
                    end,
                    ChatNewFullContext = function(prt, params)
                        local chat_prompt = [[
For this conversation, I have the following context.

These are the files I currently have open.

{{multifilecontent}}

I'm currently in file {{filename}}
]]
                        prt.ChatNew(params, chat_prompt)
                    end,
                },

@frankroeder
Copy link
Owner

frankroeder commented Jan 31, 2025

@naddeoa That's definitely a very good reference. Thank you! However, I would not be in favor of setting it as default because it consumes a lot more tokens and some users might not want that, at least I don't want that by default. You might also want to simply change the default prompts for the interactive commands in our personal settings: https://github.com/frankroeder/parrot.nvim/blob/ed43675e95c0064c205672f28da475187e9a1c03/lua/parrot/config.lua#L225C1-L268C6.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants