Skip to content

Commit

Permalink
feat!: use prompt library in chat buffer and better keymap support
Browse files Browse the repository at this point in the history
Hopefully this release dramatically improves the usability of the
plugin.

Firstly, you can now mark specific prompts (from the prompt library) as
available to appear in the slash command completion menu in the chat
buffer via the `opts.is_slash_cmd` value. This will close #286.

Secondly, the default keymaps that came with each prompt have been
removed in place of `require("codecompanion").prompt("commit")`. This
makes it much cleaner to assign your own keymaps to prompts in the
prompt library.
  • Loading branch information
olimorris committed Oct 8, 2024
1 parent ade3924 commit b462c42
Show file tree
Hide file tree
Showing 10 changed files with 249 additions and 178 deletions.
19 changes: 19 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -230,6 +230,9 @@ vim.api.nvim_set_keymap("v", "ga", "<cmd>CodeCompanionChat Add<cr>", { noremap =
vim.cmd([[cab cc CodeCompanion]])
```

> [!NOTE]
> You can also assign prompts from the library to specific mappings. See the [prompt library](#clipboard-prompt-library) section for more information.
## :gear: Configuration

Before configuring the plugin, it's important to understand how it's structured.
Expand Down Expand Up @@ -441,6 +444,22 @@ require("codecompanion").setup({

The plugin comes with a number of pre-built prompts. As per [the config](https://github.com/olimorris/codecompanion.nvim/blob/main/lua/codecompanion/config.lua), these can be called via keymaps or slash commands (via the inline assistant). These prompts have been carefully curated to mimic those in [GitHub's Copilot Chat](https://docs.github.com/en/copilot/using-github-copilot/asking-github-copilot-questions-in-your-ide). Of course, you can create your own prompts and add them to the Action Palette. Please see the [RECIPES](doc/RECIPES.md) guide for more information.

**Using Keymaps**

You can call a prompt from the library via a keymap using the `prompt` helper:

```lua
vim.api.nvim_set_keymap("v", "<LocalLeader>ce", "", {
callback = function()
require("codecompanion").prompt("explain")
end,
noremap = true,
silent = true,
})
```

In the example above, we've set a visual keymap that will trigger the Explain prompt. Providing the `short_name` of the prompt as an argument to the helper (e.g. "commit") will resolve the strategy down to an action.

### :speech_balloon: The Chat Buffer

The chat buffer is where you converse with an LLM from within Neovim. The chat buffer has been designed to be turn based, whereby you send a message and the LLM replies. Messages are segmented by H2 headers and once a message has been sent, it cannot be edited. You can also have multiple chat buffers open at the same.
Expand Down
29 changes: 26 additions & 3 deletions doc/RECIPES.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ require("codecompanion").setup({
opts = {
mapping = "<LocalLeader>ce",
modes = { "v" },
slash_cmd = "expert",
short_name = "expert",
auto_submit = true,
stop_context_insertion = true,
user_prompt = true,
Expand Down Expand Up @@ -145,14 +145,14 @@ At first glance there's a lot of new stuff in this. Let's break it down.
opts = {
mapping = "<LocalLeader>ce",
modes = { "v" },
slash_cmd = "expert",
short_name = "expert",
auto_submit = true,
stop_context_insertion = true,
user_prompt = true,
},
```

In the `opts` table we're specifying that we only want this action to appear in the _Action Palette_ if we're in visual mode. We're also asking the chat strategy to automatically submit the prompts to the LLM via the `auto_submit = true` value. We're also telling the picker that we want to get the user's input before we action the response with `user_prompt = true`. With the `slash_cmd = "expert"` option, the user can run `:CodeCompanion /expert` from the cmdline in order to trigger this prompt. Finally, as we define a prompt to add any visually selected text to the chat buffer, we need to add the `stop_context_insertion = true` option to prevent the chat buffer from duplicating this. Remember that visually selcting text and opening a chat buffer will result in that selection from being adding as a codeblock.
In the `opts` table we're specifying that we only want this action to appear in the _Action Palette_ if we're in visual mode. We're also asking the chat strategy to automatically submit the prompts to the LLM via the `auto_submit = true` value. We're also telling the picker that we want to get the user's input before we action the response with `user_prompt = true`. With the `short_name = "expert"` option, the user can run `:CodeCompanion /expert` from the cmdline in order to trigger this prompt. Finally, as we define a prompt to add any visually selected text to the chat buffer, we need to add the `stop_context_insertion = true` option to prevent the chat buffer from duplicating this. Remember that visually selcting text and opening a chat buffer will result in that selection from being adding as a codeblock.

### Prompt options and context

Expand Down Expand Up @@ -271,6 +271,29 @@ And to determine the visibility of actions in the palette itself:

## Other Configuration Options

**Allowing a Prompt to appear as a Slash Command**

It can be useful to have a prompt from the prompt library appear as a slash command in the chat buffer, like with the `Generate a Commit Message` action. This can be done by specifiying a `is_slash_cmd = true` option to the prompt:

```lua
["Generate a Commit Message"] = {
strategy = "chat",
description = "Generate a commit message",
opts = {
index = 9,
is_default = true,
is_slash_cmd = true,
short_name = "commit",
auto_submit = true,
},
prompts = {
-- Prompts go here
}
}
```

In the chat buffer, if you type `/` you will see the value of `opts.short_name` appear in the completion menu for you to expand.

**Specifying an Adapter and Model**

```lua
Expand Down
34 changes: 30 additions & 4 deletions doc/codecompanion-recipes.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
*codecompanion-recipes.txt* For NVIM v0.10.0 Last change: 2024 October 04
*codecompanion-recipes.txt* For NVIM v0.10.0 Last change: 2024 October 08

==============================================================================
Table of Contents *codecompanion-recipes-table-of-contents*
Expand Down Expand Up @@ -145,7 +145,7 @@ builtin to the plugin as the _Code Advisor_ action:
opts = {
mapping = "<LocalLeader>ce",
modes = { "v" },
slash_cmd = "expert",
short_name = "expert",
auto_submit = true,
stop_context_insertion = true,
user_prompt = true,
Expand Down Expand Up @@ -185,7 +185,7 @@ PALETTE OPTIONS ~
opts = {
mapping = "<LocalLeader>ce",
modes = { "v" },
slash_cmd = "expert",
short_name = "expert",
auto_submit = true,
stop_context_insertion = true,
user_prompt = true,
Expand All @@ -197,7 +197,7 @@ in the _Action Palette_ if we’re in visual mode. We’re also asking the chat
strategy to automatically submit the prompts to the LLM via the `auto_submit =
true` value. We’re also telling the picker that we want to get the user’s
input before we action the response with `user_prompt = true`. With the
`slash_cmd = "expert"` option, the user can run `:CodeCompanion /expert` from
`short_name = "expert"` option, the user can run `:CodeCompanion /expert` from
the cmdline in order to trigger this prompt. Finally, as we define a prompt to
add any visually selected text to the chat buffer, we need to add the
`stop_context_insertion = true` option to prevent the chat buffer from
Expand Down Expand Up @@ -331,6 +331,32 @@ And to determine the visibility of actions in the palette itself:

OTHER CONFIGURATION OPTIONS*codecompanion-recipes-other-configuration-options*

**Allowing a Prompt to appear as a Slash Command**

It can be useful to have a prompt from the prompt library appear as a slash
command in the chat buffer, like with the `Generate a Commit Message` action.
This can be done by specifiying a `is_slash_cmd = true` option to the prompt:

>lua
["Generate a Commit Message"] = {
strategy = "chat",
description = "Generate a commit message",
opts = {
index = 9,
is_default = true,
is_slash_cmd = true,
short_name = "commit",
auto_submit = true,
},
prompts = {
-- Prompts go here
}
}
<

In the chat buffer, if you type `/` you will see the value of `opts.short_name`
appear in the completion menu for you to expand.

**Specifying an Adapter and Model**

>lua
Expand Down
23 changes: 22 additions & 1 deletion doc/codecompanion.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
*codecompanion.txt* For NVIM v0.10.0 Last change: 2024 October 04
*codecompanion.txt* For NVIM v0.10.0 Last change: 2024 October 07

==============================================================================
Table of Contents *codecompanion-table-of-contents*
Expand Down Expand Up @@ -205,6 +205,9 @@ For an optimum workflow, I recommend the following keymaps:
<


[!NOTE] You can also assign prompts from the library to specific mappings. See
the |codecompanion-prompt-library| section for more information.

CONFIGURATION *codecompanion-configuration*

Before configuring the plugin, it’s important to understand how it’s
Expand Down Expand Up @@ -454,6 +457,24 @@ Chat
Of course, you can create your own prompts and add them to the Action Palette.
Please see the RECIPES <doc/RECIPES.md> guide for more information.

**Using Keymaps**

You can call a prompt from the library via a keymap using the `prompt` helper:

>lua
vim.api.nvim_set_keymap("v", "<LocalLeader>ce", "", {
callback = function()
require("codecompanion").prompt("explain")
end,
noremap = true,
silent = true,
})
<

In the example above, we’ve set a visual keymap that will trigger the Explain
prompt. Providing the `short_name` of the prompt as an argument to the helper
(e.g.� "commit") will resolve the strategy down to an action.


THE CHAT BUFFER ~

Expand Down
57 changes: 47 additions & 10 deletions lua/cmp_codecompanion/slash_commands.lua
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
local config = require("codecompanion").config
local SlashCommands = require("codecompanion.strategies.chat.slash_commands")
local strategy = require("codecompanion.strategies")

local source = {}

Expand All @@ -20,12 +21,15 @@ function source:get_keyword_pattern()
end

function source:complete(params, callback)
local items = {}
local kind = require("cmp").lsp.CompletionItemKind.Function

for name, data in pairs(config.strategies.chat.slash_commands) do
if name ~= "opts" then
table.insert(items, {
local slash_commands = vim
.iter(config.strategies.chat.slash_commands)
:filter(function(name)
return name ~= "opts"
end)
:map(function(name, data)
return {
label = "/" .. name,
kind = kind,
detail = data.description,
Expand All @@ -34,12 +38,34 @@ function source:complete(params, callback)
bufnr = params.context.bufnr,
cursor = params.context.cursor,
},
})
end
end
}
end)
:totable()

local prompts = vim
.iter(config.prompt_library)
:filter(function(_, v)
return v.opts and v.opts.is_slash_cmd and v.strategy == "chat"
end)
:map(function(_, v)
return {
label = "/" .. v.opts.short_name,
kind = kind,
detail = v.description,
config = v,
from_prompt_library = true,
context = {
bufnr = params.context.bufnr,
cursor = params.context.cursor,
},
}
end)
:totable()

local all_items = vim.tbl_extend("force", slash_commands, prompts)

callback({
items = items,
items = all_items,
isIncomplete = false,
})
end
Expand All @@ -50,9 +76,20 @@ end
---@return nil
function source:execute(item, callback)
vim.api.nvim_set_current_line("")

item.Chat = require("codecompanion").buf_get_chat(item.context.bufnr)
SlashCommands:execute(item)

if item.from_prompt_library then
local prompts = strategy.evaluate_prompts(item.config.prompts, item.context)
vim.iter(prompts):each(function(prompt)
if prompt.role == "system" then
item.Chat:add_message(prompt, { visible = false })
elseif prompt.role == "user" then
item.Chat:append_to_buf(prompt)
end
end)
else
SlashCommands:execute(item)
end

callback(item)
vim.bo[item.context.bufnr].buflisted = false
Expand Down
90 changes: 2 additions & 88 deletions lua/codecompanion/actions/prompt_library.lua
Original file line number Diff line number Diff line change
@@ -1,7 +1,3 @@
local Strategy = require("codecompanion.strategies")
local context_utils = require("codecompanion.utils.context")
local api = vim.api

local M = {}

local _prompts = {}
Expand All @@ -14,6 +10,7 @@ local _prompts = {}
function M.resolve(context, config)
local sort_index = true

--TODO: Replace with vim.iter()
for name, prompt in pairs(config.prompt_library) do
if
not config.display.action_palette.opts.show_default_prompt_library and (prompt.opts and prompt.opts.is_default)
Expand All @@ -35,7 +32,7 @@ function M.resolve(context, config)
description = prompt.description(context)
end
if prompt.opts and prompt.opts.slash_cmd then
description = "(/" .. prompt.opts.slash_cmd .. ") " .. description
description = description
end

table.insert(_prompts, {
Expand All @@ -60,87 +57,4 @@ function M.resolve(context, config)
return _prompts
end

---@param desc table
---@return string|function|nil
local function resolve_description(desc)
if type(desc) == "string" then
return desc
end
if type(desc) == "function" then
return desc()
end
end

---Setup the keymap
---@param map_config table
---@param mode string
---@return nil
local function map(map_config, mode)
return api.nvim_set_keymap(mode, map_config.opts.mapping, "", {
callback = function()
local context = context_utils.get(api.nvim_get_current_buf())

return Strategy.new({
context = context,
selected = map_config,
}):start(map_config.strategy)
end,
desc = resolve_description(map_config.description),
noremap = true,
silent = true,
})
end

---Setup the keymaps for the prompt library
---@param config table
---@return nil
function M.setup_keymaps(config)
if not config.opts.set_prompt_library_keymaps then
return
end

local prompts = config.prompt_library

for _, prompt in pairs(prompts) do
if prompt.opts and prompt.opts.mapping then
if not config.display.action_palette.opts.show_default_prompt_library and prompt.opts.is_default then
goto continue
end
if prompt.opts.modes and type(prompt.opts.modes) == "table" then
for _, mode in ipairs(prompt.opts.modes) do
map(prompt, mode)
end
else
map(prompt, "n")
end
end
::continue::
end
end

---Setup the inline slash commands for the prompt library
---@param config table
---@return table
function M.setup_inline_slash_commands(config)
local slash_cmds = {}
local prompts = config.prompt_library

for name, prompt in pairs(prompts) do
if prompt.opts then
if not config.display.action_palette.opts.show_default_prompt_library and prompt.opts.is_default then
goto continue
end

if prompt.opts.slash_cmd then
prompt.name = name
slash_cmds[prompt.opts.slash_cmd] = prompt
end

::continue::
end
end

return slash_cmds
end

return M
Loading

0 comments on commit b462c42

Please sign in to comment.