Skip to content

[Bug]: Custom prompt cannot specify model in v15.9.0 #1529

@rensftw

Description

@rensftw

Did you check the docs and existing issues?

  • I have read all the docs
  • I have updated the plugin to the latest version before submitting this issue
  • I have searched for existing issues/discussions
  • I have searched the existing issues of plugins related to this issue

Neovim version (nvim -v)

v0.11.1

Operating system/version

macOS v15.5

Adapter and model

Ollama and llama3:latest

Describe the bug

I have created a custom prompt and I want it to run with a specific model by default (docs)

This setup worked in CodeCompanion v15.8.0, however in v15.9.0 I started getting this error:

E5108: Error executing lua: vim/shared.lua:0: s: expected string, got table
stack traceback:
        [C]: in function 'error'
        vim/shared.lua: in function 'validate'
        vim/shared.lua: in function 'startswith'
        ...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:83: in function 'f'
        ...Cellar/neovim/0.11.1/share/nvim/runtime/lua/vim/iter.lua:335: in function 'map'
        ...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:82: in function 'form_messages'
        .../nvim/lazy/codecompanion.nvim/lua/codecompanion/http.lua:95: in function 'request'
        ...ompanion.nvim/lua/codecompanion/strategies/chat/init.lua:891: in function 'submit'
        ...anion.nvim/lua/codecompanion/strategies/chat/keymaps.lua:216: in function 'rhs'
        ...y/codecompanion.nvim/lua/codecompanion/utils/keymaps.lua:80: in function <...y/codecompanion.nvim/lua/codecompanion/utils/keymaps
.lua:77>

The only way I can get my custom prompt to work with CodeCompanion v15.9.0, is to stop specifying the model:

diff --git a/neovim/.config/nvim/lua/ai/utils/prompt_computer-networking.lua b/neovim/.config/nvim/lua/ai/utils/prompt_computer-networking.lua
index 64d47eb..64643c4 100644
--- a/neovim/.config/nvim/lua/ai/utils/prompt_computer-networking.lua
+++ b/neovim/.config/nvim/lua/ai/utils/prompt_computer-networking.lua
@@ -2,10 +2,7 @@ return {
     strategy = 'chat',
     description = 'Learning assistant',
     opts = {
-        adapter = {
-            name = 'ollama',
-            model = 'llama3:latest',
-        },
+        adapter = 'ollama',
     },
     prompts = {
         {

Steps To Reproduce

  1. Use the repro.lua I have supplied
  2. :CodeCompanionActions
  3. Pick the Study buddy: computer networking action
  4. Send a simple chat message in CodeCompanion and observe the error

Video:

2025-05-30_11-48-45.mp4

Expected Behavior

Use the specified model for chats with the custom prompt.

init.lua file

vim.env["CODECOMPANION_TOKEN_PATH"] = vim.fn.expand("~/.config")

vim.env.LAZY_STDPATH = ".repro"
load(vim.fn.system("curl -s https://raw.githubusercontent.com/folke/lazy.nvim/main/bootstrap.lua"))()

-- Your CodeCompanion setup
local plugins = {
    {
        'olimorris/codecompanion.nvim',
        lazy = true,
        event = 'VeryLazy',
        cmd = {
            'CodeCompanion',
            'CodeCompanionChat',
            'CodeCompanionActions',
        },
        dependencies = {
            { "nvim-treesitter/nvim-treesitter", build = ":TSUpdate" },
            'nvim-lua/plenary.nvim',
        },
        config = function()
            require('codecompanion').setup({
                adapters = {
                    opts = {
                        log_level = 'DEBUG',
                    },
                    ollama = function()
                        return require('codecompanion.adapters').extend('ollama', {
                            name = 'ollama',
                            schema = {
                                model = {
                                    default = 'qwen2.5-coder:7b',
                                },
                            },
                        })
                    end,
                },
                strategies = {
                    chat = {
                        adapter = 'ollama',
                        roles = {
                            llm = function(adapter)
                                return '󱙺  ' .. adapter.formatted_name .. ' (' .. adapter.schema.model.default .. ')'
                            end,
                            user = '  Me',
                        },
                    },
                    inline = {
                        adapter = 'ollama',
                    },
                },
                prompt_library = {
                    ['Study buddy: computer networking'] = {
                        strategy = 'chat',
                        description = 'Learning assistant',
                        opts = {
                            adapter = {
                                name = 'ollama',
                                model = 'llama3:latest',
                            },
                        },
                        prompts = {
                            {
                                role = 'system',
                                content = [[You are an expert in Networking Engineering. You job is to be a study buddy for Comp Sci students.]],
                            },
                            {
                                role = "user",
                                content = "",
                            },
                        },
                    },
                },
            })
        end,
    }
    ,
}

require('lazy').setup(plugins)

Log output

CodeCompanion did not create .repro/state/nvim/codecompanion.log

Image

Have you provided and tested with a repro.lua file?

  • Yes, I have tested and provided a repro.lua file

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions