Skip to content

[Bug]: inline strategy model overrides cause issue with v15.9.0 #1521

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
5 tasks done
dhruvinsh opened this issue May 29, 2025 · 11 comments · Fixed by #1536 or #1535
Closed
5 tasks done

[Bug]: inline strategy model overrides cause issue with v15.9.0 #1521

dhruvinsh opened this issue May 29, 2025 · 11 comments · Fixed by #1536 or #1535
Labels
bug Something isn't working

Comments

@dhruvinsh
Copy link
Contributor

Did you check the docs and existing issues?

  • I have read all the docs
  • I have updated the plugin to the latest version before submitting this issue
  • I have searched for existing issues/discussions
  • I have searched the existing issues of plugins related to this issue

Neovim version (nvim -v)

0.11.0

Operating system/version

MacOS 15.5

Adapter and model

Copilot and gemini-2.5-pro

Describe the bug

according to docs, I am overriding the adapter

      strategies = {
        chat = { adapter = "copilot", slash_commands = { ["file"] = { opts = { provider = "fzf_lua" } } } },
        inline = { adapter = "copilot" },
        agent = { adapter = "copilot" },
      },

But when I try to run inline strategy I get below error.

E5108: Error executing lua vim/shared.lua:0: s: expected string, got table
stack traceback:
        [C]: in function 'error'
        vim/shared.lua: in function 'validate'
        vim/shared.lua: in function 'startswith'
        ...odecompanion.nvim/lua/codecompanion/adapters/copilot.lua:361: in function 'condition'
        ...y/codecompanion.nvim/lua/codecompanion/adapters/init.lua:142: in function 'make_from_schema'
        ...y/codecompanion.nvim/lua/codecompanion/adapters/init.lua:218: in function 'map_schema_to_params'
        ...panion.nvim/lua/codecompanion/strategies/inline/init.lua:391: in function 'prompt'
        [string ":lua"]:1: in main chunk

Issue happens with v15.9.0 only, If I roll back to v15.8.0, its fine.

According go source code and docs, string values are allowed:

elseif type(adapter) == "string" then

Steps To Reproduce

  1. repro.lua is attached
  2. try to run inline strategy command.

Expected Behavior

Inline strategy model override with string name should work.

init.lua file

--[[
NOTE: Set the config path to enable the copilot adapter to work.
It will search the following paths for a token:
  - "$CODECOMPANION_TOKEN_PATH/github-copilot/hosts.json"
  - "$CODECOMPANION_TOKEN_PATH/github-copilot/apps.json"
--]]
vim.env["CODECOMPANION_TOKEN_PATH"] = vim.fn.expand("~/.config")

vim.env.LAZY_STDPATH = ".repro"
load(vim.fn.system("curl -s https://raw.githubusercontent.com/folke/lazy.nvim/main/bootstrap.lua"))()

-- Your CodeCompanion setup
local plugins = {
  {
    "olimorris/codecompanion.nvim",
    dependencies = {
      { "nvim-treesitter/nvim-treesitter", build = ":TSUpdate" },
      { "nvim-lua/plenary.nvim" },
      -- Test with blink.cmp (delete if not required)
      {
        "saghen/blink.cmp",
        lazy = false,
        version = "*",
        opts = {
          keymap = {
            preset = "enter",
            ["<S-Tab>"] = { "select_prev", "fallback" },
            ["<Tab>"] = { "select_next", "fallback" },
          },
          cmdline = { sources = { "cmdline" } },
          sources = {
            default = { "lsp", "path", "buffer", "codecompanion" },
          },
        },
      },
      -- Test with nvim-cmp
      -- { "hrsh7th/nvim-cmp" },
    },
    opts = {
      adapters = {
        opts = {
          show_model_choices = true,
        },
        copilot = function()
          return require("codecompanion.adapters").extend("copilot", {
            schema = {
              model = {
                default = "gemini-2.5-pro",
              },
            },
          })
        end,
      },
      strategies = {
        -- setting provider till snacks.nvim related PR gets merged
        chat = { adapter = "copilot", slash_commands = { ["file"] = { opts = { provider = "fzf_lua" } } } },
        inline = { adapter = "copilot" },
        agent = { adapter = "copilot" },
      },
      prompt_library = {
        ["git commits"] = {
          strategy = "inline",
          description = "Generate git commit for staged changes",
          opts = {
            placement = "replace",
            short_name = "orion_commit",
            auto_submit = true,
            adapter = {
              name = "copilot",
              model = "gpt-4.1",
            },
          },
          prompts = {
            {
              role = "user",
              content = function()
                return string.format(
                  [[You are an expert at following the Conventional Commit specification. Given the git diff listed below, please generate a commit message for me:


%s

]],
                  vim.fn.system("git diff --no-ext-diff --staged")
                )
              end,
              opts = {
                contains_code = true,
              },
            },
          },
        },
      },
    },
  },
}

-- Leaving this comment in to see if the issue author notices ;-)
-- This is so I can tell if they've really tested with their own repro.lua file

require("lazy.minit").repro({ spec = plugins })

-- Setup Tree-sitter
local ts_status, treesitter = pcall(require, "nvim-treesitter.configs")
if ts_status then
  treesitter.setup({
    ensure_installed = { "lua", "markdown", "markdown_inline", "yaml", "diff" },
    highlight = { enable = true },
  })
end

-- Setup nvim-cmp
-- local cmp_status, cmp = pcall(require, "cmp")
-- if cmp_status then
--   cmp.setup({
--     mapping = cmp.mapping.preset.insert({
--       ["<C-b>"] = cmp.mapping.scroll_docs(-4),
--       ["<C-f>"] = cmp.mapping.scroll_docs(4),
--       ["<C-Space>"] = cmp.mapping.complete(),
--       ["<C-e>"] = cmp.mapping.abort(),
--       ["<CR>"] = cmp.mapping.confirm({ select = true }),
--       -- Accept currently selected item. Set `select` to `false` to only confirm explicitly selected items.
--     }),
--   })
-- end

Log output

No response

Have you provided and tested with a repro.lua file?

  • Yes, I have tested and provided a repro.lua file
@dhruvinsh dhruvinsh added the bug Something isn't working label May 29, 2025
Copy link
Contributor

Important

If your issue does NOT contain a valid minimal.lua then this issue may be closed without a response.
Thanks for respecting my time and efforts.

Thanks @dhruvinsh. I'll get to this as soon as I can.

In the meantime, please ensure:

  • This is a plugin related issue and not an issue with your configuration
  • You've searched for similar issues (try the discussions too)
  • You've checked out the documentation
  • The tables in your configuration are nested correctly (again, check out the documentation)
  • The issue title is accurate
  • There is a valid minimal.lua file included so I can try and recreate the issue

@olimorris
Copy link
Owner

Image

I can't recreate this with the minimal.lua file.

@lazybobcat
Copy link

I have the same issue with copilot after just installing codecompanion. I also rolled back to 15.8.0 and it fixed the issue

@olimorris olimorris reopened this May 29, 2025
@olimorris
Copy link
Owner

I'll open this up but I cannot recreate this

@olimorris olimorris marked this as a duplicate of #1524 May 29, 2025
@KingMichaelPark
Copy link

@olimorris I can try your minimal file if you'd like? Just to confirm the above, is it the one they provided? They too are using gemini for the inline (overriding copilot to be pointed at gemini pro)

@olimorris
Copy link
Owner

The minimal file provided by @dhruvinsh didn't throw an error for me as seen in the screenshot I shared. So either it doesn't contain the error or the instructions on how to cause the error are erroneous.

@KingMichaelPark
Copy link

Interesting...

So I just prompted it this way:

Image

And that works fine.

It only failed when I use gemini this way:

Image

Image

Image

@KingMichaelPark
Copy link

So I have just added this:

        local function prompt_codecompanion_gemini()
            local query = vim.fn.input("Query: ")
            if query ~= "" then
                vim.cmd(string.format("CodeCompanion gemini %s", query))
            end
        end
        vim.keymap.set({ "n", "v" }, "<leader>ai", prompt_codecompanion_gemini, { noremap = true, silent = true })

and it works around the code action failure

@dhruvinsh
Copy link
Contributor Author

dhruvinsh commented May 29, 2025

it sucks that @olimorris can't reproduce the issue, I will work on better reproducible method. But for now I am doing some debugging on the side and focusing on copilot models only, all the o1 check doesn't seem okay. it should be return not vim.startswith(model.name, "o1")

once I apply the patch manually, I ran into different issue as below,

Error executing vim.schedule lua callback: ...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:282: attempt to index f
ield 'choices' (a nil value)
stack traceback:
        ...codecompanion.nvim/lua/codecompanion/adapters/openai.lua:282: in function 'inline_output'
        ...panion.nvim/lua/codecompanion/strategies/inline/init.lua:406: in function 'cb'
        .../Documents/codecompanion.nvim/lua/codecompanion/http.lua:146: in function <.../Documents/codecompanion.nvim/lua/cod
ecompanion/http.lua:143>

After inspecting the response of json, it look like this.

{
  error = {
    code = "invalid_request_body",
    message = "invalid request body, failed to validate schema: (1) Reason: got object, want string, Location: /properties/model/type."
  }
}

update:
last piece of puzzle, when we send data over payload we are sending model information as table "model":{"name":"gemini-2.5-pro"} and that is what causing last failure.

dhruvinsh added a commit to dhruvinsh/codecompanion.nvim that referenced this issue May 29, 2025
…is#1521

This commit refactors the handling of model identifiers to consistently
support model objects. Previously, models were often treated as simple
strings. This change ensures that if a model is represented as an
object (e.g., `{ name = "model-id", ... }`), its `name` property is
correctly used.

Key changes:
- In `http.lua`, the request building logic now explicitly extracts
  `model.name` if `model` is an object, ensuring the correct string
  identifier is sent in the API request.
- Adapters for Copilot and OpenAI have been updated to access
  `model.name` for internal logic that relies on the model identifier,
  such as conditional parameter availability or message role
  transformations.

This change improves the robustness and flexibility of model handling
within the system, paving the way for more structured model metadata.
dhruvinsh added a commit to dhruvinsh/nvim that referenced this issue May 29, 2025
Reference: olimorris/codecompanion.nvim#1521

Updated the version of the 'codecompanion.nvim' plugin from '*' to 'v15.8.0' to ensure compatibility and stability.
@olimorris olimorris marked this as a duplicate of #1529 May 30, 2025
@ahmedelgabri
Copy link

I also have a problem with :CodeCompanion /commit started after the update. This is now what I get when running the command

[ERROR] 2025-05-30 11:37:02
Error 400: {"type":"error","error":{"type":"invalid_request_error","message":"model: Input should be a valid string"}}
Press ENTER or type command to continue

But if I open the chat window :CodeCompanionChat and then type /commit, it works fine.

prompt_library = {
	['Generate a Commit Message'] = {
		opts = {
			adapter = {
				name = 'anthropic',
				model 'claude-3-5-sonnet-20241022'
			},
		},
	},
},

@olimorris
Copy link
Owner

This should now be closed.

Some feedback on this bug report:

  1. The steps to reproduce weren't clear to me
  2. I interpreted try to run inline strategy command as :CodeCompanion <my inline prompt>
  3. When you actually meant :CodeCompanionActions followed by git commits

This is why I couldn't reproduce this at first. So please be explicit and non-assuming in the future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
5 participants