Skip to content

Autocompletion uses dummy template with Qwen Coder + Ollama #3353

Closed as not planned
@AnnoyingTechnology

Description

@AnnoyingTechnology

Before submitting your bug report

Relevant environment info

- OS: MacOS
- Continue version: main
- IDE version: VS Code
- Model: N/A
- config.json:
  "tabAutocompleteModel": {
    "title": "Qwen 2.5 Coder 3B",
    "provider": "ollama",
    "model": "qwen2.5-coder:3b",
    "contextLength": 2048,
    "completionOptions": {
      "temperature": 0.15,
      "topP": 0.6,
      "topK": 10,
      "maxTokens": 256,
      "keepAlive": 14400
    }
  },

Description

When using qwen coder via ollama, a dummy prompt format (madeupFimPromptTemplate) instead of Qwen's proper FIM prompt template.

To reproduce

This doesn't happen with LM Studio as a backend. I was not able to provide a PR.

Log output

No response

Metadata

Metadata

Labels

area:autocompleteRelates to the auto complete featureide:vscodeRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorpriority:mediumIndicates medium prioritystale

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions