Closed as not planned
Description
Before submitting your bug report
- I believe this is a bug. I'll try to join the Continue Discord for questions
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: MacOS
- Continue version: main
- IDE version: VS Code
- Model: N/A
- config.json:
"tabAutocompleteModel": {
"title": "Qwen 2.5 Coder 3B",
"provider": "ollama",
"model": "qwen2.5-coder:3b",
"contextLength": 2048,
"completionOptions": {
"temperature": 0.15,
"topP": 0.6,
"topK": 10,
"maxTokens": 256,
"keepAlive": 14400
}
},
Description
When using qwen coder via ollama, a dummy prompt format (madeupFimPromptTemplate) instead of Qwen's proper FIM prompt template.
To reproduce
This doesn't happen with LM Studio as a backend. I was not able to provide a PR.
Log output
No response