Skip to content

Commit 331211b

Browse files
dimodiyordan-mitev
andauthored
docs(AI): Connect Telerik MCP server to local LLM (#3013)
* docs(AI): Connect Telerik MCP server to local LLM * Update ai/mcp-server.md Co-authored-by: Yordan <[email protected]> --------- Co-authored-by: Yordan <[email protected]>
1 parent 330fc4d commit 331211b

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

ai/mcp-server.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -171,6 +171,10 @@ The following list describes how your prompts may look like:
171171

172172
@[template](/_contentTemplates/common/ai-coding-assistant.md#number-of-requests)
173173

174+
## Connect to Local AI Model
175+
176+
You can use the Telerik Blazor MCP server with local large language models (LLM). For example, run your local model through [Ollama](https://ollama.com) and use a third-party package such as [MCP-LLM Bridge](https://github.com/patruff/ollama-mcp-bridge) to connect the model to the Telerik MCP server. This will allow you to use the Telerik AI Coding Assistant without a cloud-based AI model.
177+
174178
## See Also
175179

176180
* [Telerik Blazor extension for GitHub Copilot](slug:ai-copilot-extension)

0 commit comments

Comments
 (0)