A repository of LLM prompts that can be served over MCP to your LLM. Write once, prompt anywhere.
- A collection of useful LLM prompts (see: src/llm-prompts)
- LLM prompts are served via MCP
- Easily extendable for additional LLM prompts
- Built with TypeScript and the official MCP SDK
- Description: Provides detailed, step-by-step instructions for planning an LLM-assisted code change. Use this tool to help a software developer and LLM collaboratively create a robust, actionable plan for implementing a code change, including best practices for clarifying requirements, structuring tasks, and managing dependencies.
- Returns: The full contents of
code-change-planning-instructions.md
- Description: Returns best-practice instructions and examples for writing semantic git commit messages. Use this tool to help a developer or LLM generate clear, conventional commit messages that communicate the intent and context of code changes, following the semantic commit format.
- Returns: The full contents of
git-commit-instructions.md
- Description: Provides detailed, step-by-step instructions for completing a code change task. Use this tool to get instructions for completing a code change task defined in the plan.md file. Instructions include best practices for testing, debugging, and committing changes.
- Returns: The full contents of
code-change-task-completion-instructions.md
-
Clone the repository
git clone [email protected]:BrentLayne/llm-prompts.git
-
Install dependencies
yarn install
- Make sure to build
Compile the source code so it's ready to run.
yarn build
- Add the MCP server to Cursor (or your other MCP client of choice) and get your LLM to pull in prompts that you can then use to prompt the LLM!
Example Cursor usage:
- Add to MCP config
// in Cursor's mcp.json file
{
...other MCPs,
"llm-prompts": {
"command": "node",
"args": [
"/your/path/to/llm-prompts/build/server.js"
]
}
}
- Prompt the agent to prompt itself
- Example: "Read the instructions on how to plan a feature and then follow those instructions to help me write a new feature"
- Add a new prompt to src/llm-prompts
- Register the prompt in src/mcp-server/server.ts
yarn dev
yarn build
yarn inspector