Skip to content

Integration Patterns

Henry edited this page Jul 20, 2025 · 1 revision

Integration Patterns

This guide covers common patterns for integrating MCP Memory Service with other tools and workflows.

Table of Contents

Overview

MCP Memory Service can be integrated into various workflows to create a seamless knowledge management system. These patterns help you connect memories with your existing tools.

Development Tool Integrations

Git Hooks Integration

Automatically store commit information:

#!/bin/bash
# .git/hooks/post-commit

COMMIT_MSG=$(git log -1 --pretty=%B)
BRANCH=$(git branch --show-current)
FILES=$(git diff-tree --no-commit-id --name-only -r HEAD)

# Store in memory service
echo \"Store commit memory: Branch: $BRANCH, Message: $COMMIT_MSG, Files: $FILES\" | \\
  mcp-memory-cli store --tags \"git,commit,$BRANCH\"

VS Code Extension Pattern

Create a command to store code snippets:

// extension.js
vscode.commands.registerCommand('mcp.storeSnippet', async () => {
    const editor = vscode.window.activeTextEditor;
    const selection = editor.document.getText(editor.selection);
    const language = editor.document.languageId;
    
    await mcpClient.storeMemory({
        content: `Code snippet:\
\\`\\`\\`${language}\
${selection}\
\\`\\`\\``,
        tags: ['code-snippet', language, 'vscode']
    });
});

CI/CD Pipeline Integration

Store deployment information:

# .github/workflows/deploy.yml
- name: Store Deployment Memory
  run: |
    MEMORY=\"Deployment to ${{ github.event.inputs.environment }}
    Version: ${{ github.sha }}
    Status: ${{ job.status }}
    Timestamp: $(date -u +\"%Y-%m-%dT%H:%M:%SZ\")\"
    
    curl -X POST http://localhost:8080/memory/store \\
      -H \"Content-Type: application/json\" \\
      -d \"{\\\"content\\\": \\\"$MEMORY\\\", \\\"tags\\\": [\\\"deployment\\\", \\\"${{ github.event.inputs.environment }}\\\"]}\"

Automation Patterns

Scheduled Memory Collection

Daily summary automation:

# daily_summary.py
import schedule
import asyncio
from datetime import datetime, timedelta

async def daily_memory_summary():
    # Collect today's memories
    today = datetime.now().date()
    memories = await memory_service.recall(f\"today\")
    
    # Generate summary
    summary = f\"Daily Summary for {today}:\
\"
    summary += f\"- Total memories: {len(memories)}\
\"
    summary += f\"- Key topics: {extract_topics(memories)}\
\"
    summary += f\"- Completed tasks: {count_completed(memories)}\
\"
    
    # Store summary
    await memory_service.store(
        content=summary,
        tags=[\"daily-summary\", str(today)]
    )

# Schedule for 6 PM daily
schedule.every().day.at(\"18:00\").do(lambda: asyncio.run(daily_memory_summary()))

Event-Driven Memory Creation

Automatically capture important events:

// error_logger.js
class MemoryErrorLogger {
    constructor(memoryService) {
        this.memory = memoryService;
    }
    
    async logError(error, context) {
        // Store error details
        await this.memory.store({
            content: `Error: ${error.message}\
Stack: ${error.stack}\
Context: ${JSON.stringify(context)}`,
            tags: ['error', 'automated', context.service]
        });
        
        // Check for similar errors
        const similar = await this.memory.search(`error ${error.message.split(' ')[0]}`);
        if (similar.length > 0) {
            console.log('Similar errors found:', similar.length);
        }
    }
}

API Integration

REST API Wrapper

Simple HTTP interface for memory operations:

from flask import Flask, request, jsonify

app = Flask(__name__)

@app.route('/memory/store', methods=['POST'])
async def store_memory():
    data = request.json
    result = await memory_service.store(
        content=data['content'],
        tags=data.get('tags', [])
    )
    return jsonify({\"id\": result.id})

@app.route('/memory/search', methods=['GET'])
async def search_memories():
    query = request.args.get('q')
    results = await memory_service.search(query)
    return jsonify([r.to_dict() for r in results])

Webhook Integration

Trigger memory storage from external services:

// webhook_handler.js
app.post('/webhook/github', async (req, res) => {
    const { action, pull_request, repository } = req.body;
    
    if (action === 'closed' && pull_request.merged) {
        await memoryService.store({
            content: `PR Merged: ${pull_request.title}\
Repo: ${repository.name}\
Files changed: ${pull_request.changed_files}`,
            tags: ['github', 'pr-merged', repository.name]
        });
    }
    
    res.status(200).send('OK');
});

Workflow Examples

Documentation Workflow

Automatically document decisions:

class DecisionLogger:
    def __init__(self, memory_service):
        self.memory = memory_service
    
    async def log_decision(self, decision_type, title, rationale, alternatives):
        content = f\"\"\"
        Decision: {title}
        Type: {decision_type}
        Date: {datetime.now().isoformat()}
        
        Rationale: {rationale}
        
        Alternatives Considered:
        {chr(10).join(f'- {alt}' for alt in alternatives)}
        \"\"\"
        
        await self.memory.store(
            content=content,
            tags=['decision', decision_type, 'architecture']
        )

Learning Workflow

Track learning progress:

class LearningTracker {
    async trackProgress(topic, resource, notes, understood = true) {
        const entry = {
            content: `Learning: ${topic}\
Resource: ${resource}\
Notes: ${notes}\
Understood: ${understood}`,
            tags: ['learning', topic, understood ? 'understood' : 'review-needed']
        };
        
        await this.memory.store(entry);
        
        // Check previous learning on topic
        const previous = await this.memory.search(`learning ${topic}`);
        return {
            stored: entry,
            previousEntries: previous.length
        };
    }
}

Team Knowledge Sharing

Broadcast important updates:

async def share_team_update(update_type, content, team_members):
    # Store in memory with team visibility
    memory = await memory_service.store(
        content=f\"Team Update ({update_type}): {content}\",
        tags=['team-update', update_type, 'shared']
    )
    
    # Notify team members (example with Slack)
    for member in team_members:
        await notify_slack(
            channel=member.slack_id,
            message=f\"New {update_type} update stored: {memory.id}\"
        )

Best Practices for Integration

  1. Use Consistent Tagging: Establish tag conventions for automated entries
  2. Rate Limiting: Implement limits to prevent memory spam
  3. Error Handling: Always handle memory service failures gracefully
  4. Async Operations: Use async patterns to avoid blocking
  5. Batch Operations: Group related memories when possible

Conclusion

These integration patterns demonstrate how MCP Memory Service can become a central knowledge hub in your workflow. Start with simple integrations and gradually build more sophisticated automations as your needs grow.

Clone this wiki locally