Skip to content

02 Platform Setup Guide

Henry edited this page Aug 23, 2025 · 1 revision

Platform-Specific Setup Guide

Detailed platform-specific instructions for optimal MCP Memory Service configuration on Windows, macOS, and Linux.

🍎 macOS Setup

Apple Silicon Macs (M1/M2/M3) - Recommended Setup

Optimal Configuration:

# Install with MPS acceleration
python install.py

# Verify Metal Performance Shaders is working
python -c "import torch; print('MPS available:', torch.backends.mps.is_available())"

# Configuration for best performance
export MCP_MEMORY_STORAGE_BACKEND=sqlite_vec
export PYTORCH_ENABLE_MPS_FALLBACK=1

Performance Benefits:

  • 3-5x faster embedding generation with MPS
  • Native ARM optimization
  • Excellent memory efficiency

Intel Macs (2015-2020) - Legacy Hardware Setup

For 2015 MacBook Pro and similar legacy systems:

# Use legacy hardware optimization
python install.py --legacy-hardware

# Alternative: Use Homebrew PyTorch for better compatibility
python install.py --legacy-hardware --homebrew-torch

Legacy Hardware Optimizations:

  • CPU-only inference (no GPU acceleration)
  • Reduced memory footprint
  • Compatible with older macOS versions (10.15+)
  • Uses Homebrew PyTorch if available for better stability

Homebrew PyTorch Setup (if needed):

# Install Homebrew if not present
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

# Install PyTorch via Homebrew
brew install pytorch

# Then run installer
python install.py --homebrew-torch

macOS Service Installation

LaunchAgent Setup (User Service):

# Install as user service (starts on login)
python scripts/install_service.py

# Service will be created at:
# ~/Library/LaunchAgents/com.mcp.memory-service.plist

Service Management:

# Start service
launchctl load ~/Library/LaunchAgents/com.mcp.memory-service.plist

# Stop service
launchctl unload ~/Library/LaunchAgents/com.mcp.memory-service.plist

# Check status
launchctl list | grep com.mcp.memory-service

# View logs
tail -f ~/.mcp_memory_service/logs/mcp-memory-service.log

System Service (requires sudo):

# Install as system service (starts on boot)
sudo python scripts/install_service.py --system

# Management with sudo
sudo launchctl load /Library/LaunchDaemons/com.mcp.memory-service.plist

🪟 Windows Setup

Windows 10/11 Standard Setup

Installation:

# Clone repository
git clone https://github.com/doobidoo/mcp-memory-service.git
cd mcp-memory-service

# Install (may detect CUDA automatically)
python install.py

# For service installation (requires Administrator)
python install.py --service

GPU Acceleration (if available):

# Check for CUDA support
python -c "import torch; print('CUDA available:', torch.cuda.is_available())"

# Or DirectML support
python -c "import torch_directml; print('DirectML available')"

Windows Service Installation

Install as Windows Service:

# Run as Administrator
python scripts/install_windows_service.py

# Or use the unified installer
python install_service.py

Service Management:

# Using Windows commands
net start MCPMemoryService
net stop MCPMemoryService
sc query MCPMemoryService

# Using PowerShell
Get-Service MCPMemoryService
Start-Service MCPMemoryService
Stop-Service MCPMemoryService

Service Configuration Location:

  • Config: %USERPROFILE%\.mcp_memory_service\service_config.json
  • Logs: %USERPROFILE%\.mcp_memory_service\logs\
  • Service Registry: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\MCPMemoryService

Windows-Specific Features

PowerShell Scripts: (located in scripts/windows/)

  • start_service.ps1 - Start service
  • stop_service.ps1 - Stop service
  • service_status.ps1 - Check status
  • install_service.ps1 - Install service

Environment Variables (PowerShell):

# Set environment variables
[Environment]::SetEnvironmentVariable("MCP_MEMORY_STORAGE_BACKEND", "sqlite_vec", "User")
[Environment]::SetEnvironmentVariable("MCP_HTTP_ENABLED", "true", "User")

🐧 Linux Setup

Ubuntu/Debian Setup

System Dependencies:

# Update system
sudo apt update

# Install Python and dependencies
sudo apt install python3.10 python3.10-venv python3.10-dev python3-pip git build-essential

# Install additional dependencies for AI/ML
sudo apt install libblas3 liblapack3 liblapack-dev libblas-dev gfortran

Installation:

# Create virtual environment (recommended)
python3 -m venv venv
source venv/bin/activate

# Install MCP Memory Service
python install.py

# For CUDA acceleration (if NVIDIA GPU present)
python install.py --cuda

GPU Acceleration on Linux

NVIDIA CUDA Setup:

# Check for CUDA installation
nvidia-smi

# Install CUDA version if needed
# See: https://developer.nvidia.com/cuda-downloads

# Install with CUDA support
python install.py --cuda

# Verify CUDA is working
python -c "import torch; print('CUDA available:', torch.cuda.is_available())"

AMD ROCm Setup:

# Install ROCm (AMD GPUs)
# See: https://docs.amd.com/bundle/ROCm-Installation-Guide

# Install with ROCm support
python install.py --rocm

Linux Service Installation (Systemd)

User Service (recommended):

# Install as user service
python scripts/install_service.py

# Service file location: ~/.config/systemd/user/mcp-memory.service

System Service:

# Install as system service (requires sudo)
sudo python scripts/install_service.py --system

# Service file location: /etc/systemd/system/mcp-memory.service

Service Management:

# User service management
systemctl --user start mcp-memory
systemctl --user stop mcp-memory
systemctl --user status mcp-memory
systemctl --user enable mcp-memory  # Auto-start on login

# System service management (with sudo)
sudo systemctl start mcp-memory
sudo systemctl stop mcp-memory
sudo systemctl status mcp-memory
sudo systemctl enable mcp-memory  # Auto-start on boot

# View logs
journalctl --user -u mcp-memory -f  # User service
sudo journalctl -u mcp-memory -f    # System service

Linux Distribution-Specific Notes

CentOS/RHEL/Fedora:

# Install dependencies
sudo dnf install python3-devel python3-pip git gcc gcc-c++ blas-devel lapack-devel

# Continue with standard installation
python install.py

Arch Linux:

# Install dependencies
sudo pacman -S python python-pip git base-devel blas lapack

# Continue with standard installation
python install.py

🔧 Advanced Platform Configuration

Performance Optimization by Platform

macOS Intel (Legacy Hardware):

# Optimize for limited resources
export OMP_NUM_THREADS=2
export MKL_NUM_THREADS=2
export OPENBLAS_NUM_THREADS=2

# Use smaller embedding model
export MCP_EMBEDDING_MODEL="all-MiniLM-L6-v2"

macOS Apple Silicon:

# Optimize for MPS acceleration
export PYTORCH_ENABLE_MPS_FALLBACK=1
export MPS_CACHE_SIZE=2048

# Use full-size embedding model
export MCP_EMBEDDING_MODEL="all-mpnet-base-v2"

Windows with GPU:

# Optimize for CUDA/DirectML
$env:CUDA_VISIBLE_DEVICES="0"
$env:MCP_EMBEDDING_MODEL="all-mpnet-base-v2"

Linux with CUDA:

# Optimize for CUDA acceleration
export CUDA_VISIBLE_DEVICES=0
export MCP_EMBEDDING_MODEL="all-mpnet-base-v2"
export PYTORCH_CUDA_ALLOC_CONF=max_split_size_mb:512

Memory and CPU Optimization

Low Memory Systems (8GB or less):

# Reduce memory usage
export MCP_BATCH_SIZE=16
export MCP_MAX_CONTEXT_LENGTH=512
export MCP_EMBEDDING_MODEL="all-MiniLM-L6-v2"

High Memory Systems (16GB+):

# Increase performance
export MCP_BATCH_SIZE=64
export MCP_MAX_CONTEXT_LENGTH=1024
export MCP_EMBEDDING_MODEL="all-mpnet-base-v2"

🔒 Security Configuration by Platform

macOS Security

Firewall Configuration:

# Allow MCP Memory Service through firewall
sudo /usr/libexec/ApplicationFirewall/socketfilterfw --add /path/to/python
sudo /usr/libexec/ApplicationFirewall/socketfilterfw --unblockapp /path/to/python

Keychain Integration:

# Store API key in Keychain (optional)
security add-generic-password -a mcp-memory -s mcp-api-key -w "your-api-key"

Windows Security

Windows Firewall:

# Allow through Windows Firewall
New-NetFirewallRule -DisplayName "MCP Memory Service" -Direction Inbound -Port 8000 -Protocol TCP -Action Allow

User Account Control:

  • Service installation requires Administrator privileges
  • Regular operation runs under user context

Linux Security

UFW Firewall (Ubuntu):

# Allow MCP Memory Service port
sudo ufw allow 8000/tcp
sudo ufw reload

SELinux (CentOS/RHEL):

# Allow service to bind to network ports
sudo setsebool -P httpd_can_network_connect 1

# Set SELinux context for service files
sudo semanage fcontext -a -t bin_t "/path/to/mcp-memory-service/scripts/run_server.py"
sudo restorecon -v /path/to/mcp-memory-service/scripts/run_server.py

🚨 Platform-Specific Troubleshooting

macOS Issues

"Cannot verify developer" error:

# Allow unverified apps (use with caution)
sudo spctl --master-disable
# Re-enable after installation: sudo spctl --master-enable

MPS acceleration not working:

# Check macOS version (requires macOS 12.3+)
sw_vers

# Reset MPS cache
rm -rf ~/.cache/torch/

Windows Issues

PowerShell execution policy:

# Enable script execution (run as Administrator)
Set-ExecutionPolicy RemoteSigned

Visual C++ Runtime missing:

  • Install Microsoft Visual C++ Redistributable
  • Or install Visual Studio Build Tools

Linux Issues

Permission denied errors:

# Fix permissions
chmod +x scripts/*.sh
sudo chown -R $USER:$USER ~/.mcp_memory_service/

Library version conflicts:

# Create clean virtual environment
python3 -m venv --clear venv
source venv/bin/activate
pip install --upgrade pip
python install.py

📋 Platform Verification Commands

Test Platform-Specific Features

macOS:

# Test MPS (Apple Silicon)
python -c "import torch; print('MPS:', torch.backends.mps.is_available())"

# Test service
launchctl list | grep com.mcp.memory-service

Windows:

# Test CUDA/DirectML
python -c "import torch; print('CUDA:', torch.cuda.is_available())"

# Test service
Get-Service MCPMemoryService

Linux:

# Test CUDA
python -c "import torch; print('CUDA:', torch.cuda.is_available())"

# Test service
systemctl --user status mcp-memory

This guide provides platform-optimized installations for the best performance and compatibility on your specific system.

Clone this wiki locally