Memory Service
STDIOUniversal MCP memory service with intelligent triggers, OAuth team collaboration, and semantic search for AI assistants
Universal MCP memory service with intelligent triggers, OAuth team collaboration, and semantic search for AI assistants
Production-ready MCP memory service with zero database locks, hybrid backend (fast local + cloud sync), and intelligent memory search for AI assistants. Features v8.9.0 auto-configuration for multi-client access, 5ms local reads with background Cloudflare sync, Natural Memory Triggers with 85%+ accuracy, and OAuth 2.1 team collaboration. Works with Claude Desktop, VS Code, Cursor, Continue, and 13+ AI applications.
Database Maintenance & Type Consolidation - Professional-grade tools for maintaining memory database health and organization.
What's New:
Database Health:
📖 Full Details: CHANGELOG.md | Maintenance Guide | All Releases
# One-command installation with auto-configuration git clone https://github.com/doobidoo/mcp-memory-service.git cd mcp-memory-service && python install.py # Choose option 4 (Hybrid - RECOMMENDED) when prompted # Installer automatically configures: # ✅ SQLite pragmas for concurrent access # ✅ Cloudflare credentials for cloud sync # ✅ Claude Desktop integration # Done! Fast local + cloud sync with zero database locks
Install from PyPI:
# Install latest version from PyPI pip install mcp-memory-service # Or with uv (faster) uv pip install mcp-memory-service
Then configure Claude Desktop by adding to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or equivalent:
{ "mcpServers": { "memory": { "command": "memory", "args": ["server"], "env": { "MCP_MEMORY_STORAGE_BACKEND": "hybrid" } } } }
For advanced configuration with the interactive installer, clone the repo and run python scripts/installation/install.py.
Universal Installer (Most Compatible):
# Clone and install with automatic platform detection git clone https://github.com/doobidoo/mcp-memory-service.git cd mcp-memory-service # Lightweight installation (SQLite-vec with ONNX embeddings - recommended) python install.py # Add full ML capabilities (torch + sentence-transformers for advanced features) python install.py --with-ml # Install with hybrid backend (SQLite-vec + Cloudflare sync) python install.py --storage-backend hybrid
📝 Installation Options Explained:
--with-ml: Adds PyTorch + sentence-transformers for advanced ML features - heavier but more capable--storage-backend hybrid: Hybrid backend with SQLite-vec + Cloudflare sync - best for multi-device accessDocker (Fastest):
# For MCP protocol (Claude Desktop) docker-compose up -d # For HTTP API + OAuth (Team Collaboration) docker-compose -f docker-compose.http.yml up -d
Smithery (Claude Desktop):
# Auto-install for Claude Desktop npx -y @smithery/cli install @doobidoo/mcp-memory-service --client claude
Updating from an older version? Scripts have been reorganized for better maintainability:
python -m mcp_memory_service.server in your Claude Desktop config (no path dependencies!)uv run memory server with UV toolingscripts/run_memory_server.py to scripts/server/run_memory_server.pyOn your first run, you'll see some warnings that are completely normal:
These warnings disappear after the first successful run. The service is working correctly! For details, see our First-Time Setup Guide.
sqlite-vec may not have pre-built wheels for Python 3.13 yet. If installation fails:
brew install [email protected]--storage-backend cloudflaremacOS users may encounter enable_load_extension errors with sqlite-vec:
brew install python && rehashPYTHON_CONFIGURE_OPTS='--enable-loadable-sqlite-extensions' pyenv install 3.12.0--storage-backend cloudflare or --storage-backend hybrid👉 Visit our comprehensive Wiki for detailed guides:
busy_timeout=15000,cache_size=20000)/session-start command for manual session initialization (workaround for issue #160)Note: All heavy ML dependencies (PyTorch, sentence-transformers) are now optional to dramatically reduce build times and image sizes. SQLite-vec uses lightweight ONNX embeddings by default. Install with
--with-mlfor full ML capabilities.
# Start server with web interface uv run memory server --http # Access interactive dashboard open http://127.0.0.1:8888/ # Upload documents via CLI curl -X POST http://127.0.0.1:8888/api/documents/upload \ -F "[email protected]" \ -F "tags=documentation,reference" # Search document content curl -X POST http://127.0.0.1:8888/api/search \ -H "Content-Type: application/json" \ -d '{"query": "authentication flow", "limit": 10}'
# Start OAuth-enabled server for team collaboration export MCP_OAUTH_ENABLED=true uv run memory server --http # Claude Code team members connect via HTTP transport claude mcp add --transport http memory-service http://your-server:8000/mcp # → Automatic OAuth discovery, registration, and authentication
# Store a memory uv run memory store "Fixed race condition in authentication by adding mutex locks" # Search for relevant memories uv run memory recall "authentication race condition" # Search by tags uv run memory search --tags python debugging # Check system health (shows OAuth status) uv run memory health
Recommended approach - Add to your Claude Desktop config (~/.claude/config.json):
{ "mcpServers": { "memory": { "command": "python", "args": ["-m", "mcp_memory_service.server"], "env": { "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec" } } } }
Alternative approaches:
// Option 1: UV tooling (if using UV) { "mcpServers": { "memory": { "command": "uv", "args": ["--directory", "/path/to/mcp-memory-service", "run", "memory", "server"], "env": { "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec" } } } } // Option 2: Direct script path (v6.17.0+) { "mcpServers": { "memory": { "command": "python", "args": ["/path/to/mcp-memory-service/scripts/server/run_memory_server.py"], "env": { "MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec" } } } }
Hybrid Backend (v8.9.0+ RECOMMENDED):
# Hybrid backend with auto-configured pragmas export MCP_MEMORY_STORAGE_BACKEND=hybrid export MCP_MEMORY_SQLITE_PRAGMAS="busy_timeout=15000,cache_size=20000" # Cloudflare credentials (required for hybrid) export CLOUDFLARE_API_TOKEN="your-token" export CLOUDFLARE_ACCOUNT_ID="your-account" export CLOUDFLARE_D1_DATABASE_ID="your-db-id" export CLOUDFLARE_VECTORIZE_INDEX="mcp-memory-index" # Enable HTTP API export MCP_HTTP_ENABLED=true export MCP_HTTP_PORT=8000 # Security export MCP_API_KEY="your-secure-key"
SQLite-vec Only (Local):
# Local-only storage export MCP_MEMORY_STORAGE_BACKEND=sqlite_vec export MCP_MEMORY_SQLITE_PRAGMAS="busy_timeout=15000,cache_size=20000"
┌─────────────────┐    ┌─────────────────┐    ┌─────────────────┐
│   AI Clients    │    │  MCP Memory     │    │ Storage Backend │
│                 │    │  Service v8.9   │    │                 │
│ • Claude Desktop│◄──►│ • MCP Protocol  │◄──►│ • Hybrid 🌟     │
│ • Claude Code   │    │ • HTTP Transport│    │   (5ms local +  │
│   (HTTP/OAuth)  │    │ • OAuth 2.1 Auth│    │    cloud sync)  │
│ • VS Code       │    │ • Memory Store  │    │ • SQLite-vec    │
│ • Cursor        │    │ • Semantic      │    │ • Cloudflare    │
│ • 13+ AI Apps   │    │   Search        │    │                 │
│ • Web Dashboard │    │ • Doc Ingestion │    │ Zero DB Locks ✅│
│   (Port 8888)   │    │ • Zero DB Locks │    │ Auto-Config ✅  │
└─────────────────┘    └─────────────────┘    └─────────────────┘
mcp-memory-service/
├── src/mcp_memory_service/    # Core application
│   ├── models/                # Data models
│   ├── storage/               # Storage backends
│   ├── web/                   # HTTP API & dashboard
│   └── server.py              # MCP server
├── scripts/                   # Utilities & installation
├── tests/                     # Test suite
└── tools/docker/              # Docker configuration
See CONTRIBUTING.md for detailed guidelines.
python scripts/validation/validate_configuration_complete.py to check your setupReal-world metrics from active deployments:
Apache License 2.0 - see LICENSE for details.
Ready to supercharge your AI workflow? 🚀
👉 Start with our Installation Guide or explore the Wiki for comprehensive documentation.
Transform your AI conversations into persistent, searchable knowledge that grows with you.