
Graphiti
STDIOSTREAMABLE HTTPKnowledge graph server for AI agents with Neo4j and MCP integration
Knowledge graph server for AI agents with Neo4j and MCP integration
🌟 A powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).
git clone https://github.com/gifflet/graphiti-mcp-server.git cd graphiti-mcp-server
cp .env.sample .env
.env
with your configuration:# Required for LLM operations OPENAI_API_KEY=your_openai_api_key_here MODEL_NAME=gpt-4o # Optional: Custom OpenAI endpoint # OPENAI_BASE_URL=https://api.openai.com/v1 # Neo4j Configuration (defaults work with Docker) NEO4J_URI=bolt://neo4j:7687 NEO4J_USER=neo4j NEO4J_PASSWORD=demodemo EOF
docker compose up -d
# Check if services are running docker compose ps # Check logs docker compose logs graphiti-mcp
You can run with environment variables directly:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4o docker compose up
Service | Port | Purpose |
---|---|---|
Neo4j Browser | 7474 | Web interface for graph visualization |
Neo4j Bolt | 7687 | Database connection |
Graphiti MCP | 8000 | MCP server endpoint |
Variable | Required | Default | Description |
---|---|---|---|
OPENAI_API_KEY | ✅ | - | Your OpenAI API key |
MODEL_NAME | ❌ | gpt-4o | OpenAI model to use |
OPENAI_BASE_URL | ❌ | - | Custom OpenAI endpoint |
NEO4J_URI | ❌ | bolt://neo4j:7687 | Neo4j connection URI |
NEO4J_USER | ❌ | neo4j | Neo4j username |
NEO4J_PASSWORD | ❌ | demodemo | Neo4j password |
Default configuration for Neo4j:
neo4j
demodemo
bolt://neo4j:7687
(within Docker network)You can run with environment variables directly:
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4o docker compose up
{ "mcpServers": { "Graphiti": { "command": "uv", "args": ["run", "graphiti_mcp_server.py"], "env": { "OPENAI_API_KEY": "your_key_here" } } } }
{ "mcpServers": { "Graphiti": { "url": "http://localhost:8000/sse" } } }
graphiti_cursor_rules.mdc
)The server supports standard MCP transports:
http://localhost:8000/sse
ws://localhost:8000/ws
# Using uv (recommended) curl -LsSf https://astral.sh/uv/install.sh | sh uv sync # Or using pip pip install -r requirements.txt
docker run -d \ --name neo4j-dev \ -p 7474:7474 -p 7687:7687 \ -e NEO4J_AUTH=neo4j/demodemo \ neo4j:5.26.0
# Set environment variables export OPENAI_API_KEY=your_key export NEO4J_URI=bolt://localhost:7687 # Run with stdio transport uv run graphiti_mcp_server.py # Or with SSE transport uv run graphiti_mcp_server.py --transport sse --use-custom-entities
# Run basic connectivity test curl http://localhost:8000/health # Test MCP endpoint curl http://localhost:8000/sse
# Clean up and restart docker compose down -v docker compose up --build # Check disk space docker system df
# View all logs docker compose logs -f # View specific service logs docker compose logs -f graphiti-mcp docker compose logs -f neo4j # Enable debug logging docker compose up -e LOG_LEVEL=DEBUG
docker-compose.yml
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ MCP Client │ │ Graphiti MCP │ │ Neo4j │
│ (Cursor) │◄──►│ Server │◄──►│ Database │
│ │ │ (Port 8000) │ │ (Port 7687) │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│
▼
┌──────────────────┐
│ OpenAI API │
│ (LLM Client) │
└──────────────────┘
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
Need help? Open an issue or check our troubleshooting guide above.