
Cognee
STDIOLocal AI knowledge management server for Claude Desktop
Local AI knowledge management server for Claude Desktop
cognee‑mcp - Run cognee’s memory engine as a Model Context Protocol server
Demo . Learn more · Join Discord · Join r/AIMemory
Build memory for Agents and query from any client that speaks MCP – in your terminal or IDE.
Please refer to our documentation here for further information.
git clone https://github.com/topoteretes/cognee.git
cd cognee/cognee-mcp
pip install uv
uv sync --dev --all-extras --reinstall
source .venv/bin/activate
LLM_API_KEY="YOUR_OPENAI_API_KEY"
python src/server.py
or stream responses over SSE
python src/server.py --transport sse
or run with Streamable HTTP transport (recommended for web deployments)
python src/server.py --transport http --host 127.0.0.1 --port 8000 --path /mcp
You can do more advanced configurations by creating .env file using our template. To use different LLM providers / database configurations, and for more info check out our documentation.
If you’d rather run cognee-mcp in a container, you have two options:
.env
containing only your LLM_API_KEY
(and your chosen settings).docker rmi cognee/cognee-mcp:main || true docker build --no-cache -f cognee-mcp/Dockerfile -t cognee/cognee-mcp:main .
# For HTTP transport (recommended for web deployments) docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # For SSE transport docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # For stdio transport (default) docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main
# With HTTP transport (recommended for web deployments) docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # With SSE transport docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main # With stdio transport (default) docker run -e TRANSPORT_MODE=stdio --env-file ./.env --rm -it cognee/cognee-mcp:main
Docker uses environment variables, not command line arguments:
-e TRANSPORT_MODE=http
--transport http
(won't work)Direct Python usage uses command line arguments:
python src/server.py --transport http
-e TRANSPORT_MODE=http
(won't work)After starting your Cognee MCP server with Docker, you need to configure your MCP client to connect to it.
Start the server with SSE transport:
docker run -e TRANSPORT_MODE=sse --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
Configure your MCP client:
claude mcp add cognee-sse -t sse http://localhost:8000/sse
Verify the connection:
claude mcp list
You should see your server connected:
Checking MCP server health...
cognee-sse: http://localhost:8000/sse (SSE) - ✓ Connected
Claude (~/.claude.json
)
{ "mcpServers": { "cognee": { "type": "sse", "url": "http://localhost:8000/sse" } } }
Cursor (~/.cursor/mcp.json
)
{ "mcpServers": { "cognee-sse": { "url": "http://localhost:8000/sse" } } }
Start the server with HTTP transport:
docker run -e TRANSPORT_MODE=http --env-file ./.env -p 8000:8000 --rm -it cognee/cognee-mcp:main
Configure your MCP client:
claude mcp add cognee-http -t http http://localhost:8000/mcp
Verify the connection:
claude mcp list
You should see your server connected:
Checking MCP server health...
cognee-http: http://localhost:8000/mcp (HTTP) - ✓ Connected
Claude (~/.claude.json
)
{ "mcpServers": { "cognee": { "type": "http", "url": "http://localhost:8000/mcp" } } }
Cursor (~/.cursor/mcp.json
)
{ "mcpServers": { "cognee-http": { "url": "http://localhost:8000/mcp" } } }
You can configure both transports simultaneously for testing:
{ "mcpServers": { "cognee-sse": { "type": "sse", "url": "http://localhost:8000/sse" }, "cognee-http": { "type": "http", "url": "http://localhost:8000/mcp" } } }
Note: Only enable the server you're actually running to avoid connection errors.
The MCP server exposes its functionality through tools. Call them from any MCP client (Cursor, Claude Desktop, Cline, Roo and more).
cognify: Turns your data into a structured knowledge graph and stores it in memory
codify: Analyse a code repository, build a code graph, stores it in memory
search: Query memory – supports GRAPH_COMPLETION, RAG_COMPLETION, CODE, CHUNKS, INSIGHTS
list_data: List all datasets and their data items with IDs for deletion operations
delete: Delete specific data from a dataset (supports soft/hard deletion modes)
prune: Reset cognee for a fresh start (removes all data)
cognify_status / codify_status: Track pipeline progress
Data Management Examples:
# List all available datasets and data items list_data() # List data items in a specific dataset list_data(dataset_id="your-dataset-id-here") # Delete specific data (soft deletion - safer, preserves shared entities) delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="soft") # Delete specific data (hard deletion - removes orphaned entities) delete(data_id="data-uuid", dataset_id="dataset-uuid", mode="hard")
To use debugger, run:
bash mcp dev src/server.py
Open inspector with timeout passed:
http://localhost:5173?timeout=120000
To apply new changes while developing cognee you need to do:
uv sync --dev --all-extras --reinstall
mcp dev src/server.py
In order to use local cognee:
Uncomment the following line in the cognee-mcp pyproject.toml
file and set the cognee root path.
#"cognee[postgres,codegraph,gemini,huggingface,docs,neo4j] @ file:/Users/<username>/Desktop/cognee"
Remember to replace file:/Users/<username>/Desktop/cognee
with your actual cognee root path.
Install dependencies with uv in the mcp folder
uv sync --reinstall
We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT
for more information.