CodeAlive
STDIOHTTP-SSEDeep context provider for AI clients with codebase semantic search and understanding.
Deep context provider for AI clients with codebase semantic search and understanding.
Connect your AI assistant to CodeAlive's powerful code understanding platform in seconds!
This MCP (Model Context Protocol) server enables AI clients like Claude Code, Cursor, Claude Desktop, Continue, VS Code (GitHub Copilot), Cline, Codex, OpenCode, Qwen Code, Gemini CLI, Roo Code, Goose, Kilo Code, Windsurf, Kiro, Qoder, and Amazon Q Developer to access CodeAlive's advanced semantic code search and codebase interaction features.
The most accurate and comprehensive Context Engine as a service, optimized for large codebases, powered by advanced GraphRAG and accessible via MCP. It enriches the context for AI agents like Cursor, Claude Code, Codex, etc., making them 35% more efficient and up to 84% faster.
It's like Context7, but for your (large) codebases.
It allows AI-Coding Agents to:
Once connected, you'll have access to these powerful tools:
get_data_sources - List your indexed repositories and workspacescodebase_search - Semantic code search across your indexed codebase (main/master branch)codebase_consultant - AI consultant with full project expertiseAfter setup, try these commands with your AI assistant:
get_data_sourcescodebase_searchcodebase_consultantThe fastest way to get started - no installation required! Our remote MCP server at https://mcp.codealive.ai/api provides instant access to CodeAlive's capabilities.
Select your preferred AI client below for instant setup:
You may ask your AI agent to install the CodeAlive MCP server for you.
Here is CodeAlive API key: PASTE_YOUR_API_KEY_HERE
Add the CodeAlive MCP server by following the installation guide from the README at https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/README.md
Find the section "AI Client Integrations" and locate your client (Claude Code, Cursor, Gemini CLI, etc.). Each client has specific setup instructions:
- For Gemini CLI: Use the one-command setup with `gemini mcp add`
- For Claude Code: Use `claude mcp add` with the --transport http flag
- For other clients: Follow the configuration snippets provided
Prefer the Remote HTTP option when available. If API key is not provided above, help me issue a CodeAlive API key first.
Then allow execution.
Option 1: Remote HTTP (Recommended)
claude mcp add --transport http codealive https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"
Option 2: Docker (STDIO)
claude mcp add codealive-docker /usr/bin/docker run --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.3.0
Replace YOUR_API_KEY_HERE with your actual API key.
Option 1: Remote HTTP (Recommended)
Cmd+, or Ctrl+,){ "mcpServers": { "codealive": { "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
Option 2: Docker (STDIO)
{ "mcpServers": { "codealive": { "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
OpenAI Codex CLI supports MCP via ~/.codex/config.toml.
~/.codex/config.toml (Docker stdio – recommended)
[mcp_servers.codealive] command = "docker" args = ["run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0"]
Experimental: Streamable HTTP (requires experimental_use_rmcp_client)
Note: Streamable HTTP support requires enabling the experimental Rust MCP client in your Codex configuration.
[mcp_servers.codealive] url = "https://mcp.codealive.ai/api" headers = { Authorization = "Bearer YOUR_API_KEY_HERE" }
One command setup (complete):
gemini mcp add --transport http secure-http https://mcp.codealive.ai/api --header "Authorization: Bearer YOUR_API_KEY_HERE"
Replace YOUR_API_KEY_HERE with your actual API key. That's it - no config files needed! 🎉
Option 1: Remote HTTP (Recommended)
.continue/config.yaml in your project or ~/.continue/config.yamlmcpServers: - name: CodeAlive type: streamable-http url: https://mcp.codealive.ai/api requestOptions: headers: Authorization: "Bearer YOUR_API_KEY_HERE"
Option 2: Docker (STDIO)
mcpServers: - name: CodeAlive type: stdio command: docker args: - run - --rm - -i - -e - CODEALIVE_API_KEY=YOUR_API_KEY_HERE - ghcr.io/codealive-ai/codealive-mcp:v0.3.0
Option 1: Remote HTTP (Recommended)
Note: VS Code supports both Streamable HTTP and SSE transports, with automatic fallback to SSE if Streamable HTTP fails.
Ctrl+Shift+P or Cmd+Shift+P){ "servers": { "codealive": { "type": "http", "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
Option 2: Docker (STDIO)
Create .vscode/mcp.json in your workspace:
{ "servers": { "codealive": { "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
Note: Claude Desktop remote MCP requires OAuth authentication. Use Docker option for Bearer token support.
Docker (STDIO)
Edit your config file:
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.jsonAdd this configuration:
{ "mcpServers": { "codealive": { "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
Option 1: Remote HTTP (Recommended)
{ "mcpServers": { "codealive": { "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
Option 2: Docker (STDIO)
{ "mcpServers": { "codealive": { "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
Add CodeAlive as a remote MCP server in your opencode.json.
{ "$schema": "https://opencode.ai/config.json", "mcp": { "codealive": { "type": "remote", "url": "https://mcp.codealive.ai/api", "enabled": true, "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
Qwen Code supports MCP via mcpServers in its settings.json and multiple transports (stdio/SSE/streamable-http). Use streamable-http when available; otherwise use Docker (stdio).
~/.qwen/settings.json (Streamable HTTP)
{ "mcpServers": { "codealive": { "type": "streamable-http", "url": "https://mcp.codealive.ai/api", "requestOptions": { "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } }
Fallback: Docker (stdio)
{ "mcpServers": { "codealive": { "type": "stdio", "command": "docker", "args": ["run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0"] } } }
Roo Code reads a JSON settings file similar to Cline.
Global config: mcp_settings.json (Roo) or cline_mcp_settings.json (Cline-style)
Option A — Remote HTTP
{ "mcpServers": { "codealive": { "type": "streamable-http", "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
Option B — Docker (STDIO)
{ "mcpServers": { "codealive": { "type": "stdio", "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
Tip: If your Roo build doesn't honor HTTP headers, use the Docker/STDIO option.
UI path: Settings → MCP Servers → Add → choose Streamable HTTP
Streamable HTTP configuration:
codealivehttps://mcp.codealive.ai/apiAuthorization: Bearer YOUR_API_KEY_HEREDocker (STDIO) alternative:
Add a STDIO extension with:
dockerrun --rm -i -e CODEALIVE_API_KEY=YOUR_API_KEY_HERE ghcr.io/codealive-ai/codealive-mcp:v0.3.0UI path: Manage → Integrations → Model Context Protocol (MCP) → Add Server
HTTP
{ "mcpServers": { "codealive": { "type": "streamable-http", "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
STDIO (Docker)
{ "mcpServers": { "codealive": { "type": "stdio", "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
File: ~/.codeium/windsurf/mcp_config.json
{ "mcpServers": { "codealive": { "type": "streamable-http", "serverUrl": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
Note: Kiro does not yet support remote MCP servers natively. Use the
mcp-remoteworkaround to connect to remote HTTP servers.
Prerequisites:
npm install -g mcp-remote
UI path: Settings → MCP → Add Server
Global file: ~/.kiro/settings/mcp.json
Workspace file: .kiro/settings/mcp.json
Remote HTTP (via mcp-remote workaround)
{ "mcpServers": { "codealive": { "type": "stdio", "command": "npx", "args": [ "mcp-remote", "https://mcp.codealive.ai/api", "--header", "Authorization: Bearer ${CODEALIVE_API_KEY}" ], "env": { "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE" } } } }
Docker (STDIO)
{ "mcpServers": { "codealive": { "type": "stdio", "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
UI path: User icon → Qoder Settings → MCP → My Servers → + Add (Agent mode)
SSE (remote HTTP)
{ "mcpServers": { "codealive": { "type": "sse", "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
STDIO (Docker)
{ "mcpServers": { "codealive": { "type": "stdio", "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
Q Developer CLI
Config file: ~/.aws/amazonq/mcp.json or workspace .amazonq/mcp.json
HTTP server
{ "mcpServers": { "codealive": { "type": "http", "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
STDIO (Docker)
{ "mcpServers": { "codealive": { "type": "stdio", "command": "docker", "args": [ "run", "--rm", "-i", "-e", "CODEALIVE_API_KEY=YOUR_API_KEY_HERE", "ghcr.io/codealive-ai/codealive-mcp:v0.3.0" ] } } }
Q Developer IDE (VS Code / JetBrains)
Global: ~/.aws/amazonq/agents/default.json
Local (workspace): .aws/amazonq/agents/default.json
Minimal entry (HTTP):
{ "mcpServers": { "codealive": { "type": "http", "url": "https://mcp.codealive.ai/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" }, "timeout": 60000 } } }
Use the IDE UI: Q panel → Chat → tools icon → Add MCP Server → choose http or stdio.
Note: JetBrains AI Assistant requires the
mcp-remoteworkaround for connecting to remote HTTP MCP servers.
Prerequisites:
npm install -g mcp-remote
Config file: Settings/Preferences → AI Assistant → Model Context Protocol → Configure
Add this configuration:
{ "mcpServers": { "codealive": { "command": "npx", "args": [ "mcp-remote", "https://mcp.codealive.ai/api", "--header", "Authorization: Bearer ${CODEALIVE_API_KEY}" ], "env": { "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE" } } } }
For self-hosted deployments, replace the URL:
{ "mcpServers": { "codealive": { "command": "npx", "args": [ "mcp-remote", "http://your-server:8000/api", "--header", "Authorization: Bearer ${CODEALIVE_API_KEY}" ], "env": { "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE" } } } }
See JetBrains MCP Documentation for more details.
For developers who want to customize or contribute to the MCP server.
# Clone the repository git clone https://github.com/CodeAlive-AI/codealive-mcp.git cd codealive-mcp # Setup with uv (recommended) uv venv source .venv/bin/activate # Windows: .venv\Scripts\activate uv pip install -e . # Or setup with pip python -m venv .venv source .venv/bin/activate # Windows: .venv\Scripts\activate pip install -e .
Once installed locally, configure your AI client to use the local server:
claude mcp add codealive-local /path/to/codealive-mcp/.venv/bin/python /path/to/codealive-mcp/src/codealive_mcp_server.py --env CODEALIVE_API_KEY=YOUR_API_KEY_HERE
Replace the Docker command and args with:
{ "command": "/path/to/codealive-mcp/.venv/bin/python", "args": ["/path/to/codealive-mcp/src/codealive_mcp_server.py"], "env": { "CODEALIVE_API_KEY": "YOUR_API_KEY_HERE" } }
# Start local HTTP server export CODEALIVE_API_KEY="your_api_key_here" python src/codealive_mcp_server.py --transport http --host localhost --port 8000 # Test health endpoint curl http://localhost:8000/health
Auto-install for Claude Desktop via Smithery:
npx -y @smithery/cli install @CodeAlive-AI/codealive-mcp --client claude
Repo: https://github.com/akolotov/gemini-cli-codealive-extension
Gemini CLI extension that wires CodeAlive into your terminal with prebuilt slash commands and MCP config. It includes:
GEMINI.md guidance so Gemini knows how to use CodeAlive tools effectively/codealive:chat, /codealive:find, /codealive:searchInstall
gemini extensions install https://github.com/akolotov/gemini-cli-codealive-extension
Configure
# Option 1: .env next to where you run `gemini` CODEALIVE_API_KEY="your_codealive_api_key_here" # Option 2: environment variable export CODEALIVE_API_KEY="your_codealive_api_key_here" gemini
Deploy the MCP server as an HTTP service for team-wide access or integration with self-hosted CodeAlive instances.
The CodeAlive MCP server can be deployed as an HTTP service using Docker. This allows multiple AI clients to connect to a single shared instance, and enables integration with self-hosted CodeAlive deployments.
Create a docker-compose.yml file based on our example:
# Download the example curl -O https://raw.githubusercontent.com/CodeAlive-AI/codealive-mcp/main/docker-compose.example.yml mv docker-compose.example.yml docker-compose.yml # Edit configuration (see below) nano docker-compose.yml # Start the service docker compose up -d # Check health curl http://localhost:8000/health
Configuration Options:
For CodeAlive Cloud (default):
CODEALIVE_BASE_URL environment variable (uses default https://app.codealive.ai)Authorization: Bearer YOUR_KEY headerFor Self-Hosted CodeAlive:
CODEALIVE_BASE_URL to your CodeAlive instance URL (e.g., https://codealive.yourcompany.com)Authorization: Bearer YOUR_KEY headerSee docker-compose.example.yml for the complete configuration template.
Once deployed, configure your AI clients to use your HTTP endpoint:
Claude Code:
claude mcp add --transport http codealive http://your-server:8000/api --header "Authorization: Bearer YOUR_API_KEY_HERE"
VS Code:
code --add-mcp "{\"name\":\"codealive\",\"type\":\"http\",\"url\":\"http://your-server:8000/api\",\"headers\":{\"Authorization\":\"Bearer YOUR_API_KEY_HERE\"}}"
Cursor / Other Clients:
{ "mcpServers": { "codealive": { "url": "http://your-server:8000/api", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } }
Replace your-server:8000 with your actual deployment URL and port.
Test the hosted service:
curl https://mcp.codealive.ai/health
Check your API key:
curl -H "Authorization: Bearer YOUR_API_KEY" https://app.codealive.ai/api/v1/data_sources
Enable debug logging: Add --debug to local server args
MIT License - see LICENSE file for details.
Ready to supercharge your AI assistant with deep code understanding?
Get started now →