LangGraph编码代理
STDIO使用LangGraph和MCP实现多方案的编程代理团队
使用LangGraph和MCP实现多方案的编程代理团队
This project implements a small team of coding agents using LangGraph and the Model Context Protocol (MCP). The agents use MCP servers to provide tools and capabilities through a unified gateway.
The overall objective of this agent team is to take requirements and code context and create multiple implementations of proposed features; human operators can then choose their preferred approach and proceed, discarding the others.
This project originated from the Anthropic MCP Hackathon in NYC on 12/11/2024 and has since evolved into its own standalone project.
The system consists of three main components:
MCP Gateway Server: A server that:
MCP Servers: Individual servers that provide specific capabilities:
Coding Agents: There are three agents that collaborate to accomplish coding tasks:
# Install the agent package pip install -e . # Install the gateway package cd gateway pip install -e . cd ..
The agent supports multiple LLM providers through environment variables:
# LLM Configuration - supports multiple providers: LLM_MODEL=provider/model-name # Supported providers and example models: # - Anthropic: anthropic/claude-3-5-sonnet-20240620 # - OpenAI: openai/gpt-4 # - OpenRouter: openrouter/openai/gpt-4o-mini # - Google: google/gemini-1.5-pro # API Keys for different providers OPENAI_API_KEY=your_openai_api_key ANTHROPIC_API_KEY=your_anthropic_api_key OPENROUTER_API_KEY=your_openrouter_api_key GOOGLE_API_KEY=your_google_api_key # OpenRouter Configuration (if using OpenRouter) OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
The gateway server is configured through gateway/config.json
. By default, it starts two MCP servers:
{ "mcp": { "servers": { "filesystem": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-filesystem", "/path/to/directory" ] }, "memory": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-memory" ] } } } }
You can add more servers from the official MCP servers repository.
cd gateway python -m mcp_gateway.server
The server will start on port 8808 by default.
The agent's connection to the gateway is configured in langgraph.json
:
{ "dependencies": ["."], "graphs": { "agent": "./src/react_agent/graph.py:graph" }, "env": ".env", "mcp": { "gateway_url": "http://localhost:8808" } }
Open the folder in LangGraph Studio! The agent will automatically:
The agent has access to tools from both MCP servers:
read_file
: Read file contentswrite_file
: Create or update fileslist_directory
: List directory contentssearch_files
: Find files matching patternscreate_entities
: Add entities to knowledge graphcreate_relations
: Link entities togethersearch_nodes
: Query the knowledge graphgateway/config.json
src/react_agent/prompts.py
src/react_agent/graph.py
This project is licensed under the MIT License - see the LICENSE file for details.