Chain of Draft Prompt
STDIOPowerful tool that enhances LLM reasoning by transforming prompts into Chain of Draft format.
Powerful tool that enhances LLM reasoning by transforming prompts into Chain of Draft format.
The MCP Chain of Draft (CoD) Prompt Tool is a powerful Model Context Protocol tool that enhances LLM reasoning by transforming standard prompts into either Chain of Draft (CoD) or Chain of Thought (CoT) format. Here's how it works:
This approach significantly improves reasoning quality while reducing token usage and maintaining high accuracy.
This tool supports a "Bring Your Own LLM" approach, allowing you to use any language model of your choice:
Cloud Services
# For Anthropic Claude export ANTHROPIC_API_KEY=your_key_here # For OpenAI export OPENAI_API_KEY=your_key_here # For Mistral AI export MISTRAL_API_KEY=your_key_here
Local Models with Ollama
# First install Ollama curl https://ollama.ai/install.sh | sh # Pull your preferred model ollama pull llama2 # or ollama pull mistral # or any other model # Configure the tool to use Ollama export MCP_LLM_PROVIDER=ollama export MCP_OLLAMA_MODEL=llama2 # or your chosen model
Custom Local Models
# Point to your local model API export MCP_LLM_PROVIDER=custom export MCP_CUSTOM_LLM_ENDPOINT=http://localhost:your_port
This project implements the Chain of Draft (CoD) reasoning approach as a Model Context Protocol (MCP) prompt tool for Claude. The core Chain of Draft implementation is based on the work by stat-guy. We extend our gratitude for their pioneering work in developing this efficient reasoning approach.
Original Repository: https://github.com/stat-guy/chain-of-draft
Core Chain of Draft Implementation
Performance Analytics
Adaptive Word Limits
Comprehensive Example Database
Format Enforcement
Hybrid Reasoning Approaches
OpenAI API Compatibility
pip install -r requirements.txt
.env
file:
ANTHROPIC_API_KEY=your_api_key_here
python server.py
npm install
.env
file:
ANTHROPIC_API_KEY=your_api_key_here
# Build TypeScript files using Nx npm run nx build # Start the server npm start # For development with auto-reload: npm run dev
Available scripts:
npm run nx build
: Compiles TypeScript to JavaScript using Nx build systemnpm run build:sea
: Creates Single Executable Applications for all platformsnpm start
: Runs the compiled server from dist
npm test
: Runs the test query against the servernpm run dev
: Runs the TypeScript server directly using ts-node (useful for development)The project uses Nx as its build system, providing:
This project supports building Single Executable Applications (SEA) using Node.js 22+ and the @getlarge/nx-node-sea plugin. This allows you to create standalone executables that don't require Node.js to be installed on the target system.
The project includes several scripts for building SEA executables:
# Build for all platforms npm run build:sea # Build for specific platforms npm run build:macos # macOS npm run build:linux # Linux npm run build:windows # Windows
The project uses Nx for managing the build process. The SEA configuration is handled through the nx-node-sea plugin, which provides a streamlined way to create Node.js single executable applications.
Key features of the SEA build process:
Once built, the SEA executables can be found in the dist
directory. These executables:
For Claude Desktop integration with SEA executables, update your configuration to use the executable path:
{ "mcpServers": { "chain-of-draft-prompt-tool": { "command": "/path/to/mcp-chain-of-draft-prompt-tool", "env": { "ANTHROPIC_API_KEY": "your_api_key_here" } } } }
To integrate with Claude Desktop:
Install Claude Desktop from claude.ai/download
Create or edit the Claude Desktop config file:
~/Library/Application Support/Claude/claude_desktop_config.json
Add the tool configuration (Python version):
{ "mcpServers": { "chain-of-draft-prompt-tool": { "command": "python3", "args": ["/absolute/path/to/cod/server.py"], "env": { "ANTHROPIC_API_KEY": "your_api_key_here" } } } }
Or for the JavaScript version:
{ "mcpServers": { "chain-of-draft-prompt-tool": { "command": "node", "args": ["/absolute/path/to/cod/index.js"], "env": { "ANTHROPIC_API_KEY": "your_api_key_here" } } } }
Restart Claude Desktop
You can also use the Claude CLI to add the tool:
# For Python implementation claude mcp add chain-of-draft-prompt-tool -e ANTHROPIC_API_KEY="your_api_key_here" "python3 /absolute/path/to/cod/server.py" # For JavaScript implementation claude mcp add chain-of-draft-prompt-tool -e ANTHROPIC_API_KEY="your_api_key_here" "node /absolute/path/to/cod/index.js"
Dive is an excellent open-source MCP Host Desktop Application that provides a user-friendly GUI for interacting with MCP tools like this one. It supports multiple LLMs including ChatGPT, Anthropic Claude, Ollama, and other OpenAI-compatible models.
Download and install Dive from their releases page
Configure the Chain of Draft tool in Dive's MCP settings:
{ "mcpServers": { "chain-of-draft-prompt-tool": { "command": "/path/to/mcp-chain-of-draft-prompt-tool", "enabled": true, "env": { "ANTHROPIC_API_KEY": "your_api_key_here" } } } }
If you're using the non-SEA version:
{ "mcpServers": { "chain-of-draft-prompt-tool": { "command": "node", "args": ["/path/to/dist/index.js"], "enabled": true, "env": { "ANTHROPIC_API_KEY": "your_api_key_here" } } } }
Using Dive provides a convenient way to interact with the Chain of Draft tool through a modern, feature-rich interface while maintaining all the benefits of the MCP protocol.
The project includes integration with the MCP Inspector tool, which provides a visual interface for testing and debugging MCP tools. This is especially useful during development or when you want to inspect the tool's behavior.
You can start the MCP Inspector using the provided npm script:
# Start the MCP Inspector with the tool npm run test-inspector # Or run it manually npx @modelcontextprotocol/inspector -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY -- node dist/index.js
This will:
The MCP Inspector provides:
This makes it an invaluable tool for:
The Inspector will be available at http://localhost:5173
by default.
The Chain of Draft server provides the following tools:
Tool | Description |
---|---|
chain_of_draft_solve | Solve a problem using Chain of Draft reasoning |
math_solve | Solve a math problem with CoD |
code_solve | Solve a coding problem with CoD |
logic_solve | Solve a logic problem with CoD |
get_performance_stats | Get performance stats for CoD vs CoT |
get_token_reduction | Get token reduction statistics |
analyze_problem_complexity | Analyze problem complexity |
If you want to use the Chain of Draft client directly in your Python code:
from client import ChainOfDraftClient # Create client with specific LLM provider cod_client = ChainOfDraftClient( llm_provider="ollama", # or "anthropic", "openai", "mistral", "custom" model_name="llama2" # specify your model ) # Use directly result = await cod_client.solve_with_reasoning( problem="Solve: 247 + 394 = ?", domain="math" ) print(f"Answer: {result['final_answer']}") print(f"Reasoning: {result['reasoning_steps']}") print(f"Tokens used: {result['token_count']}")
For TypeScript/Node.js applications:
import { ChainOfDraftClient } from './lib/chain-of-draft-client'; // Create client with your preferred LLM const client = new ChainOfDraftClient({ provider: 'ollama', // or 'anthropic', 'openai', 'mistral', 'custom' model: 'llama2', // your chosen model endpoint: 'http://localhost:11434' // for custom endpoints }); // Use the client async function solveMathProblem() { const result = await client.solveWithReasoning({ problem: "Solve: 247 + 394 = ?", domain: "math", max_words_per_step: 5 }); console.log(`Answer: ${result.final_answer}`); console.log(`Reasoning: ${result.reasoning_steps}`); console.log(`Tokens used: ${result.token_count}`); } solveMathProblem();
The server is available in both Python and JavaScript implementations, both consisting of several integrated components:
Both implementations follow the same core principles and provide identical MCP tools, making them interchangeable for most use cases.
This project is open-source and available under the MIT license.