Rubber Duck
STDIOMCP server bridging multiple OpenAI-compatible LLMs for rubber duck debugging perspectives
MCP server bridging multiple OpenAI-compatible LLMs for rubber duck debugging perspectives
An MCP (Model Context Protocol) server that acts as a bridge to query multiple OpenAI-compatible LLMs. Just like rubber duck debugging, explain your problems to various AI "ducks" and get different perspectives!
     __
   <(o )___
    ( ._> /
     `---'  Quack! Ready to debug!
Any provider with an OpenAI-compatible API endpoint, including:
👉 Complete Claude Desktop setup instructions below in Claude Desktop Configuration
npm install -g mcp-rubber-duck
# Clone the repository git clone https://github.com/nesquikm/mcp-rubber-duck.git cd mcp-rubber-duck # Install dependencies npm install # Build the project npm run build # Run the server npm start
Create a .env file in the project root:
# OpenAI OPENAI_API_KEY=sk-... OPENAI_DEFAULT_MODEL=gpt-4o-mini # Optional: defaults to gpt-4o-mini # Google Gemini GEMINI_API_KEY=... GEMINI_DEFAULT_MODEL=gemini-2.5-flash # Optional: defaults to gemini-2.5-flash # Groq GROQ_API_KEY=gsk_... GROQ_DEFAULT_MODEL=llama-3.3-70b-versatile # Optional: defaults to llama-3.3-70b-versatile # Ollama (Local) OLLAMA_BASE_URL=http://localhost:11434/v1 # Optional OLLAMA_DEFAULT_MODEL=llama3.2 # Optional: defaults to llama3.2 # Together AI TOGETHER_API_KEY=... # Custom Providers (you can add multiple) # Format: CUSTOM_{NAME}_* where NAME becomes the provider key (lowercase) # Example: Add provider "myapi" CUSTOM_MYAPI_API_KEY=... CUSTOM_MYAPI_BASE_URL=https://api.example.com/v1 CUSTOM_MYAPI_DEFAULT_MODEL=custom-model # Optional CUSTOM_MYAPI_MODELS=model1,model2 # Optional: comma-separated list CUSTOM_MYAPI_NICKNAME=My Custom Duck # Optional: display name # Example: Add provider "azure" CUSTOM_AZURE_API_KEY=... CUSTOM_AZURE_BASE_URL=https://mycompany.openai.azure.com/v1 # Global Settings DEFAULT_PROVIDER=openai DEFAULT_TEMPERATURE=0.7 LOG_LEVEL=info # MCP Bridge Settings (Optional) MCP_BRIDGE_ENABLED=true # Enable ducks to access external MCP servers MCP_APPROVAL_MODE=trusted # always, trusted, or never MCP_APPROVAL_TIMEOUT=300 # seconds # MCP Server: Context7 Documentation (Example) MCP_SERVER_CONTEXT7_TYPE=http MCP_SERVER_CONTEXT7_URL=https://mcp.context7.com/mcp MCP_SERVER_CONTEXT7_ENABLED=true # Per-server trusted tools MCP_TRUSTED_TOOLS_CONTEXT7=* # Trust all Context7 tools # Optional: Custom Duck Nicknames (Have fun with these!) OPENAI_NICKNAME="DUCK-4" # Optional: defaults to "GPT Duck" GEMINI_NICKNAME="Duckmini" # Optional: defaults to "Gemini Duck" GROQ_NICKNAME="Quackers" # Optional: defaults to "Groq Duck" OLLAMA_NICKNAME="Local Quacker" # Optional: defaults to "Local Duck" CUSTOM_NICKNAME="My Special Duck" # Optional: defaults to "Custom Duck"
Note: Duck nicknames are completely optional! If you don't set them, you'll get the charming defaults (GPT Duck, Gemini Duck, etc.). If you use a config.json file, those nicknames take priority over environment variables.
Create a config/config.json file based on the example:
cp config/config.example.json config/config.json # Edit config/config.json with your API keys and preferences
This is the most common setup method for using MCP Rubber Duck with Claude Desktop.
First, ensure the project is built:
# Clone the repository git clone https://github.com/nesquikm/mcp-rubber-duck.git cd mcp-rubber-duck # Install dependencies and build npm install npm run build
Edit your Claude Desktop config file:
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.jsonAdd the MCP server configuration:
{ "mcpServers": { "rubber-duck": { "command": "node", "args": ["/absolute/path/to/mcp-rubber-duck/dist/index.js"], "env": { "MCP_SERVER": "true", "OPENAI_API_KEY": "your-openai-api-key-here", "OPENAI_DEFAULT_MODEL": "gpt-4o-mini", "GEMINI_API_KEY": "your-gemini-api-key-here", "GEMINI_DEFAULT_MODEL": "gemini-2.5-flash", "DEFAULT_PROVIDER": "openai", "LOG_LEVEL": "info" } } } }
Important: Replace the placeholder API keys with your actual keys:
your-openai-api-key-here → Your OpenAI API key (starts with sk-)your-gemini-api-key-here → Your Gemini API key from Google AI StudioNote: MCP_SERVER: "true" is required - this tells rubber-duck to run as an MCP server for any MCP client (not related to the MCP Bridge feature).
Once restarted, test these commands in Claude:
Use the list_ducks tool with check_health: true
Should show:
Use the list_models tool
Use the ask_duck tool with prompt: "What is rubber duck debugging?", provider: "openai"
Use the compare_ducks tool with prompt: "Explain async/await in JavaScript"
Use the ask_duck tool with prompt: "Hello", provider: "openai", model: "gpt-4"
ls -la dist/index.js to confirm the project built successfullydist/index.jsIf ducks show as unhealthy:
The MCP Bridge allows your ducks to access tools from other MCP servers, extending their capabilities beyond just chat. Your ducks can now search documentation, access files, query APIs, and much more!
Note: This is different from the MCP server integration above:
MCP_BRIDGE_ENABLED): Ducks USE external MCP servers as clientsMCP_SERVER): Rubber-duck SERVES as an MCP server to any MCP clientAdd these environment variables to enable MCP Bridge:
# Basic MCP Bridge Configuration MCP_BRIDGE_ENABLED="true" # Enable ducks to access external MCP servers MCP_APPROVAL_MODE="trusted" # always, trusted, or never MCP_APPROVAL_TIMEOUT="300" # 5 minutes # Example: Context7 Documentation Server MCP_SERVER_CONTEXT7_TYPE="http" MCP_SERVER_CONTEXT7_URL="https://mcp.context7.com/mcp" MCP_SERVER_CONTEXT7_ENABLED="true" # Trust all Context7 tools (no approval needed) MCP_TRUSTED_TOOLS_CONTEXT7="*"
always: Every tool call requires approval (with session-based memory)
trusted: Only untrusted tools require approval
never: All tools execute immediately (use with caution)
Configure trust levels per MCP server for granular security:
# Trust all tools from Context7 (documentation server) MCP_TRUSTED_TOOLS_CONTEXT7="*" # Trust specific filesystem operations only MCP_TRUSTED_TOOLS_FILESYSTEM="read-file,list-directory" # Trust specific GitHub tools MCP_TRUSTED_TOOLS_GITHUB="get-repo-info,list-issues" # Global fallback for servers without specific config MCP_TRUSTED_TOOLS="common-safe-tool"
Configure MCP servers using environment variables:
MCP_SERVER_{NAME}_TYPE="http" MCP_SERVER_{NAME}_URL="https://api.example.com/mcp" MCP_SERVER_{NAME}_API_KEY="your-api-key" # Optional MCP_SERVER_{NAME}_ENABLED="true"
MCP_SERVER_{NAME}_TYPE="stdio" MCP_SERVER_{NAME}_COMMAND="python" MCP_SERVER_{NAME}_ARGS="/path/to/script.py,--arg1,--arg2" MCP_SERVER_{NAME}_ENABLED="true"
# Enable MCP Bridge MCP_BRIDGE_ENABLED="true" MCP_APPROVAL_MODE="trusted" # Configure Context7 server MCP_SERVER_CONTEXT7_TYPE="http" MCP_SERVER_CONTEXT7_URL="https://mcp.context7.com/mcp" MCP_SERVER_CONTEXT7_ENABLED="true" # Trust all Context7 tools MCP_TRUSTED_TOOLS_CONTEXT7="*"
Now your ducks can search and retrieve documentation from Context7:
Ask: "Can you find React hooks documentation from Context7 and return only the key concepts?"
Duck: *searches Context7 and returns focused, essential React hooks information*
Smart Token Management: Ducks can retrieve comprehensive data from MCP servers but return only the essential information you need, saving tokens in your host LLM conversations:
Example Workflow:
You: "Find Express.js routing concepts from Context7, keep it concise"
Duck: *Retrieves full Express docs, processes, and returns only routing essentials*
Result: 500 tokens instead of 5,000+ tokens of raw documentation
When using always mode, the system remembers your approvals:
search-docs - Approve? ✅"search-docs automatically (no new approval needed)get-examples - Approve? ✅"This eliminates approval fatigue while maintaining security!
Ask a single question to a specific LLM provider. When MCP Bridge is enabled, ducks can automatically access tools from connected MCP servers.
{ "prompt": "What is rubber duck debugging?", "provider": "openai", // Optional, uses default if not specified "temperature": 0.7 // Optional }
Have a conversation with context maintained across messages.
{ "conversation_id": "debug-session-1", "message": "Can you help me debug this code?", "provider": "groq" // Optional, can switch providers mid-conversation }
Clear all conversation history and start fresh. Useful when switching topics or when context becomes too large.
{ // No parameters required }
List all configured providers and their health status.
{ "check_health": true // Optional, performs fresh health check }
List available models for LLM providers.
{ "provider": "openai", // Optional, lists all if not specified "fetch_latest": false // Optional, fetch latest from API vs cached }
Ask the same question to multiple providers simultaneously.
{ "prompt": "What's the best programming language?", "providers": ["openai", "groq", "ollama"] // Optional, uses all if not specified }
Get responses from all configured ducks - like a panel discussion!
{ "prompt": "How should I architect a microservices application?" }
// Ask the default duck await ask_duck({ prompt: "Explain async/await in JavaScript" });
// Start a conversation await chat_with_duck({ conversation_id: "learning-session", message: "What is TypeScript?" }); // Continue the conversation await chat_with_duck({ conversation_id: "learning-session", message: "How does it differ from JavaScript?" });
// Get different perspectives await compare_ducks({ prompt: "What's the best way to handle errors in Node.js?", providers: ["openai", "groq", "ollama"] });
// Convene the council for important decisions await duck_council({ prompt: "Should I use REST or GraphQL for my API?" });
# Install Ollama curl -fsSL https://ollama.ai/install.sh | sh # Pull a model ollama pull llama3.2 # Ollama automatically provides OpenAI-compatible endpoint at localhost:11434/v1
GEMINI_API_KEY=...GROQ_API_KEY=gsk_...TOGETHER_API_KEY=...To check if a provider is OpenAI-compatible:
/v1/chat/completions endpoint in their API docscurl -X POST "https://api.provider.com/v1/chat/completions" \ -H "Authorization: Bearer YOUR_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "model": "model-name", "messages": [{"role": "user", "content": "Hello"}] }'
npm run dev
npm test
npm run lint
npm run typecheck
MCP Rubber Duck provides multi-platform Docker support, working on macOS (Intel & Apple Silicon), Linux (x86_64 & ARM64), Windows (WSL2), and Raspberry Pi 3+.
The easiest way to get started is with our pre-built multi-architecture image:
# Pull the image (works on all platforms) docker pull ghcr.io/nesquikm/mcp-rubber-duck:latest # Create environment file cp .env.template .env # Edit .env and add your API keys # Run with Docker Compose (recommended) docker compose up -d
# Use desktop-optimized settings ./scripts/deploy.sh --platform desktop # Or with more resources and local AI ./scripts/deploy.sh --platform desktop --profile with-ollama
# Use Pi-optimized settings (memory limits, etc.) ./scripts/deploy.sh --platform pi # Or copy optimized config directly cp .env.pi.example .env # Edit .env and add your API keys docker compose up -d
# Deploy to remote Raspberry Pi ./scripts/deploy.sh --mode ssh --ssh-host [email protected]
The scripts/deploy.sh script auto-detects your platform and applies optimal settings:
# Auto-detect platform and deploy ./scripts/deploy.sh # Options: ./scripts/deploy.sh --help
Available options:
--mode: docker (default), local, or ssh--platform: pi, desktop, or auto (default)--profile: lightweight, desktop, with-ollama--ssh-host: For remote deployment# .env.pi.example - Optimized for Pi 3+ DOCKER_CPU_LIMIT=1.5 DOCKER_MEMORY_LIMIT=512M NODE_OPTIONS=--max-old-space-size=256
# .env.desktop.example - Optimized for powerful systems DOCKER_CPU_LIMIT=4.0 DOCKER_MEMORY_LIMIT=2G NODE_OPTIONS=--max-old-space-size=1024
# Default profile (lightweight, good for Pi) docker compose up -d # Desktop profile (higher resource limits) docker compose --profile desktop up -d # With local Ollama AI docker compose --profile with-ollama up -d
For developers who want to build and publish their own multi-architecture images:
# Build for AMD64 + ARM64 ./scripts/build-multiarch.sh --platforms linux/amd64,linux/arm64 # Build and push to GitHub Container Registry ./scripts/gh-deploy.sh --public
Connect Claude Desktop to MCP Rubber Duck running on a remote system:
{ "mcpServers": { "rubber-duck-remote": { "command": "ssh", "args": [ "user@remote-host", "docker exec -i mcp-rubber-duck node /app/dist/index.js" ] } } }
| Platform | Architecture | Status | Notes | 
|---|---|---|---|
| macOS Intel | AMD64 | ✅ Full | Via Docker Desktop | 
| macOS Apple Silicon | ARM64 | ✅ Full | Native ARM64 support | 
| Linux x86_64 | AMD64 | ✅ Full | Direct Docker support | 
| Linux ARM64 | ARM64 | ✅ Full | Servers, Pi 4+ | 
| Raspberry Pi 3+ | ARM64 | ✅ Optimized | Memory-limited config | 
| Windows | AMD64 | ✅ Full | Via Docker Desktop + WSL2 | 
If you prefer not to use docker-compose:
# Raspberry Pi docker run -d \ --name mcp-rubber-duck \ --memory=512m --cpus=1.5 \ --env-file .env \ --restart unless-stopped \ ghcr.io/nesquikm/mcp-rubber-duck:latest # Desktop/Server docker run -d \ --name mcp-rubber-duck \ --memory=2g --cpus=4 \ --env-file .env \ --restart unless-stopped \ ghcr.io/nesquikm/mcp-rubber-duck:latest
mcp-rubber-duck/
├── src/
│   ├── server.ts           # MCP server implementation
│   ├── config/             # Configuration management
│   ├── providers/          # OpenAI client wrapper
│   ├── tools/              # MCP tool implementations
│   ├── services/           # Health, cache, conversations
│   └── utils/              # Logging, ASCII art
├── config/                 # Configuration examples
└── tests/                  # Test suites
list_ducks({ check_health: true })max_retries and timeout settings🦆 Want to help make our duck pond better?
We love contributions! Whether you're fixing bugs, adding features, or teaching our ducks new tricks, we'd love to have you join the flock.
Check out our Contributing Guide to get started. We promise it's more fun than a regular contributing guide - it has ducks! 🦆
Quick start for contributors:
MIT License - see LICENSE file for details
See CHANGELOG.md for a detailed history of changes and releases.
MCP Rubber Duck is available through multiple channels:
io.github.nesquikm/rubber-duck🦆 Happy Debugging with your AI Duck Panel! 🦆