
Cognition Wheel
STDIOAI reasoning server consulting multiple language models and synthesizing their responses.
AI reasoning server consulting multiple language models and synthesizing their responses.
A Model Context Protocol (MCP) server that implements a "wisdom of crowds" approach to AI reasoning by consulting multiple state-of-the-art language models in parallel and synthesizing their responses.
# Run directly with npx (no installation needed) npx mcp-cognition-wheel # Or install globally npm install -g mcp-cognition-wheel mcp-cognition-wheel
pnpm install
.env.example
to .env
and add your API keyspnpm run build
The Cognition Wheel follows a three-phase process:
Parallel Consultation: Simultaneously queries three different AI models:
Anonymous Analysis: Uses code names (Alpha, Beta, Gamma) to eliminate bias during the synthesis phase
Smart Synthesis: Randomly selects one of the models to act as a synthesizer, which analyzes all responses and produces a final, comprehensive answer
# Run directly with npx (no installation needed) npx mcp-cognition-wheel # Or install globally npm install -g mcp-cognition-wheel mcp-cognition-wheel
pnpm install
.env.example
to .env
and add your API keyspnpm run build
This is an MCP server designed to be used with MCP-compatible clients like Claude Desktop or other MCP tools.
ANTHROPIC_API_KEY
: Your Anthropic API keyGOOGLE_GENERATIVE_AI_API_KEY
: Your Google AI API keyOPENAI_API_KEY
: Your OpenAI API keyBased on the guide from this dev.to article, here's how to integrate with Cursor:
Open Cursor Settings:
Configure the server:
cognition-wheel
npx
["-y", "mcp-cognition-wheel"]
Example configuration:
{ "cognition-wheel": { "command": "npx", "args": ["-y", "mcp-cognition-wheel"], "env": { "ANTHROPIC_API_KEY": "your_anthropic_key", "GOOGLE_GENERATIVE_AI_API_KEY": "your_google_key", "OPENAI_API_KEY": "your_openai_key" } } }
Build the project (if not already done):
pnpm run build
Configure the server:
cognition-wheel
node
["/absolute/path/to/your/cognition-wheel/dist/app.js"]
Example configuration:
{ "cognition-wheel": { "command": "node", "args": [ "/Users/yourname/path/to/cognition-wheel/dist/app.js" ], "env": { "ANTHROPIC_API_KEY": "your_anthropic_key", "GOOGLE_GENERATIVE_AI_API_KEY": "your_google_key", "OPENAI_API_KEY": "your_openai_key" } } }
Test the integration:
cognition_wheel
tool should be automatically triggeredThe server provides a single tool called cognition_wheel
with the following parameters:
context
: Background information and context for the problemquestion
: The specific question you want answeredenable_internet_search
: Boolean flag to enable web search capabilitiespnpm run dev
: Watch mode for developmentpnpm run build
: Build the TypeScript codepnpm run start
: Run the server directly with tsxBuild and run with Docker:
# Build the image docker build -t cognition-wheel . # Run with environment variables docker run --rm \ -e ANTHROPIC_API_KEY=your_key \ -e GOOGLE_GENERATIVE_AI_API_KEY=your_key \ -e OPENAI_API_KEY=your_key \ cognition-wheel
MIT