Multi LLM Cross-Check
STDIOMCP server for cross-checking responses from multiple LLM providers simultaneously.
MCP server for cross-checking responses from multiple LLM providers simultaneously.
A Model Control Protocol (MCP) server that allows cross-checking responses from multiple LLM providers simultaneously. This server integrates with Claude Desktop as an MCP server to provide a unified interface for querying different LLM APIs.
pip install uv
)To install Multi LLM Cross-Check Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @lior-ps/multi-llm-cross-check-mcp-server --client claude
git clone https://github.com/lior-ps/multi-llm-cross-check-mcp-server.git cd multi-llm-cross-check-mcp-server
uv venv uv pip install -r requirements.txt
Configure in Claude Desktop:
Create a file named claude_desktop_config.json
in your Claude Desktop configuration directory with the following content:
{ "mcp_servers": [ { "command": "uv", "args": [ "--directory", "/multi-llm-cross-check-mcp-server", "run", "main.py" ], "env": { "OPENAI_API_KEY": "your_openai_key", // Get from https://platform.openai.com/api-keys "ANTHROPIC_API_KEY": "your_anthropic_key", // Get from https://console.anthropic.com/account/keys "PERPLEXITY_API_KEY": "your_perplexity_key", // Get from https://www.perplexity.ai/settings/api "GEMINI_API_KEY": "your_gemini_key" // Get from https://makersuite.google.com/app/apikey } } ] }
Notes:
which uv
on MacOS/Linux or where uv
on Windows.Once configured:
cross_check
tool in your conversations by asking to "cross check with other LLMs"The server returns a dictionary with responses from each LLM provider:
{ "ChatGPT": { ... }, "Claude": { ... }, "Perplexity": { ... }, "Gemini": { ... } }
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.