
Ultra
STDIOMulti-AI model MCP server with OpenAI, Gemini integration and built-in analytics dashboard
Multi-AI model MCP server with OpenAI, Gemini integration and built-in analytics dashboard
All Models. One Interface. Zero Friction.
🚀 Ultra MCP - A Model Context Protocol server that exposes OpenAI, Gemini, Azure OpenAI, and xAI Grok AI models through a single MCP interface for use with Claude Code and Cursor.
Stop wasting time having meetings with human. Now it's time to ask AI models do this.
This project is inspired by:
While inspired by zen-mcp-server, Ultra MCP offers several key advantages:
npx ultra-mcp
to get startednpm install -g ultra-mcp
npx ultra-mcp config
npx ultra-mcp db:stats
npx ultra-mcp dashboard
conf
library# Install globally via npm npm install -g ultra-mcp # Or run directly with npx npx -y ultra-mcp config
Set up your API keys interactively:
npx -y ultra-mcp config
This will:
New in v0.5.10:
# Run the MCP server npx -y ultra-mcp # Or after building locally bun run build node dist/cli.js
Ultra MCP provides several powerful commands:
config
- Interactive Configurationnpx -y ultra-mcp config
Configure API keys interactively with a user-friendly menu system.
dashboard
- Web Dashboardnpx -y ultra-mcp dashboard # Custom port npx -y ultra-mcp dashboard --port 4000 # Development mode npx -y ultra-mcp dashboard --dev
Launch the web dashboard to view usage statistics, manage configurations, and monitor AI costs.
install
- Install for Claude Codenpx -y ultra-mcp install
Automatically install Ultra MCP as an MCP server for Claude Code.
doctor
- Health Checknpx -y ultra-mcp doctor # Test connections to providers npx -y ultra-mcp doctor --test
Check installation health and test API connections.
chat
- Interactive Chatnpx -y ultra-mcp chat # Specify model and provider npx -y ultra-mcp chat -m gpt-5 -p openai npx -y ultra-mcp chat -m grok-4 -p grok
Chat interactively with AI models from the command line.
db:show
- Show Database Infonpx -y ultra-mcp db:show
Display database file location and basic statistics.
db:stats
- Usage Statisticsnpx -y ultra-mcp db:stats
Show detailed usage statistics for the last 30 days including costs by provider.
db:view
- Database Viewernpx -y ultra-mcp db:view
Launch Drizzle Studio to explore the usage database interactively.
# Install Ultra MCP for Claude Code npx -y ultra-mcp install
This command will:
Add to your Claude Code settings:
{ "mcpServers": { "ultra-mcp": { "command": "npx", "args": ["-y", "ultra-mcp@latest"] } } }
First configure your API keys:
npx -y ultra-mcp config
Then add to your Cursor MCP settings:
{ "mcpServers": { "ultra-mcp": { "command": "npx", "args": ["-y", "ultra-mcp@latest"] } } }
Ultra MCP will automatically use the API keys you configured with the config
command.
Ultra MCP provides powerful AI tools accessible through Claude Code and Cursor. New in v0.7.0: All tools are now also available as discoverable prompts in Claude Code.
All Ultra MCP tools are now exposed as discoverable prompts in Claude Code, making them even easier to use:
How to use prompts:
/
in Claude Code to see available promptsThis makes Ultra MCP's powerful AI capabilities more accessible than ever!
deep-reasoning
)Leverage advanced AI models for complex problem-solving and analysis.
investigate
)Thoroughly investigate topics with configurable depth levels.
research
)Conduct comprehensive research with multiple output formats.
list-ai-models
)View all available AI models and their configuration status.
// In Claude Code or Cursor with MCP await use_mcp_tool('ultra-mcp', 'deep-reasoning', { provider: 'openai', prompt: 'Design a distributed caching system for microservices', reasoningEffort: 'high', });
# Clone the repository git clone https://github.com/RealMikeChong/ultra-mcp cd ultra-mcp # Install dependencies bun install # Build TypeScript bun run build # Run tests bun run test # Development mode with watch bun run dev # Test with MCP Inspector npx @modelcontextprotocol/inspector node dist/cli.js
Ultra MCP acts as a bridge between multiple AI model providers and MCP clients:
src/cli.ts
- CLI entry point with commandersrc/server.ts
- MCP server implementationsrc/config/
- Configuration management with schema validationsrc/handlers/
- MCP protocol handlerssrc/providers/
- Model provider implementationssrc/utils/
- Shared utilities for streaming and error handlingUltra MCP stores configuration in your system's default config directory:
~/Library/Preferences/ultra-mcp-nodejs/
~/.config/ultra-mcp/
%APPDATA%\ultra-mcp-nodejs\
You can also set API keys and base URLs via environment variables:
OPENAI_API_KEY
/ OPENAI_BASE_URL
GOOGLE_API_KEY
/ GOOGLE_BASE_URL
AZURE_API_KEY
/ AZURE_BASE_URL
(base URL required for Azure)XAI_API_KEY
/ XAI_BASE_URL
Note: Configuration file takes precedence over environment variables.
Ultra MCP supports vector embeddings for semantic code search. By default, it uses text-embedding-3-small for cost efficiency (6.5x cheaper than the large model).
You can customize the embedding models in your configuration:
{ "vectorConfig": { "embeddingModel": { "openai": "text-embedding-3-small", // or "text-embedding-3-large" "azure": "text-embedding-3-small", // or "text-embedding-3-large" "gemini": "text-embedding-004" } } }
Model | Cost | Dimensions | MTEB Score | Best For |
---|---|---|---|---|
text-embedding-3-small | $0.02/1M tokens | 1536 | 62.3% | Cost-effective code search |
text-embedding-3-large | $0.13/1M tokens | 3072 | 64.6% | Maximum accuracy |
text-embedding-3-large
, it will continue to work but won't be compatible with new embeddings from text-embedding-3-small
. Consider re-indexing if you want to use the smaller model.embeddingModel
in your vector config.git checkout -b feature-name
npm test
git commit -m "Add feature"
git push origin feature-name
# Run all tests bun run test # Run tests with UI bun run test:ui # Run tests with coverage bun run test:coverage
MIT License - see LICENSE file for details.
👋 Mike Chong - Building tools to amplify human potential through AI.
As one of the earliest users of GitHub Copilot (personally invited by Nat Friedman, former GitHub CEO), I've witnessed firsthand how AI-assisted development can transform the way we build software. My journey as a former engineer on Outlook iOS/Android taught me the importance of creating tools that genuinely improve people's daily lives.
Ultra MCP represents my vision of democratizing access to the best AI models, making cutting-edge AI capabilities accessible to every developer through a unified, simple interface. I believe that by removing barriers between developers and AI models, we can accelerate innovation and create a better world for everyone.
"The future belongs to those who can seamlessly orchestrate human creativity with AI capabilities."
While both projects aim to enhance AI development workflows, Ultra MCP brings unique advantages:
Written in TypeScript - Full type safety, excellent IDE support, and a more maintainable codebase
Vector Search Support - Built-in semantic code search using vector embeddings
npx ultra-mcp index
npx ultra-mcp search "authentication logic"
Built-in Dashboard & Usage Tracking - Comprehensive analytics and cost monitoring
Advanced Pricing System - Real-time cost management
npx ultra-mcp pricing show gpt-4o
Unlike many MCP implementations, Ultra MCP includes built-in vector search and a pricing-aware dashboard out of the box. These features make Ultra MCP particularly suited for developers who want robust tooling with built-in cost visibility and intelligent code search capabilities for responsible AI usage.