Gemini桥接
STDIO连接AI助手与Gemini的MCP服务器
连接AI助手与Gemini的MCP服务器
A lightweight MCP (Model Context Protocol) server that enables AI coding assistants to interact with Google's Gemini AI through the official CLI. Works with Claude Code, Cursor, VS Code, and other MCP-compatible clients. Designed for simplicity, reliability, and seamless integration.
mcp>=1.0.0 and Gemini CLIInstall Gemini CLI:
npm install -g @google/gemini-cli
Authenticate with Gemini:
gemini auth login
Verify installation:
gemini --version
🎯 Recommended: PyPI Installation
# Install from PyPI pip install gemini-bridge # Add to Claude Code with uvx (recommended) claude mcp add gemini-bridge -s user -- uvx gemini-bridge
Alternative: From Source
# Clone the repository git clone https://github.com/shelakh/gemini-bridge.git cd gemini-bridge # Build and install locally uvx --from build pyproject-build pip install dist/*.whl # Add to Claude Code claude mcp add gemini-bridge -s user -- uvx gemini-bridge
Development Installation
# Clone and install in development mode git clone https://github.com/shelakh/gemini-bridge.git cd gemini-bridge pip install -e . # Add to Claude Code (development) claude mcp add gemini-bridge-dev -s user -- python -m src
Gemini Bridge works with any MCP-compatible AI coding assistant - the same server supports multiple clients through different configuration methods.
# Recommended installation claude mcp add gemini-bridge -s user -- uvx gemini-bridge # Development installation claude mcp add gemini-bridge-dev -s user -- python -m src
Global Configuration (~/.cursor/mcp.json):
{ "mcpServers": { "gemini-bridge": { "command": "uvx", "args": ["gemini-bridge"], "env": {} } } }
Project-Specific (.cursor/mcp.json in your project):
{ "mcpServers": { "gemini-bridge": { "command": "uvx", "args": ["gemini-bridge"], "env": {} } } }
Go to: Settings → Cursor Settings → MCP → Add new global MCP server
Configuration (.vscode/mcp.json in your workspace):
{ "servers": { "gemini-bridge": { "type": "stdio", "command": "uvx", "args": ["gemini-bridge"] } } }
Alternative: Through Extensions
uvx gemini-bridgeAdd to your Windsurf MCP configuration:
{ "mcpServers": { "gemini-bridge": { "command": "uvx", "args": ["gemini-bridge"], "env": {} } } }
cline_mcp_settings.json:{ "mcpServers": { "gemini-bridge": { "command": "uvx", "args": ["gemini-bridge"], "env": {} } } }
Go to: Settings → MCP → Add MCP Server
{ "mcpServers": { "gemini-bridge": { "command": "uvx", "args": ["gemini-bridge"], "env": {} } } }
gemini-bridgeSTDIOuvx["gemini-bridge"]Using the UI:
uvx gemini-bridgeManual Configuration:
"augment.advanced": { "mcpServers": [ { "name": "gemini-bridge", "command": "uvx", "args": ["gemini-bridge"], "env": {} } ] }
mcp_settings.json:{ "mcpServers": { "gemini-bridge": { "command": "uvx", "args": ["gemini-bridge"], "env": {} } } }
{ "command": "uvx", "args": ["gemini-bridge"], "env": {} }
For pip-based installations:
{ "command": "gemini-bridge", "args": [], "env": {} }
For development/local testing:
{ "command": "python", "args": ["-m", "src"], "env": {}, "cwd": "/path/to/gemini-bridge" }
For npm-style installation (if needed):
{ "command": "npx", "args": ["gemini-bridge"], "env": {} }
Once configured with any client, use the same two tools:
The server implementation is identical - only the client configuration differs!
By default, Gemini Bridge uses a 60-second timeout for all CLI operations. For longer queries (large files, complex analysis), you can configure a custom timeout using the GEMINI_BRIDGE_TIMEOUT environment variable.
Example configurations:
# Add with custom timeout (120 seconds) claude mcp add gemini-bridge -s user --env GEMINI_BRIDGE_TIMEOUT=120 -- uvx gemini-bridge
{ "mcpServers": { "gemini-bridge": { "command": "uvx", "args": ["gemini-bridge"], "env": { "GEMINI_BRIDGE_TIMEOUT": "120" } } } }
Timeout Options:
timeout_seconds to either tool for one-off extensionsconsult_geminiDirect CLI bridge for simple queries.
Parameters:
query (string): The question or prompt to send to Geminidirectory (string): Working directory for the query (default: current directory)model (string, optional): Model to use - "flash" or "pro" (default: "flash")timeout_seconds (int, optional): Override the execution timeout for this requestExample:
consult_gemini( query="Find authentication patterns in this codebase", directory="/path/to/project", model="flash" )
consult_gemini_with_filesCLI bridge with file attachments for detailed analysis.
Parameters:
query (string): The question or prompt to send to Geminidirectory (string): Working directory for the queryfiles (list): List of file paths relative to the directorymodel (string, optional): Model to use - "flash" or "pro" (default: "flash")timeout_seconds (int, optional): Override the execution timeout for this requestmode (string, optional): Either "inline" (default) to stream file contents or "at_command" to let Gemini CLI resolve @path references itselfExample:
consult_gemini_with_files( query="Analyze these auth files and suggest improvements", directory="/path/to/project", files=["src/auth.py", "src/models.py"], model="pro", timeout_seconds=180 )
Tip: When scanning large trees, switch to mode="at_command" so the Gemini CLI handles file globbing and truncation natively.
# Simple research query consult_gemini( query="What authentication patterns are used in this project?", directory="/Users/dev/my-project" )
# Analyze specific files consult_gemini_with_files( query="Review these files and suggest security improvements", directory="/Users/dev/my-project", files=["src/auth.py", "src/middleware.py"], model="pro" )
# Compare multiple implementation files consult_gemini_with_files( query="Compare these database implementations and recommend the best approach", directory="/Users/dev/my-project", files=["src/db/postgres.py", "src/db/sqlite.py", "src/db/redis.py"], mode="at_command" )
GEMINI_BRIDGE_MAX_INLINE_TOTAL_BYTES, etc.) or prefer mode="at_command" for bigger payloads.gemini command@ mode delegates to Gemini CLI toolinggemini-bridge/
├── src/
│   ├── __init__.py              # Entry point
│   ├── __main__.py              # Module execution entry point
│   └── mcp_server.py            # Main MCP server implementation
├── .github/                     # GitHub templates and workflows
├── pyproject.toml              # Python package configuration
├── README.md                   # This file
├── CONTRIBUTING.md             # Contribution guidelines
├── CODE_OF_CONDUCT.md          # Community standards
├── SECURITY.md                 # Security policies
├── CHANGELOG.md               # Version history
└── LICENSE                    # MIT license
# Install in development mode pip install -e . # Run directly python -m src # Test CLI availability gemini --version
The server automatically integrates with Claude Code when properly configured through the MCP protocol.
# Install Gemini CLI npm install -g @google/gemini-cli # Authenticate gemini auth login # Test gemini --version
gemini command is in your PATHgemini auth loginWe welcome contributions from the community! Please read our Contributing Guidelines for details on how to get started.
This project is licensed under the MIT License - see the LICENSE file for details.
See CHANGELOG.md for detailed version history.
docs/ directoryFocus: A simple, reliable bridge between Claude Code and Gemini AI through the official CLI.