MCP Ollama服务器
STDIO用于整合Ollama的MCP服务器
用于整合Ollama的MCP服务器
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
ollama pull llama2
)Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS, %APPDATA%\Claude\claude_desktop_config.json
on Windows):
{ "mcpServers": { "ollama": { "command": "uvx", "args": [ "mcp-ollama" ] } } }
Install in development mode:
git clone https://github.com/yourusername/mcp-ollama.git cd mcp-ollama uv sync
Test with MCP Inspector:
mcp dev src/mcp_ollama/server.py
The server provides four main tools:
list_models
- List all downloaded Ollama modelsshow_model
- Get detailed information about a specific modelask_model
- Ask a question to a specified modelMIT