Ollama集成
STDIO连接Ollama与MCP的本地LLM集成服务
连接Ollama与MCP的本地LLM集成服务
🚀 A powerful bridge between Ollama and the Model Context Protocol (MCP), enabling seamless integration of Ollama's local LLM capabilities into your MCP-powered applications.
🔄 Model Management
🤖 Model Execution
🛠 Server Control
pnpm install
pnpm run build
Add the server to your MCP configuration:
MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
{ "mcpServers": { "ollama": { "command": "node", "args": ["/path/to/ollama-server/build/index.js"], "env": { "OLLAMA_HOST": "http://127.0.0.1:11434" // Optional: customize Ollama API endpoint } } } }
// Pull a model await mcp.use_mcp_tool({ server_name: "ollama", tool_name: "pull", arguments: { name: "llama2" } }); // Run the model await mcp.use_mcp_tool({ server_name: "ollama", tool_name: "run", arguments: { name: "llama2", prompt: "Explain quantum computing in simple terms" } });
await mcp.use_mcp_tool({ server_name: "ollama", tool_name: "chat_completion", arguments: { model: "llama2", messages: [ { role: "system", content: "You are a helpful assistant." }, { role: "user", content: "What is the meaning of life?" } ], temperature: 0.7 } });
await mcp.use_mcp_tool({ server_name: "ollama", tool_name: "create", arguments: { name: "custom-model", modelfile: "./path/to/Modelfile" } });
OLLAMA_HOST
: Configure custom Ollama API endpoint (default: http://127.0.0.1:11434)Contributions are welcome! Feel free to:
MIT License - feel free to use in your own projects!
Built with ❤️ for the MCP ecosystem