Deepseek Thinker
STDIOMCP provider for Deepseek reasoning content, supporting OpenAI API and Ollama local modes.
MCP provider for Deepseek reasoning content, supporting OpenAI API and Ollama local modes.
A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.
🤖 Dual Mode Support
🎯 Focused Reasoning
originPrompt
(string): User's original promptSet the following environment variables:
API_KEY=<Your OpenAI API Key> BASE_URL=<API Base URL>
Set the following environment variable:
USE_OLLAMA=true
Add the following configuration to your claude_desktop_config.json
:
{ "mcpServers": { "deepseek-thinker": { "command": "npx", "args": [ "-y", "deepseek-thinker-mcp" ], "env": { "API_KEY": "<Your API Key>", "BASE_URL": "<Your Base URL>" } } } }
{ "mcpServers": { "deepseek-thinker": { "command": "npx", "args": [ "-y", "deepseek-thinker-mcp" ], "env": { "USE_OLLAMA": "true" } } } }
{ "mcpServers": { "deepseek-thinker": { "command": "node", "args": [ "/your-path/deepseek-thinker-mcp/build/index.js" ], "env": { "API_KEY": "<Your API Key>", "BASE_URL": "<Your Base URL>" } } } }
# Install dependencies npm install # Build project npm run build # Run service node build/index.js
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
This project is licensed under the MIT License. See the LICENSE file for details.