
Ollama Deep Researcher
STDIOOllama-powered MCP server for deep research using local LLMs and web search
Ollama-powered MCP server for deep research using local LLMs and web search
Ollama Deep Researcher is a Desktop Extension (DXT) that enables advanced topic research using web search and LLM synthesis, powered by a local MCP server. It supports configurable research parameters, status tracking, and resource access, and is designed for seamless integration with the DXT ecosystem.
.
├── manifest.json # DXT manifest (see MANIFEST.md for spec)
├── src/
│ ├── index.ts # MCP server entrypoint (Node.js, stdio transport)
│ └── assistant/ # Python research logic
│ └── run_research.py
├── README.md # This documentation
└── ...
Clone the repository and install dependencies:
git clone <your-repo-url> cd mcp-server-ollama-deep-researcher npm install
Install Python dependencies for the assistant:
cd src/assistant pip install -r requirements.txt # or use pyproject.toml/uv if preferred
Set required environment variables for web search APIs:
TAVILY_API_KEY
PERPLEXITY_API_KEY
export TAVILY_API_KEY=your_tavily_key export PERPLEXITY_API_KEY=your_perplexity_key
Build the TypeScript server (if needed):
npm run build
Run the extension locally for testing:
node dist/index.js # Or use the DXT host to load the extension per DXT documentation
research
tool with { "topic": "Your subject" }
get_status
toolconfigure
tool with any of: maxLoops
, llmModel
, searchApi
See manifest.json
for the full DXT manifest, including tool schemas and resource templates. Follows DXT MANIFEST.md.
stderr
for debugging.TAVILY_API_KEY
or PERPLEXITY_API_KEY
is set in your environment.stderr
.© 2025 Your Name or Organization. Licensed under MIT.