
Deep Research
HTTP-SSESTDIOAgent-based web search and research tool with PDF, image, and YouTube analysis capabilities
Agent-based web search and research tool with PDF, image, and YouTube analysis capabilities
Deep Research is an agent-based tool that provides web search and advanced research capabilities. It leverages HuggingFace's smolagents
and is implemented as an MCP server.
This project is based on HuggingFace's open_deep_research example.
uv
package managergit clone https://github.com/Hajime-Y/deep-research-mcp.git cd deep-research-mcp
uv venv source .venv/bin/activate # For Linux or Mac # .venv\Scripts\activate # For Windows uv sync
Create a .env
file in the root directory of the project and set the following environment variables:
OPENAI_API_KEY=your_openai_api_key
HF_TOKEN=your_huggingface_token
SERPER_API_KEY=your_serper_api_key
You can obtain a SERPER_API_KEY by signing up at Serper.dev.
Start the MCP server:
uv run deep_research.py
This will launch the deep_research
agent as an MCP server.
You can also run this MCP server in a Docker container:
# Build the Docker image docker build -t deep-research-mcp . # Run with required API keys docker run -p 8080:8080 \ -e OPENAI_API_KEY=your_openai_api_key \ -e HF_TOKEN=your_huggingface_token \ -e SERPER_API_KEY=your_serper_api_key \ deep-research-mcp
To register this Docker container as an MCP server in different clients:
Add the following to your Claude Desktop configuration file (typically located at ~/.config/Claude/claude_desktop_config.json
on Linux, ~/Library/Application Support/Claude/claude_desktop_config.json
on macOS, or %APPDATA%\Claude\claude_desktop_config.json
on Windows):
{ "mcpServers": { "deep-research-mcp": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "OPENAI_API_KEY=your_openai_api_key", "-e", "HF_TOKEN=your_huggingface_token", "-e", "SERPER_API_KEY=your_serper_api_key", "deep-research-mcp" ] } } }
For Cursor IDE, add the following configuration:
{ "mcpServers": { "deep-research-mcp": { "command": "docker", "args": [ "run", "-i", "--rm", "-e", "OPENAI_API_KEY=your_openai_api_key", "-e", "HF_TOKEN=your_huggingface_token", "-e", "SERPER_API_KEY=your_serper_api_key", "deep-research-mcp" ] } } }
If you're running the MCP server on a remote machine or exposing it as a service, you can use the URL-based configuration:
{ "mcpServers": { "deep-research-mcp": { "url": "http://your-server-address:8080/mcp", "type": "sse" } } }
deep_research.py
: Entry point for the MCP servercreate_agent.py
: Agent creation and configurationscripts/
: Various tools and utilities
text_web_browser.py
: Text-based web browsertext_inspector_tool.py
: File inspection toolvisual_qa.py
: Image analysis toolmdconvert.py
: Converts various file formats to MarkdownThis project is provided under the Apache License 2.0.
This project uses code from HuggingFace's smolagents
and Microsoft's autogen
projects.