
Open WebSearch
STDIOMulti-engine web search MCP server with article fetching, no API keys required
Multi-engine web search MCP server with article fetching, no API keys required
A Model Context Protocol (MCP) server based on multi-engine search results, supporting free web search without API keys.
npm install
npm run build
Cherry Studio:
{ "mcpServers": { "web-search": { "name": "Web Search MCP", "type": "streamableHttp", "description": "Multi-engine web search with article fetching", "isActive": true, "baseUrl": "http://localhost:3000/mcp" } } }
VSCode (Claude Dev Extension):
{ "mcpServers": { "web-search": { "transport": { "type": "streamableHttp", "url": "http://localhost:3000/mcp" } }, "web-search-sse": { "transport": { "type": "sse", "url": "http://localhost:3000/sse" } } } }
Claude Desktop:
{ "mcpServers": { "web-search": { "transport": { "type": "streamableHttp", "url": "http://localhost:3000/mcp" } }, "web-search-sse": { "transport": { "type": "sse", "url": "http://localhost:3000/sse" } } } }
Quick deployment using Docker Compose:
docker-compose up -d
Or use Docker directly:
docker run -d --name web-search -p 3000:3000 -e ENABLE_CORS=true -e CORS_ORIGIN=* ghcr.io/aas-ee/open-web-search:latest
Environment variable configuration:
# Enable CORS (default: false) ENABLE_CORS=true # CORS origin configuration (default: *) CORS_ORIGIN=* # Default search engine (options: bing, duckduckgo, exa, brave, default: bing) DEFAULT_SEARCH_ENGINE=duckduckgo # Enable HTTP proxy (default: false) USE_PROXY=true # Proxy server URL (default: http://127.0.0.1:10809) PROXY_URL=http://your-proxy-server:port
Then configure in your MCP client:
{ "mcpServers": { "web-search": { "name": "Web Search MCP", "type": "streamableHttp", "description": "Multi-engine web search with article fetching", "isActive": true, "baseUrl": "http://localhost:3000/mcp" }, "web-search-sse": { "transport": { "name": "Web Search MCP", "type": "sse", "description": "Multi-engine web search with article fetching", "isActive": true, "url": "http://localhost:3000/sse" } } } }
The server provides three tools: search
, fetchLinuxDoArticle
, and fetchCsdnArticle
.
{ "query": string, // Search query "limit": number, // Optional: Number of results to return (default: 10) "engines": string[] // Optional: Engines to use (bing,baidu,linuxdo,csdn,duckduckgo,exa,brave) default bing }
Usage example:
use_mcp_tool({ server_name: "web-search", tool_name: "search", arguments: { query: "search content", limit: 3, // Optional parameter engines: ["bing", "csdn", "duckduckgo", "exa", "brave"] // Optional parameter, supports multi-engine combined search } })
Response example:
[ { "title": "Example Search Result", "url": "https://example.com", "description": "Description text of the search result...", "source": "Source", "engine": "Engine used" } ]
Used to fetch complete content of CSDN blog articles.
{ "url": string // URL from CSDN search results using the search tool }
Usage example:
use_mcp_tool({ server_name: "web-search", tool_name: "fetchCsdnArticle", arguments: { url: "https://blog.csdn.net/xxx/article/details/xxx" } })
Response example:
[ { "content": "Example search result" } ]
Used to fetch complete content of Linux.do forum articles.
{ "url": string // URL from linuxdo search results using the search tool }
Usage example:
use_mcp_tool({ server_name: "web-search", tool_name: "fetchLinuxDoArticle", arguments: { url: "https://xxxx.json" } })
Response example:
[ { "content": "Example search result" } ]
Since this tool works by scraping multi-engine search results, please note the following important limitations:
Rate Limiting:
Result Accuracy:
Legal Terms:
Search Engine Configuration:
DEFAULT_SEARCH_ENGINE
environment variableProxy Configuration:
USE_PROXY=true
PROXY_URL
Welcome to submit issue reports and feature improvement suggestions!
If you want to fork this repository and publish your own Docker image, you need to make the following configurations:
To enable automatic Docker image building and publishing, please add the following secrets in your GitHub repository settings (Settings → Secrets and variables → Actions):
Required Secrets:
GITHUB_TOKEN
: Automatically provided by GitHub (no setup needed)Optional Secrets (for Alibaba Cloud ACR):
ACR_REGISTRY
: Your Alibaba Cloud Container Registry URL (e.g., registry.cn-hangzhou.aliyuncs.com
)ACR_USERNAME
: Your Alibaba Cloud ACR usernameACR_PASSWORD
: Your Alibaba Cloud ACR passwordACR_IMAGE_NAME
: Your image name in ACR (e.g., your-namespace/open-web-search
)The repository includes a GitHub Actions workflow (.github/workflows/docker.yml
) that automatically:
Trigger Conditions:
main
branchv*
)Build and Push to:
Image Tags:
ghcr.io/your-username/open-web-search:latest
your-acr-address/your-image-name:latest
(if ACR is configured)main
branch or create version tagsdocker run -d --name web-search -p 3000:3000 -e ENABLE_CORS=true -e CORS_ORIGIN=* ghcr.io/your-username/open-web-search:latest
If you find this project helpful, please consider giving it a ⭐ Star!