Higress AI Search
STDIOMCP server providing AI search tool to enhance model responses with real-time search results.
MCP server providing AI search tool to enhance model responses with real-time search results.
A Model Context Protocol (MCP) server that provides an AI search tool to enhance AI model responses with real-time search results from various search engines through Higress ai-search feature.
https://github.com/user-attachments/assets/60a06d99-a46c-40fc-b156-793e395542bb
https://github.com/user-attachments/assets/5c9e639f-c21c-4738-ad71-1a88cc0bcb46
The server can be configured using environment variables:
HIGRESS_URL
(optional): URL for the Higress service (default: http://localhost:8080/v1/chat/completions
).MODEL
(required): LLM model to use for generating responses.INTERNAL_KNOWLEDGE_BASES
(optional): Description of internal knowledge bases.Using uvx will automatically install the package from PyPI, no need to clone the repository locally.
{ "mcpServers": { "higress-ai-search-mcp-server": { "command": "uvx", "args": [ "higress-ai-search-mcp-server" ], "env": { "HIGRESS_URL": "http://localhost:8080/v1/chat/completions", "MODEL": "qwen-turbo", "INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents" } } } }
Using uv requires cloning the repository locally and specifying the path to the source code.
{ "mcpServers": { "higress-ai-search-mcp-server": { "command": "uv", "args": [ "--directory", "path/to/src/higress-ai-search-mcp-server", "run", "higress-ai-search-mcp-server" ], "env": { "HIGRESS_URL": "http://localhost:8080/v1/chat/completions", "MODEL": "qwen-turbo", "INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents" } } } }
This project is licensed under the MIT License - see the LICENSE file for details.