Ragdocs
STDIOMCP server enabling semantic search and retrieval of documentation using vector database and natural language queries
MCP server enabling semantic search and retrieval of documentation using vector database and natural language queries
A Model Context Protocol (MCP) server that enables semantic search and retrieval of documentation using a vector database (Qdrant). This server allows you to add documentation from URLs or local files and then search through them using natural language queries.
Install the package globally:
npm install -g @qpd-v/mcp-server-ragdocs
Start Qdrant (using Docker):
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
Ensure Ollama is running with the default embedding model:
ollama pull nomic-embed-text
Add to your configuration file:
%AppData%\Roaming\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json%AppData%\Roaming\Code\User\globalStorage\rooveterinaryinc.roo-cline\settings\cline_mcp_settings.json%AppData%\Claude\claude_desktop_config.json{ "mcpServers": { "ragdocs": { "command": "node", "args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }
Verify installation:
# Check Qdrant is running curl http://localhost:6333/collections # Check Ollama has the model ollama list | grep nomic-embed-text
Current version: 0.1.6
Install globally using npm:
npm install -g @qpd-v/mcp-server-ragdocs
This will install the server in your global npm directory, which you'll need for the configuration steps below.
docker run -p 6333:6333 -p 6334:6334 qdrant/qdrant
The server can be used with both Cline/Roo and Claude Desktop. Configuration differs slightly between them:
Add to your Cline settings file (%AppData%\Roaming\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json)
AND/OR
Add to your Roo-Code settings file (%AppData%\Roaming\Code\User\globalStorage\rooveterinaryinc.roo-cline\settings\cline_mcp_settings.json):
{ "mcpServers": { "ragdocs": { "command": "node", "args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }
For OpenAI instead of Ollama:
{ "mcpServers": { "ragdocs": { "command": "node", "args": ["C:/Users/YOUR_USERNAME/AppData/Roaming/npm/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js"], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "openai", "OPENAI_API_KEY": "your-openai-api-key" } } } }
{ "mcpServers": { "ragdocs": { "command": "node", "args": ["PATH_TO_PROJECT/mcp-ragdocs/build/index.js"], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }
Add to your Claude Desktop config file:
%AppData%\Claude\claude_desktop_config.json~/Library/Application Support/Claude/claude_desktop_config.json{ "mcpServers": { "ragdocs": { "command": "C:\\Program Files\\nodejs\\node.exe", "args": [ "C:\\Users\\YOUR_USERNAME\\AppData\\Roaming\\npm\\node_modules\\@qpd-v/mcp-server-ragdocs\\build\\index.js" ], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }
Windows Setup with OpenAI:
{ "mcpServers": { "ragdocs": { "command": "C:\\Program Files\\nodejs\\node.exe", "args": [ "C:\\Users\\YOUR_USERNAME\\AppData\\Roaming\\npm\\node_modules\\@qpd-v/mcp-server-ragdocs\\build\\index.js" ], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "openai", "OPENAI_API_KEY": "your-openai-api-key" } } } }
{ "mcpServers": { "ragdocs": { "command": "/usr/local/bin/node", "args": [ "/usr/local/lib/node_modules/@qpd-v/mcp-server-ragdocs/build/index.js" ], "env": { "QDRANT_URL": "http://127.0.0.1:6333", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } } } }
For either Cline or Claude Desktop, when using Qdrant Cloud, modify the env section:
With Ollama:
{ "env": { "QDRANT_URL": "https://your-cluster-url.qdrant.tech", "QDRANT_API_KEY": "your-qdrant-api-key", "EMBEDDING_PROVIDER": "ollama", "OLLAMA_URL": "http://localhost:11434" } }
With OpenAI:
{ "env": { "QDRANT_URL": "https://your-cluster-url.qdrant.tech", "QDRANT_API_KEY": "your-qdrant-api-key", "EMBEDDING_PROVIDER": "openai", "OPENAI_API_KEY": "your-openai-api-key" } }
QDRANT_URL (required): URL of your Qdrant instance
QDRANT_API_KEY (required for cloud): Your Qdrant Cloud API keyEMBEDDING_PROVIDER (optional): Choose between 'ollama' (default) or 'openai'EMBEDDING_MODEL (optional):
OLLAMA_URL (optional): URL of your Ollama instance (defaults to http://localhost:11434)OPENAI_API_KEY (required if using OpenAI): Your OpenAI API keyadd_documentation
url: URL of the documentation to fetchsearch_documentation
query: Search querylimit (optional): Maximum number of results to return (default: 5)list_sources
In Claude Desktop or any other MCP-compatible client:
Add this documentation: https://docs.example.com/api
Search the documentation for information about authentication
What documentation sources are available?
git clone https://github.com/qpd-v/mcp-server-ragdocs.git cd mcp-server-ragdocs
npm install
npm run build
npm start
MIT
Qdrant Connection Error
Error: Failed to connect to Qdrant at http://localhost:6333
docker ps | grep qdrantOllama Model Missing
Error: Model nomic-embed-text not found
ollama pull nomic-embed-textollama listConfiguration Path Issues
YOUR_USERNAME with your actual Windows usernamenpm Global Install Issues
npm -vnpm list -g @qpd-v/mcp-server-ragdocsFor other issues, please check:
docker logs $(docker ps -q --filter ancestor=qdrant/qdrant)ollama listnode -v (should be 16 or higher)Contributions are welcome! Please feel free to submit a Pull Request.