LLMS-TXT文档
HTTP-SSEllms.txt文档源访问和检索MCP服务器
llms.txt文档源访问和检索MCP服务器
llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.
MCP offers a way for developers to have full control over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple  fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.
uv.curl -LsSf https://astral.sh/uv/install.sh | sh
llms.txt file to use.llms.txt file.llms.txt file of choice:uvx --from mcpdoc mcpdoc \ --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt \ --transport sse \ --port 8082 \ --host localhost
npx @modelcontextprotocol/inspector
tool calls.Cursor Settings and MCP tab.~/.cursor/mcp.json file.langgraph-docs-mcp name and link to the LangGraph llms.txt).{
  "mcpServers": {
    "langgraph-docs-mcp": {
      "command": "uvx",
      "args": [
        "--from",
        "mcpdoc",
        "mcpdoc",
        "--urls",
        "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt",
        "--transport",
        "stdio",
        "--port",
        "8081",
        "--host",
        "localhost"
      ]
    }
  }
}
Cursor Settings/MCP tab.CMD+L (on Mac) to open chat.agent is selected.Then, try an example prompt, such as:
use the langgraph-docs-mcp server to answer any LangGraph questions -- 
+ call list_doc_sources tool to get the available llms.txt file
+ call fetch_docs tool to read it
+ reflect on the urls in llms.txt 
+ reflect on the input question 
+ call fetch_docs on any urls relevant to the question
+ use this to answer the question
what are types of memory in LangGraph?
CMD+L (on Mac).Configure MCP to open the config file, ~/.codeium/windsurf/mcp_config.json.langgraph-docs-mcp as noted above.CMD+L (on Mac) to open Cascade and refresh MCP servers.langgraph-docs-mcp as connected.Then, try the example prompt:
Settings/Developer to update ~/Library/Application\ Support/Claude/claude_desktop_config.json.langgraph-docs-mcp as noted above.Then, try the example prompt:
claude mcp add-json langgraph-docs '{"type":"stdio","command":"uvx" ,"args":["--from", "mcpdoc", "mcpdoc", "--urls", "langgraph:https://langchain-ai.github.io/langgraph/llms.txt"]}' -s local
~/.claude.json updated.$ Claude
$ /mcp 
Then, try the example prompt:
The mcpdoc command provides a simple CLI for launching the documentation server.
You can specify documentation sources in three ways, and these can be combined:
sample_config.yaml file in this repo.mcpdoc --yaml sample_config.yaml
sample_config.json file in this repo.mcpdoc --json sample_config.json
name:url.llms.txt for the MCP server above.mcpdoc --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt
You can also combine these methods to merge documentation sources:
mcpdoc --yaml sample_config.yaml --json sample_config.json --urls https://langchain-ai.github.io/langgraph/llms.txt
--follow-redirects: Follow HTTP redirects (defaults to False)--timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)Example with additional options:
mcpdoc --yaml sample_config.yaml --follow-redirects --timeout 15
This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.
Both YAML and JSON configuration files should contain a list of documentation sources.
Each source must include an llms_txt URL and can optionally include a name:
# Sample configuration for mcp-mcpdoc server # Each entry must have a llms_txt URL and optionally a name - name: LangGraph Python llms_txt: https://langchain-ai.github.io/langgraph/llms.txt
[ { "name": "LangGraph Python", "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt" } ]
from mcpdoc.main import create_server # Create a server with documentation sources server = create_server( [ { "name": "LangGraph Python", "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt", }, # You can add multiple documentation sources # { # "name": "Another Documentation", # "llms_txt": "https://example.com/llms.txt", # }, ], follow_redirects=True, timeout=15.0, ) # Run the server server.run(transport="stdio")