2025-03-18 15:45:21 -07:00
x
2025-03-18 12:50:18 -04:00
x
2025-03-18 16:38:22 -04:00
x
2025-03-17 23:32:55 -04:00
2025-03-17 23:29:03 -04:00
x
2025-03-18 13:04:46 -04:00
2025-03-17 23:29:03 -04:00
x
2025-03-18 16:37:21 -04:00
2025-03-18 15:45:21 -07:00
2025-03-17 23:29:03 -04:00
2025-03-17 23:29:03 -04:00
x
2025-03-18 13:31:15 -04:00

MCP LLMS-TXT Documentation Server

Overview

llms.txt is an index of website contents for LLMs. As an example, LangGraph's llms.txt provides a list of LangGraph doc URLs with descriptions. An LLM can use this file to decide which docs to read when accomplishing tasks, which pairs well with IDE agents like Cursor and Windsurf or apps like Claude Code/Desktop.

However, these apps use different built-in tools to read and process files like llms.txt; sometimes IDEs will reflect on the llms.txt file and use it for formulate web search queries rather than just retrieving the URLs listed! More broadly, there can be poor visibility into what apps are doing with their built-in retrieval / search tools.

MCP offers a way for developers to define tools that give full control over how context is retrieved and displayed to LLMs in these apps. Here, we create a simple MCP server that defines a few tools that these apps can use, such as a list_doc_sources to load any llms.txt you provide and a fetch_docs tool read any URLs within llms.txt. This simple MCP server has two benefits: (1) it allows the user to customize context retrieval and (2) it allows the user to audit each tool call as well as the context returned.

Screenshot 2025-03-18 at 12 55 51 PM

Quickstart

Install uv:

curl -LsSf https://astral.sh/uv/install.sh | sh

Select an llms.txt file to use.

  • For example, here's the LangGraph llms.txt file.

Run the MCP server locally with your llms.txt file of choice:

uvx --from mcpdoc mcpdoc \
    --urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt \
    --transport sse \
    --port 8082 \
    --host localhost

Screenshot 2025-03-18 at 3 29 30 PM

Run MCP inspector and connect to the running server:

npx @modelcontextprotocol/inspector

Screenshot 2025-03-18 at 3 30 30 PM

Here, you can test the tool calls.

Finally, add the server to any MCP host applications of interest.

Below, we walk through each one, but here are the the config files that are updated for each:

*Cursor*
`~/.cursor/mcp.json` 

*Windsurf*
`~/.codeium/windsurf/mcp_config.json`
 
*Claude Desktop*
`~/Library/Application\ Support/Claude/claude_desktop_config.json`
 
*Claude Code*
`~/.claude.json`

These will be updated with our server, as shown below.

NOTE: It appears that stdio transport is required for Windsurf and Cursor.

{
  "mcpServers": {
    "langgraph-docs-mcp": {
      "command": "uvx",
      "args": [
        "--from",
        "mcpdoc",
        "mcpdoc",
        "--urls",
        "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt",
        "--transport",
        "stdio",
        "--port",
        "8081",
        "--host",
        "localhost"
      ]
    }
  }
}

Usage

Cursor

Setup:

  • Settings -> MCP to add a server.
  • Update ~/.cursor/mcp.json with langgraph-docs-mcp as noted above.
  • Settings -> MCP to confirm that the server is connected.
  • Control-L to open chat.
  • Ensure agent is selected.

Screenshot 2025-03-18 at 1 56 54 PM

Then, try an example prompt:

use the langgraph-docs-mcp server to answer any LangGraph questions -- 
+ call get_docs tool to get the available llms.txt file
+ call fetch_docs tool to read it
+ reflect on the urls in llms.txt 
+ reflect on the input question 
+ call fetch_docs on any urls relevant to the question
+ use this to answer the question

what are types of memory in LangGraph?
  • It will ask to approve tool calls as it processes your question.

Screenshot 2025-03-18 at 1 58 38 PM

  • Consider adding some of these instructions to Cursor Rules.

Windsurf

Setup:

  • Control-L to open Cascade and click Configure MCP to open the config file.
  • Update ~/.codeium/windsurf/mcp_config.json with langgraph-docs-mcp as noted above.
  • Control-L to open Cascade and refresh MCP servers.
  • Available MCP servers will be listed, showing langgraph-docs-mcp as connected.

Screenshot 2025-03-18 at 2 02 12 PM

Then, try the example prompt:

  • It will perform your tool calls.

Screenshot 2025-03-18 at 2 03 07 PM

Claude Desktop

Setup:

  • Open Settings -> Developer to update ~/Library/Application\ Support/Claude/claude_desktop_config.json.
  • Restart Claude.

Screenshot 2025-03-18 at 2 05 54 PM

  • You will see your tools visible in the bottom right of your chat input.

Screenshot 2025-03-18 at 2 05 39 PM

Then, try the example prompt:

  • It will ask to approve tool calls as it processes your request.

Screenshot 2025-03-18 at 2 06 54 PM

Claude Code

Setup:

  • In a terminal after installing Claude Code, run to add the MCP server to your project:
claude mcp add-json langgraph-docs '{"type":"stdio","command":"uvx" ,"args":["--from", "mcpdoc", "mcpdoc", "--urls", "langgraph:https://langchain-ai.github.io/langgraph/llms.txt"]}' -s project
  • You will see ~/.claude.json updated.
  • Test by launching Claude Code and running to view your tools:
$ Claude
$ /mcp 

Screenshot 2025-03-18 at 2 13 49 PM

Then, try the example prompt:

  • It will ask to approve tool calls.

Screenshot 2025-03-18 at 2 14 37 PM

Command-line Interface

The mcpdoc command provides a simple CLI for launching the documentation server. You can specify documentation sources in three ways, and these can be combined:

  1. Using a YAML config file:
mcpdoc --yaml sample_config.yaml

This will load the LangGraph Python documentation from the sample_config.yaml file.

  1. Using a JSON config file:
mcpdoc --json sample_config.json

This will load the LangGraph Python documentation from the sample_config.json file.

  1. Directly specifying llms.txt URLs with optional names:
mcpdoc --urls https://langchain-ai.github.io/langgraph/llms.txt LangGraph:https://langchain-ai.github.io/langgraph/llms.txt

URLs can be specified either as plain URLs or with optional names using the format name:url.

You can also combine these methods to merge documentation sources:

mcpdoc --yaml sample_config.yaml --json sample_config.json --urls https://langchain-ai.github.io/langgraph/llms.txt

Additional Options

  • --follow-redirects: Follow HTTP redirects (defaults to False)
  • --timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)

Example with additional options:

mcpdoc --yaml sample_config.yaml --follow-redirects --timeout 15

This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.

Configuration Format

Both YAML and JSON configuration files should contain a list of documentation sources. Each source must include an llms_txt URL and can optionally include a name:

YAML Configuration Example (sample_config.yaml)

# Sample configuration for mcp-mcpdoc server
# Each entry must have a llms_txt URL and optionally a name
- name: LangGraph Python
  llms_txt: https://langchain-ai.github.io/langgraph/llms.txt

JSON Configuration Example (sample_config.json)

[
  {
    "name": "LangGraph Python",
    "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt"
  }
]

Programmatic Usage

from mcpdoc.main import create_server

# Create a server with documentation sources
server = create_server(
    [
        {
            "name": "LangGraph Python",
            "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt",
        },
        # You can add multiple documentation sources
        # {
        #     "name": "Another Documentation",
        #     "llms_txt": "https://example.com/llms.txt",
        # },
    ],
    follow_redirects=True,
    timeout=15.0,
)

# Run the server
server.run(transport="stdio")
Description
Expose llms-txt to IDEs for development
Readme MIT 209 KiB
Languages
Python 91.8%
Makefile 8.2%