Eugene Yurtsev e76cddc12d x
2025-03-17 23:32:55 -04:00
x
2025-03-17 23:32:55 -04:00
x
2025-03-17 23:32:55 -04:00
2025-03-17 23:29:03 -04:00
2025-03-17 23:29:03 -04:00
x
2025-03-17 23:32:55 -04:00
2025-03-17 23:29:03 -04:00
2025-03-17 23:29:03 -04:00
2025-03-17 23:29:03 -04:00
x
2025-03-17 23:32:55 -04:00

MCP LLMS-TXT Documentation Server

A Model Control Protocol (MCP) server for serving documentation from llms.txt files.

Installation

pip install mcp-mcpdoc

Usage

Command-line Interface

The mcpdoc command provides a simple CLI for launching the documentation server. You can specify documentation sources in three ways, and these can be combined:

  1. Using a YAML config file:
mcpdoc --yaml sample_config.yaml

This will load the LangGraph Python documentation from the sample_config.yaml file.

  1. Using a JSON config file:
mcpdoc --json sample_config.json

This will load the LangGraph Python documentation from the sample_config.json file.

  1. Directly specifying llms.txt URLs with optional names:
mcpdoc --urls https://langchain-ai.github.io/langgraph/llms.txt LangGraph:https://langchain-ai.github.io/langgraph/llms.txt

URLs can be specified either as plain URLs or with optional names using the format name:url.

You can also combine these methods to merge documentation sources:

mcpdoc --yaml sample_config.yaml --json sample_config.json --urls https://langchain-ai.github.io/langgraph/llms.txt

Additional Options

  • --follow-redirects: Follow HTTP redirects (defaults to False)
  • --timeout SECONDS: HTTP request timeout in seconds (defaults to 10.0)

Example with additional options:

mcpdoc --yaml sample_config.yaml --follow-redirects --timeout 15

This will load the LangGraph Python documentation with a 15-second timeout and follow any HTTP redirects if necessary.

Configuration Format

Both YAML and JSON configuration files should contain a list of documentation sources. Each source must include an llms_txt URL and can optionally include a name:

YAML Configuration Example (sample_config.yaml)

# Sample configuration for mcp-mcpdoc server
# Each entry must have a llms_txt URL and optionally a name
- name: LangGraph Python
  llms_txt: https://langchain-ai.github.io/langgraph/llms.txt

JSON Configuration Example (sample_config.json)

[
  {
    "name": "LangGraph Python",
    "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt"
  }
]

Programmatic Usage

from mcpdoc.main import create_server

# Create a server with documentation sources
server = create_server(
    [
        {
            "name": "LangGraph Python",
            "llms_txt": "https://langchain-ai.github.io/langgraph/llms.txt",
        },
        # You can add multiple documentation sources
        # {
        #     "name": "Another Documentation",
        #     "llms_txt": "https://example.com/llms.txt",
        # },
    ],
    follow_redirects=True,
    timeout=15.0,
)

# Run the server
server.run(transport="stdio")
Description
Expose llms-txt to IDEs for development
Readme MIT 209 KiB
Languages
Python 91.8%
Makefile 8.2%