mirror of
https://github.com/langchain-ai/mcpdoc.git
synced 2025-10-19 03:18:14 +03:00
Minor update
This commit is contained in:
57
README.md
57
README.md
@@ -2,44 +2,46 @@
|
||||
|
||||
## Overview
|
||||
|
||||
[llms.txt](https://llmstxt.org/) is a standard index of website contents to help LLMs. As an example, [LangGraph's llms.txt](https://langchain-ai.github.io/langgraph/llms.txt) provides a curated list of LangGraph doc URLs with a short description of each one. An LLM can use this file to decide which pages to read when accomplishing tasks, and pairs well with IDEs like Cursor and Windsurf or applications like Claude Code/Desktop.
|
||||
[llms.txt](https://llmstxt.org/) is an index of website contents for LLMs. As an example, [LangGraph's llms.txt](https://langchain-ai.github.io/langgraph/llms.txt) provides a list of LangGraph doc URLs with descriptions. An LLM can use this file to decide which docs to read when accomplishing tasks, which pairs well with IDE agents like Cursor and Windsurf or apps like Claude Code/Desktop.
|
||||
|
||||
However, these applications use different built-in tools to read and process files like `llms.txt`; sometimes IDEs will reflect on the `llms.txt` file and use it for formulate *web search queries* rather than retrieving the specific URLs listed! More broadly, there can be poor visibility into what applications are doing with their built-in retrieval / search tools.
|
||||
However, these apps use different built-in tools to read and process files like `llms.txt`; sometimes IDEs will reflect on the `llms.txt` file and use it for formulate *web search queries* rather than just retrieving the URLs listed! More broadly, there can be poor visibility into what apps are doing with their built-in retrieval / search tools.
|
||||
|
||||
[MCP](https://github.com/modelcontextprotocol) offers a way for developers to define tools that give us *full control* over how documentation is retrieved and displayed to LLMs in these applications. Here, we create [a simple MCP server](https://github.com/modelcontextprotocol) that defines a few basical external **tools** that these applications can use: 1) to tool to load `llms.txt` and 2) fetch specific URLs within `llms.txt`. When these tools are used, the user can customize retrieval and audit the tool calls / the context returned to better understand what is happening under the hood.
|
||||
[MCP](https://github.com/modelcontextprotocol) offers a way for developers to define tools that give *full control* over how context is retrieved and displayed to LLMs in these apps. Here, we create [a simple MCP server](https://github.com/modelcontextprotocol) that defines a few **tools** that these apps can use, such as a `list_doc_sources` to load any `llms.txt` you provide and a `fetch_docs` tool read any URLs within `llms.txt`. This simple MCP server has two benefits: (1) it allows the user to customize context retrieval and (2) it allows the user to audit each tool call as well as the context returned.
|
||||
|
||||

|
||||
|
||||
## Quickstart
|
||||
|
||||
Install uv:
|
||||
* Please see [official uv docs](https://docs.astral.sh/uv/getting-started/installation/#installation-methods) for other ways to install `uv`.
|
||||
|
||||
```bash
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
```
|
||||
|
||||
Please see [official uv docs](https://docs.astral.sh/uv/getting-started/installation/#installation-methods) for other ways to install `uv`.
|
||||
|
||||
Select an `llms.txt` file to use. For example, here's the LangGraph `llms.txt`
|
||||
```bash
|
||||
https://langchain-ai.github.io/langgraph/llms.txt
|
||||
```
|
||||
|
||||
Run the MCP server locally with whatever `llms.txt` file you want to use:
|
||||
Select an `llms.txt` file to use.
|
||||
* For example, [here's](https://langchain-ai.github.io/langgraph/llms.txt) the LangGraph `llms.txt` file.
|
||||
|
||||
Run the MCP server locally with your `llms.txt` file of choice:
|
||||
```bash
|
||||
uvx --from mcpdoc mcpdoc \
|
||||
--urls LangGraph:https://langchain-ai.github.io/langgraph/llms.txt \
|
||||
--transport sse \
|
||||
--port 8081 \
|
||||
--port 8082 \
|
||||
--host localhost
|
||||
```
|
||||
|
||||
Run MCP inspector and connect to the running server via SSE at http://localhost:8081/sse:
|
||||
* This should run at: http://localhost:8082
|
||||
|
||||
< add photo >
|
||||
|
||||
Run [MCP inspector](https://modelcontextprotocol.io/docs/tools/inspector) and connect to the running server:
|
||||
```bash
|
||||
npx @modelcontextprotocol/inspector
|
||||
```
|
||||
|
||||
< add photo >
|
||||
|
||||
Here, you can test the `tool` calls.
|
||||
|
||||
Finally, add the server to any MCP host applications of interest.
|
||||
@@ -60,9 +62,9 @@ Below, we walk through each one, but here are the the config files that are upda
|
||||
`~/.claude.json`
|
||||
```
|
||||
|
||||
These will be updated with our server specification, as shown below.
|
||||
These will be updated with our server, as shown below.
|
||||
|
||||
> NOTE: It appears that `stdio` transport required for Windsurf and Cursor.
|
||||
> NOTE: It appears that `stdio` transport is required for Windsurf and Cursor.
|
||||
|
||||
```
|
||||
{
|
||||
@@ -92,7 +94,8 @@ These will be updated with our server specification, as shown below.
|
||||
### Cursor
|
||||
|
||||
Setup:
|
||||
* Ensure `~/.cursor/mcp.json` is updated to include the server.
|
||||
* `Settings -> MCP` to add a server.
|
||||
* Update `~/.cursor/mcp.json` with `langgraph-docs-mcp` as noted above.
|
||||
* `Settings -> MCP` to confirm that the server is connected.
|
||||
* `Control-L` to open chat.
|
||||
* Ensure `agent` is selected.
|
||||
@@ -112,16 +115,19 @@ use the langgraph-docs-mcp server to answer any LangGraph questions --
|
||||
what are types of memory in LangGraph?
|
||||
```
|
||||
|
||||
* It will ask to approve tool calls.
|
||||
* It will ask to approve tool calls as it processes your question.
|
||||
|
||||

|
||||
|
||||
* Consider adding some of these instructions to [Cursor Rules](https://docs.cursor.com/context/rules-for-ai).
|
||||
|
||||
### Windsurf
|
||||
|
||||
Setup:
|
||||
* Ensure `~/.codeium/windsurf/mcp_config.json` is updated to include the server.
|
||||
* `Control-L` to open Cascade.
|
||||
* Available MCP servers will be listed.
|
||||
* `Control-L` to open Cascade and click `Configure MCP` to open the config file.
|
||||
* Update `~/.codeium/windsurf/mcp_config.json` with `langgraph-docs-mcp` as noted above.
|
||||
* `Control-L` to open Cascade and refresh MCP servers.
|
||||
* Available MCP servers will be listed, showing `langgraph-docs-mcp` as connected.
|
||||
|
||||

|
||||
|
||||
@@ -133,29 +139,30 @@ Then, try the example prompt:
|
||||
### Claude Desktop
|
||||
|
||||
Setup:
|
||||
* Open `Settings -> Developer` to update the config.
|
||||
* Open `Settings -> Developer` to update `~/Library/Application\ Support/Claude/claude_desktop_config.json`.
|
||||
* Restart Claude.
|
||||
|
||||

|
||||
|
||||
* You will see your tools.
|
||||
* You will see your tools visible in the bottom right of your chat input.
|
||||
|
||||

|
||||
|
||||
Then, try the example prompt:
|
||||
|
||||
* It will ask to approve tool calls.
|
||||
* It will ask to approve tool calls as it processes your request.
|
||||
|
||||

|
||||
|
||||
### Claude Code
|
||||
|
||||
Setup:
|
||||
* Shortcut to add the MCP server to your project:
|
||||
* In a terminal after installing [Claude Code](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/overview), run to add the MCP server to your project:
|
||||
```
|
||||
claude mcp add-json langgraph-docs '{"type":"stdio","command":"uvx" ,"args":["--from", "mcpdoc", "mcpdoc", "--urls", "langgraph:https://langchain-ai.github.io/langgraph/llms.txt"]}' -s project
|
||||
```
|
||||
* Test
|
||||
* You will see `~/.claude.json` updated.
|
||||
* Test by launching Claude Code and running to view your tools:
|
||||
```
|
||||
$ Claude
|
||||
$ /mcp
|
||||
|
||||
Reference in New Issue
Block a user