Adds Release links to README and revised screenshot

This commit is contained in:
Matte Noble
2025-04-25 15:16:28 -07:00
parent 65ae349558
commit 8a6d2ef646
2 changed files with 2 additions and 2 deletions

View File

@@ -20,13 +20,13 @@ This is our very first Technical Preview so bear in mind things will be rough ar
- MacOS (Sequoia 15.0 or higher recommended)
- [Ollama](https://ollama.com/) (Either local or remote, you can configure any Ollama URL in settings)
- [Download the latest release of Tome](#)
- [Download the latest release of Tome](https://github.com/runebookai/tome/releases/download/v0.1.0/Tome_0.1.0_aarch64.dmg)
## Quickstart
We'll be updating our [home page](https://runebook.ai) in the coming weeks with docs and an end-to-end tutorial, here's a quick getting started guide in the meantime.
1. Install [Tome](#) and [Ollama](https://ollama.com)
1. Install [Tome](https://github.com/runebookai/tome/releases/download/v0.1.0/Tome_0.1.0_aarch64.dmg) and [Ollama](https://ollama.com)
2. Install a [Tool supported model](https://ollama.com/search?c=tools) (we're partial to Qwen2.5, either 14B or 7B depending on your RAM)
3. Open the MCP tab in Tome and install your first [MCP server](https://github.com/modelcontextprotocol/servers) (Fetch is an easy one to get started with, just paste `uvx mcp-server-fetch` into the server field)
4. Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 995 KiB

After

Width:  |  Height:  |  Size: 934 KiB