Skip to main content

Requirements

Quickstart

  1. Install Tome
  2. Connect your preferred LLM provider - OpenAI, Ollama and Gemini are preset but you can also add providers like LM Studio by using http://localhost:1234/v1 as the URL
  3. Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste uvx mcp-server-fetch into the server field).
  4. Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.

Next steps

Now that you have Tome installed, explore these key features:

Chat with Tome

Learn about how to chat with Tome.

Create an App

Learn how to create and use apps.

Learn about Models

Learn about the types of models you can use.

Use MCP Servers

Learn how to use MCP servers with Tome.