Requirements
- MacOS or Windows
- LLM Provider of your choice: Ollama or Gemini API key are easy & free
- Download the latest release of Tome
Quickstart
- Install Tome
- Connect your preferred LLM provider - OpenAI, Ollama and Gemini are preset but you can also add providers like LM Studio by using http://localhost:1234/v1 as the URL
- Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste
uvx mcp-server-fetchinto the server field). - Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.
Next steps
Now that you have Tome installed, explore these key features:Chat with Tome
Learn about how to chat with Tome.
Create an App
Learn how to create and use apps.
Learn about Models
Learn about the types of models you can use.
Use MCP Servers
Learn how to use MCP servers with Tome.
