Requirements
- MacOS or Windows
- LLM Provider of your choice: Ollama or Gemini API key are easy & free
- Download the latest release of Tome
Quickstart
- Install Tome
- Connect your preferred LLM provider - OpenAI, Ollama and Gemini are preset but you can also add providers like LM Studio by using http://localhost:1234/v1 as the URL
- Open the MCP tab in Tome and install your first MCP server (Fetch is an easy one to get started with, just paste
uvx mcp-server-fetchinto the server field). - Chat with your MCP-powered model! Ask it to fetch the top story on Hacker News.
