the text-based terminal client for Ollama.
- intuitive and simple terminal UI, no need to run servers, frontends, just type
oterm
in your terminal. - multiple persistent chat sessions, stored together with system prompt & parameter customizations in sqlite.
- can use any of the models you have pulled in Ollama, or your own custom models.
- allows for easy customization of the model's system prompt and parameters.
- supports tools integration for providing external information to the model.
uvx oterm
See Installation for more details.
- Create custom commands that can be run from the terminal using oterm. Each of these commands is a chat, customized to your liking and connected to the tools of your choice.
- Support for Model Context Protocol (MCP) tools. You can now use any of the MCP tools to provide external information to the model.
- Support for the
<thinking/>
tag in reasoning models.
The splash screen animation that greets users when they start oterm.
A view of the chat interface, showcasing the conversation between the user and the model.
The model selection screen, allowing users to choose and customize available models.
oterm using the
git
MCP server to access its own repo.
The image selection interface, demonstrating how users can include images in their conversations.
oterm supports multiple themes, allowing users to customize the appearance of the interface.
This project is licensed under the MIT License.