A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

agentic-ai ai command-line-tool generative-ai linux llm local-llm macos mcp mcp-client mcp-server model-context-protocol ollama open-source pypi-package sse stdio streamable-http tool-management windows
3 Open Issues Need Help Last updated: Jan 13, 2026

Open Issues Need Help

View All on GitHub

AI Summary: This issue proposes adding a setting to control the minimum input length for messages sent to the Ollama model. Currently, there's a hardcoded minimum, which the user finds restrictive for simple testing scenarios like sending a single word. Allowing this to be configurable would provide more flexibility.

Complexity: 2/5
enhancement good first issue

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

Python
#agentic-ai#ai#command-line-tool#generative-ai#linux#llm#local-llm#macos#mcp#mcp-client#mcp-server#model-context-protocol#ollama#open-source#pypi-package#sse#stdio#streamable-http#tool-management#windows

AI Summary: The `ollmcp` client incorrectly requests the `/prompts/list` endpoint from MCP servers, even if the server doesn't support it. This can lead to crashes when the server returns an error response. The issue suggests that the client should gracefully handle unsupported capabilities instead of making unnecessary requests.

Complexity: 2/5
bug good first issue

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

Python
#agentic-ai#ai#command-line-tool#generative-ai#linux#llm#local-llm#macos#mcp#mcp-client#mcp-server#model-context-protocol#ollama#open-source#pypi-package#sse#stdio#streamable-http#tool-management#windows

AI Summary: This issue requests the ability to configure the host for a remote Ollama server directly within the `config.json` file. Currently, the host can only be specified as a command-line argument, and there's no apparent way to set a default remote host for persistent use.

Complexity: 2/5
good first issue feature request

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

Python
#agentic-ai#ai#command-line-tool#generative-ai#linux#llm#local-llm#macos#mcp#mcp-client#mcp-server#model-context-protocol#ollama#open-source#pypi-package#sse#stdio#streamable-http#tool-management#windows