Open Issues Need Help
View All on GitHubAI Summary: This issue proposes adding a setting to control the minimum input length for messages sent to the Ollama model. Currently, there's a hardcoded minimum, which the user finds restrictive for simple testing scenarios like sending a single word. Allowing this to be configurable would provide more flexibility.
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.
AI Summary: The `ollmcp` client incorrectly requests the `/prompts/list` endpoint from MCP servers, even if the server doesn't support it. This can lead to crashes when the server returns an error response. The issue suggests that the client should gracefully handle unsupported capabilities instead of making unnecessary requests.
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.
AI Summary: This issue requests the ability to configure the host for a remote Ollama server directly within the `config.json` file. Currently, the host can only be specified as a command-line argument, and there's no apparent way to set a default remote host for persistent use.
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.