THE Copilot in Obsidian

6.4K stars 583 forks 6.4K watchers TypeScript GNU Affero General Public License v3.0
ai aiagent chatgpt copilot obsidian-plugin
4 Open Issues Need Help Last updated: Mar 11, 2026

Open Issues Need Help

View All on GitHub

AI Summary: The user is asking how prompt caching is implemented within the plugin, specifically when using OpenRouter as a provider. They want to know if the plugin enables prompt caching by default, as it's important for reducing token costs with certain hosted models.

Complexity: 2/5
good first issue question

THE Copilot in Obsidian

TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin

AI Summary: The user is requesting that the "thinking" token display, currently supported for OpenAI Format providers like LM Studio and Ollama, be extended to also work with llama.cpp/llama-swap providers. This would enhance the user experience by providing visual feedback during model inference.

Complexity: 2/5
good first issue feature request

THE Copilot in Obsidian

TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin

AI Summary: This issue proposes adding Kilo Gateway as a new LLM provider for Obsidian Copilot. The user is a contributor to the Kilo project and is willing to implement the integration themselves. Kilo Gateway is described as a backend for OpenRouter, where the user already has credits.

Complexity: 2/5
good first issue

THE Copilot in Obsidian

TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin

THE Copilot in Obsidian

TypeScript
#ai#aiagent#chatgpt#copilot#obsidian-plugin