A beautiful local-first coding agent running in your terminal - built by the community for the community ⚒

ai ai-agents ai-coding coding-agents llm llm-inference ollama openai openrouter
4 Open Issues Need Help Last updated: Sep 16, 2025

Open Issues Need Help

View All on GitHub
bug help wanted

A beautiful local-first coding agent running in your terminal - built by the community for the community ⚒

TypeScript
#ai#ai-agents#ai-coding#coding-agents#llm#llm-inference#ollama#openai#openrouter
enhancement help wanted

A beautiful local-first coding agent running in your terminal - built by the community for the community ⚒

TypeScript
#ai#ai-agents#ai-coding#coding-agents#llm#llm-inference#ollama#openai#openrouter

AI Summary: The token count display in Nanocoder is inaccurate because it fails to include the system prompt tokens, leading to an underestimation of actual API costs and context usage. This bug affects all AI providers and requires the system prompt's token count to be factored into the displayed total.

Complexity: 2/5
bug good first issue

A beautiful local-first coding agent running in your terminal - built by the community for the community ⚒

TypeScript
#ai#ai-agents#ai-coding#coding-agents#llm#llm-inference#ollama#openai#openrouter

A beautiful local-first coding agent running in your terminal - built by the community for the community ⚒

TypeScript
#ai#ai-agents#ai-coding#coding-agents#llm#llm-inference#ollama#openai#openrouter