Local, reliable doc Q&A bot that cites sources and refuses when unsure. Powered by Ollama, RAG, and confidence scoring.

ollama rag
1 Open Issue Need Help Last updated: Sep 16, 2025

Open Issues Need Help

View All on GitHub

AI Summary: This issue proposes implementing a foundational abstraction layer to support multiple LLM and embedding providers (e.g., OpenAI, Groq, Mistral), moving away from a tightly coupled Ollama setup. Key features include automatic fallback capabilities, a provider manager, configuration system, and ensuring legal and regional compliance, all while maintaining backward compatibility.

Complexity: 4/5
enhancement good first issue architecture provider-support

Local, reliable doc Q&A bot that cites sources and refuses when unsure. Powered by Ollama, RAG, and confidence scoring.

Python
#ollama#rag