Open Issues Need Help
View All on GitHubAI Summary: The `semantic_code_search` function fails with an 'input length exceeds context length' error when processing codebases with over 100 files. This issue appears to stem from how the function batches or concatenates file content before sending it for embedding, rather than embedding files individually. While Ollama models themselves support larger contexts, the `semantic_code_search` implementation seems to be the bottleneck.
Semantic Intelligence for Large-Scale Engineering. Context+ is an MCP server designed for developers who demand 99% accuracy. By combining Tree-sitter AST parsing, Spectral Clustering, and Obsidian-style linking, Context+ turns a massive codebase into a searchable, hierarchical feature graph.
AI Summary: The Ollama JavaScript client incorrectly ignores the `OLLAMA_HOST` environment variable specified in `.mcp.json`. This prevents users from connecting to remote Ollama instances, as the client hardcodes the connection to `localhost:11434`.
Semantic Intelligence for Large-Scale Engineering. Context+ is an MCP server designed for developers who demand 99% accuracy. By combining Tree-sitter AST parsing, Spectral Clustering, and Obsidian-style linking, Context+ turns a massive codebase into a searchable, hierarchical feature graph.
AI Summary: This issue requests the implementation of a dark mode feature for the user interface. A toggle button should be added to the header to allow users to switch between light and dark themes. The primary requirement is that the code, regardless of its origin, must be clean and adhere to the project's instructions.
Semantic Intelligence for Large-Scale Engineering. Context+ is an MCP server designed for developers who demand 99% accuracy. By combining Tree-sitter AST parsing, Spectral Clustering, and Obsidian-style linking, Context+ turns a massive codebase into a searchable, hierarchical feature graph.