Open Issues Need Help
View All on GitHubAI Summary: This feature request aims to add command-line interface (CLI) support for managing collection ignore patterns. Currently, these patterns can only be managed through manual YAML file edits, which is error-prone. The goal is to provide a first-class CLI experience for adding, removing, and inspecting these patterns while maintaining backward compatibility with existing YAML configurations.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding a new `kindx doctor` command to the KINDX CLI. This command will provide a centralized diagnostics report, consolidating information about sqlite-vec availability, backend mode, remote API configuration, MCP authentication, and the health of daemon/watch processes. The goal is to streamline troubleshooting and improve the onboarding experience for KINDX users.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes making model-backed tests capability-aware to prevent noisy failures on local machines lacking a usable `node-llama-cpp` runtime. The goal is to introduce a shared capability probe that allows these tests to skip cleanly with actionable reasons, distinguishing between environment limitations and actual code regressions, while preserving CI behavior and existing test coverage.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding a new GitHub Actions CI job to test the installation and runtime behavior of the project's packaging. It aims to catch regressions that might be missed by existing tests by utilizing existing `smoke-install.sh` and `Containerfile` assets. The goal is to ensure a practical subset of these checks are run in CI, providing clear failure indications and local execution guidance.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding robust validation to the `index.yml` configuration file during loading. The goal is to catch errors like incorrect data types, unknown keys, and legacy array formats early, providing clear error messages to prevent silent misconfigurations and confusing runtime behavior.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue requests the addition of a beginner-friendly quickstart guide to the repository's README. The guide should explain how to configure the `remote` LLM backend to work with services like Ollama, LM Studio, or other OpenAI-compatible endpoints, including necessary environment variables and API key usage. It should also clarify the rerank fallback behavior when the `/v1/rerank` endpoint is unavailable.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This is a beginner-level documentation issue to update the README's data storage and schema documentation. The current documentation is outdated and does not reflect the actual SQLite database schema and the use of YAML files for collection configuration. The goal is to make the documentation accurately represent the current system design.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes to add support for tilde (~) expansion in YAML-defined collection paths, allowing them to correctly resolve to the user's home directory. Currently, tilde paths are not expanded, leading to misleading examples in `sample-catalog.yml` and potentially broken user configurations. The fix involves modifying path resolution logic and updating the sample catalog to reflect the new supported behavior.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding `--pattern` as a backward-compatible alias for the `--mask` flag in the `collection add` command. The goal is to align CLI terminology with the config model and documentation, which currently use "pattern" more frequently, thereby reducing user confusion and improving consistency.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue proposes adding regression tests for the `collection include` and `exclude` commands in the CLI. The goal is to ensure that default queries correctly reflect the inclusion or exclusion of collections, preventing regressions in this contributor-facing feature. The tests will verify that excluded collections are ignored by default queries, re-included collections are searched again, and explicit collection filters still work as expected.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue aims to make two existing but hidden CLI options, `collection add --mask` and `get --from`, visible in the main help output and the README documentation. The goal is to improve discoverability and ensure the help text accurately reflects the command's capabilities, reducing user confusion and support burden.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue requests documentation for the existing YAML `ignore` patterns feature in the project's README.md. The goal is to explain how to configure and use `ignore` patterns in `index.yml` to exclude files or directories during indexing and search, providing a clear example for manual `index.yml` editors.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue addresses a broken path in the `specs/smoke-install.sh` script. The script incorrectly references a non-existent file for Bun's preload helper, while the actual file is `engine/preloader.ts`. The fix involves updating the script to point to the correct file, ensuring the smoke test accurately catches regressions.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.
AI Summary: This issue is a "good first issue" focused on updating the project's README documentation. The goal is to audit the environment variable table against the actual code defaults and ensure all user-facing variables are documented, while also correcting any inaccuracies in existing documentation.
Local-first hybrid cli search engine (BM25 + Vector) for personal knowledge bases and agentic workflows. Runs entirely on-device via node-llama-cpp.