Open-source AI chat app built with Convex and TanStack Router

2 Open Issues Need Help Last updated: Jun 19, 2025

Open Issues Need Help

View All on GitHub

AI Summary: The task is to implement the ability to stop a running LLM generation mid-stream within the Hyperwave AI chat application. This requires addressing the underlying issue in the Convex Agent component (linked in the issue description) which currently lacks this functionality. The solution will likely involve modifying the Convex Agent code and integrating the changes into the Hyperwave application.

Complexity: 4/5
enhancement good first issue

Open-source AI chat app built with Convex and TanStack Router

TypeScript

Open-source AI chat app built with Convex and TanStack Router

TypeScript