Local LLM agent for vintage Mac hardware. Built on a 2013 Mac Pro trashcan with dual AMD FirePro D500s. Elyan Labs.

1 Open Issue Need Help Last updated: Mar 16, 2026

Open Issues Need Help

View All on GitHub
AI/ML AI Chatbots & Agents

AI Summary: This GitHub issue requests the integration of Ollama and LM Studio as new backend options for the TrashClaw application, which currently only supports raw llama-server. The task involves implementing auto-detection of the backend type, adapting to potential differences in tool calling formats between the backends, and documenting the setup process. The solution must be thoroughly tested with Ollama and a model that supports tool use.

Complexity: 4/5
good first issue bounty

Local LLM agent for vintage Mac hardware. Built on a 2013 Mac Pro trashcan with dual AMD FirePro D500s. Elyan Labs.

Python