Zig-based Ollama alternative for running LLMs locally. Built on top of llama.cpp.zig bindings

ai cli cpp igllama inference library llama llama-cpp llamacpp llm ollama zig
1 Open Issue Need Help Last updated: Feb 24, 2026

Open Issues Need Help

View All on GitHub

AI Summary: The `igllama pull` command displays garbled characters instead of a progress bar on Windows terminals. This is due to the tool sending UTF-8 characters that the terminal misinterprets with a legacy code page. The issue can be resolved by forcing UTF-8 output, detecting terminal support, or adding a flag to disable Unicode rendering.

Complexity: 2/5
bug enhancement help wanted good first issue

Zig-based Ollama alternative for running LLMs locally. Built on top of llama.cpp.zig bindings

Zig
#ai#cli#cpp#igllama#inference#library#llama#llama-cpp#llamacpp#llm#ollama#zig