Open Issues Need Help
View All on GitHub Add support for fp8 model 2 months ago
AI Summary: The task is to add support for an FP8 (or similar quantized) version of the FLUX.1-Kontext-dev model to the Ghibli At Home application. This involves modifying the code to download and utilize a smaller, quantized model (e.g., a GGUF file) instead of the larger, full-precision model, reducing VRAM requirements from approximately 47GB to 12GB.
Complexity:
4/5
enhancement help wanted
The GPT-4o image generation we have at home. A powerful, self-hosted AI photo stylizer built for performance and privacy.
JavaScript