Open Issues Need Help
View All on GitHubAI Summary: A user is attempting to run a custom OpenVINO model on an NPU device but receives no output when making a completion request. The logs indicate that no generation, tokenization, or other processing occurs, with all timing metrics showing 0.00, suggesting a silent failure in the inference pipeline before any actual computation takes place.
Repository for OpenVINO's extra modules
AI Summary: A user is encountering an issue where `ollama.exe serve` produces no output when run in PowerShell on Windows, even after setting `GODEBUG=cgocheck=0` and trying `setupvars.bat`. The problem requires investigation to diagnose why the command is not providing any feedback or starting as expected.
Repository for OpenVINO's extra modules
Repository for OpenVINO's extra modules