Open Issues Need Help
View All on GitHub AI/ML • Inference Platforms
[Good First Issue]: No output from ollama.exe 22 days ago
AI Summary: A user is encountering an issue where `ollama.exe serve` produces no output when run in PowerShell on Windows, even after setting `GODEBUG=cgocheck=0` and trying `setupvars.bat`. The problem requires investigation to diagnose why the command is not providing any feedback or starting as expected.
Complexity:
2/5
good first issue
Repository for OpenVINO's extra modules
C++
#arm#inference-engine#java#nvidia-gpu#openvino#pytorch
AI/ML • Inference Platforms
Unable to be used offline 24 days ago
good first issue
Repository for OpenVINO's extra modules
C++
#arm#inference-engine#java#nvidia-gpu#openvino#pytorch