1. Ollama’s openness is questioned
“Ollama is quasi‑open source.” – DiabloD3
The project is viewed as “quasi‑open source” because it claims ownership of code that is heavily derived fromllama.cppwithout clear credit.
2. Performance debates
“Ollama is slower.” – logicalele
Benchmark chatter (e.g., “Ollama ended up slowest on the 9B…” – dminik) highlights speed differences and fuels discussion about its real‑world performance.
3. Preference for alternatives due to usability and hardware support
“I really like LM Studio when I can use it under Windows but for people like me with Intel Macs + AMD GPU ollama is the only option because it can leverage the GPU using MoltenVK aka Vulkan, unofficially.” – alifeinbinary
Users cite LM Studio’s server mode, broader GPU support, and easier setup as reasons to prefer it over Ollama.