Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Bigsy
14 days ago
|
parent
|
context
|
favorite
| on:
April 2026 TLDR Setup for Ollama and Gemma 4 26B o...
For MLX I'd guess.
wronglebowski
14 days ago
|
next
[–]
That also comes upstream from llama.cpp
https://github.com/ggml-org/llama.cpp/discussions/4345
redrove
14 days ago
|
prev
[–]
https://omlx.ai/
leftnode
14 days ago
|
parent
[–]
Does this have a CLI only interface?
redrove
13 days ago
|
root
|
parent
[–]
Yes. You could also look at the README.md.
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: