top | item 46904837 (no title) mohsen1 | 24 days ago At this point, if you're paying out of pocket you should use Kimi or GLM for it to make sense discuss order hn newest andai|24 days ago GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.Haven't tried Kimi, hear good things. bluerooibos|24 days ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama. corysama|24 days ago Try this https://unsloth.ai/docs/models/qwen3-coder-next
andai|24 days ago GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.Haven't tried Kimi, hear good things.
bluerooibos|24 days ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama. corysama|24 days ago Try this https://unsloth.ai/docs/models/qwen3-coder-next
andai|24 days ago
Haven't tried Kimi, hear good things.
bluerooibos|24 days ago
At least, my M1 Pro seems to struggle and take forever using them via Ollama.
corysama|24 days ago