top | item 46626498

(no title)

mangamadaiyan | 1 month ago

Hm. I thought LLMs weren't free. Am I missing something?

discuss

order

selfhoster11|1 month ago

1. You can run decent local AI now - see /r/LocalLlama. You pay the electricity cost and hardware capex (which isn't that expensive for smaller models).

2. Chinese APIs like Moonshot and DeepSeek have extremely cheap pricing, with optional subscriptions that will grant you a fixed number of requests of any context size for under $10 a month. Claude Code is the bourgeois option, GLM-4.7 does quite well on vibe coding and is extremely cheap.