(no title)
clvx | 7 months ago
And, is there an open source implementation of an agentic workflow (search tools and others) to use it with local LLM’s?
clvx | 7 months ago
And, is there an open source implementation of an agentic workflow (search tools and others) to use it with local LLM’s?
dent9|7 months ago
Also none of this is worth the money because it's simply not possible to run the same kinds of models you pay for online on a standard home system. Things like ChatGPT 4o use more VRAM than you'll ever be able to scrounge up unless your budget is closer to $10,000-25,000+. Think multiple RTX A6000 cards or similar. So ultimately you're better off just paying for the online hosted services
beefnugs|7 months ago
Of course the economics are completely at odds with any real engineering: nobody wants you to use smaller local models, nobody wants you to consider cost/efficiency saving
apparent|7 months ago
Seems like there would be cost advantages and always-online advantages. And the risk of a desktop computer getting damaged/stolen is much lower than for laptops.
haiku2077|7 months ago
https://zed.dev/blog/fastest-ai-code-editor
seanmcdirmid|7 months ago
prettyblocks|7 months ago