(no title)
elevation | 16 days ago
I've achieved 3 and 4 orders of magnitude CPU performance boosts and 50% RAM reductions using C in places I wouldn't normally and by selecting/designing efficient data structures. TUIs are a good example of this trend. For internal engineering, to be able to present the information we need while bypassing the millions of SLoC in the webstack is more efficient in almost every regard.
logicprog|16 days ago
sunshowers|16 days ago
coldtea|16 days ago
That's what makes them great. As opposed to modern "minimal" waste of space UIs or the Electron crappage.
mseepgood|16 days ago
elevation|16 days ago
If your business requirements are stable and you have a good test suite, you're living in a golden age for leveraging your current access to LLMs to reduce your future operational costs.
grogenaut|16 days ago
coldtea|16 days ago
Making 50 SOTA AI requests per day ≈ running a 10W LED bulb for about 2.5 hours per day
Given I usually have 2-3 lights on all day in the house, that's like 1500 LLM requests per day (which sounds quite more than I do).
So even a month worth of requests for building some software doesn't sound that much. Having a local beefy traditional build server compling or running tests for 4 hours a day would be like ~7,600 requests/day
mulmen|16 days ago
embedding-shape|16 days ago
teaearlgraycold|16 days ago