I had my formative years in programming when memory usage was something you still worried about as a programmer. And then memory expanded so much that all kinds of “optimal” patterns for programming just become nearly irrelevant. Will we start to actually consider this in software solutions again as a result?
fulafel|4 days ago
But in performance work, the relative speed of RAM relative to computation has dropped such that it's a common wisdom to treat today's cache as RAM of old (and today's RAM as disk of old, etc).
In software performance work it's been all about hitting the cache for a long time. LLMs aren't too amenable to caching though.
makapuf|3 days ago
KellyCriterion|3 days ago
;-)
seanmcdirmid|3 days ago
dahcryn|3 days ago
christophilus|3 days ago
dgxyz|3 days ago
Back to native apps without bloated toolkits!
emeril|3 days ago
my issue is that my company won't issue laptops with more than 16 gbs of ram
guess i'm not virtualizing anything...
mushufasa|3 days ago
Lalabadie|3 days ago
eumenides1|3 days ago
KellyCriterion|3 days ago
Also RAMsan will have a renaissance then? :-D
jacquesm|4 days ago
I don't think that ever happened. Using relatively sparse amount of memory turns into better cache management which in turn usually improves performance drastically.
And in embedded stuff being good with memory management can make the difference between 'works' and 'fail'.
zeta0134|3 days ago
_fizz_buzz_|3 days ago
yread|3 days ago
rTX5CMRXIfFG|4 days ago
raw_anon_1111|3 days ago
ponector|3 days ago
cyberrock|3 days ago
close04|3 days ago
Not for fun but for convenience (laziness occasionally?). Someone needed to "pay" for the app being available on all platforms. Either the programmer by coding and optimizing multiple times, or the user by using a bloated unoptimized piece of software. The choice was made to have the user pay. It's been so long I doubt recent generations of coders could even do it differently.
hulitu|2 days ago
Maybe a bit of engineering and planning could help here. Shipping always half finished products is it usually not a recipe to success.
zarzavat|3 days ago
StopDisinfo910|3 days ago
They actively prefer keeping confortable margins than competing between each other. They have already been condemned for active collusion in the past.
New actors from China could shake things up a bit but the geopolitical situation makes that complicated. The market can stay broken for a long time.
zozbot234|3 days ago
yxhuvud|3 days ago
junon|3 days ago
ReedorReed|3 days ago
enaaem|3 days ago
AdamN|3 days ago
jooz|3 days ago
But well, I think there is no right answer and there always be a trade off case by case depending on the context.
throw0101a|3 days ago
As 'just' a user in the 1990s and MS-DOS, fiddling with QEMM was a bit of a craft to get what you wanted to run in the memory you had.
* https://en.wikipedia.org/wiki/QEMM
(Also, DESQview was awesome.)
lmcd|3 days ago
unknown|3 days ago
[deleted]
eulers_secret|3 days ago
I do embedded Linux and ram usage is a major concern, same for other embedded applications.
I’m partying like it’s the 90s, on a 32-bit processor and a couple hundred MB of ram.
nostrademons|3 days ago
halJordan|3 days ago
NooneAtAll3|3 days ago
it's just a cartel cycle of gaining profits while soon eliminating all investments into competitors when flood of cheap ram "suddenly" appears
thfuran|3 days ago
seanmcdirmid|3 days ago