top | item 47162021

(no title)

travisgriggs | 4 days ago

I had my formative years in programming when memory usage was something you still worried about as a programmer. And then memory expanded so much that all kinds of “optimal” patterns for programming just become nearly irrelevant. Will we start to actually consider this in software solutions again as a result?

discuss

order

fulafel|4 days ago

You're right in terms of fitting your program to memory, so that it can run in the first place.

But in performance work, the relative speed of RAM relative to computation has dropped such that it's a common wisdom to treat today's cache as RAM of old (and today's RAM as disk of old, etc).

In software performance work it's been all about hitting the cache for a long time. LLMs aren't too amenable to caching though.

makapuf|3 days ago

AFAIK, you can't explicitly allocate cache like you allocate RAM however. A bit like if you could only work on files and ram was used for cache. Maybe I am mistaken ? (Edit: typo)

seanmcdirmid|3 days ago

LLMs need memory bandwidth to stream lots of data through quickly, not so much caching. Well, this is basically the same way that a GPU uses memory.

dahcryn|3 days ago

I've actively started to use outlook and teams through chrome to free up some of my ram, easily saves 3-4gb. It's gotten ridiculous how much ram basic tools are using, leaving nothing for doing actually real work

christophilus|3 days ago

People get on me all the time about not installing programs on my computer. I run everything in the browser, if I can. Partly so I can kill it properly without it misbehaving, and partly because I don't trust their software at all. Zoom, Slack, Gmail, etc-- if I can run it in the browser, then that's the only way I'll run it.

dgxyz|3 days ago

Every app ships with its own isolated web browser now. That idea needs to die.

Back to native apps without bloated toolkits!

emeril|3 days ago

ive found the web versions use a similar amount of memory and have fewer features

my issue is that my company won't issue laptops with more than 16 gbs of ram

guess i'm not virtualizing anything...

mushufasa|3 days ago

I doubt it. I predict in a few years, maybe sooner, one/some of the AI companies buying up the supply will either have achieved their goal or collapsed, and then the market will be flooded with a glut of memory driving prices low again. Or, conversely, the demand stays high for a sustained period of time and the suppliers just increase supply. There's no hard bill of materials/technical reasons for the memory prices to be this high, unlike 20+ years ago.

Lalabadie|3 days ago

And in the meantime, major buyers (government, big orgs) adjust by extending the planned lifespan of their computers, and upping the IT wage budget a bit to support that. That adjustment probably won't go away after supply returns.

eumenides1|3 days ago

AI companies aren't buying RAM, they are buying the Wafers themselves. Then they are making special AI stuff. So the RAM never exists, and there will be no glut memory coming. Maybe some DDR5 will dribble out, but HBM isn't something we can use (at the moment).

KellyCriterion|3 days ago

Well, at least then there would be enough RAM to run run Windows7 and Crysis from a RAMdisk, Id guess?

Also RAMsan will have a renaissance then? :-D

jacquesm|4 days ago

> And then memory expanded so much that all kinds of “optimal” patterns for programming just become nearly irrelevant.

I don't think that ever happened. Using relatively sparse amount of memory turns into better cache management which in turn usually improves performance drastically.

And in embedded stuff being good with memory management can make the difference between 'works' and 'fail'.

zeta0134|3 days ago

The need to use optimal patterns didn't go away, but the techniques certainly did. Just as a quick example, it's usually a bad idea now to use lookup tables to accelerate small math workloads. The lookup table creates memory pressure on the cache, which ends up degrading performance on modern systems. Back in the 1980s, lookup tables were by far the dominant technique because math was *slow.*

_fizz_buzz_|3 days ago

It obviously never became completely irrelevant. But I think programmers spend a lot less time thinking about memory than they used to. People used to do a lot of gymnastics and crazy optimizations to fit stuff into memory. I do quite a bit of embedded programming and most of the time it seems easier for me to simply upgrade the MCU and spend 10cents more (or whatever) than to make any crazy optimimzations. But of course there are still cases where it makes sense.

yread|3 days ago

When was the last time you used mergesort because you had to?

rTX5CMRXIfFG|4 days ago

I never really bought in to the anti-Leetcode crowd’s sentiment that it’s irrelevant. It has always mattered as a competitive edge, against other job candidates if you’re an employee or the competition of you’re a company. It only looked irrelevant because opportunities were everywhere during ZIRP, but good times never last.

raw_anon_1111|3 days ago

Most developers work at banks, insurance companies and other “enterprise” jobs. Even most developers at BigTech and who are working “at scale” are building on top of scalable infrastructure and aren’t worrying about reversing a btree on a whiteboard.

ponector|3 days ago

It mattered to pass through the interview, but not for the job itself. With all leetcode geniuses in Microsoft why Teams and Windows are so shitty?

cyberrock|3 days ago

It's not like most developers are wasting memory for fun by using Electron etc. It's just the simplest way to deploy applications that require frequent multiplatform changes. Until you get Apple to approve native app changes faster and Linux users to agree on framework, app distribution, etc., it's the most optimal way to ship a product and not just a program.

close04|3 days ago

> for fun

Not for fun but for convenience (laziness occasionally?). Someone needed to "pay" for the app being available on all platforms. Either the programmer by coding and optimizing multiple times, or the user by using a bloated unoptimized piece of software. The choice was made to have the user pay. It's been so long I doubt recent generations of coders could even do it differently.

hulitu|2 days ago

> applications that require frequent multiplatform changes

Maybe a bit of engineering and planning could help here. Shipping always half finished products is it usually not a recipe to success.

zarzavat|3 days ago

RAM didn't get more expensive to produce. It just got more desirable. The prices will come down again when supply responds. It may take some time, but it will happen eventually.

StopDisinfo910|3 days ago

RAM production is highly inelastic and controlled by an oligopoly. They have little desire to increase production considering the lead time and the risk that the AI demand might be transient.

They actively prefer keeping confortable margins than competing between each other. They have already been condemned for active collusion in the past.

New actors from China could shake things up a bit but the geopolitical situation makes that complicated. The market can stay broken for a long time.

zozbot234|3 days ago

RAM actually got more expensive to produce in the medium term because production is bottlenecked. It takes years to expand production.

yxhuvud|3 days ago

We would have, if the expensive memory was a long term trend. It is not - eventually the supply will expand to match demand. There is no fundamental lack of raw materials underlying the issues, it is just a demand shock.

junon|3 days ago

Also, it's not like we have regressed in the process itself either, which was historically the limiting factor. As you said this is purely an economics thing resulting from a greedy shift in business focus by e.g. Micron.

ReedorReed|3 days ago

I just heard in a podcast, they talked about how powerful our devices are today but do not feel faster than they did 15 years ago and that it's because of what you write here.

enaaem|3 days ago

I have a 2020 Intel Mac (quad core, 16gb RAM) and it feels as slow as the Packard Bell from 2000 when I was a kid. The launchpad takes 1-2 seconds to show a bunch of icons. Absolutely insane!

AdamN|3 days ago

A lot of that is on the OS vendors (and security requirements drive some inefficiencies that didn't used to be needed either).

jooz|3 days ago

When I train some leetcode problems, I remember the best solution was the one that optimised cpu (time) instead of memory. Meaning adding data index in memory instead of iterating on the main data structure. I thought, ok, thats fine, it's normal, you can (could) always buy more RAM, but you can't buy more time.

But well, I think there is no right answer and there always be a trade off case by case depending on the context.

throw0101a|3 days ago

> I had my formative years in programming when memory usage was something you still worried about as a programmer.

As 'just' a user in the 1990s and MS-DOS, fiddling with QEMM was a bit of a craft to get what you wanted to run in the memory you had.

* https://en.wikipedia.org/wiki/QEMM

(Also, DESQview was awesome.)

lmcd|3 days ago

I've recently started a side project for the N64, and this is very relatable! Working within such tight constraints is most of the fun.

eulers_secret|3 days ago

Depends on the machine you’re targeting.

I do embedded Linux and ram usage is a major concern, same for other embedded applications.

I’m partying like it’s the 90s, on a 32-bit processor and a couple hundred MB of ram.

nostrademons|3 days ago

Android's investing significantly in reducing the memory usage of the next release simply because the BOM cost of RAM for their low-end partners is becoming prohibitive.

halJordan|3 days ago

But if that new or different because of this event? No it's not, Android has had several initiatives to enable low end devices, from optimizing full fatter Android, to inventing new versions of Android.

NooneAtAll3|3 days ago

most likely in a couple years this bubble will pop, just like 8 years and 16 years ago

it's just a cartel cycle of gaining profits while soon eliminating all investments into competitors when flood of cheap ram "suddenly" appears

thfuran|3 days ago

This is coming from an insane demand spike, not some nefarious plot by the RAM manufacturers.

seanmcdirmid|3 days ago

Eventually new capacity will come online, and the money the DRAM companies are making are going to accelerate even ,ore new capacity. If you can get your new capacity going before your competitors, maybe you can avoid a bubble burst. If you don’t build new capacity, your competitors will, etc, etc…