top | item 41501146

(no title)

emehex | 1 year ago

May I interest you in Jevons paradox:

> In economics, the Jevons paradox occurs when technological progress increases the efficiency with which a resource is used (reducing the amount necessary for any one use), but the falling cost of use induces increases in demand enough that resource use is increased, rather than reduced.

Source: https://en.wikipedia.org/wiki/Jevons_paradox

discuss

order

rachofsunshine|1 year ago

As a concrete application: software development has gotten infinitely easier in recent decades. Better IDEs, fewer performance constraints, better virutalization, less worries about on-prem deployments, autocomplete, easy version control, you name it. Any given engineer should be orders of magnitude more productive - yet demand has (recent slump aside) only grown.

visarga|1 year ago

And computers have become 1 million times better in the last 2-3 decades. Higher frequency, more RAM, more bandwidth, more network peers. And yet we make even more of them.

> "I think there is a world market for maybe five computers." Thomas Watson, president of IBM, 1943

rightbyte|1 year ago

> software development has gotten infinitely easier in recent decades.

I don't agree here. It was way simpler in the 90s. The programmer experience probably peaked around the transition from TUI:s to win32 where you could do either. Different screen resolutions is probably what made programming gui:s suck. And all the churn of Microsoft and Oracle frameworks didn't help.

Nowadays the overhead of making an app that passes procurement is insurmountable. And consumers seem to not buy apps at full price anymore.

dmitrygr|1 year ago

> software development has gotten infinitely easier

Writing good software is as hard as it has ever been. IDEs don’t help you with anything that makes proper software difficult. The only thing that has changed is that users have been conditioned to accept shit.

incrudible|1 year ago

> Better IDEs

We are now getting to a point where IDEs are as good as the ones we had in the 90s.

> performance constraints

Evened out by higher fidelity and less efficient programming languages and paradigms.

> deployment

More robust, perhaps, but also much more complex.

> version control

An improvement in some respects, a regression in others.

> more productive

Hard constraints make people productive. Being productive is about what not to do, impossibilities make for easy decisions.

rvense|1 year ago

Typing on a typewriter is five-six times faster than handwriting. Imagine if we still spent all our time writing! How silly that would be.

amy-petrik-214|1 year ago

As another concrete application, roads:

https://bangaloremirror.indiatimes.com/opinion/others/easyno...

The idea is more lanes on the highway means more traffic means slower (even though you added a lane!). What goes off of this is what's called "Palin's Corrolary", which is to make traffic faster it's best to have fewer lanes. Politicians apply various techniques for this such as perpetual construction or allocating vast swaths of asphalt for bicycles, to make the traffic flow faster.

So it does make sense that in fact slower chips will make AI faster, and punch cards will make software development faster, as the inverse of these proposed trends.

meiraleal|1 year ago

Productivity which is cancelled by the procrastination and mental issues social media brought to us all.

codesnik|1 year ago

well, it's time for another abstraction layer.

binalpatel|1 year ago

My personal (overly biased view after reading Chip War recently) take is pretty much there, seems like a lot of the same early dynamics of semiconductors are playing out here.

Very large R&D expenditures for the next iterations of the models at the leading edge (the "fabs" of the world), everything downstream getting much cheaper and better with demand increasing as a result.

Like a world where Claude Opus 3.5 is incredibly expensive to train and run, but also results in a Claude Haiku that's on net better than the Opus of the prior generation, occurring every cycle.

skzv|1 year ago

One of my favourite economic paradoxes. It changes the way you think about efficiency and consumption.

My colleague introduced me to this idea. He had been studying ways to increase computing efficiency out of concern for the environment. Making programs more efficient would reduce energy consumption, right?

His advisor introduced him to Jevons paradox and he realized such efforts could have the exact opposite effect. So he dropped that research entirely. If you're worried about energy consumption, you need to make energy production more green, not machines more efficient.

Making data centers more efficient will probably cause us to build more data centers and use more power overall, not less.

01HNNWZ0MV43FF|1 year ago

Unless it's something with a relatively fixed demand

SoftTalker|1 year ago

Sounds like a variant on the "induced demand" theory that people who are opposed to road building always trot out.

J_Shelby_J|1 year ago

Well, we’re not going to roll it out.

But it’s not really a theory so much as an established fact that the only way to reduce traffic is to have viable alternatives to driving.

obelos|1 year ago

If you have ever not gone somewhere because “there's too much traffic” or chosen to go to a store because it has easier parking than an equivalent alternative store, you've experienced the rudiments of induced demand.

jessriedel|1 year ago

It is, but in both cases it's not a good reason by itself to reduce investment in supply (roads or GPUs).

rcxdude|1 year ago

One is indeed an example of the other

hangsi|1 year ago

Interesting to consider GPUs as the coal of the AI revolution.

This is worth it for the mental image of heaping them into a boiler fire by the shovel load alone.

EasyMark|1 year ago

This definitely made me laugh after reading endless complaint (well scrolling) threads on the price and unavailability of cheap graphics cards thanks to $coin mining and AI usage. I would love to shovel a few in the fire to produce energy for the next generation of overpowered cards while I play old schools games on my $250 laptop

EasyMark|1 year ago

I bring this up all the time with coworkers. When a new generation of processors come out with amazing speed/#of core/power improvements, developers get lazier. I’m all for meaningful improvements, and grudgingly on the side of stuff like electron that allow easy cross platform dev, but please, for the love of God, please quit stacking on garbage features and useless GUI mods, pointless graphics, endless pulling in of huge libraries to do one little thing, etc. I try my best to keep my C++ and Rust dev as small as possible with as few dependencies as possible. If something might take me more than a week to write myself, I’ll give it strong consideration, otherwise I write it myself.