top | item 40233938

Computers Reduce Efficiency: Case Studies of the Solow Paradox (2023)

97 points| gtt | 1 year ago |scottlocklin.wordpress.com

87 comments

order

virtue3|1 year ago

I think this article is drastically downplaying how dramatically more complicated designs and things are now than they were before.

I don't believe it's computers that are to blame; I believe it's complexity nightmare problem.

We have much tighter tolerances for everything now; everything "does more" and relies on my components.

Back when we used pen and paper to create military vehicles it was mostly JUST about performance and completing the objective. There wasn't thousands upon thousands of other requirements and features (whether or not this is a good thing is debatable).

mcphage|1 year ago

> Back when we used pen and paper to create military vehicles it was mostly JUST about performance and completing the objective. There wasn't thousands upon thousands of other requirements and features (whether or not this is a good thing is debatable).

It makes me wonder if this piling-on of requirements is also enabled or encouraged by computers. I agree that things are becoming more complex, but I’m thinking computers might be partially to blame for that as well.

hulitu|1 year ago

> everything "does more"

Windows 2000 GUI was much more complex than the Windows 10 or 11 GUI. Yet Windows 10 and 11 have difficulties painting the screen (glitches, black screens for a second, and so on).

7thaccount|1 year ago

Agreed. Complexity could ultimately be our downfall. Everything is drastically more complicated than before and the margins for safety are getting more and more reduced.

Take my own industry of electricity markets for example. It used to be you had large vertically integrated utilities that handled generation of power as well as the transmission of it to the residential grid (distribution). They would run the grid and factor all costs (fixed and variable) into residential and industrial rates. This is easy to explain to someone in a minute or so. In the 1970s and 80s though deregulation took off and you could finally build fairly efficient and smaller gas plants, so there was a push to have these much larger grid operators optimize over a much larger region and introduce competition amongst those in the market, so the public wouldn't suffer from unwise investments from the utilities. This system is more efficient, but is supposed to operate off of a "free market" system. The only problem is that it has never worked very well overall. It does schedule power more efficiently, but you have all these power plants needed for just a handful of events that are no longer solvent as they can't earn enough money in the markets. So the grid operators are dealing with mass scale retirements (some of these would've occured anyway due to EPA rulings) and spending tons of time and money trying to fix a problem that didn't use to exist. These organizations have thousands of pages of legal documents and run enormously complex power auctions and have to employ hundreds of employees to administer all of it. Very few people understand how the cake is made anymore so to speak. Does it save more money? Yes, but the cost is a massive increase in complexity that grows each year as new rules are made. So we took something conceptually simple and made it 10x more complex in order to squeak out more savings. I'm not saying it was the wrong path, but doing this sort of thing all over society/economy has its own costs.

rini17|1 year ago

And what about financial services mentioned in the articles? Doubt that "everything does more" there.

constantcrying|1 year ago

It definitely is true that a design which could easily be created by hand is harder to create on a computer. The things where computers shine are actually managimg the complexity of a large and complicated systems.

What I think the article leaves unspoken (but implied) is the "curse of tools", if you give a person tools he is likely to use them, even if they might not be applicable. Meaning that someone might decide to create a complex solution to a problem, simply because the tools he has been given allow him to do so. I think it is always very important to keep in mind what has been achieved with the very limited tools of the past and the immense ingenuity of the people who worked within those limits.

dimask|1 year ago

> The things where computers shine are actually managimg the complexity of a large and complicated systems.

I would argue that where computers shine, firstmost, is automating repeated tasks. Even if a task is fairly simple in complexity and doing it by hand takes less time, if you have to repeat the same task over and over it may be beneficial to use computer tools that allow some automation, even if in the first couple of runs this is gonna take more time. In this sense, something being easier to do by hand (once) does not necessarily imply that it is better to do it by hand.

But I do agree that an increase in complexity comes as a curse of tools. People with less tech-understanding may be more easy to get what some benefits of such tools are, but the problems that increased complexity brings takes longer to catch them.

docfort|1 year ago

Complexity is the outcome of misunderstanding. The misunderstanding can come from lots of areas.

It could be from a requirements perspective: “I understand what I can build easily, but not what you want.”

It could be from an engineering perspective: “I understand what you want, but I don’t understand how to build that cohesively.”

It could be from a scientific perspective: “No one knows what tools we need to investigate this.”

I saw mentioned in other comments that CAD software doesn’t allow for sketching. As someone who was originally trained in drafting the old way, and who has used modern CAD systems to produce models for fantastically large physical simulations, I largely agree that sketching is lost. But the sketching that I can do on paper is just not at the same level of complexity as I can kinda do on my computer.

But the complexity of using the new tool obscures the fact that my model is much more complicated than I could otherwise manage using old tools. And that’s because I’m still learning. In fact, I have to “deploy” while I’m still in learning mode about a problem, unlike before, where I had to understand the problem in order to use the tools to draft the thing.

Being able to do something with a half-formed idea sounds like sketching, but when non-experts rely upon it, it’s pretty fragile. Because it wasn’t done.

Building a memex (something the author disparages multiple times) is super hard because we still don’t understand how to represent ideas separately from language, our original mental sketching tool. But people built Altavista and Google and LLMs anyway. And yeah, they’re super complex.

How does TCP/IP work over wireless connections? Poorly and with a lot of complexity. Why? Because the concept of a connection is ill-defined when communication relies on ephemeral signaling.

But despite the complexity, it is useful and fun to use only half-baked ideas. Just like it’s fun to use language to describe stuff I don’t understand, but merely experience. Graduation. Being a parent to a sick child. Losing a loved one.

pas|1 year ago

The problem is that we are flooded with low-quality tools.

In general it is almost universally true of software nowadays. (Because change/progress leapfrogged any kind of teleological end-to-end design OR it's simply unmaintained, for example see any US government IT system. Or the ubiquitous extremely fragile corporate synthetic snowflake software that only runs on Windows XP SP1 + that 3 particular patches with that exact vc6.dll copied over from an old CD.)

A good quality information processing tool is designed for the process that it's meant to augment, ideally considering the original problem, and ultimately improving the process itself.

(Just digitizing a form likely leads to worse outcomes. Because those forms were designed for pen and paper. Screens and keyboard navigation requires different flows.

And the usual process is even worse, which consist of reinventing the wheel without context, as in speedrunning through all the problems of corporate politics and procurement, delivering some half-assed milestone-driven monstrosity, and forcing it on employees.

Of course, due to the aforementioned universal internal incompetence-fest and quarter-driven development budgets are bad, required time and effort is underestimated, learning curves are ignored, software gets set in stone too soon, and thus efficiency remains well below the expected, planned, possible, and hoped for.)

Nevermark|1 year ago

Never underestimate the personal satisfaction of spending a day, or a week, installing, playing or configuring one's tools!

In the past, we customized our workflow while in the flow. Now, to approximate that freedom, we have to futz around up front, with the limited control levers they give us. In a slow feedback loop with any actual work, to get our tools to help us in the way we want.

Which for complex tools and work, can rapidly become its own end.

api|1 year ago

The curse of tools applies in computing too. A virtue of languages like Go is that they reduce incidental and gratuitous complexity through the simplicity of the language. A complex language with a powerful type system like Rust or C++ will tempt programmers to use every facet of the language and create baroque overly complex code.

paulsutter|1 year ago

SpaceX Raptor engine was designed using a full engine combustion simulator, without which the engine would have been impossible [1]. Not to mention the rapid evolution from Raptor 1 to 3 [2]

Jet aircraft are 70% more efficient since 1967, largely from simulation [3], similar in automotive

Unclear how the NVIDIA H100 would have been designed by hand-drawing 80 billion transistors

Net-net: Computers necessary, but we need much better UIs and systems. Maybe AI will help us improve this

[1] https://youtu.be/vYA0f6R5KAI?si=SG1vLMMl8l3DuCYN

[2] https://www.nextbigfuture.com/2024/04/spacex-raptor-3-engine...

[3] https://en.m.wikipedia.org/wiki/Fuel_economy_in_aircraft

> Jet airliners became 70% more fuel efficient between 1967 and 2007, 40% due to improvements in engine efficiency and 30% from airframes.

matheweis|1 year ago

There are at least two dimensions to this that I believe that the author has overlooked:

1. Economies of scale. It may be that drafting something up in CAD takes more cycles to get right up front, but once you have an established design it is much easier to reproduce it by orders of magnitude

2. Changes in software. Software companies are ever changing their interfaces, decreasing productivity every time their users encounter this learning curve.

pas|1 year ago

And CAD models benefit from better technology "for free", better visualization, better heuristics (for rule/code/safety/conformance checking), and so on.

Did it make sense for military nuclear submarines back then? Well, maybe not, who knows. (Submarines are definitely not mass produced.)

But what this 'insightful essay' ignores is that productivity decreased overall in the 'West' (as the post-WWII boom ended) but then picked up right around the dot-fucking-com boom. Oh, wait computers. But maybe this glorified shitpost should have used recent datasets instead of spending its eruditeness budget on extra spicy and fancy words. (propitiate!)

https://www.mckinsey.com/~/media/mckinsey/mckinsey%20global%...

https://cepr.org/sites/default/files/styles/flexible_wysiwyg...

https://www.caixabankresearch.com/sites/default/files/styles...

joeatwork|1 year ago

Something this article leaves out is that mostly, when people are given better tools, they don’t just produce more widgets per unit time: often instead they build different (more complex, better) widgets. When I was in school I read a study about this - a design shop had N draftsmen, they introduced CAD tools anticipating reducing the staff, and when researchers went back to the shop they had the same staff, but they were designing things that wouldn’t have been practical or possible before.

Nevermark|1 year ago

Under appreciated: Automation creates, or dramatically enhances, the need to fully understand problems and solutions at the most detailed and practical levels. Because automation removes the valuable manual ad hoc flexibility to adapt to most wonkiness.

1. When a job requires a mix of human and computer work, productivity changes are very dependent on interface details. Even one slightly confusing GUI, slowness of feedback, a tool that isn't quite as flexible as a job needs, or an inability to see/edit/use related information at the same time, can greatly harm productivity.

2. When a job is completely automated, productivity can go way up. But this productivity doesn't get attributed to human workers, it is corporate productivity. And then only if this highly optimized task really provides value. There is a lot of performative information processing, with conjectured long term payoffs, serving the needs of management and tech workers to look busy, and believe they are valuable.

For both human and corporate productivity, automation makes it extremely easy to decrease productivity due to the most subtle mismatches between problems and solutions.

When work is done by hand, these mismatches tend to be glaringly obvious, less tolerated, and more easily mitigated or eliminated.

flavaz|1 year ago

A classic example of this would be how some roles require endless spreadsheets, or individual updates to a CRM tool like Pipedrive.

CRM tools add a lot of overhead to what should be a simple process- letting your manager know what you’re up to.

onthecanposting|1 year ago

If that bookkeeping overhead is fed into analysis and process mining to drive improvement, it might be a net gain. More often though, I see yet-another-spreadhseet applied as panacea, then it's forgotten in a few months and the process repeats over many years.

pnut|1 year ago

I'm not a CRM end user, but I'd be very grateful for such a tool if I had to suddenly cover for a coworker, or inherit an existing business relationship. What is the alternative, each person individually cobbles together some godawful workflow management system? With no centralised repository of information?

Totally unsustainable, and not at all related to keeping your manager informed.

ElevenLathe|1 year ago

Good to know sales people have their own version of JIRA hell.

geysersam|1 year ago

Nobody can convince me computerization has not improved efficiency in industrial manufacturing. But computerization has probably lead to fewer people working in manufacturing. Did overall efficiency decrease or increase?

pydry|1 year ago

Offshoring did 90% of that, not computers.

The effect of computerization was just to keep some industrial production at home - mostly the unique kind of manufacturing where labor costs weren't dominant.

I think the US is in for an epic shock in few years time when they realize just how much getting cut off from Chinese factories will hurt.

Either demand for labor to substitute those Chinese factories will spike or the US will economically spiral as inflation takes off like a rocket and US elites try yet again to shift the burden onto the politically impotent working classes.

gieksosz|1 year ago

At first I thought it was written in the 80ies, then I saw the author mention 1995 and it began to feel very strange that someone from the mid 90ies would rant against computers. Then I reached a section about LLM …

AndrewKemendo|1 year ago

I build architectures for major systems and in every case I start with a blank paper in a Strathmore 400series Sketchbook

About 4 years ago I made a wall of my office into a chalkboard and that’s been where I work out massively complex interdependencies and data workflows

Nothing on a computer remotely compares to the speed and specificity of pen or chalk in hand

Almondsetat|1 year ago

Except a tablet with pen support

nitwit005|1 year ago

Case studies from the 80s, as productivity started improving again in the 90s.

I can actually remember Alan Greenspan discussing this, despite how young I was.

Manfred|1 year ago

To be precise, the first study mentioned is from 2011, then 1989, 1987, then studies done in the 1990's.

dahart|1 year ago

The financial services industry has been revolutionized by computers, I have no idea why the author thought that would make a good example. Today’s stock markets & high frequency trading & online banks & international finance didn’t (and can’t) even exist without computers. The explosion of personal investing and day trading that has changed trading doesn’t exist without computers.

The entire computer industry itself has accelerated and grown because of computers, nowhere has he accounted for the “productivity” attributed to sales of computers. Fields I’ve worked in, video games and CG films, have absolutely increased efficiency with computers: for equal sized productions, the quality has gone up and the workforce needed has gone down over time consistently for decades.

The article has only one single and completely vague datapoint that includes anything from the last 30 years, that’s a major red flag. The invective portmanteaus and insult words are also a red flag and very weak argumentation. Is that supposed to make up for the complete lack of any relevant data? Not to mention some of the insults are worse than iffy by todays standards and don’t reflect well on the author.

Call me rather unconvinced, I guess.

superfunny|1 year ago

Agreed - I was very surprised to hear this. From Dividend.com:

"In the late 1960s, the volume of trading activity increased dramatically. With the drastic increase in volume, the NYSE had $4 billion in unprocessed transactions by 1968. To catch up, the exchange closed every Wednesday from June 12, 1968 to December 31, 1968. During this crisis, over 100 brokers failed due to the high volume of transaction that could not be processed."

Today, the NYSE processes trading volumes of 3-4 billion shares per day.

rdlecler1|1 year ago

This analysis ignores the impact of competition. A car produced today is better (and more complex to make) than a car produced in 1980.

Technological productivity isn’t just about improving the number of units or dollar value produced/hours input. Technology can make products more competitive without any increase in productivity by making them better, and, therefore more attractive, to customers even if unit cost or volume stays fixed.

beretguy|1 year ago

I don’t know. We need to define “better” first. My friend has 1970s F150 that still drives. “For how long will car run so that i don’t have to buy a new one” - that’s my definition of “better”. Will modern cars run 50 years from now?

smeej|1 year ago

I'm noticing a version of this as I pilot switching to a handwritten bullet journal for task management. I'll still sit down at the computer to brainstorm and organize the tasks of my projects, because being able to move list items around is a huge advantage, but when it actually comes to doing the darn things? It's been so much more effective to track them in the notebook. Planning out my daily schedule, figuring out what I can do in which timeframe, and making sure things don't fall through the cracks has worked so much better on paper.

cushychicken|1 year ago

Something tells me this guy is one of those people who thinks version control is a newfangled process step and not a useful piece of a development cycle.

courseofaction|1 year ago

There are better explanations for the general downturn of productivity despite better tools: Increased focus on extraction since the expansion of neoliberal policies in the 70s.

https://wtfhappenedin1971.com/

Samtidsfobiker|1 year ago

I too have wondered why it take so long to make stuff in CAD, and why even suggesting that everything should fit together the first time is laughable at best.

My theory is that computers can't do rough sketching. No CAD software suite (I think) can iterate and evalute rough ideas as fast and flexible as whiteboard pen in a meeting room can.

JKCalhoun|1 year ago

That's the way I see it.

Just as an example I am familiar with: so many people appear to begin a project like a MAME cabinet with SketchUp.

I like "Cardboard Assisted Design" and have literally built several MAME cabinet prototypes in cardboard where iteration is easy with merely a box-cutter knife.

When the ergonomics and part-fitting is "go", I take measurements from the cardboard proto and move to wood.

Designing acrylic parts for later laser-cutting I have also used "CAD" for prototyping — sometimes even flat-bed scanning the chipboard prototype and then moving to a vector drawing app to overlay the laser-friendly beziers.

Even for PCB layout I often will laser-print the PCB as a final sanity check: punching holes in the paper and shoving the actual electronic components in to see that everything will fit before I send the file off to Taiwan.

nurple|1 year ago

Your theory seems sound. I see this theme of free-flow expressivity vs stricture and formalism in the choice of programming languages. CAD is like rust, where it never fits together the first time and parametric stricture lines the road of progress with infinite caltrops in abdication to correctness; javascript is a sketch book where ideas and mutations happen easily and often, correctness is a beautiful illusion brought through the evolutionary nature of experimentation.

marcosdumay|1 year ago

Mechanical CADs aren't designed for maintainability.

They are designed for expressivity first, and easiness of learning (for people with industry knowledge) second. And those are the two only goals.

Just try to adapt a mechanical design for a slightly different task. It's usually better to start from scratch.

(Anyway, that's an example of computers not being fully used. Going from the Solow paradox into "computers are bad" - like lots of people like to do, even here - is just stupid.)

sobellian|1 year ago

Am I nuts, or does the first graph show exactly the opposite of his claim? He says it shows declining productivity, but labor productivity rises. Costs also rise but this is exactly what one should expect from labor-saving devices, no?

hcks|1 year ago

“Computa*” this is insane

TazeTSchnitzel|1 year ago

What's with a lot of the mentions of “computer” using what looks like a portmanteau with “retard”?! I know some famous people like Stallman do this, but I don't think it's perceived positively.

sourcepluck|1 year ago

Provide a link to Stallman using that term then please? Otherwise, I'm calling nonsense. I'd be 95+% sure you're making that up, or confused in this case.

doubloon|1 year ago

A lack of consideration for others feelings.

eru|1 year ago

Well, Stallman ain't perceived positively, either.

The author clearly has an axe to grind. I haven't read enough yet to decide whether they have valid point.

drewcoo|1 year ago

[deleted]

dopylitty|1 year ago

For similar reasons to those mentioned in the article it's possible the past century will be seen as a dark age by future humans. Computers are incredibly fragile and depend on complex systems (eg the electricity grid) to even operate. They also can't persist data even across several decades reliably. Yet we've created a society where nothing can be done without a computer and all our data is stored in computers instead of physically.

When those complex systems fail and the computers stop working we'll be left without any traces of the knowledge generated in the past century or the people who generated it. We'll also have lost all the previous knowledge that was moved from physical to digital storage.

All future humans will see from the century is a whole lot of microplastics.