top | item 43488436

(no title)

baazaa | 11 months ago

I think people need to get used to the idea that the West is just going backwards in capability. Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more. And these are industries which should be seeing the most progress, things are even worse in hard-tech at Boeing or whatever.

Whenever people see old systems still in production (say things that are over 30 years old) the assumption is that management refused to fund the replacement. But if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable.

From the outside, decline always looks like a choice, because the exact form the decline takes was chosen. The issue is that all the choices are bad.

discuss

order

nisa|11 months ago

My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

In my work experience I've realized everybody fears honesty in their organization be it big or small.

Customers can't admit the project is failing, so it churns on. Workers/developers want to keep their job and either burn out or adapt and avoid talking about obvious deficits. Management is preoccupied with softening words and avoiding decisions because they lack knowledge of the problem or process.

Additionally there has been a growing pipeline of people that switch directly from university where they've been told to only manage other people and not care about the subject to positions of power where they are helpless and can't admit it.

Even in university, working for the administration I've watched people self congratulation on doing design thinking seminars every other week and working on preserving their job instead of doing useful things while the money for teaching assistants or technical personnel is not there.

I've seen that so often that I think it's almost universal. The result is mediocre broken stuff where everyone pretends everything is fine. Everyone wants to manage, nobody wants to do the work or god forbid improve processes and solve real problems.

I've got some serious ADHD symptoms and as a sysadmin when you fail to deliver it's pretty obvious and I messed up big time more than once and it was always sweet talked, excused, bullshitted away from higher ups.

Something is really off and everyone is telling similar stories about broken processes.

Feels like a collective passivity that captures everything and nobody is willing to admit that something doesn't work. And a huge missallocation of resources.

Not sure how it used to be but I'm pessimistic how this will end.

AnthonyMouse|11 months ago

> My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

This is really a cultural problem that has infected management along with everyone else.

It used to be that you were expected to be able to fix your own car or washing machine, and moreover that one you couldn't fix would be rejected by the customers. It was expected to come with documentation and be made of modular parts you could actually obtain for less than three quarters of the price of the entire machine.

Now everything is a black box you're expected to never open and if it breaks and the manufacturer doesn't deign to fix it you go to the store and buy another one.

The problem with this is that it poisons the well. Paying money to make the problem go away instead of learning how to fix it yourself means that, at scale, you lose the ability to fix it yourself. The knowledge and infrastructure to choose differently decays, so that you have to pay someone else to fix the problem, even if that's not what you would have chosen.

The result is a helplessness that stems from a lack of agency. Once the ability to do something yourself has atrophied, you can no longer even tell whether the person you're having do it for you is doing it well. Which, of course, causes them to not. And in turn to defend the opacity so they can continue to not.

Which brings us back to management. The C suite doesn't actually know how the company works. If something bad happens, they may not even find out about it, or if they do it's through a layer of middle management that has put whatever spin on it necessary to make sure the blame falls on the designated scapegoat. Actually fixing the cause of the problem is intractable because the cause is never identified.

But to fix that you'd need an economy with smaller companies, like a machine with modular parts and documented interfaces, instead of an opaque monolith that can't be cured because it can't be penetrated by understanding.

somenameforme|11 months ago

I think a way to sum this up is simply metric optimizing. As organizations and companies grow larger the need to evaluate people at scale becomes necessary. And so metrics are used, and people then naturally start to optimize around those metrics. But it seems to invariably turns out that any sort of metric you create will not effectively measure progress towards a goal you want to achieve, when that metric ends up being optimized for.

The traditional term for this is cobra effect. [1] When the Brits were occupying India they wanted to reduce the cobra population, so they simply created a bounty on cobra heads. Sounds reasonable, but you need to have foresight to think about what comes next. This now created a major incentive for entrepreneurial Indians to start mass breeding cobras to then turn in their heads. After this was discovered, the bounty program was canceled, and the now surging cobra farm industry mostly just let their cobras go wild.

I think the fundamental problem is that things just don't work so well at scale, after a point. This is made even worse by the fact that things work really well at scale before they start to break down. So we need a large economy that remains relatively decentralized. But that's not so easy, because the easiest way to make more money is to just start assimilating other companies/competitors with your excess revenue. Anti-trust is the knee jerk answer but even there, are we even going to pretend there's a single person alive who e.g. Google (or any other mega corp) doesn't have the resources to 'sway'?

[1] - https://en.wikipedia.org/wiki/Perverse_incentive

baazaa|11 months ago

While I suspect the root cause is managerial dysfunction ultimately the disease spreads everywhere. I've stopped honing my technical skills because I don't expect to ever work in an organisation sufficiently well-managed for it to matter. So then you end up with the loss of genuine technical expertise from generation to generation as well.

bsenftner|11 months ago

This crisis, which it is, is caused by the unrecognized necessity for effective communications within science and technology and business, which is not taught. Not really, only a lite "presentation skill" is taught.

Fact of the matter: communications is everything for humans, including dealing with one's own self. Communications are how our internal self conversation mired in bias encourages or discourages behavior, communications are how peers lead, mislead, inform, misinform, and omit key information - including that critical problem information that people are too often afraid to relate.

An effective communicator can talk to anyone, regardless of stature, and convey understanding. If the information is damningly negative, the effective communicator is thanked for their insight and not punished nor ignored.

Effective communications is everything in our complex society, and this critical skill is simply ignored.

__oh_es|11 months ago

I would caveat with its not affordable to be passionate anymore. The top engineers (mech, chem, civil, etc) I know work in finance or consulting instead of doing things they care about.

Closer to tech, I feel we have had a big influx on non-tech joining the tech workforce and the quality has suffered as a result of a lack of fundamentals and passion

fijiaarone|11 months ago

The cause of an incompetent management class is a subservient worker class. Now a subservient worker class is either that they are incompetent or they don’t have access to capital, meaning that they can’t strike out on their own and leave management to suffer the consequences of their incompetence.

whatever1|11 months ago

This started when companies decided that labor is fungible.

The moment you admit failure as an employee, you are out of the company. And no for most people it is not easy to find a job that will not disrupt their lives (aka move cities, change financial planning, even health insurance).

So employees do what they have to do. They will lie till the last moment and pretend that the initiatives they are working on are huge value add for the company.

In the past you knew you would retire from your company, also the compensation differential was not that huge across levels, so there was little incentive to BS.

Today everything is optimized with a horizon of a financial quarter. Then a pandemic hits, and we realize that we don't even know how to make freaking masks and don’t even have supplies of things for more than a week.

roenxi|11 months ago

> Something is really off and everyone is telling similar stories about broken processes.

There are people out there who are pretty conflict-avoidant by nature, and any group tends to pretty significant levels of cohesion because of it. There are some classic stories out there about when it goes particularly bad and spirals into a bad case of groupthink.

In the economy there are supposed to be some slightly cruel feedback mechanisms where companies (effectively big groups) that get off track are defunded and their resources reallocated to someone more competent. The west has been on a campaign to disable all those feedback mechanisms and let companies just keep trudging on. We've pretty much disabled recessions by this point. A bunch of known-incompetent management teams have been bailed out so they can just keep plodding along destroying value. There is not so much advantage in being honest about competence in this environment, if anything it is a bad thing because it makes it harder to take bailout money with a straight face.

I cite the Silicon Valley Bank collapse as an interesting case study. A looot of companies should have gone bust with that one because they were imprudent with their money. They didn't.

zosima|11 months ago

This is it. There is a mass-hypnosis in the west where reality at best is being completely ignored and at worst actively treated in a very hostile manner.

choeger|11 months ago

I can confirm nearly everything you say, and I'd like to add that it's a cultural phenomenon. We don't seem to value competence anymore. I cannot recall when I've heard someone say something positive about another person's competence. Be it a craftsman, industry worker, knowledge worker, or even a teacher. There doesn't seem to be any value in doing a good job.

lenerdenator|11 months ago

> My personal theory is that this is the result of an incompetent management class where no self corrections are happening.

Close. They're not incompetent; we just redefined competence.

It used to be that competence was a mix of a lot of distinct, but interdependent, qualities. The end result was synergy that allowed for people and organizations (including companies) to compete and move society forward.

In the 1970s, we started to allow a bunch of psychopaths (I'm saying this in the clinical sense) to redefine competence. Instead of this array of distinct qualities, they just defined it in terms of ability to create monetary value, particularly if that value was then transferred to shareholders. That was it.

We also switched to quarterly reporting for for-profit companies, shrinking the window to evaluate this new definition of competence to 90 days. Three months.

An end result of this was that you could simply do whatever made the most money in 90 days and be considered competent.

Jack Welch was the paragon of this. GE shareholders saw massive gains during the latter half of his tenure at the helm. This wasn't because of groundbreaking new products or services; quite the opposite: Jack realized that selling off divisions and cutting costs by any means necessary was a good way to make money in the 90 day period. Institutional knowledge and good business relationships in the market - two of the elements of the former definition of competence - were lost, while money - the sole element under which competence was judged in the new definition - went up.

You also had managers doing a lot of the avoidance of real management, like you speak of. Instead of betting on a new product or trying to enter a new market, they took a Six Sigma course, learned a bunch of jargon, and cut costs at the expense of business past the 90 day period.

If you do this enough (and we did, far beyond just GE), that expense is taken at the societal level. Existence extends beyond 90 days. You can't mortgage the future forever. It's now the future, the payment is due, and we have an empty account to draw from.

Theoretically, we could go back to a more in-depth evaluation of competence and reward its display over the long term. In practice, there are a bunch of people who got unfathomably wealthy off of the shift to the "new" competence, and now they're in charge and don't want to switch back, so we won't.

nostrademons|11 months ago

They're responding to incentives. The only user that matters is the marginal user, the person who didn't previously use your product but now does. They even teach this in MBAs and economics classes. And so the only efforts that matter are those that create a customer, and hence management spends a great deal on promotions, marketing, new customer discounts, advertising, gamification, addictive usage mechanics, lock-in, etc but basically zero on making existing customers happy. It's almost better if they aren't happy - an "ideally run" company is one that has users who hate your product but don't hate it quite enough to quit using it (or if they do, they have no alternatives).

Enshittification in action.

amadeuspagel|11 months ago

If only people were allowed to start their own companies.

mitjam|11 months ago

It’s the age of thinking instead of doing. Thinking doesn‘t solve doing problems but we can think and talk them away or at least outsource the doing. —- Hmm, what an interesting thought. Let‘s think about it some more.

forgotoldacc|11 months ago

One big problem is becoming a manager is seen as the end goal, and pay often reflects that.

Being a great engineer or researcher doesn't pay. You won't get your name known for your work. All your achievements will be attributed to whoever manages you at best, or attributed to the corporation above you with not a single human name at worst.

People like being recognized for their work. Every great achiever wants to have their name remembered long after they leave this world. Everyone wants to be the next Isaac Newton. The next Bill Gates. The next Steve Jobs. The next Elon Musk. It's a constant downhill path from being known for using your brain and busting your ass to discover or create something, to being known for managing someone who created something, to being known as someone who bought the company that managed people who created something. Motivations are all fucked up. No matter what you discover or create these days, there's a feeling that you're not going to have your name written in history books. Your best options are join a grift or manage someone who's doing the hard work.

fungiblecog|11 months ago

this is exactly my experience

Art9681|11 months ago

You're just getting older and looking at the past with rose colored glasses. No one is going backwards in capability. It is about how accessible and cheap the thing is. In the 90's, a license to install Maya or 3D Studio Max, or Lightwave was extremely expensive, those products were not promoted nor available to the general public. They would cost tens of thousands of dollars, for the software alone, not to mention the hardware.

Today it is a commodity. So we are flooded with low effort productions.

With that being said, we have more capability than ever, at the cheapest cost ever. Whether businesses use that wisely is a different story.

There will always be outliers. I see many comments with people who derived value from whatever they perceived as something uncommon and unique they could do. Now AI has made those skills a commodity. So they lose their motivation since it becomes harder to attain some sort of adoration.

In any case, going forward, no matter what, there will be those who adopt the new tools and use them passionately to create things that are above and beyond the average. And folks will be on HN reminiscing about those people, 30 years from now.

mitthrowaway2|11 months ago

But for example Toy Story (1995) had a budget of 30 million. Today's Disney box office flops have budgets closer to 250 million.

somenameforme|11 months ago

This is a tangent, but I don't think the cost of things like 3DS or Maya were ever major barriers to entry. They were widely available for 'free' download. I think the companies involved were basically using this as what would eventually become the modern 'free for entities with less than $xxx annual revenue' license as there was seemingly less than 0 effort to ever enforce their copyrights. To say nothing of the countless commercial books available for both, which simply would not have had a market if it was only selling to people who had real licenses for the software.

johnnyanmac|11 months ago

If it's a commodity why is everything worse in quality? Commodification doesn't explain drop in objective metrics like performance, security, and complexity. It doesn't even explain the decline in stuff like customer satisfaction.

I don't think talent is the problem either. There's a lot more talent now than in the 90's.

yubblegum|11 months ago

Boeing calls to say hello...

bko|11 months ago

I was thinking about examples of where things got worse over time. They include some common appliances that use water, due to water use regulations. No reason my dishwasher should take over 2 hours to run. But then there's other things like food delivery.

I used to deliver pizzas in the early 2000s. I would get paid

$4/hour (later bumped to $5 per hour)

$1/delivery (pass through to customer)

+ tips

I had good days / times where I was pretty much always busy and made around $20/hour by the end.

So delivery cost the customer $1 + tip (usually ~$3), cost the business maybe $40 a night (~2.5 drivers for 3 hours), and I made out pretty well.

I can't compare exactly but I feel like today the business pays more, the customer pays more, the drivers get paid less and it's all subsidized by investors to boot. Am I totally wrong on this? But I feel like delivery got so much worse and I don't know where the money is going.

rurp|11 months ago

I'm glad I'm not the only one noticing appliances getting worse across the board. I don't buy enough of them to really know if the trend holds overall but the correlation is pretty much perfect between how new an appliance is and how much I hate it. For example the controls on my new LG washer and dryer are incredible bad. They've made it hard to impossible to just set the run level manually to push you into set programs for bedding or whatever. But they never work right! We've given up on using those programs entirely because they are terrible.

The main culprits I've seen are cheaping out on quality, replacing traditional controls with touch screens or "AI" magic buttons, and squeezing in more monetization streams or adding gimmicky features that actively make the product worse.

Maybe things will turn around someday. There are a few rays of hope, like the touchscreen fad in cars gradually losing its luster, but it seems like we've been on the wrong path for a long time and I'm not sure it will ever correct.

maxsilver|11 months ago

> I can't compare exactly but I feel like today the business pays more, the customer pays more, the drivers get paid less and it's all subsidized by investors to boot. Am I totally wrong on this?

It is exactly that! Food delivery is an excellent example of 'things just got worse'.

In 2019, 'delivery' was a specialty a restaurant would have to focus on to offer. Pizza places (Papa Johns, Pizza Hut, etc) and other specific delivery-focused restaurants (such as Panera Bread, Jimmy Johns, or your local Chinese restaurant) would have actual W2 employees who did delivery driving, as part of their job. The restaurant would want deliveries to go well (for both the customer, as well as the driver), so would make sure their own staff had reasonable access to food, some light training, and would ensure they could deliver it somewhat well. (They would reject orders too far away, they wouldn't serve food that wouldn't survive a delivery trip well, etc)

In post-COVID 2025, "every" restaurant offers delivery, but almost no restaurant still employs their own delivery drivers (locally, Jimmy Johns might be the only one left). Everyone else just outsourced to DoorDash. DoorDash drivers are employees who are 'legally-not-employees' (1099 employees), so they no longer have any direct access to the restaurants, and they can't train well for any specific service, because they might have to visit any-of-50 restaurants on any given day, all of which have entirely different procedures (even if they are the same brand or chain). Restaurants have zero incentive to ensure deliveries go well (the drivers aren't their employees, so they no longer care about turnover, and customers have to use DoorDash or Uber Eats or equivalent, because almost every restaurant uses it, so there's no downside to a DoorDash delivery going bad).

Prices to consumers are double-to-higher than what they were in 2019, depending on the restaurant. Wages are down, employment security is entirely eliminated. Quality and service have tanked.

Presumably, investors make slightly more money off of all of this?

gmac|11 months ago

> They include some common appliances that use water, due to water use regulations. No reason my dishwasher should take over 2 hours to run

I don't think this is a great example, because saving water (and thus the energy needed to heat the water) is both a social good and a private good.

Your new dishwasher program might take longer because, for example, (a) it's more efficient to soak residues than keep blasting away at them, but it takes longer and (b) if you alternate between shooting water at the the top and bottom drawers (but not both at once) then you can get away with using half the water, in twice the time.

Most dishwashers have an 'express' programme that uses more water and energy to finish faster, so if that matters you can still have it. If it doesn't matter to you (e.g. because you're running the dishwasher overnight, or while you're at work), you and everyone else benefits from the greater efficiency.

So I think this is an unambiguous improvement. :)

The average quality of appliances is a separate question. Anecdotally, I finally had to replace a 22-year-old Neff dishwasher. I got a new Bosch one (same firm, different logo), and have been pleasantly surprised that the new model is still made in Germany, seems pretty solid, washes well, and is guaranteed for 5 years.

mym1990|11 months ago

Not sure about price comparisons but what I can say is that many experiences feel worse. Paying 50-60$ for 3 tacos to be delivered, going out to basically any restaurant, pricing models on almost any subscription service(Adobe good example).

It’s led me to learn to DIY as much as possible, making my own fun and experiences so to say.

marcosdumay|11 months ago

> Am I totally wrong on this?

You are probably getting more, and the difference more than goes entirely into rent.

Real state is destroying the world's economy.

woah|11 months ago

The money is going to the driver's rent.

nradov|11 months ago

Donald Trump made appliance water use regulations an issue in the 2024 presidential campaign. Of course his opponents mocked him for it, and it probably was a little silly, but the messaging was effective in getting voters fired up about government overreach.

https://www.msn.com/en-us/news/politics/biden-allies-call-tr...

immibis|11 months ago

AFAIK the reason that newer dishwashers (new more than a decade ago) take a 2 hour cycle is that it's a more efficient cycle in both energy and water, but not in time.

arkh|11 months ago

> Whenever people see old systems still in production (say things that are over 30 years old) the assumption is that management refused to fund the replacement.

The problem is not refusing to fund replacements. The problem is refusing to fund maintenance.

A lot of managers in old school business were sold on IT as a tool. And tools? You buy them, use them and replace only when they break. Maintenance is minimal and you sure don't evolve them.

That's how you get couple decade old software chugging along, being so key to operations everything you want to add has to be aware of it and its warts which will then infect what touches it. And replacement projects cannot work because usually they mean changing how things are done.

But 20 years of rot are a symbiosis between users and tools:

- some tool does not allow a workflow, so users manage and find a workaround

- there is a workaround so next version of the software landscape cannot break it

- people want to do some new thing which is not in the software, changing it could break the previous workaround. So either people don't do the new thing or adapt and create other workarounds

Multiple rounds of this and you have a fossilized organization and IT where nothing can be easily changed. The business cannot adapt. The software cannot be modified to allow adaptation because it could break the business. Now a new competitor emerges, the business is losing and that's when everyone starts blaming everyone for the problems. But in reality? The cause is 20 years ago when some management decided to add IT as a cost center.

My solution to this problem? Create your own competitor and kill the old business.

ergonaught|11 months ago

I’m sure that’s a factor, however we need to probably also acknowledge that “younger people” (whether developers or managers or etc) lack exposure to things that were genuinely better previously (and where technology is concerned there are many examples), and thus have no mental model for it. They literally don’t know any better, and they’re operating within that framework.

Crude oversimplification: if all you’ve ever known are slow and bloated web app UIs on mobile phones, you’re simply not going to know how to make good design/development choices outside that environment.

andai|11 months ago

I don't think that's the reason.

If it were necessary to have seen something to know it's possible, nothing would ever improve, and nothing new would get made.

Cthulhu_|11 months ago

> Go watch CGI in a movie theatre and it's worse than 20 years ago

Objectively this isn't true as CGI technology has improved by leaps and bounds (think e.g. subsurface skin scattering in new vs old Gollum), however there's a lot of other factors at play; old CGI used film tricks to make it blend better, new CGI uses full CGI and digital whatsits and doesn't care anymore. It also depends on budget and what studio takes care of it. Good CGI is invisible, and there's a number of non-superhero films where the CGI just isn't visible / you're not even aware of it. Anyway, what 20 year old CGI are you thinking about, and what are you comparing it with? I'm thinking The Spirits Within (2001) or Beowulf (2007); the former did not age well, the latter was already panned as having poor CGI when it came out. Avatar (2009) pushed the frontier again I think.

> go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more.

This is a blinkered view of reality; there's thousands of game developers outside of this bubble, from single person developers making modern classics like Stardew Valley or even Minecraft when it first came out, to teams of developers that are bigger than those that made the games of 20 years ago.

Also, your opinion isn't fact; in the top 20 best selling games of 2024 [0] there is only one arguable remaster (GTA 5, which is on its 3rd remaster) and two complete remakes (FFVII Rebirth and CoD 3), with the former being a completely different game compared to the original. I share your cynicism about the "top of the line" video game market today, but you're not correct.

(meanwhile I'm playing 2007 video game (Supreme Commander))

[0] https://www.gamespot.com/gallery/2024s-best-selling-games-in...

ViktorRay|11 months ago

Actually I think your examples show that it is you who may be incorrect.

Stardew Valley is 9 years old.

Minecraft is almost 16 years old. The current version of the game has not dramatically changed in terms of the experience of most players of the game in over 10 years. (Hardcore players of any game will always make a big deal of any minor changes).

I was born in the 1990’s. I was playing games regularly in the 2000’s and the 2010’s although I don’t play as much today.

Hardly anyone in 2005 was playing 1996 games or 1989 games regularly.

Even in 2015 not many were playing 2006 or 1999 games regularly. (I think World of Warcraft was the only very popular old game in 2015)

But now in 2025 you bring up a 2016 game and 2009 game to argue with that other guy?

Hell what happened to the major big budget games? I remember playing Witcher 3, Red Dead Redemption 2 and Cyberpunk 2077…but even those games are ancient now. Witcher 3 is 10 years old, RDR 2 is 7 years old, Cyberpunk is 5 years old…

In 2015 I was playing games more often but I was playing games that were more recently released…. Not really games from 2010, 2008 and 2005….

Hell the most popular game for kids now is Fortnite which is 8 years old and came out in 2017! I wasn’t playing Mass Effect (2007) too much in 2015. The difference between Mass Effect 1 or Elder Scrolls Oblivion and The Witcher 3 is the same time difference as when Fortnite was released and 2025!

pimlottc|11 months ago

In many cases, quality is being driven down by automation that’s drastically cheaper and produces results that are deemed “good enough”.

Some of this is inevitable as new products and services move from being high end to mass-market, and it’s perhaps a bit chicken-and-egg to determine whether we accept this because we most people never really cared about quality that much anyway or because we just learn to accept what we’re given.

But it feels like there could be a world where automation still reduces costs while still maintaining a high level of quality, even if it’s not quite as cheap as it is now.

baazaa|11 months ago

I once found some old price catalogues (early 20c) for shoes etc. and estimated the items there are barely any cheaper today in real terms. Now obviously that's partly because we have cheaper substitutes today, so we've lost economies of scale when building things the old-fashioned way and the modern equivalent has to be made bespoke... but it's still pretty alarming given we should be ~10x richer.

But consider an example which can't be blamed on that. My city (Melbourne) has a big century-old tram network. The network used to cover the city, now it covers only the inner city because it hasn't ever been expanded. We can't expand it because it's too expensive. Why could we afford to cover the whole city a century ago when we were 10x poorer? With increasing density it should be even more affordable to build mass-transit.

Obviously people blame the latter example on declining state capacity, but I'm not sure state capacity is doing any worse than Google capacity or General Electric capacity.

bluescrn|11 months ago

> Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more.

Even more glaring is TV shows, where you now get an 8-episode 'season' every 2-3 years rather than the old days of 20+ episode seasons every year, often non-stop for 5 or more years.

It's not so much about capability/competence as pushing production values to unsustainable levels. You could get away with much less expensive VFX, sets, and costume when filming in standard definition. Now every pixel is expected to look flawless at 4K.

Another more controversial factor is that everyone brings their politics/activism to work and injects them into everything that they do. Now everything has to be pushing for social change, nothing can just be entertainment for the sake of entertainment.

ngetchell|11 months ago

Is that a change? George Lucas certainly brought his politics around Vietnam to Star Wars. The 70s were a very radical and political time for movies

BeFlatXIII|11 months ago

> Even more glaring is TV shows, where you now get an 8-episode 'season' every 2-3 years rather than the old days of 20+ episode seasons every year, often non-stop for 5 or more years.

That's often a good change. Less filler for the sake of having another full season.

plondon514|11 months ago

In Japan right now and I see a ton of automation everywhere, self checkout at grocery stores and restaurants, but what you also see is a live humanbeing assigned to the machines to help you if you have issues.

throwaway150|11 months ago

> In Japan right now and I see a ton of automation everywhere, self checkout at grocery stores and restaurants, but what you also see is a live humanbeing assigned to the machines to help you if you have issues.

Isn't that how self checkout happens in every part of the world that has self checkout? I'm failing to see what's special about self checkout in Japan.

nicbou|11 months ago

Japan has a LOT of make-work jobs for the elderly.

giantg2|11 months ago

To be fair, the replacement projects we've outsourced to multiple Indian companies have been utter failures too.

carlmr|11 months ago

>But if you look at replacement projects so many of them are such dismal failures

The problem with replacement projects is when and why they're usually started. They're usually started once there's a fixed deadline on some technology ceasing to exist, creating the appropriate urgency.

Usually the people that wrote that original software have long gone, the last few people that were able to maintain it are also nearing retirement age or already gone as well, you have some ancient technologies used for which it's hard to get documentation on the internet today.

Now you're tasked with writing a replacement, and everything that doesn't work on day 1 is deemed a failure. It might have worked if you started earlier. Because if your original codebase is COBOL and assembly written for mainframe, it's really hard to find anyone that can understand what it does fully and rewrite it now cleanly.

If you had updated from COBOL and mainframe assembly to C, and from C to 90s Java, and from 90s Java to modern Java/Go/Rust/Node, you'd have plenty of institutional knowledge available at each step, and you would have people that know the old and the new world at each step. Jumping half a century in computing techonology is harder than doing small jumps every 10-15 years.

antifa|11 months ago

"Don't fix what isn't broken" is often becoming a survival tactic these days. You never know what new modern puchase you could make is spying on you, only works online (when no useful reason for that), made of cheaper materials, planning to remotely disable a feature and charge you for it, vendor locks resupply/maintenance, etc..

It increasingly applies to nearly all aspects of the economy. Everybody wants to lock you in and take a cut. Almost all new innovation these days is just rent seeking gatekeeping. Even genuine innovations are unable to get their innovations out without either recreating entire software stacks (or supply chains) that's under feudalistic/parasitic control, they often remain niche and undermonetized. This will have an effect on the economy like a % yearly reduction in atmospheric oxygen will destroy biodiversity.

chii|11 months ago

> Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more.

None of the things you said are actually true. Only superficially, because you've only seen those mass market crap.

Good movies are still around, and yuo don't even notice the CGI, because they're cleverly done. For crap like the recently released snow white, it's obvious that the CGI is badly done - it doesn't make it an indictment against all movies released of late!

Same with games - just because there's lots of AAA studio flops that look terrible, doesn't mean the medium is all terrible. There's so many good indie games that you can never truly play them all.

But if your exposure to these products are only the mass market crap, then you might certainly feel that way.

snozolli|11 months ago

Good movies are still around

Compare 1997 to today.

Major hit after major hit was being released that year, and they were overwhelmingly original and creative. There had been a boom in independent filmmaking and many of the big production houses had started up smaller studios to attract the talent. Unfortunately, Hollywood did what Hollywood does and killed everything that made them good.

Nowadays, we have endless releases of super hero sequels that are, fundamentally, the same movie over and over. We have endless remakes and reboots because nobody wants to take a chance.

Yes, you can find creativity if you look hard enough, but in 1997 it was everywhere, and in your face. You can't pretend that it doesn't matter or that it doesn't mark an enormous shift in culture (business and society).

corimaith|11 months ago

At least in terms of media, profits are higher (due to increasing mainstream acceptance rather than opionated subcultures), budgets are also higher (hence more risk), so why put in the effort? It's not the West, same thing is happening in Japan, even in places like mass produced literature and Webtoons in China & Korea. Gacha games/Live-Service unfortunately is alot more profitable for less effort than an ambitious single player game like BG3. Then there's also poignant quote by the Square Enix CEO that any sort of investment into media needs to be compared to with the opportunity cost of just investing in the S&P500 instead. It's not enough to just be profitable, you need to make at least 2x/3x over that 5 year dev time to break even.

So alot unfortunately is a choice that consumers have made. Even in terms of media again, alot of modern viewers watch media more as self-insert fantasies, so quality of writing or novelty is often going to worthless or even detrimental to them. I don't believe that mindset, but having talked to many on /a/, /v/ or reddit, there's many who are just there to consume rather than actual interest.

johnnyanmac|11 months ago

It's hard comparing GaaS to Single player ganes. It's not less effort, it's different effort.

Your average gacha may look lower effort, but it has to sustain thst effort longer instead of patching the game for a few months and moving on. It has to do a lot more marketing to get players in, because many are this pseudo-MMO experience, completely with PvP and Guild content to manage.

At the highest end, Hoyovervese's operating costs would even make Activision blush. But those games make billions to compensate.

SkipperCat|11 months ago

I feel that the West is backsliding because for the past decade, we've addicted ourselves to social media dopamine hits and we stopped observing the outside world because we've glued our attention to our phones. Seems like this has hit the under 30 group the hardest.

I remember being bored and having to create my own fun. I remember being aware of my surroundings and being curious about it because I didn't have my favorite entertainment media attached to my palm. I remember learning about thing such as what was in my Cheerios because the box was the only thing in front of me when I ate my breakfast.

It would be a joke to say that AI exists to fill the void from what I mentioned above, but it does kinda sorta feel correct in a weird sci-fi conspiracy way.

gtirloni|11 months ago

Good thing there aren't phones and social media in the East (or whatever you want to call non-West these days).

meander_water|11 months ago

This is largely the conclusion drawn in Stolen Focus by Johann Hari (well worth the read). Although he argues not just in the West, but across the world.

fijiaarone|11 months ago

People used to get dopamine hits from writing code that works, fixing cars, climbing mountains, playing music, and asking other people out on dates.

Dopamine addiction isn’t the problem.

jose-incandenza|11 months ago

I believe it's not related to capability, but to how investment works. Aggregate demand is composed of consumption and investment (I'm referring to the global economy, combining both public and private expenditure). Investment is the money extracted by capitalist actors in the system that is reinvested to generate profits. These capitalist actors need an incentive (the promise of generating more money) to invest the money they have extracted, and this incentive is usually the latest hot technology.

For example, when the internet emerged, everyone wanted to be online; when smartphones appeared, everyone wanted to have an app; and when VR emerged, Facebook changed its name and lost half of its value in the stock market. Now, it's AI. Capitalists do not focus on the details of the technologies; to them, every new technology looks the same. They see new tech as a growing opportunity and old tech as a saturated market. Obviously, this perspective is flawed, but it doesn’t matter.

In my opinion, AI is not going to create more value. The only real impact it will have is reducing the amount of workforce needed to generate that value, which will ultimately push the economy into a recession. As consumption declines, I don’t see what new technology could come after AI to offset this effect through further investment.

bloqs|11 months ago

What's happening with the games and movies is investment has figured out through the decades is that taking big risks often does not pay off, and that 25 to 45 year olds are the biggest consumer group.

Broad changes in the distribution of wealth, and government spending on education sharply declining, levels of critical thinking and open-mindedness have declined.

So now, if something can be made thats part of an existing franchise or consumer favoured products, then thats lower risk. It attracts more capital. Full on remakes again and again with idiots generally accepting bad games on nostalgia value means sales even of a bad game remain palletable.

I dont think the west is going backwards in capability, but people seem incapable of highlighting what has changed

anonzzzies|11 months ago

I think in the west all c-level /managers (but it happens on all levels, including workers, they just tend to not get very far) just want to get rich (and/or power; same thing) and have 0 pride or vision: they just pick the path that makes them most; if that's good for the company/country/etc that's a nice coincidence, but if not, fine. Enough examples around and they are not even ashamed or whatever that it costs lives; as long as they get the $ it is fine. There is no vision, no real plan beyond what they think they will stay on for.

bee_rider|11 months ago

I don’t think it is a good standard for judging a civilization, really. But CGI, 20 years ago? A lot of it was really quite bad. CGI has always had bad and good instances because of the interplay between increasing technical skill and totally random director-determined skill at selecting shots.

I mean, like, Disney has been getting worse at CGI, but only because then whole company has given up. This is just normal companies shifting around, though.

tbyehl|11 months ago

Anything from 20+ years ago that someone still thinks holds up as great CGI probably wasn't CGI in the first place.

The problem with CGI today is that it's over-used and mis-applied in areas that still have Uncanny Valley type issues (fight scenes, car chases/crashes, etc).

johnnyanmac|11 months ago

>I mean, like, Disney has been getting worse at CGI, but only because then whole company has given up.

I think that's the main point, yes. There's a sense before that companies were trying to push the envelope. These days it's just a shrug and cynical minmaxing of funds to the shareholders. CGI 20 years ago was objectively worse but you can tell they had way to hide the flaws or redirect the eye away from them. Now... Ehh, who cares? Just get the first pass through.

If you want a relevant example: some people say Lili and Stitch's life action has a weird looking stitch model. Part of thst is because way back in 2005, the original Stitch was simply never meant to be looked at in a side profile for an extended time. Art directors made sure to avoid that angle in every frame they drew. 20 years later... meh. Ship it. Screw the outsourced CGI trying to model something better, the cinematography begin careful of angles, nor any reaction from "nitpickers". We got the IP, it'll make money.

It's not a franchise killer but it'd just one example of the many broken windows

0x1ceb00da|11 months ago

CGI today is so good that you don't even notice it most of the time.

jfengel|11 months ago

The Pixar division, at least, is extraordinary and continues to push for more.

hackernoops|11 months ago

Decline was a set of choices, it's just that the average person didn't make those choices and we're not allowed to discuss them or the people that made them.

galaxyLogic|11 months ago

Someone has to shout out: Empteror has no clothes!

In recent years lying has been normalized. Black is White etc. 1984 is here.

mjevans|11 months ago

I suspect a key to success is:

Don't replace an existing solution with exactly the same thing on a different platform.

Think larger. Solve today's / (near) tomorrow's problem's BETTER. That's probably going to require changes to process too. A full evaluation of what's the most effective way with the capabilities and needs that exist now.

Then bring up interfaces that provide what the old system did, verify the data round trips, and when it's approved cut over.

KaiserPro|11 months ago

> Go watch CGI in a movie theatre and it's worse than 20 years

Sorry but thats just not true. Sure there are shit VFX films, but I guarantee that the "serious" movies that people hold up as "all in camera effects" have hundreds of shots with digital set extensions and all sorts of VFX magic.

If you look at TV, where there has been huge competition, the use of VFX has exploded, mainly as a cost saving, but also as a story enhancer. stuff that would have cost £20m ten years ago is being done for £500k. Thats huge innovation.

> remasters of 20 year old games because no-one knows how to do anything any more

They are remasters because the people putting the money up are conservative.

Innovation is happening, just not where you expect. Look at the indy games market.

Much as I don't like it, but a huge amount of innovation is happening in the world of youtube and tiktok. New editing styles, almost a complete new genre of moving picture has emerged.

Where there is competition, there is innovation.

exe34|11 months ago

> if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable

This makes it sound as if management only decide whether to engage in modernizing or not. I think it's only fair to also give them full credit for the failures - profit over people, dogma over pragmatism, etc is their fault.

robertlagrant|11 months ago

> go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more

People do, but they aren't in AAA studios. They're doing indie games, because their large corporations were captured by the professional glom-onto-success management class.

dgfitz|11 months ago

> releases are all remasters of 20 year old games

This just isn’t true. It’s not nice to make things up.

bigstrat2003|11 months ago

It's far less nice to accuse people of lying without any evidence. If someone is wrong, say so. But there's no reason to call people liars.

theshackleford|11 months ago

Yup. It’s just nostalgia huffing and increasing age.

whoknowsidont|11 months ago

>But if you look at replacement projects so many of them are such dismal failures that's management's reluctance to engage in fixing stuff is understandable.

This is most definitely still the fault of management.

nindalf|11 months ago

> go home to play video games and the new releases are all remasters of 20 year old games

This annoyed me, because it's so manifestly untrue. The games of the year of the last few years (https://en.wikipedia.org/wiki/List_of_Game_of_the_Year_award...)

- 2024: Astro Bot

- 2023: Baldur's Gate 3

- 2022: Elden Ring

- 2021: No consensus pick, but It Takes Two stands out to me

- 2020: Hades

All of these, with the exception of BG3 are original IP. A lot of them have really unique game mechanics that I haven't seen before. Hades has some of the tightest combat that never gets old even after hundreds of runs. It also has extraordinary music and voice acting. Truly a labour of love.

It Takes Two is a co-op story adventure. Every single level has a new fun mechanic. In one of them you literally control time. Please, do tell me which game from 20 years ago was a co-op adventure where every level was unique? The best co-op was probably Halo 2 (2004), but that's just shooting from beginning to end.

You're thinking "well, ok there's one sequel in there. That's proof that video game companies want to play it safe". But you'd still be incorrect. BG3 is inspired by its prequels BG1 and 2, but those released 20 years ago. Open YouTube and check out how different they are in every single way. I'll bet there isn't even a single line of code common between the BG3 and the originals. BG3 exists because the developers grew up playing BG1 and 2 and wanted to make a homage to the games that shaped them. And they succeeded, good for them.

I will admit that I didn't play Elden Ring. I didn't even attempt to, because I already have a full time job. But that's great too, because it shows that there are games being made for people who love a punishingly difficult challenge. That's not me, but you can find that now if you want.

Your comment is just rose-tinted whingeing. It's so easy to write a comment like "man, the good old days were really good weren't they". But ... no. I can play all of the games from the good old days and I can also play Hades, It Takes Two and BG3. And that's just the surface! There are so many incredible games being made and released. Factorio is great in many ways, but the most remarkable part is how they've optimised their game to a mind-boggling extent.

No one knows how to do anything anymore? Then how did these incredibly innovative, flawlessly executed games get made?

ViktorRay|11 months ago

I posted a reply to another user here.

https://news.ycombinator.com/item?id=43493740

I don’t want to retype everything I posted in that reply but it kind of applies to your comment as well.

In 2015 if we were having this discussion I could easily pull out dozens of groundbreaking innovating games from 2010 to 2015.

In 2005 if we were having this discussion I could have easily pulled out dozens of groundbreaking innovating games from 2000 to 2005.

But we are having this discussion in 2025 and I know both you and I would struggle to pull out a dozen high quality new innovating games that have come out in the past 5 years.

Clearly things have gone worse.

nottorp|11 months ago

> All of these, with the exception of BG3 are original IP.

Elden Ring is just Demon's Souls 4 from 2009. It's good to the point that I'll still preorder its successor, but nothing is original there any more.

Edit: not 4, more like 7?

Edit 2: Hades seemed more difficult to me than Elden Ring. Maybe you shouldn't trust the marketing and check for yourself.

Telemakhos|11 months ago

> Go watch CGI in a movie theatre and it's worse than 20 years ago, go home to play video games and the new releases are all remasters of 20 year old games because no-one knows how to do anything any more. And these are industries

Maybe arts shouldn't have been industries. Look at sculpture or painting from the Renaissance and then postmodern sculpture and painting and you'll see a similar decline, despite the improvement of tools. We still have those techniques, and occasionally someone will produce a beautiful work as satire. We could be CNC milling stone buildings more beautiful and detailed than any palace or cathedral and that would last for generations, but brutalism killed the desire to do so, despite the technology and skill being available. There's something to industrialized/democratized art being sold to the masses that leads to a decline in quality, and it's not "because no-one knows how to do anything any more." It's because no one care nor wants to pay for anything beautiful, when there are cheaper yet sufficient alternatives.

WillieCubed|11 months ago

I don't think the issue is that the West is going backwards in capability; rather, it's that although it has the capability to produce great products (software, media, etc.), it deliberately chooses not to because it's not as cost effective, because the people with expertise are overworked and understaffed, or because management had other priorities (see AAA game development).

In other words, the capitalists won.

baazaa|11 months ago

AAA games are eye-wateringly expensive though, management aren't imagining it; my point is things becoming more expensive is a symptom of decline. I'm sure the late romans consoled themselves they could build another Pantheon they just cared more about efficiency now.

Where I work in government we've stopped paying for important data from vendors (think sensors around traffic etc.) because the quotes are eye-wateringly expensive. But I've worked in data long enough to know the quotes probably reflect genuine costs, because data engineers are so incompetent (and if it's a form of pricing gouging it's not working because gov isn't paying up). So it looks like we're choosing to be in the dark about important data, but it's not entirely a choice.

Saying we can do stuff but it's unaffordable is imo just another way of saying we can't do stuff.

QuadmasterXLII|11 months ago

that was it even 6 or seven years ago, but the warning signs are there that we’ve chosen not to for so long that the abilities have rotted

fijiaarone|11 months ago

Who did the capitalists beat? America wasn’t a socialist utopia 20 years — or even 50 years ago.

throwaway6734|11 months ago

The result of boomer cultural and capital domination. Millenials need to grow up and yank power from them, first starting with massive government handouts given to olds

nradov|11 months ago

Should I take it that you support privatizing Social Security and eliminating Medicare? That probably won't be a winning political platform with any generation.

hackernoops|11 months ago

Even worse is the entitlement spending imbalance along racial lines.

kaycey2022|11 months ago

The west isn’t capitalist any more. Sure the number goes up but at what cost

sloowm|11 months ago

This is the absolute opposite take of what is actually happening. We are in the most capitalist time ever. There are a few people who own all the wealth, 2 of them are also in control of the most powerful government. This is capitalism baby, enjoy the full wrath.

trashtester|11 months ago

With all due respect, this attitude typically comes with age. I see it in myself, too (I'm over 50).

You're right that an important reason why it's hard to replace those 30+ year old systems, and that part of the reason is that the current devs are not necessarily at the same level as those who built the original. But at least in part, this is due to survivorship bias.

Plenty of the systems that were built 30-50 years ago HAVE been shut down, and those that were not tend to be the most useful ones.

A more important tell, though, is that you see traditional IT systems as the measuring stick for progress. If you do a review of history, you'll see that what is seen as the measuring stick changes over time.

For instance, in the 50's and 60's, the speed of cars and airplanes was a key measuring sticks. Today, we don't even HAVE planes in operation that match the SR-71 or Concorde, and car improvements are more incremental and practical than spectacular.

In the 70s and into the 80s, space exploration and flying cars had the role. We still don't have flying cars, and very little happened in space from 1985 until Elon (who grew up in that era) resumed it, based on his dream of going to Mars.

In the 90s, as Gen-X'ers (who had been growing up with C64/Amiga's) grew up, computers (PC) were the rage. But over the last 20 years little has happened with the hardware (and traditional software) except that the number of cores/socket has been going up.

In the 2000s, mobile phones were the New Thing, alongside apps like social media, uber, etc. Since 2015, that has been pretty slow, too, though.

Every generations tends to devalue the breakthroughs that came after they turned 30.

Boomers were not impressed by computers. Many loved their cars, but remained nostalgic about the old ones.

X-ers would often stay with PC's as the milennials switched to phones-only. Some X-ers may still be a bit disappointed that there's no flying cars, Moon Base and no Mars Colony yet (though Elon, an X'er is working on those).

And now, some Milennials do not seem to realize that we're in the middle of the greatest revolution in human history (or pre-history for that matter).

And developers (both X'ers and millennials) in particular seem to resist it more than most. They want to keep their dependable von Neumann architecture computing paradigm. The skills they have been building up over their career. The source of their pride and their dignity.

They don't WANT AI to be the next paradigm. Instead, they want THEIR paradigm to improve even further. They hold on to it as long as they can get away with it. They downplay of revolutionary it is.

The fact, though, is that every kid today walks around with R2D2 and C3PO in their pockets. And production of physical robots have gone exponential, too. A few more years at this rate, and it will be everywhere.

Walking around today, 2025 isn't all that different from 2015. But 2035 may well be as different from 2025 as 2025 is to 1925.

And you say the West is declining?

Well, for Europe (including Russia), this is true. Apart from DeepMind (London), very little happens in Europe now.

Also, China is a competitor now. But so was the USSR a couple of generations ago, especially with Sputnik.

The US is still in the leadership position, though, if only barely. China is catching up, but they're still behind in many areas.

Just like with Sputnik, the US may need to pull itself together to maintain the lead.

But if you think all development has ended, you're like a boomer in 2010, using planes and cars as the measuring stick that thinks that nothing significant happened since 1985.

anovikov|11 months ago

Look at space and it was obviously broken 20 years ago in the West: Shuttle that finally proved itself, after 20 years of trying to conceal it, to be a generational, unfixable mistake - and stubborn insistence on pushing through with a replacement based on same technology and same people - burning billions while staying stuck.

Now it is in the best shape ever and progress seems to be unstoppable. And West throughly dominates it in every dimension and that dominance seems to only be accelerating.

Boeing just failed in what was an inherently unfair game: they tried to compete with state-funded Airbus that could just burn unlimited cash not worrying about real profitability, Boeing tried doing it by cutting costs, and failed.