top | item 42473864

(no title)

CliveBloomers | 1 year ago

Another meaningless benchmark, another month—it’s like clockwork at this point. No one’s going to remember this in a month; it’s just noise. The real test? It’s not in these flashy metrics or minor improvements. The only thing that actually matters is how fast it can wipe out the layers of middle management and all those pointless, bureaucratic jobs that add zero value.

That’s the true litmus test. Everything else? It’s just fine-tuning weights, playing around the edges. Until it starts cutting through the fat and reshaping how organizations really operate, all of this is just more of the same.

discuss

order

oytis|1 year ago

So far AI market seems to be focused on replacing meaningful jobs, meaningless ones look safe (which kind of makes sense if you think about it).

handfuloflight|1 year ago

Agreed, but isn't it management who decides that this would be implemented? Are they going to propogate their own removal?

zamadatix|1 year ago

Middle manager types are probably interested in their salary performance more than anything. "Real" management (more of their assets come from their ownership of the company than a salary) will override them if it's truthfully the best performing operating model for the company.

akra|1 year ago

Its a common view from the "do'ers" (the people who made most of the value in the past; the hard workers, etc) that this will make management redundant. Sadly with a basic understanding of economics you can see this is probably wrong. The "do'ers" have given more power to the management class at their own expense with this solution - if I can get the AI to "do" all I need are the people who "decide what to do". Market power belongs with scarcity - all else being equal AI makes the barrier to development smaller meaning less scarcity on that side. In general technology developments have increased inequality especially since the 90's onwards.

Generally with AI think the top of society stand to gain a lot more than the middle/bottom of it for a whole host of reasons. If you think anything different your framework you use to make your conclusion is probably wrong at least in IMO.

I don't like saying this but there is a reason why the "AI bros", VC's, big tech CEO's, etc are all very very excited about this and many employees (some commenting here) are filled with dread/fear. The sales people, the managers, the MBA's, etc stand to gain a lot from this. Fear also serves as the best marketing tool; it makes people talk and spread OpenAI's news more so than everything else. Its a reason why targeting coding jobs/any jobs is so effective. I want to be wrong of course.