In this scenario the person who wants to be paid owns the output of the agent. So it’s closer to a contractor and subcontractor arrangement than employment.
1. They built the agent and it's somehow competitive. If so, they shouldn't just replace their own job with it, they should replace a lot more jobs and get a lot more rich than one salary.
2. They rent the agent. If so, why would the renting company not rent directly to their boss, maybe even at a business premium?
I see no scenario where there's an "agent to do my work while I keep getting a paycheck."
A question is which side agents will achieve human-level skill at first. It wouldn’t surprise me if doing the work itself end-to-end (to a market-ready standard) remains in the uncanny valley for quite some time, while “fuzzier” roles like management can be more readily replaced.
It’s like how we all once thought blue collar work would be first, but it turned out that knowledge work is much easier. Right now everyone imagines managers replacing their employees with AI, but we might have the order reversed.
> This begs the question of which side agents will achieve human-level skill at first.
I don't agree; it's perfectly possible, given chasing0entropy's... let's say 'feature request', that either side might gain that skill level first.
> It wouldn’t surprise me if doing the work itself end-to-end (to a market-ready standard) remains in the uncanny valley for quite some time, while “fuzzier” roles like management can be more readily replaced.
Agreed - and for many of us, that's exactly what seems to be happening. My agent is vaguely closer to the role that a good manager has played for me in the past than it is to the role I myself have played - it keeps better TODO lists than I can, that's for sure. :-)
> It’s like how we all once thought blue collar work would be first, but it turned out that knowledge work is much easier. Right now everyone imagines managers replacing their employees with AI, but we might have the order reversed.
Some humans will be rich and they'll buy things. For example those humans who own AI or fabs. And those humans, who serve to them (assuming that there will be services not replaced by AI, for example prostitution), will also buy things.
If 99.99% of other humans will become poor and eventually die, it certainly will change economy a lot.
> How are businesses going to get money if there are no humans that are able to pay for goods?
By transacting with other businesses. In theory comparative advantage will always ensure that some degree of trade takes place between completely automated enterprises and comparatively inefficient human labor; in practice the utility an AI could derive from these transactions might not be worth it for either party—the AI because the utility is so minimal, and the humans because the transactions cannot sustain their needs. This gets even more fraught if we assume an AGI takes control before cheaply available space flight, because at a certain point having insufficiently productive humans living on any area of sea or land becomes less efficient than replacing the humans with automatons (particularly when you account for the risk of their behaving in unexpected ways).
There is an amount of people who own, well, in the past we could say "means of production" but let's not. So, they own the physical capital and AI worker-robots, and this combination produces various goods for human use. So they (the people who own that stuff) trade those goods between each other since nobody owns the full range of production chains.
The people who used to be hired workers? Eh, they still own their ability to work (which is now completely useless in the market economy) and nothing much more so... well, they can go and sleep under the bridge or go extinct or do whatever else peacefully, as long as they don't try to trespass on the private property, sanctity and inviolability of which is obviously crucial for the societal harmony.
So yeah, the global population would probably shrink down to something in the hundreds millions or so in the end, and ironically, the economy may very well end up being self-sustainable and environmentally green and all that nice stuff since it won't have to support the life standards of ~10 billions, although the process of getting there could be quite tumultous.
The AI agents don’t appear to know how & where to be economically productive. That still appears to be a uniquely human domain of expertise.
So the human is there to decide which job is economically productive to take on. The AI is there to execute the day-to-day tasks involved in the job.
It’s symbiotic. The human doesn’t labour unnecessarily. The AI has some avenue of productive output & revenue generating opportunity for OpenAI/Anthropic/whoever.
You are welcome to try to cut them out and start your own business. But I suspect you might find it a bit harder than your employer signing up for a SaaS AI agent. Actually wait, isn't that what this website is? Does it work?
This is backwards. Those people got into the positions they have by having money to spend, not because someone wanted to pay them to do something. (Or they had a way to have control over spending someone else's money.)
They are a bridge between those with money and those with skill. Plus they can aggregate information and act as a repository of knowledge and decision maker for their teams.
These are valuable skills, though perhaps nowhere near as valuable as they end up being in a free market.
Mtinie|3 months ago
georgehotz|3 months ago
1. They built the agent and it's somehow competitive. If so, they shouldn't just replace their own job with it, they should replace a lot more jobs and get a lot more rich than one salary.
2. They rent the agent. If so, why would the renting company not rent directly to their boss, maybe even at a business premium?
I see no scenario where there's an "agent to do my work while I keep getting a paycheck."
danenania|3 months ago
It’s like how we all once thought blue collar work would be first, but it turned out that knowledge work is much easier. Right now everyone imagines managers replacing their employees with AI, but we might have the order reversed.
jMyles|3 months ago
I don't agree; it's perfectly possible, given chasing0entropy's... let's say 'feature request', that either side might gain that skill level first.
> It wouldn’t surprise me if doing the work itself end-to-end (to a market-ready standard) remains in the uncanny valley for quite some time, while “fuzzier” roles like management can be more readily replaced.
Agreed - and for many of us, that's exactly what seems to be happening. My agent is vaguely closer to the role that a good manager has played for me in the past than it is to the role I myself have played - it keeps better TODO lists than I can, that's for sure. :-)
> It’s like how we all once thought blue collar work would be first, but it turned out that knowledge work is much easier. Right now everyone imagines managers replacing their employees with AI, but we might have the order reversed.
Perfectly stated IMO.
chasing0entropy|3 months ago
I believe once AI scales my theory will be proven universal.
My wife believes there will eventually also be a third job created to do the job.
zwnow|3 months ago
Lots of us are not cut out for blue collar work.
vbezhenar|3 months ago
If 99.99% of other humans will become poor and eventually die, it certainly will change economy a lot.
lurk2|3 months ago
By transacting with other businesses. In theory comparative advantage will always ensure that some degree of trade takes place between completely automated enterprises and comparatively inefficient human labor; in practice the utility an AI could derive from these transactions might not be worth it for either party—the AI because the utility is so minimal, and the humans because the transactions cannot sustain their needs. This gets even more fraught if we assume an AGI takes control before cheaply available space flight, because at a certain point having insufficiently productive humans living on any area of sea or land becomes less efficient than replacing the humans with automatons (particularly when you account for the risk of their behaving in unexpected ways).
Joker_vD|3 months ago
The people who used to be hired workers? Eh, they still own their ability to work (which is now completely useless in the market economy) and nothing much more so... well, they can go and sleep under the bridge or go extinct or do whatever else peacefully, as long as they don't try to trespass on the private property, sanctity and inviolability of which is obviously crucial for the societal harmony.
So yeah, the global population would probably shrink down to something in the hundreds millions or so in the end, and ironically, the economy may very well end up being self-sustainable and environmentally green and all that nice stuff since it won't have to support the life standards of ~10 billions, although the process of getting there could be quite tumultous.
macintux|3 months ago
OtherShrezzing|3 months ago
So the human is there to decide which job is economically productive to take on. The AI is there to execute the day-to-day tasks involved in the job.
It’s symbiotic. The human doesn’t labour unnecessarily. The AI has some avenue of productive output & revenue generating opportunity for OpenAI/Anthropic/whoever.
ruined|3 months ago
fijiaarone|3 months ago
georgehotz|3 months ago
dboreham|3 months ago
gridspy|3 months ago
These are valuable skills, though perhaps nowhere near as valuable as they end up being in a free market.
coliveira|3 months ago