When did Ben Thompson go so far down the path to autocratic sympathizer? This is such an anti-democratic, anti-free market, anti-free speech view on this whole situation.
First, everything revolves around a core conceit that "Might makes right". The idea that entities might might push back with the tools at their disposal is treated as a fools errand, you should just acquiesce.
The role of the legislative branch in deciding what private entities are allowed, or not allowed to do is treated as a side note. He equates the dictates of the executive branch as if it was the will of the United States itself, above even the Constitution.
It's dismissive of the rights of private companies and individuals to make decisions for themselves about the actions they take and whether or how they choose to transact within the law with parts of the executive branch.
He acts as if it's a foregone conclusion that every AI company should be considered an arm of the executive branch of the US government. The analogy to nuclear weapons is super flawed, there are multiple laws on the books (written into law by Congress) specifically regulating Nuclear research and development.
And most astonishingly he ends it dropping an implied threat of violence towards Anthropic (and assumedly anyone else who doesn't agree with his point of view):
> I don’t want that, and, more pertinently, the ones with guns aren’t going to tolerate it.
Yep, once you accept “might makes right” the laws in a democracy become polite suggestions. Oh, your town is in the way of hydropower? Too bad the gov’t has more guns than you. That’s how you get the Three Gorges Dam in China. Nevertheless, the Trump Mafia is demonstrating how paper thin democracy and rule of law really is in the US.
Ben says over and over again that he's not making the argument about the importance of democratic oversight, but he clearly is. He doesn't like Amodei and is happy to see him fail, and it comes through loud and clear in the piece.
Anthropic, like any other US company, should be free to not sell to the government if they don't want to. These other arguments about oversight are nonsense.
I don't think he ever said in the article that he is not making the argument about the importance of democratic oversight, if anything, in the conclusion, he says:
"The way to address this new reality, however, is with new laws and through strengthening accountable oversight; cheering or even demanding that an unelected executive decide how and where such powerful capabilities can be used is the road an even more despotic future."
But I think you drastically misunderstood the point of this article. Ben is pointing out the implications of Amodei's analogy of advanced AI being like nuclear weapons. The government has a monopoly on nuclear weapons and has extreme regulations and oversight on the companies that help build nuclear weapons for it. And those companies do not tell the government how or when they can use the nukes.
So if advanced AI is like nuclear weapons, why can an unelected executive tell a democratically elected government how to use it?
I thought this was a flimsy piece. Agree with your conclusion.
Also - I'm surprised the government didn't say "ok" and then use Claude (as they wished) anyways. Don't know the details and am oversimplfying, but seems like a plausible path without much recourse/oversight.
> At the same time, what is the standard by which it should be decided what is allowed and not allowed if not laws, which are passed by an elected Congress?
And that might be a compelling argument if the rule of law hadn’t been throw out the window. If secret courts didn’t exist. If “laws” couldn’t be twisted into pretzel or just outright ignored. If things, oh you know, like the name of the department he invokes throughout this article, could only be changed by the elected Congress instead of unilaterally changed by a mad king. Yeah, maybe then your article would hold water.
What an incredibly naive take on the current state of affairs.
dwallin|1 day ago
First, everything revolves around a core conceit that "Might makes right". The idea that entities might might push back with the tools at their disposal is treated as a fools errand, you should just acquiesce.
The role of the legislative branch in deciding what private entities are allowed, or not allowed to do is treated as a side note. He equates the dictates of the executive branch as if it was the will of the United States itself, above even the Constitution.
It's dismissive of the rights of private companies and individuals to make decisions for themselves about the actions they take and whether or how they choose to transact within the law with parts of the executive branch.
He acts as if it's a foregone conclusion that every AI company should be considered an arm of the executive branch of the US government. The analogy to nuclear weapons is super flawed, there are multiple laws on the books (written into law by Congress) specifically regulating Nuclear research and development.
And most astonishingly he ends it dropping an implied threat of violence towards Anthropic (and assumedly anyone else who doesn't agree with his point of view):
> I don’t want that, and, more pertinently, the ones with guns aren’t going to tolerate it.
Wow.
silverlake|12 hours ago
harry19023|1 day ago
Anthropic, like any other US company, should be free to not sell to the government if they don't want to. These other arguments about oversight are nonsense.
tguedes|1 day ago
"The way to address this new reality, however, is with new laws and through strengthening accountable oversight; cheering or even demanding that an unelected executive decide how and where such powerful capabilities can be used is the road an even more despotic future."
But I think you drastically misunderstood the point of this article. Ben is pointing out the implications of Amodei's analogy of advanced AI being like nuclear weapons. The government has a monopoly on nuclear weapons and has extreme regulations and oversight on the companies that help build nuclear weapons for it. And those companies do not tell the government how or when they can use the nukes.
So if advanced AI is like nuclear weapons, why can an unelected executive tell a democratically elected government how to use it?
grvdrm|1 day ago
Also - I'm surprised the government didn't say "ok" and then use Claude (as they wished) anyways. Don't know the details and am oversimplfying, but seems like a plausible path without much recourse/oversight.
joshstrange|7 hours ago
And that might be a compelling argument if the rule of law hadn’t been throw out the window. If secret courts didn’t exist. If “laws” couldn’t be twisted into pretzel or just outright ignored. If things, oh you know, like the name of the department he invokes throughout this article, could only be changed by the elected Congress instead of unilaterally changed by a mad king. Yeah, maybe then your article would hold water.
What an incredibly naive take on the current state of affairs.
unknown|1 day ago
[deleted]