I am directing the Department of War to designate Anthropic a supply-chain risk
1344 points| jacobedawson | 1 day ago |twitter.com
https://www.cnbc.com/2026/02/27/trump-anthropic-ai-pentagon....
1344 points| jacobedawson | 1 day ago |twitter.com
https://www.cnbc.com/2026/02/27/trump-anthropic-ai-pentagon....
Some comments were deferred for faster rendering.
_fat_santa|1 day ago
I would assume the original terms the DoW is now railing against were in those original contracts that they signed. In that case it looks like the DoW is acting in bad faith here, they signed the original contact and agreed to those terms, then they went back and said no, you need to remove those safeguards to which Anthropic is (rightly so) saying no.
Am I missing something here?
EDIT: Re-reading Dario's post[1] from this morning I'm not missing anything. Those use cases were never part of the original contacts:
> Two such use cases have never been included in our contracts with the Department of War
So yeah this seems pretty cut and dry. Dow signed a contract with Anthropic and agreed to those terms. Then they decided to go back and renege on those original terms to which Anthropic said no. Then they promptly threw a temper tantrum on social media and designated them as a supply chain risk as retaliation.
My final opinion on this is Dario and Anthropic is in the right and the DoW is acting in bad faith by trying to alter the terms of their original contracts. And this doesn't even take into consideration the moral and ethical implications.
[1]: https://www.anthropic.com/news/statement-department-of-war
pinkmuffinere|1 day ago
[1] https://www.anthropic.com/news/statement-department-of-war
techblueberry|1 day ago
lukewrites|1 day ago
labrador|1 day ago
In fact, as a patriotic American veteran, I'd be ok with Anthropic moving to Europe. It might be better for Claude and AGI, which are overriding issues for me.
Rutger Bregman @rcbregman
This is a huge opportunity for Europe. Welcome Anthropic with open arms. Roll out the red carpet. Visa for all employees.
Europe already controls the AI hardware bottleneck through ASML. Add the world's leading AI safety lab and you have the foundations of an AI superpower.
https://x.com/rcbregman/status/2027335479582925287
Someone1234|1 day ago
But how do you even begin to discuss that Tweet or this topic without talking about ideology and to contextualize this with other seemingly unrelated things currently going on in the US?
I genuinely don't think I'm conversationally agile enough to both discuss this topic while still able to avoid the political/ideological rabbit-hole.
0xbadcafebee|1 day ago
This is the new McCarthyism. Do what the administration says, or you will be blacklisted, or worse.
nickysielicki|1 day ago
The designation says any contractor, supplier, or partner doing business with the US military can’t conduct any commercial activity with Anthropic. Well, AWS has JWCC. Microsoft has Azure Government. Google has DoD contracts. If that language is enforced broadly, then Claude gets kicked off Bedrock, Vertex, and potentially Azure… which is where all the enterprise revenue lives. Claude cannot survive on $200/mo individual powerusers. The math just doesn’t math.
rushcar|1 day ago
This is authoritarian behavior. You're having trouble negotiating a contract, so instead of just canceling it - you basically ban all of F500 from doing business with that firm.
easton|1 day ago
I’m sure the lawyers just got paged, but does this mean the hyperscalers (AWS, GCP) can’t resell Claude anymore to US companies that aren’t doing business with the DoD? That’s rough.
NickAndresen|1 day ago
eckelhesten|1 day ago
kilroy123|1 day ago
readitalready|1 day ago
hoppoli|1 day ago
general1465|1 day ago
When we have first politician blown to bits by autonomous AI FPV there will be sheer panic of every politician in the world to put the genie back into the bottle. It will be too late at that point.
Anthropic is correct with its no killbot rule.
cmiles8|1 day ago
agmater|1 day ago
getpokedagain|1 day ago
avaer|1 day ago
linuxhansl|1 day ago
Supply-chain risks means "the potential for adversaries to sabotage, subvert, or disrupt the integrity and delivery of defense systems, including software, hardware, and services, to degrade national security".
So now Anthropic is an adversary, because it does not want "fully autonomous weapons" or automated mass surveillance? Sure thing, DoD. Go use Grok or whatever, I'm sure that will go great.
cube00|1 day ago
So OpenAI will also be marked as a supply chain risk too, right?
[1]: https://www.axios.com/2026/02/27/altman-openai-anthropic-pen...
txrx0000|1 day ago
Open-source everything. Papers, code, weights, financial records. Do all of your research in the open. Run a 100% transparent organization so that there's nothing to take from you. Level the playing field for good and bad actors alike, otherwise the bad actors will get their hands on it while everyone else is left behind.
Stop comparing AI capabilities to nuclear weapons. A nuke cannot protect against or reverse the damage of another nuke. AI capabilities are not like nukes. Diffuse it as much as possible. Give it to everyone and the good will prevail.
Build a world where millions of AGIs run on millions of gaming PCs, aligned with millions of different individuals. It is a necessary condition for humanity's survival.
leapis|1 day ago
dang|1 day ago
Statement from Dario Amodei on our discussions with the Department of War - https://news.ycombinator.com/item?id=47173121 - Feb 2026 (1508 comments)
bnycum|1 day ago
phs318u|1 day ago
Dario is Lando, complaining “We had a deal!” Only to be told, “I’m altering the deal. Pray I don’t alter it any further.”
cannabis_sam|1 day ago
I wish I thought enough Americans had the spine required to stand up to this, and I know for a fact that a lot do... the solution is literally written into your constitution.
cpeterso|1 day ago
garbawarb|1 day ago
liuliu|1 day ago
kylecazar|1 day ago
This administration consistently exploits what were designed to be emergency powers because no such requirement exists. Leave no room for interpretation.