(no title)
lbrandy | 5 months ago
While I see what you are getting at, and I think its super important we come up with philosophical frameworks to push back on the central idea in question (ie, the moral hazard of "its gonna happen anyway so why not pour a little more into the river").... I think your writing/responses miss the central point.
As I see it, the fundamental issue with this essay, and your responses, is you keep conflating impossible with probability zero. People are saying "this is inevitable" to mean this has probability 1 of occurring, with basic game theory reasoning (its a giant iterative prisoners dilemna), and your response "but it's possible". Yes, with measure zero.
Telling us that such a path surely exists isn't useful. If you want to push back on "inevitability" you need to find a credible path with probability > 0 (which is not the same as impossible).
top256|5 months ago
We actually agree: even if the probability of successful coordination is only 10%, accepting inevitability makes it 0%. That difference matters enormously given the stakes. My argument isn't "coordination is definitely possible" but rather "believing it's impossible guarantees failure." When tech leaders say "AGI is inevitable," they're not describing reality; they're shaping it by discouraging attempts to coordinate. Human cloning hasn't happened because we maintain active resistance despite technical feasibility.
You're asking for credible paths with P > 0. I'm saying: knowing P with certainty is impossible, so accepting P = 1 narratives makes alternative paths invisible. The path emerges through trial and error, not before it.
naasking|5 months ago
No, they're describing reality. As I posted in another comment, progress in technology drops capital requirements for innovation. Even if there's global coordination to stop AGI development right now, progress in tech means that in 30 years someone in their basement could do what OpenAI is doing right now but with commodity hardware. Preventing this would require an oppressive regime controlling basic information technology and knowledge to an extent that isn't palatable to anyone.