(no title)
aosaigh | 3 months ago
What if AI isn’t like the other changes you’ve experienced? What if a superintelligence is developed (or emerges)? What are the implications of that? Can it be controlled?
The people running the biggest AI companies in the world are themselves worried these questions, so we should be too.
HN is exactly the type of place that should be interested in discussing issues like this and it’s unfair to just dismiss the concerns.
bruce511|3 months ago
So sure, if you want to fret about the future, if you want to be anxious about hypothetical things like singularities or super-AIs or whatever, then go for it. Whatever makes you happy.
All I'm suggesting is that this path is not particularly unique. I've lived through the cold war. Through the moral panic of the 80s and 90s with the emergence of personal computers. Through cell phones. Through Y2K.
Yes, this is different. But it's also exactly the same. Forgive me for being sanguine. The future, whatever it is is coming. Will worrying about it add a single day to your life?
If you want to change the trajectory of the future, then I recommend running for public office. Perhaps others will follow you.
The genie though is out of the bottle. It cannot be put back.
(For what it's worth, I don't think a big statistics machine will end civilization. )