top | item 18433996

(no title)

arnioxux | 7 years ago

Tangentially related, this reminds me of the spy recordings[1] between incarcerated German nuclear physicists and their reactions to the atomic bomb dropping.

At first they were incredulous and thought it must be a bluff. The bomb should be "impossible", with Germany failing at their own nuclear project a few years earlier. Then they slowly worked out how it could have been done, and realized that the americans must've had hundreds of thousands of people working on it. "Which is a hundred times more than we had" This was follow by a lot of regret over what they could've done better and the various implications of a world where such a bomb now exists.

I originally found that link from a tweet[2] by someone working at OpenAI. I am sure AI scientists are feeling similar anxiety about their research.

It's easier than ever for someone with a hundred times your computation resources to achieve things that are supposed to be "impossible", at least to the unsuspecting public who haven't grasp the rate of progress in AI.

And I am not even talking about some massmurdering AGI. It's the boring stuff like astroturfing chatbots whose sole purpose is to psychoanalyze individuals to manipulate voting behavior that scares me. This asymmetric power might already be available to those who are willing to throw a few hundred GPU years at the problem and I am not sure how the common man can defend against it.

[1] "Transcript of Surreptitiously Taped Conversations among German Nuclear Physicists at Farm Hall, August 1945" http://germanhistorydocs.ghi-dc.org/pdf/eng/English101.pdf

[2] https://twitter.com/karpathy/status/778286393441169408

discuss

order

baq|7 years ago

I find your analogy very concerning. It looks like it’s already the case with Russian troll factories and who knows who else.