top | item 35290846

(no title)

Shindi | 2 years ago

Open source does NOT equal safe, it's actually worse. If you release something that can wreak havoc and it's open source, it will ALWAYS be out there, no off switch.

Imagine we later discover that an open source LLM is way more powerful than we realize. For example GPT-3 is pretty powerful but it really hasn't been out a while. Imagine what we discover it can do in 3-5 years, without even accounting for more advanced models like GPT-4 which is already out. Imagine someone discovers some really powerful, dangerous capability years down the line.

People can't imagine what can go wrong with LLMs, but think about the recourse we have for bad behavior online now: arresting people, forcing legal action against individuals/companies, sanctions or financial repercussions. Notice these aren't technical barriers, these are social barriers. You can't do these things again language models!

discuss

order

JohnFen|2 years ago

> it will ALWAYS be out there, no off switch.

Well, to be fair, that's where we are anyway -- open source or not.