(no title)
DaedPsyker | 2 years ago
Whatever your own opinion, Google did it out of what they perceived to be good intentions (and very likely business sense given a global audience for their products). Yet their intentions directly lead to unintended consequences. Google is being a baby with a gun in essence. Like he says, what if they decide to ask it to solve climate change and it decides to wipe humans out?
Obviously it's still very theoretical and can't do anything like that, but the point is more that perhaps Google doesn't have the culture necessarily to truly interrogate their actions.
therealjumbo|2 years ago
>This event is significant because it is major demonstration of someone giving a LLM a set of instructions and the results being totally not at all what they predicted.
Replace LLM with computer in that sentence, is it still novel? Laughably far from it, unexpected results are one of the defining features of moderately complex software programs going all the way back to the first person to program a computer. Some of the results are unexpected, but a lot are not, because it's literally doing what the prompt injection tells it to. Which isn't all that surprising but sure anyway...
>Obviously it's still very theoretical and can't do anything like that, but the point is more that perhaps Google doesn't have the culture necessarily to truly interrogate their actions.
Oh that's definitely true.
rayiner|2 years ago
That makes even less sense, because most countries “globally” are internally quite homogenous. If someone in Bangladesh or China writes “show me pictures of people walking outside,” it’s even more jarring to deliberately insert random Latinos, East Asians, and Africans.
Given Google’s global audience, it might want to detect the customer’s location and show Chinese people pictures of Chinese people, and Japanese people pictures of Japanese people. That actually makes a lot of sense. But that’s not what they did.
hollerith|2 years ago