top | item 12684362

(no title)

LukeB_UK | 9 years ago

Microsoft's bot (Tay) learned as people talked with it. People took advantage of that and basically attacked it with racist things which meant it ended up learning to be a racist.

discuss

order

StavrosK|9 years ago

Actually, IIRC someone had discovered a debug command, "repeat" or something like that. So people would just tell it to repeat offensive sentences.

ljk|9 years ago

offtopic, but how does someone "discover" it? by sheer luck?

dorfsmay|9 years ago

So we're rediscovering parenting via AI!

Practicality|9 years ago

I can see AI parent being a new career.

JonnieCache|9 years ago

I remember FYAD doing the same thing to an eliza implementation that had learning back in the day, in like 2004. Plus ca change...

gohrt|9 years ago

It's pretty well established that modern AI's main contribution is (a) massively larger datasets, (b) algorithms and technology to handle those massively larger datasets, (c) boring but important parameter tuning.

The core learning algorithms are not changing.