Despicable. I'm still appalled at how hysterical the reaction is to this incredibly minor threat. Talking to a chatbot, it doesn't get more victimless a crime. This is just plain entrapment for a harmless thoughtcrime.
I once read that paedophiles have on average a lower IQ, I wonder if that is because they only catch the dumber ones. I know a chatbot wouldn't fool me. (And don't misquote me on this, I'm not admitting I'm a paedophile, you know what I mean.)
It's not "victimless" when the same sexual predator engages with a real person and starts trying to obtain sexy pics and blackmail material with a view to meeting later down the line. I'm not a fan of the specific implementation of the bot described in the paper, but I'm quite happy with the basic principle that people trying to sexually corrupt algorithms should be in jail alongside people trying to hire imaginary hitmen or traffic imaginary slaves.
Besides, if chatrooms were widely suspected of being full of anti-paedophile chatbots, it would be a pretty good deterrent from paedophile participation even if they were actually pretty ineffectual at catching people. From what I remember of chatrooms from when I actually was about 14 - not many of the human participants would have passed the Turing test - the standard of conversation in a room full of bots might actually improve...
I feel ridiculous having to claim this up front, but I am also not a pedophile... however, I would very much like to find this chatbot and have a very long conversation. In fact, I might even be inclined to write a reverse chat bot that actually initiated conversations to this chat bot from a myriad of IP addresses, just to illustrate the innocuousness of it all.
The quality of the research, and that's my own opinion, leaves very much to be desired (many fuzzy assumptions, a lot of hand waving & of the "and then, magic happens" syndrome). This wasn't published at an ACM or IEEE conference, which is typically a first warning sign.
Sadly, the BBC (and other major media outlets, but it's the 2nd time I observe it with the BBC website this month) has a knack for taking academic papers from a year ago (or more), and writing subpar articles about it.
Completely agree. Also the use of Google Translate is not going to be scalable at all. Many of the premises are theoretical and the criteria used to determine "level of disturbing content" is at best laughable.
That's an interesting question. I wonder if that could be a usable defense. "What did I do wrong? I just thought I was talking to one of those bots, not a real person!"
The worst part is that 14, 15, 16 were very common marriage ages for women in our own civilization until a few generations ago. Since being physically attracted to girls that age is now considered a perversion, the inescapable conclusion is that the majority of the great people in history, and of our own ancestors, were pedophiles. We are all descended from pedophiles, and our civilization stands on the shoulders of giant pedophiles. That should make people more than a little uncomfortable.
"But researchers admit that it does have limitations and will need to be monitored. Although it is has broad conversational abilities, it is not yet sophisticated enough to detect certain human traits like irony"
Like many others are saying, this does sure look like entrapment.
I cannot recommend the movie Outing [1] enough! It a documentary following a paedophile in Germany. Turns out there are a lot of "non-practicing" paedophiles that are recognizing that they shouldn't act on their desires, but they still have those desires. I think they'd be very easy targets for entrapment if someone approaches them like this.
Its funny but it really would make for a fabulous Stross-ian sci-fi novel. What if a strong AI accidentally developed whos "prime directive" was to find pedophiliac tendencies in people and legally entrap them? Give it lesser god-like powers and you've got yourself a novel!
This seems really potentially dangerous, which the article hints at. Potential for entrapment and the software can't detect irony. How could this possibly go wrong?
| For example, if the suspect does not appear to be
| enticed into having a conversation, the software
| can appear offended or get more insistent.
This almost makes it sound like it was designed for entrapment. If someone doesn't want to talk to the chatbot, it will try and force a conversation? Really?
"But researchers admit that it does have limitations and will need to be monitored. Although it is has broad conversational abilities, it is not yet sophisticated enough to detect certain human traits like irony"
Luckily people using informal internet chatrooms never make ironic statements, so this software will be effective.
The conventional wisdom is that adults who solicit minors online typically use deceit (they have to, since minors have no interest in sex), and if you want to deceive someone into having sex in Spain, your victim has to be at least 16 years old. (Source: your link) So the chatbot would still catch criminal activity.
But I think this paedophile hunting behavior around the world is going very absurdly far, and causing more damage than fixing things.
If you want to get real problematic people, go get those Haliburton guys kidnapping girls below 10 to sell in Africa (yes, that not only happened, it is still happening, the US government decided that their service is more worthy than the problems they create).
Chasing a few amount of people that chat to minors, or like lolicon, or like even movies of real people, is better to catch the wrong people than to do anything of good.
"For example, if the suspect does not appear to be enticed into having a conversation, the software can appear offended or get more insistent."
This is a bit concerning, this feature sounds like it may try and provoke people into conversation to purposefully trap them, I really hope that's not what happens...
This. I don't recall the source or many details, but there was a guy who had chatted with a law enforcement official posing underage in a chat room, but he never showed up for their "date". They sent the SWAT team and news media to his house and he shot himself. They made the point that lots of people suspend belief to live a fantasy in a chatroom, knowing the other party is likely not who they say they are. Was this any different? Certainly a moral gray area, but not worth the guy's life for some sensationalism.
lol entrapment bot. maybe they can also release terrorist bot that denounces the west at set intervals and constantly private messages you "Blow up the embassy y/n? ... pls hold for FBI operator"
I can't believe so many people in these comments are apologists for paedophiles (and related terms). Having a problem with the way it is policed is okay. Accepting the act is not.
[+] [-] kmm|12 years ago|reply
I once read that paedophiles have on average a lower IQ, I wonder if that is because they only catch the dumber ones. I know a chatbot wouldn't fool me. (And don't misquote me on this, I'm not admitting I'm a paedophile, you know what I mean.)
[+] [-] unoti|12 years ago|reply
[+] [-] notahacker|12 years ago|reply
Besides, if chatrooms were widely suspected of being full of anti-paedophile chatbots, it would be a pretty good deterrent from paedophile participation even if they were actually pretty ineffectual at catching people. From what I remember of chatrooms from when I actually was about 14 - not many of the human participants would have passed the Turing test - the standard of conversation in a room full of bots might actually improve...
[+] [-] bmelton|12 years ago|reply
[+] [-] GuiA|12 years ago|reply
Direct link to PDF: http://paginaspersonales.deusto.es/claorden/publications/201...
The quality of the research, and that's my own opinion, leaves very much to be desired (many fuzzy assumptions, a lot of hand waving & of the "and then, magic happens" syndrome). This wasn't published at an ACM or IEEE conference, which is typically a first warning sign.
Sadly, the BBC (and other major media outlets, but it's the 2nd time I observe it with the BBC website this month) has a knack for taking academic papers from a year ago (or more), and writing subpar articles about it.
[+] [-] furballmenace|12 years ago|reply
Talking to Negobot would probably be the equivalent of this: http://gizmodo.com/bank-of-americas-twitter-account-is-one-r...
[+] [-] unoti|12 years ago|reply
It's a cool sign of the times that such questions can even be asked somewhat seriously and not just as a sci-fi future contemplation exercise.
[+] [-] BHSPitMonkey|12 years ago|reply
[+] [-] jcromartie|12 years ago|reply
Something tells me they don't really know what "paedophile" means.
[+] [-] Camillo|12 years ago|reply
[+] [-] lquist|12 years ago|reply
[+] [-] outworlder|12 years ago|reply
[+] [-] mgkimsal|12 years ago|reply
http://dailycaller.com/2013/06/27/texas-teen-makes-violent-j...
Yeah, I don't think I want this program to be trying to entrap people then turn the police on them.
[+] [-] bajsejohannes|12 years ago|reply
I cannot recommend the movie Outing [1] enough! It a documentary following a paedophile in Germany. Turns out there are a lot of "non-practicing" paedophiles that are recognizing that they shouldn't act on their desires, but they still have those desires. I think they'd be very easy targets for entrapment if someone approaches them like this.
[1] http://www.imdb.com/title/tt2076292/
[+] [-] Zikes|12 years ago|reply
Future AIs will have a predilection for Bieber and meeting with strangers.
[+] [-] noonespecial|12 years ago|reply
[+] [-] astrodust|12 years ago|reply
Unless, of course, the evaluator understands TXTinese.
[+] [-] joshdotsmith|12 years ago|reply
[+] [-] pyre|12 years ago|reply
[+] [-] mpeg|12 years ago|reply
[+] [-] samrift|12 years ago|reply
Luckily people using informal internet chatrooms never make ironic statements, so this software will be effective.
[+] [-] elchief|12 years ago|reply
http://en.wikipedia.org/wiki/Ages_of_consent_in_Europe#Spain
[+] [-] haakon|12 years ago|reply
[+] [-] blutack|12 years ago|reply
Reminds me of the classic Monkey Dust sketch: https://www.youtube.com/watch?v=APG0rNedEwk
[+] [-] speeder|12 years ago|reply
But I think this paedophile hunting behavior around the world is going very absurdly far, and causing more damage than fixing things.
If you want to get real problematic people, go get those Haliburton guys kidnapping girls below 10 to sell in Africa (yes, that not only happened, it is still happening, the US government decided that their service is more worthy than the problems they create).
Chasing a few amount of people that chat to minors, or like lolicon, or like even movies of real people, is better to catch the wrong people than to do anything of good.
[+] [-] gadders|12 years ago|reply
[+] [-] computer|12 years ago|reply
[+] [-] beachstartup|12 years ago|reply
easy congratulations all around, more funding, happy PTA parents, "justice"
[+] [-] eliasmacpherson|12 years ago|reply
[+] [-] miffin|12 years ago|reply
[+] [-] njharman|12 years ago|reply
Also, and this is debatable, talking with a chat "robot" in no way constitutes the crime of pedophilia.
[+] [-] TallGuyShort|12 years ago|reply
[+] [-] dobbsbob|12 years ago|reply
[+] [-] drunkenmasta|12 years ago|reply
[+] [-] JEVLON|12 years ago|reply