top | item 6026481

'Virtual Lolita' Aims to Trap Chatroom Paedophiles

28 points| ytNumbers | 12 years ago |bbc.co.uk | reply

48 comments

order
[+] kmm|12 years ago|reply
Despicable. I'm still appalled at how hysterical the reaction is to this incredibly minor threat. Talking to a chatbot, it doesn't get more victimless a crime. This is just plain entrapment for a harmless thoughtcrime.

I once read that paedophiles have on average a lower IQ, I wonder if that is because they only catch the dumber ones. I know a chatbot wouldn't fool me. (And don't misquote me on this, I'm not admitting I'm a paedophile, you know what I mean.)

[+] unoti|12 years ago|reply
Citizen! You yourself are perilously close to committing one or more Thoughtcrimes now,and don't think we're not watching!
[+] notahacker|12 years ago|reply
It's not "victimless" when the same sexual predator engages with a real person and starts trying to obtain sexy pics and blackmail material with a view to meeting later down the line. I'm not a fan of the specific implementation of the bot described in the paper, but I'm quite happy with the basic principle that people trying to sexually corrupt algorithms should be in jail alongside people trying to hire imaginary hitmen or traffic imaginary slaves.

Besides, if chatrooms were widely suspected of being full of anti-paedophile chatbots, it would be a pretty good deterrent from paedophile participation even if they were actually pretty ineffectual at catching people. From what I remember of chatrooms from when I actually was about 14 - not many of the human participants would have passed the Turing test - the standard of conversation in a room full of bots might actually improve...

[+] bmelton|12 years ago|reply
I feel ridiculous having to claim this up front, but I am also not a pedophile... however, I would very much like to find this chatbot and have a very long conversation. In fact, I might even be inclined to write a reverse chat bot that actually initiated conversations to this chat bot from a myriad of IP addresses, just to illustrate the innocuousness of it all.
[+] GuiA|12 years ago|reply
Google scholar page for the paper: http://scholar.google.com/citations?view_op=view_citation&hl...

Direct link to PDF: http://paginaspersonales.deusto.es/claorden/publications/201...

The quality of the research, and that's my own opinion, leaves very much to be desired (many fuzzy assumptions, a lot of hand waving & of the "and then, magic happens" syndrome). This wasn't published at an ACM or IEEE conference, which is typically a first warning sign.

Sadly, the BBC (and other major media outlets, but it's the 2nd time I observe it with the BBC website this month) has a knack for taking academic papers from a year ago (or more), and writing subpar articles about it.

[+] unoti|12 years ago|reply
Is it illegal in Spain to engage in lewd discussions with a computer program?

It's a cool sign of the times that such questions can even be asked somewhat seriously and not just as a sci-fi future contemplation exercise.

[+] BHSPitMonkey|12 years ago|reply
That's an interesting question. I wonder if that could be a usable defense. "What did I do wrong? I just thought I was talking to one of those bots, not a real person!"
[+] jcromartie|12 years ago|reply
> posing as a 14-year-old girl to spot paedophiles

Something tells me they don't really know what "paedophile" means.

[+] Camillo|12 years ago|reply
The worst part is that 14, 15, 16 were very common marriage ages for women in our own civilization until a few generations ago. Since being physically attracted to girls that age is now considered a perversion, the inescapable conclusion is that the majority of the great people in history, and of our own ancestors, were pedophiles. We are all descended from pedophiles, and our civilization stands on the shoulders of giant pedophiles. That should make people more than a little uncomfortable.
[+] outworlder|12 years ago|reply
Not only that, but in some countries, 14 is the age of consent.
[+] mgkimsal|12 years ago|reply
"But researchers admit that it does have limitations and will need to be monitored. Although it is has broad conversational abilities, it is not yet sophisticated enough to detect certain human traits like irony"

http://dailycaller.com/2013/06/27/texas-teen-makes-violent-j...

Yeah, I don't think I want this program to be trying to entrap people then turn the police on them.

[+] bajsejohannes|12 years ago|reply
Like many others are saying, this does sure look like entrapment.

I cannot recommend the movie Outing [1] enough! It a documentary following a paedophile in Germany. Turns out there are a lot of "non-practicing" paedophiles that are recognizing that they shouldn't act on their desires, but they still have those desires. I think they'd be very easy targets for entrapment if someone approaches them like this.

[1] http://www.imdb.com/title/tt2076292/

[+] Zikes|12 years ago|reply
Headlines of tomorrow: "Paedophile entrapment bot first in history to pass Turing test"

Future AIs will have a predilection for Bieber and meeting with strangers.

[+] noonespecial|12 years ago|reply
Its funny but it really would make for a fabulous Stross-ian sci-fi novel. What if a strong AI accidentally developed whos "prime directive" was to find pedophiliac tendencies in people and legally entrap them? Give it lesser god-like powers and you've got yourself a novel!
[+] astrodust|12 years ago|reply
This sort of bot has a better chance of passing the Turing test than an average girl.

Unless, of course, the evaluator understands TXTinese.

[+] joshdotsmith|12 years ago|reply
This seems really potentially dangerous, which the article hints at. Potential for entrapment and the software can't detect irony. How could this possibly go wrong?
[+] pyre|12 years ago|reply

  | For example, if the suspect does not appear to be
  | enticed into having a conversation, the software
  | can appear offended or get more insistent. 
This almost makes it sound like it was designed for entrapment. If someone doesn't want to talk to the chatbot, it will try and force a conversation? Really?
[+] mpeg|12 years ago|reply
Yeah. It mentions the bot getting more insistent if "the suspect does not appear to be enticed into having a conversation", how is this a good thing?
[+] samrift|12 years ago|reply
"But researchers admit that it does have limitations and will need to be monitored. Although it is has broad conversational abilities, it is not yet sophisticated enough to detect certain human traits like irony"

Luckily people using informal internet chatrooms never make ironic statements, so this software will be effective.

[+] elchief|12 years ago|reply
Oddly enough, the age of consent in Spain is 13.

http://en.wikipedia.org/wiki/Ages_of_consent_in_Europe#Spain

[+] haakon|12 years ago|reply
The conventional wisdom is that adults who solicit minors online typically use deceit (they have to, since minors have no interest in sex), and if you want to deceive someone into having sex in Spain, your victim has to be at least 16 years old. (Source: your link) So the chatbot would still catch criminal activity.
[+] speeder|12 years ago|reply
The technology part is interesting...

But I think this paedophile hunting behavior around the world is going very absurdly far, and causing more damage than fixing things.

If you want to get real problematic people, go get those Haliburton guys kidnapping girls below 10 to sell in Africa (yes, that not only happened, it is still happening, the US government decided that their service is more worthy than the problems they create).

Chasing a few amount of people that chat to minors, or like lolicon, or like even movies of real people, is better to catch the wrong people than to do anything of good.

[+] gadders|12 years ago|reply
Well, Halliburton isn't alone in this UN staff trade food for sexual favours in famines.
[+] computer|12 years ago|reply
You're going to have to add some references if you make claims that the US government is contracting with people/children smugglers.
[+] beachstartup|12 years ago|reply
why? it's a lot easier to nab chatroom losers than go after the largest contractors in the world.

easy congratulations all around, more funding, happy PTA parents, "justice"

[+] eliasmacpherson|12 years ago|reply
Can't find the link between Halliburton and sex trafficking to Africa, can you provide one please?
[+] miffin|12 years ago|reply
"For example, if the suspect does not appear to be enticed into having a conversation, the software can appear offended or get more insistent." This is a bit concerning, this feature sounds like it may try and provoke people into conversation to purposefully trap them, I really hope that's not what happens...
[+] njharman|12 years ago|reply
"Trap" as in entrapment makes using this to arrest or expose people wrong.

Also, and this is debatable, talking with a chat "robot" in no way constitutes the crime of pedophilia.

[+] TallGuyShort|12 years ago|reply
This. I don't recall the source or many details, but there was a guy who had chatted with a law enforcement official posing underage in a chat room, but he never showed up for their "date". They sent the SWAT team and news media to his house and he shot himself. They made the point that lots of people suspend belief to live a fantasy in a chatroom, knowing the other party is likely not who they say they are. Was this any different? Certainly a moral gray area, but not worth the guy's life for some sensationalism.
[+] dobbsbob|12 years ago|reply
lol entrapment bot. maybe they can also release terrorist bot that denounces the west at set intervals and constantly private messages you "Blow up the embassy y/n? ... pls hold for FBI operator"
[+] drunkenmasta|12 years ago|reply
Not too long ago, the same kind of witch-hunts were going on against homosexuals (Alan Turing comes to mind). And now?
[+] JEVLON|12 years ago|reply
I can't believe so many people in these comments are apologists for paedophiles (and related terms). Having a problem with the way it is policed is okay. Accepting the act is not.