This is an awfully contrived title for an article that could be summarized as "people can find out whether or not you recognize something shown to you by monitoring electrical activity along the scalp."
If you manage to hack out a list of all 4-digit numbers that you recognize, it's trivial to bruteforce which of those numbers are for your cards or for some other security PINs.
Also, it has other practical uses - think of it as a better-than-polygraph test for questions of type "have you seen this person" or "does this account-password belong to you".
Well the SCIFI-future scenario is valid though. Imagine a world in which people use such devices regularly. It's not that hard to envision some social media application or game that can extract some information without you being aware of it.
This relies on an unsuspecting victim wearing a complicated nonstandard headset and then looking at a series of images / numbers slowly enough to register each of them consciously.
In what world would the victim not become suspicious?
(I appreciate things may change in the future, and if brain control headsets become common then a malware model (ad popups, for example) could provide a plausible vector for this attack.)
It's my understanding that the headset is in fact standard:
(from the actual paper) "The experiments are implemented and tested using a Emotiv EPOC BCI device"
(from the hyperbole article) "For $200-300, you can buy an Emotiv"
In what world would the victim not become suspicious? I think this result is framed as "if BCI-controlled gaming takes off, it doesn't take much to harvest personal data from gamers".
Also, I wonder what are the implications for interrogation methods (think CIA, not local police). They didn't test what happens if the victim is actually trying to resist, maybe even if the victim has had guidance on how to resist. I would love to know.
The research(both in this paper and the previous one at Usenix security 2012) is over hyped bullshit. The experiment was: remember this pin number to enter at the end of the experiment and then we show you numbers and look for a recognition signal. Or they check that you recognize an image of your bank.
This is just image/text recognition research from 1980's and 90's neuroscience regurgitated as security publications with far shittier experimental methodology and consumer equipment.
At no point did they actually demonstrate they got access to secrets you knew. E.g. your real PIN number and they certainly didn't demonstrate they could do so surreptitiously. There is no reason to believe you could actually do this and these experiments tell us nothing we didn't already know from actual real experiments done by real clinical researchers: you can use the p300 signal to tell if someone recognizes a specified stimulus.
This implies the possibility of "something you know" may be only just as secure as "something you have."
As people integrate and evolve to include technology, the security aspects of bio-technical interfaces are going to get really interesting and damn important.
"Thought crime" will soon have a much darker and more dangerous meaning. Of course NSA will want to tap everything people are thinking, just like they're already treating all human communications "to keep us safe". I don't think it's a stretch to think they'll want to do that, too, if nothing changes, and people continue to let them do anything they want in the name of "national security".
Wow I wasn't aware that EEGs are this cheap. Does anyone know how well these 200-300$ thingies play with Linux and how easy it is to hack around with them generally?
I'd love to log my brain activities while learning, reading or playing poker :D
Edit:
Seems like the Emotive EPOC has an SDK that supports Linux and also an open source library called Emokit that was build from reverse engineering the device's communication :D
Turns out they aren't actually that cheap. To get a real EEG from Emotiv, it's $750 just for the device - the $300 version doesn't seem to actually be an EEG, they call it an EPOC, and don't exactly explain what it is, but do mention that it will not give you access to raw EEG data, which is what you need for any sort of legitimate experiment. On top of that, if you want to use the SDK, licensed properly, you need to pay an additional $500 or more. So if you want to play with an EEG and it's API, the minimum price you're really paying is $1250 - far from the $300 mentioned in the article.
In addition, these cheaper consumer EEGs don't produce research-grade data, so while they are good for messing around and experimenting, if you want to get serious, you'll need to upgrade to a more expensive headset.
This is pretty common for how Emotiv presents itself. If you look through their site and write ups about their Epoc headset, you'll find the same kind of overhyped and misleading information.
It's cool that home BCI is so cheap now, I just wish they weren't trying to captilize so heavily on it.
This is how it will go down. First, the government is going to own these companies. Then they are going to declare the technology illegal to use in private hands. Third, they will train operatives that can only be certified by government agencies to use these devices.
Sensationalist title designed to gain unjustified views. Accurate title would be "$200-$300 buys you an off the shelf polygraph test". Same principles, this has been known as a "lie detector" test for years.. and it's defeatable..
It seems completely different than a lie detector. Classic polygraphs, in essence, measure stress-responses. This measures [success of] pattern recognition. You can't use it for many yes/no lie-detector questions, however, it has a potential to be much more accurate (and less spoofable) for questions like "Do you remember this face?" or "Have you seen 'ox9j$lkjew' before ? It's a password to a child-porn site we found on your computer - wondering if you have used it.."
Assuming something like this actually works some day, I wonder if you could avoid it by having your secret be something that can't be encoded visually - eg haptic feedback/gesture rather than passwords.
Neat idea. The debit card pin bit does not seem feasible though, at least in a brute force setting - finding out a 6 digit pin, showing each number for 1 second, takes > 11 days in the worst case.
But in any case showing pins that way wouldn't work anyway - most people have a muscle memory for their pins, but would not recognize them when written down.
[+] [-] Centigonal|12 years ago|reply
[+] [-] PeterisP|12 years ago|reply
Also, it has other practical uses - think of it as a better-than-polygraph test for questions of type "have you seen this person" or "does this account-password belong to you".
[+] [-] singingfish|12 years ago|reply
[+] [-] kriro|12 years ago|reply
[+] [-] molsongolden|12 years ago|reply
[+] [-] cpdean|12 years ago|reply
0-Day? I knew I shouldn't have upgraded from primate.
[+] [-] tehwalrus|12 years ago|reply
In what world would the victim not become suspicious?
(I appreciate things may change in the future, and if brain control headsets become common then a malware model (ad popups, for example) could provide a plausible vector for this attack.)
[+] [-] Nimi|12 years ago|reply
(from the actual paper) "The experiments are implemented and tested using a Emotiv EPOC BCI device"
(from the hyperbole article) "For $200-300, you can buy an Emotiv"
In what world would the victim not become suspicious? I think this result is framed as "if BCI-controlled gaming takes off, it doesn't take much to harvest personal data from gamers".
Also, I wonder what are the implications for interrogation methods (think CIA, not local police). They didn't test what happens if the victim is actually trying to resist, maybe even if the victim has had guidance on how to resist. I would love to know.
[+] [-] VMG|12 years ago|reply
[+] [-] aa0|12 years ago|reply
[+] [-] anologwintermut|12 years ago|reply
This is just image/text recognition research from 1980's and 90's neuroscience regurgitated as security publications with far shittier experimental methodology and consumer equipment.
At no point did they actually demonstrate they got access to secrets you knew. E.g. your real PIN number and they certainly didn't demonstrate they could do so surreptitiously. There is no reason to believe you could actually do this and these experiments tell us nothing we didn't already know from actual real experiments done by real clinical researchers: you can use the p300 signal to tell if someone recognizes a specified stimulus.
[+] [-] ballard|12 years ago|reply
This implies the possibility of "something you know" may be only just as secure as "something you have."
As people integrate and evolve to include technology, the security aspects of bio-technical interfaces are going to get really interesting and damn important.
[+] [-] mtgx|12 years ago|reply
[+] [-] kriro|12 years ago|reply
I'd love to log my brain activities while learning, reading or playing poker :D
Edit: Seems like the Emotive EPOC has an SDK that supports Linux and also an open source library called Emokit that was build from reverse engineering the device's communication :D
[+] [-] jenius|12 years ago|reply
In addition, these cheaper consumer EEGs don't produce research-grade data, so while they are good for messing around and experimenting, if you want to get serious, you'll need to upgrade to a more expensive headset.
[+] [-] narfquat|12 years ago|reply
But really, looks like this experiment could be totally derailed by closing your eyes, or by thinking of irrelevant topics.
Still pretty neat though.
[+] [-] dodo53|12 years ago|reply
[+] [-] spullara|12 years ago|reply
Related, the MRI lie detector: http://www.ncbi.nlm.nih.gov/pubmed/19092066
[+] [-] snom380|12 years ago|reply
[+] [-] trit|12 years ago|reply
It's cool that home BCI is so cheap now, I just wish they weren't trying to captilize so heavily on it.
[+] [-] brisance|12 years ago|reply
[+] [-] snom380|12 years ago|reply
[+] [-] cmapes|12 years ago|reply
[+] [-] PeterisP|12 years ago|reply
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] lukasb|12 years ago|reply
[+] [-] donquichotte|12 years ago|reply
[+] [-] ars|12 years ago|reply
But in any case showing pins that way wouldn't work anyway - most people have a muscle memory for their pins, but would not recognize them when written down.
[+] [-] prof_hobart|12 years ago|reply
[+] [-] jokoon|12 years ago|reply
[+] [-] pronoiac|12 years ago|reply
[+] [-] c-oreills|12 years ago|reply
[+] [-] bobwaycott|12 years ago|reply
[+] [-] quantumpotato_|12 years ago|reply
[+] [-] Aardwolf|12 years ago|reply
[+] [-] aa0|12 years ago|reply