Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Input Devices Security Your Rights Online

Researchers Find 'Mind-Control' Gaming Headsets Can Leak Users' Secrets 107

Sparrowvsrevolution writes "At the Usenix security conference in Seattle last week, a group of researchers from the University of California at Berkeley, Oxford University and the University of Geneva presented a study that hints at the darker side of a future where we control computers with our minds rather than a mouse. In a study of 28 subjects wearing brain-machine interface headsets built by companies like Neurosky and Emotiv and marketed to consumers for gaming and attention exercises, the researchers found they were able to extract hints directly from the electrical signals of the test subjects' brains that partially revealed private information like the location of their homes, faces they recognized and even sequences of numbers they recognized. For the moment, the experimental theft of users' private information from brain signals is more science fiction than a real security vulnerability, since it requires tricking the victim into thinking about the target information at a certain time, and still doesn't work reliably. (Though much better than random chance.) But as BMI gets more sophisticated and mainstream, the researchers say their study should serve as a warning about privacy issues around the technology of such interfaces."
This discussion has been archived. No new comments can be posted.

Researchers Find 'Mind-Control' Gaming Headsets Can Leak Users' Secrets

Comments Filter:
  • by i_ate_god ( 899684 ) on Friday August 17, 2012 @09:32AM (#41022847)

    ...then you have nothing to hide!

    I guess in the future, lucid dreaming will be mandatory learning a young age so we are forced to control our dreams to prevent deviancy.

  • by flaming error ( 1041742 ) on Friday August 17, 2012 @09:48AM (#41023007) Journal

    Thanks to this research it seems pretty clear that interfacing to the brain reveals much more than where you want to move a cursor.

    Anybody working with classified info won't be allowed anywhere near these things.

  • by Coisiche ( 2000870 ) on Friday August 17, 2012 @09:52AM (#41023043)

    Now it seems to me that could make quite a useful interrogation tool, and I'd be therefore very surprised if such things are not already in use by constabulary forces.

  • by fustakrakich ( 1673220 ) on Friday August 17, 2012 @09:52AM (#41023047) Journal

    Anybody working with classified info won't be allowed anywhere near these things.

    For everybody else they'll be mandatory as protection against pre-crime.

  • by ledow ( 319597 ) on Friday August 17, 2012 @10:07AM (#41023209) Homepage

    Not watched the Mythbusters episode about the lie detector? Apart from the absolute rubbish asserting that polygraphs work (despite there being NO scientific evidence that they have ever or could ever work and vast evidence to the contrary - and HUGE problems with their experiment setup in the first place), they do a bit where they stick people in MRI's, EEG's, fMRI's, etc.

    Basically, it's hard to tell without a very good MRI scan happening *as* the person lies, real consequences if they are found out, complete amateurs being tested, no counter-measures being taken by them, huge amounts of analysis, etc. to say if someone is lying. But if someone doesn't want it to be known they are lying, it's almost impossible to tell from any external measurement.

    And if you can't do it with medical-grade EEG or room-sized MRI results, you can't do it with a gaming headset for the next 30 years.

    Hell, the US is just about the only "first-world" country that's EVER allowed polygraph results to be used as "evidence" in a court of law.

    It's just that shite, and unreliable, a method to detect what someone is thinking. And if you can lie automatically and convincingly, then you have nothing *TO* detect in the brain. At least until you can *literally* read people's thoughts as if they were sentences being spoken aloud about what they intend to do.

    Hell, our knowledge of the brain at the moment stems mainly from waiting for someone to have a bolt fired through their brain by accident and seeing what facility they lose and what parts of the brain were damaged. Above and beyond that, the brain's a black box of which we can only measure "activity" by way of measure electromagnetic changes. That's like trying to tell what colour object is inside a opaque box that you can't touch by waving a metal detector near it.

  • by tgd ( 2822 ) on Friday August 17, 2012 @10:15AM (#41023297)

    I'm not certain why this is modded funny instead of insightful. We have been programmed by popular media and life in general to devalue privacy.

    Actually, you've been programmed by the media into believing privacy is something historically "normal". As a general rule in human history, privacy has been totally foreign. People always knew what their tribe, hamlet, neighborhood or building were up to. There wasn't an expectation of any sort of privacy, for anything from actions, to sexual activities, to hygeine. It just simply didn't happen.

    Privacy, as a popular expectation, has a lot more to do with manipulating people. Shame is a powerful method of control. When society convinces you that you should be embarassed about something, the person who knows it gains a lot of power over you. If everyone knew it, there's no power. Shame, and the associated need for a concept of privacy, were constructs that arise over and over as ways of controlling a population.

  • by kaiser423 ( 828989 ) on Friday August 17, 2012 @10:39AM (#41023635)

    Shame is a powerful method of control. When society convinces you that you should be embarassed about something, the person who knows it gains a lot of power over you. If everyone knew it, there's no power. Shame, and the associated need for a concept of privacy, were constructs that arise over and over as ways of controlling a population.

    This is an important note to make. It's really the pass/fail criteria behind DoD security clearances (barring other big issues). They don't necessarily care that you slept with another dude in college or smoke some Marijuana or did X or did Y. They care about whether that knowledge can be used to blackmail/shame you into revealing secrets. If you admit it up front and have no problem with everyone in the world knowing that you did that, then they don't care either. But if you're a straight arrow and you're so absolutely ashamed of the time that you took an extra-long lunch break, but still charged normal hours then they're going to think twice about you. It's all about there being nothing to shame/blackmail/bribe you with in your background (and not having erratic/bad judgement). Not about how goody two-shoes you are.

    In this sense, social media makes the vetting process easier. Easy to check on your judgement good/bad, and easy to see if you're ashamed of something or not (aka, did you/would you post about it).

  • Ghost in the Shell (Score:2, Interesting)

    by Gyorg_Lavode ( 520114 ) on Friday August 17, 2012 @10:58AM (#41023877)
    This is the first step towards the world described in Ghost int he Shell. At some point a need for security appliances on brain-machine interfaces will be needed and created. Then the brain-machine interface and the security appliance will move to an embedded solution within bodies. At that point, hacking will be a lot more dangerious as one of the impacts of attacking, defending, and counter-attacking will be loss of confidentiality, integrity, and availablility of people's own brains.

Never call a man a fool. Borrow from him.

Working...