Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Privacy Input Devices Security Your Rights Online

Researchers Find 'Mind-Control' Gaming Headsets Can Leak Users' Secrets 107

Sparrowvsrevolution writes "At the Usenix security conference in Seattle last week, a group of researchers from the University of California at Berkeley, Oxford University and the University of Geneva presented a study that hints at the darker side of a future where we control computers with our minds rather than a mouse. In a study of 28 subjects wearing brain-machine interface headsets built by companies like Neurosky and Emotiv and marketed to consumers for gaming and attention exercises, the researchers found they were able to extract hints directly from the electrical signals of the test subjects' brains that partially revealed private information like the location of their homes, faces they recognized and even sequences of numbers they recognized. For the moment, the experimental theft of users' private information from brain signals is more science fiction than a real security vulnerability, since it requires tricking the victim into thinking about the target information at a certain time, and still doesn't work reliably. (Though much better than random chance.) But as BMI gets more sophisticated and mainstream, the researchers say their study should serve as a warning about privacy issues around the technology of such interfaces."
This discussion has been archived. No new comments can be posted.

Researchers Find 'Mind-Control' Gaming Headsets Can Leak Users' Secrets

Comments Filter:
  • by i_ate_god ( 899684 ) on Friday August 17, 2012 @08:32AM (#41022847)

    ...then you have nothing to hide!

    I guess in the future, lucid dreaming will be mandatory learning a young age so we are forced to control our dreams to prevent deviancy.

    • by mark-t ( 151149 )
      In the future, people will be communicating via technology-enabled telepathy. As commonplace as cellphones are now.
      • by Yvanhoe ( 564877 ) on Friday August 17, 2012 @09:01AM (#41023139) Journal
        Technology-enabled telepathy is actually what I call cellphones today : you hold a talisman, another talisman rings and transmits speech. Just implant it if you want, but the magic is already done.
        • Not entirely telepathy, IMO. It still requires actual speech. It's functionally equivalent to telepathy in that another person knows what you intended to convey, but the requirement of actual physical speech brings it back over the line.
        • by mark-t ( 151149 )

          Cellphones transmit sound, and speech is just a subcategory of that.

          When I said "telepathy", I meant what a person is *thinking*. This could include spoken words, but it could also include what is being seen, or even imagined or remembered.

        • by N0Man74 ( 1620447 ) on Friday August 17, 2012 @09:56AM (#41023849)

          Technology-enabled telepathy is actually what I call cellphones today : you hold a talisman, another talisman rings and transmits speech. Just implant it if you want, but the magic is already done.

          You misunderstand the meaning of the word. Telepathy is transference of thought or experience. It isn't simply the transfer of voice, words, or even expression of ideas. The roots would be "tele" and "pathe", which would translate as "distant" and "experience" respectively. For your cell phone, I think "distant voice" would be far more accurate, or "telephone"...

          Though, smart phones with cameras might also fall under "distant sight", or... "television".

          • Yes, but we are speaking English, not Latin. In English, Telepathy refers specifically to the direct transference of thought messages from one mind to another mind, without first traveling through intermediate mediums such as text or speech, perhaps even disregarding the need of language. To suggest that using a cellphone is a form of telepathy, is much the same as suggesting that shouting to a person across a large room from you is a form of telepathy. While this is apparently true for the definition of "t

        • Woo imagine stuff, telepathy, speaking with the dead, chasing the bad spirit away, science and engineering instead find solutions : mobile phone, mass storage memory, medicine.
    • by Anonymous Coward

      I guess in the future, lucid dreaming will be mandatory learning a young age so we are forced to control our dreams to prevent deviancy.

      Newspeak has a word for something like that: crimestop.

    • by mcgrew ( 92797 ) *

      I guess in the future, lucid dreaming will be mandatory learning a young age

      And we'll have a new form of entertainment. [wikipedia.org]

    • If you thought nothing wrong then you have nothing to hide!

      Its about marketing not crime. Your browser shows you A and then B, or your video game shows you A and then B, etc. The parts of your brain correlated with "wants" shows more activity during B. Targeted marketing starts delivering ads related to B.

      This technology may not be accurate enough for a court of law but marketing does not need that level of accuracy.

    • by WTFmonkey ( 652603 ) on Friday August 17, 2012 @11:16AM (#41024705)
      That, or Faraday-cage helmets will be all the fashion rage...
      • by HeX314 ( 570571 )

        I prefer to militarize my subconscious just in case someone tries to steal my secrets in a dream.

  • by second_coming ( 2014346 ) on Friday August 17, 2012 @08:35AM (#41022871)
    take off the headset before going to the ATM :)
    • hell, with these "privacy concerns", you better take off the headset before using facebook or even google!

      • by Anonymous Coward

        And then somebody will make a law where it will be illegal to *not* use the headset!

      • hell, with these "privacy concerns", you better take off the headset before using facebook or even google!

        If you're using facebook, then privacy probably isn't important to you. Just leave it on.

        • You know that it is possible to use facebook without sacrificing too much in the way of privacy and personal details.

          1. don't put much (or any) personal details up on your page. I think mine knows the city and state I currently live in, the city and state I was born in, who I'm married to, and my birthday. that's it. FB doesn't have my detailed address or phone numbers. It doesn't know where I went to school. It doesn't know where I work or have worked. It doesn't know what I'm interested in. I'm a member o

    • by Jeng ( 926980 )

      The Google glasses you shouldn't wear in front of an ATM.

      I don't see the risk of wearing one of these helms to the ATM, well no actual risk, just paranoia. There is more of a risk of someone putting the helm on you and flashing random numbers in front of you until they get your PIN.

    • take off the headset before going to the ATM :)

      This might be less of a problem: http://science.slashdot.org/story/04/03/18/0132222/nasa-develops-tech-to-hear-words-not-yet-spoken [slashdot.org]

  • by fuzzyfuzzyfungus ( 1223518 ) on Friday August 17, 2012 @08:36AM (#41022877) Journal

    But now ve hav vays only of collectink unemployment...

  • ...don't tell Miniluv.
  • by QilessQi ( 2044624 ) on Friday August 17, 2012 @08:47AM (#41022989)

    ...people voluntarily reveal private information like the location of their homes, what they had for breakfast, favorite sexual positions, etc.

    • by Seumas ( 6865 )

      If you don't use Facebook and Twitter, you are suspicious and may be a terrorist, serial killer, etc.

      If you don't participate in the new generation of mind-control(and reading) devices, you are suspicious and may be a terrorist, serial killer, etc.

    • by __aaeihw9960 ( 2531696 ) on Friday August 17, 2012 @08:59AM (#41023129)
      I'm not certain why this is modded funny instead of insightful. We have been programmed by popular media and life in general to devalue privacy.

      We've been taught that the only people who need privacy are terrorists or pedophiles.

      So, why would anyone need to go through the trouble of reading our minds when we've been pretty well conditioned to just hand out our personal identifiers without thought?

      It seems to me that if I need to know where you live, what your passwords are, and what you had for breakfast, I just make a NEW AND IMPROVED SUPER FUN SOCIAL MEDIA POWERED GAME!!!

      • by tgd ( 2822 ) on Friday August 17, 2012 @09:15AM (#41023297)

        I'm not certain why this is modded funny instead of insightful. We have been programmed by popular media and life in general to devalue privacy.

        Actually, you've been programmed by the media into believing privacy is something historically "normal". As a general rule in human history, privacy has been totally foreign. People always knew what their tribe, hamlet, neighborhood or building were up to. There wasn't an expectation of any sort of privacy, for anything from actions, to sexual activities, to hygeine. It just simply didn't happen.

        Privacy, as a popular expectation, has a lot more to do with manipulating people. Shame is a powerful method of control. When society convinces you that you should be embarassed about something, the person who knows it gains a lot of power over you. If everyone knew it, there's no power. Shame, and the associated need for a concept of privacy, were constructs that arise over and over as ways of controlling a population.

        • You're basically right: as I understand it, in America around either pre-Colonial or Colonial times, you were expected to keep your windows open (or at least unshuttered and with curtains open) so that neighbors could peek in and make sure you weren't up to anything ungodly. Failure to do so would have been regarded as suspicious.

          But there was still shame in those days, despite the expected lack of privacy. You could easily be shamed for what you did right out in the open. The concepts of shame and priva

          • by captjc ( 453680 )

            I would have figured it was because there was no such thing as an electric fan, air conditioner, common forms of odor control outside of expensive incense and oils, or soundproofing. Soundproofing is kind of a prerequisite for home privacy. If people can hear you doing something they might just as well see you doing it, especially if one of the many town gossips hear you. It also helps if most homes had multiple bedrooms, much less multiple rooms, which from what I read most colonial homes did not. Actually

        • Re: (Score:3, Interesting)

          by kaiser423 ( 828989 )

          Shame is a powerful method of control. When society convinces you that you should be embarassed about something, the person who knows it gains a lot of power over you. If everyone knew it, there's no power. Shame, and the associated need for a concept of privacy, were constructs that arise over and over as ways of controlling a population.

          This is an important note to make. It's really the pass/fail criteria behind DoD security clearances (barring other big issues). They don't necessarily care that you slept with another dude in college or smoke some Marijuana or did X or did Y. They care about whether that knowledge can be used to blackmail/shame you into revealing secrets. If you admit it up front and have no problem with everyone in the world knowing that you did that, then they don't care either. But if you're a straight arrow and yo

        • As a general rule in human history, privacy has been totally foreign. People always knew what their tribe, hamlet, neighborhood or building were up to. There wasn't an expectation of any sort of privacy, for anything from actions, to sexual activities, to hygeine. It just simply didn't happen.

          And that's why there has never been an unsolved crime in all of human history!

          • by tgd ( 2822 )

            As a general rule in human history, privacy has been totally foreign. People always knew what their tribe, hamlet, neighborhood or building were up to. There wasn't an expectation of any sort of privacy, for anything from actions, to sexual activities, to hygeine. It just simply didn't happen.

            And that's why there has never been an unsolved crime in all of human history!

            Privacy and secrecy are not the same thing. You can have secrets without an expectation or right of privacy.

      • no they just pull out the dogs and the pipe. they dont need this. its just for the political leaders they cant use the torture thing on directly or coercerce with a job in 4 years. if you have seen that episode of 60 minutes when the lobbiest says that once he offered them a job he OWNED them. at one point he owned HALF of the senate and the house. and barbra was shocked, i wasn't. I am ready for my soma and blue pill now, please leave me and the cat alone...
    • by nurb432 ( 527695 )

      Not everyone. Some only give out minimal information.

  • I am a wererabbit!
  • by flaming error ( 1041742 ) on Friday August 17, 2012 @08:48AM (#41023007) Journal

    Thanks to this research it seems pretty clear that interfacing to the brain reveals much more than where you want to move a cursor.

    Anybody working with classified info won't be allowed anywhere near these things.

    • by fustakrakich ( 1673220 ) on Friday August 17, 2012 @08:52AM (#41023047) Journal

      Anybody working with classified info won't be allowed anywhere near these things.

      For everybody else they'll be mandatory as protection against pre-crime.

    • Anybody working with classified info won't be allowed anywhere near these things.

      But they also have lives... they might have cool toys at home like these headsets for playing video games...

      shouldn't we be looking at ways to really build secure software around these devices rather than prevent people from using them...

      it is a defeatist attitude where software folks instead of fixing their software say don't use it...

      • The only way to be sure that any software around these devices are secure is to compile it yourself after analyzing the source code. Bonus points for writing the source code yourself.
        • That works for software, I guess. What about the hardware? Should people setup a chip foundry in their garage?

        • by Dr Fro ( 169927 )

          http://cm.bell-labs.com/who/ken/trust.html

          Not even that. How do you know the compiler isn't inserting backdoors?

    • by Anonymous Coward

      Actually, it shows, they truly have no idea what input they need for the controllers, so, they take everything and extrapolate instead.

      Seems to me, right now it's only useful for medical purposes.

      Still, if they would find out the exact signals they need to tap to make it work only for the game, without grabbing anything else, then it would reduce the number of sensors and build costs enormously.

  • by Coisiche ( 2000870 ) on Friday August 17, 2012 @08:52AM (#41023043)

    Now it seems to me that could make quite a useful interrogation tool, and I'd be therefore very surprised if such things are not already in use by constabulary forces.

    • by illaqueate ( 416118 ) on Friday August 17, 2012 @08:56AM (#41023093)

      This is another one of those cases where the authors want to write about a science fiction scenario that doesn't exist like direct neural input so they do an EEG study that in no way resembles the scenario they are imagining. EEG is terrible for extracting information so there's not much to worry about.

      • Alot depends on the required statistical accuracy of the information you are trying to extract, and the costs associated with a false-positive or false negative.

        If you are trying to prove guilt/innocence in a court of law or planning to bomb some remote village based on "intelligence" then its a pretty poor method, as with torture.

        But if all you are after is a better than average statistical correlation, then might catch the attention of advert-targeters and fraudsters. From inside an EEG controlled MMOG, y

      • EEGs are terrible when used as polygraph in a court of law. This is because they are not perfect and make mistakes fairly often, therefore cannot be used as evidence beyond reasonable doubt. However, if you want to get "some" information about a person, with only a degree of certainty, they are pretty damn good.

    • by ledow ( 319597 ) on Friday August 17, 2012 @09:07AM (#41023209) Homepage

      Not watched the Mythbusters episode about the lie detector? Apart from the absolute rubbish asserting that polygraphs work (despite there being NO scientific evidence that they have ever or could ever work and vast evidence to the contrary - and HUGE problems with their experiment setup in the first place), they do a bit where they stick people in MRI's, EEG's, fMRI's, etc.

      Basically, it's hard to tell without a very good MRI scan happening *as* the person lies, real consequences if they are found out, complete amateurs being tested, no counter-measures being taken by them, huge amounts of analysis, etc. to say if someone is lying. But if someone doesn't want it to be known they are lying, it's almost impossible to tell from any external measurement.

      And if you can't do it with medical-grade EEG or room-sized MRI results, you can't do it with a gaming headset for the next 30 years.

      Hell, the US is just about the only "first-world" country that's EVER allowed polygraph results to be used as "evidence" in a court of law.

      It's just that shite, and unreliable, a method to detect what someone is thinking. And if you can lie automatically and convincingly, then you have nothing *TO* detect in the brain. At least until you can *literally* read people's thoughts as if they were sentences being spoken aloud about what they intend to do.

      Hell, our knowledge of the brain at the moment stems mainly from waiting for someone to have a bolt fired through their brain by accident and seeing what facility they lose and what parts of the brain were damaged. Above and beyond that, the brain's a black box of which we can only measure "activity" by way of measure electromagnetic changes. That's like trying to tell what colour object is inside a opaque box that you can't touch by waving a metal detector near it.

      • And if you can't do it with medical-grade EEG or room-sized MRI results, you can't do it with a gaming headset for the next 30 years.

        That is why the "it" is likely to be targeted advertising not the extraction of evidence for use in court. Marketing doesn't need the level of accuracy that a court would.

    • by jovius ( 974690 )

      The problem is that you'd first have to record a huge amount of EEG data correlated with the verbal output and the perceptions of the person to be questioned, to construct any sort of model to be used later in the interrogation room. Otherwise the EEG data is mostly meaningless noise. Besides the person with the EEG headset can obfuscate the interrogation ad infinitum by pressing teeth together for example (thus enhancing alpha waves), or moving the facial muscles in some other ways.

    • Now it seems to me that could make quite a useful interrogation tool, and I'd be therefore very surprised if such things are not already in use by constabulary forces.

      I don't see us being anywhere close to this yet, but it's yet another reason to fight to defend 5th Amendment rights (for Americans in the audience).

  • I'll really worry when Google starts investing heavily in the technology.

  • Hm (Score:4, Funny)

    by rumith ( 983060 ) on Friday August 17, 2012 @09:10AM (#41023251)
    I predict a sharp growth of tinfoil hat making companies' share price.
    Anyway, this technology is amazing. How long until we (as a species) can do the same from a distance? How long until such devices are then minituarized and cost so little that it is feasible to make them ubiquitous?
  • ...Google announces it is buying Neurosky and Emotiv for an undisclosed sum of money

  • Blurb Misleading (Score:5, Informative)

    by Stormy Dragon ( 800799 ) on Friday August 17, 2012 @09:58AM (#41023871)

    If you RTFA, you discover that they can use it to confirm that you recognize particular things, so the system doesn't "leak" secrets. They can only "steal" things they already know.

  • This is the first step towards the world described in Ghost int he Shell. At some point a need for security appliances on brain-machine interfaces will be needed and created. Then the brain-machine interface and the security appliance will move to an embedded solution within bodies. At that point, hacking will be a lot more dangerious as one of the impacts of attacking, defending, and counter-attacking will be loss of confidentiality, integrity, and availablility of people's own brains.
  • Comment removed based on user account deletion
    • Yeah, and we were all supposed to have flying cars "in 20 years" in the 70s and robot housemaids "in 20 years" in the 50s.

      Trying to find citation.

      I think you misspelled "sound-bite given to tabloid journalist by excitable academic."

  • If so it could now be possible for anyone to experiment with it using these cheap headsets...great...

  • You mean to tell me that devices created to read your thoughts are, in fact, capable of reading your thoughts. What a wacky, unforeseen outcome!
  • Used in a different way, this can be fun. See Necomimi ears. [neurowear.com] These are cosplay ears which are controlled by a CPU that's reading basic brain activity. They swing up to the "pricked" position when brain activity indicates the wearer has their active attention on something, and slowly droop if the wearer isn't doing much.

    These were sold at Fanme 2012. If you call out the wearer's name, or their phone rings, the ears prick up. There are reports that people playing video games have their Necomimi ears pri

  • Works in theory, but good luck trying to get the 12 year olds on X-Box live from cursing insults first.
  • Yes, I am glad to see you.

    And, no, I'm not thinking about work right now.

    Why do you ask?

  • a train that will take you far away...
  • 20 years from now some user tries out the new free game on facebook. It's got some number puzzles, some geography aspects, inserts pictures of his friends here and there as he walks around a virtual city.

    On the other side, the malicious company gathers information about what people on your friends list are closest to you by how you react when you see their pictures, they probably know what city you live in from facebook so unknown to you the city you've been walking through has been a simplified version of

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...