Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Image

How To Index and Search a Video By Emotion 76

robotsrule writes "Here's a a demonstration video of EmoRate, a software program that uses the Emotiv 14-electrode EEG headset to record your emotions via your facial expressions. In the video you'll see EmoRate record my emotions while I watch a YouTube video, then index that video by emotion, and then navigate that video by simply by remembering a feeling. The web page for EmoRate explains how I used Emotiv's SDK to build the software program, and how I trained the system by watching emotionally evocative videos on YouTube while wearing the headset."
This discussion has been archived. No new comments can be posted.

How To Index and Search a Video By Emotion

Comments Filter:
  • by Kitkoan ( 1719118 ) on Thursday August 26, 2010 @06:14PM (#33387556)
    is right here. [ratethatemo.com]
  • Brain-computer interface really taking up via to emos?

    This is not the augmented / towards transhuman future I imagined...

    • Brain-computer interface really taking up via to emos?

      For me this raises the question; if an emo watches emoRate. Should emoRate show the emo video material matching the emo's emotional state and thus amplifying and affirming emo's emotions entering a self-referencing emo-loop...

      Or should one design an algorithm detecting emotional changes of users watching emoRate, and serve emo content countering emo's current emotional state in the opposite direction?

      Say, distilling your facebook data:

      "subject x in ager

      • For me this raises the question; if an emo watches emoRate. Should emoRate show the emo video material matching the emo's emotional state and thus amplifying and affirming emo's emotions entering a self-referencing emo-loop...

        Or should one design an algorithm detecting emotional changes of users watching emoRate, and serve emo content countering emo's current emotional state in the opposite direction?

        I'd say that depends on whether it's a positive or negative emotion. If I feel extraordinary happy, I certai

    • by flyneye ( 84093 )

      I think you could use the headset for assigning particular functions to particular facial expressions as they have. I'll take the "magic" out of this contraption by noting that it doesn't read emotions, it merely senses pre defined facial expression and compiles the data. Over many years of research with EEG both Dr. Walter and myself found that even one sensor was enough and holding it between fingertips( where nerve endings are plentiful) rather than pasting it on the face or scalp produced far better sen

  • It would be interesting to run this a few times... not sequentially, but maybe once every few months to give yourself time to reset (for the lack of a better word), and rewatch whatever it was you recorded with this device and then diff the results to see if their was a drift, and in what area's. I have to assume you wouldn't always respond the same way, and the results could be highly interesting, perhaps even moreso to the field of psychiatry in allowing a more exact gauging of the effectiveness of whatev
    • by Tynin ( 634655 )
      Sorry for replying to myself. After a casual search I found it was naive of me to think EEG's weren't being used in psychiatry for a while now. Seems it is rather common place. Still, cool the tech is being used in new and more accessible ways.
    • Re: (Score:3, Interesting)

      by Kitkoan ( 1719118 )

      It would be interesting to run this a few times... not sequentially, but maybe once every few months to give yourself time to reset (for the lack of a better word), and rewatch whatever it was you recorded with this device and then diff the results to see if their was a drift, and in what area's. I have to assume you wouldn't always respond the same way, and the results could be highly interesting, perhaps even moreso to the field of psychiatry in allowing a more exact gauging of the effectiveness of whatever drug they are administering to a patient.

      I doubt you'd get nearly the same reactions. Things like boredom (reruns don't always get everyones attention) and (since these are static videos) predictability can and will cause detachment of your emotions and their intensity to what your watching. Think of things like horror movies, sure they can make you feel fear greatly during the first watch, but rarely can cause that much fear during the second viewing, let alone the third, forth, ect...

      • Unfortunately, it's probably harder for these things to detect boredom vs. interest than simpler emotions, but it would be cool if you could set it so your player runs faster when you're bored and slower when you're interested (and there are already sound-adjuster programs out there so you can run faster or slower without distorting the sound pitch badly.)

        You want to use this in interactive mode, not batch, so it's reacting to what you're interested in or bored about now, not what you felt about it last tim

  • So the guy moves his face in accordance with his emotions and then, guess what, he can make the same gestures to go back to the place in the video where he previous made those gestures. Woot? If anything this only shows just how far off this sort of technology really is. Last I checked, I'm pretty sure I had more than 4 emotions... Am I missing something about how amazing this is?

    • Re: (Score:1, Interesting)

      by Anonymous Coward

      Yea, I get the feeling the software trained the person rather than the other way around. But this is the first step in software thinking just like people; it'll be here in 15 years.

  • by BitHive ( 578094 ) on Thursday August 26, 2010 @06:30PM (#33387686) Homepage

    If you want the raw EEG data, you have to buy emotiv's $10,000 SDK. I'll stick with my Neurosky headset for now.

    • Re: (Score:3, Interesting)

      by c0lo ( 1497653 )
      Neither has a Linux port for the SDK (wink - wonder why this is posted on /. then?)
      Seriously, anyone with some references on similar devices that have Linux support?
      • There's an SDK version in the works with a promise of a beta at any point. Closed source drivers though which of course sucks but this is a small company trying to make it in a very small market; could have been worse, could have needed a dongle. Oh, wait :(

        So yes, the Linux support could be better but they are at least making an effort in this area rather than having to be brow-beated about it for many years.

    • by MrBandersnatch ( 544818 ) on Thursday August 26, 2010 @07:14PM (#33388002)

      Nonsense!

      The version to access EEG data is $750. They have a $500 developer version and a $299 consumer version - I don't even think they have a $10K product! As for Neurosky, do you mean that toy where you move the pong ball up and down? Sorry, genuinely interested since I hadn't thought they had done much beyond that.

      • by BitHive ( 578094 )

        It was $10,000 a year or so ago. I seem to remember a $15k "research" version too but I'm guessing there weren't very many takers.

      • by Idiomatick ( 976696 ) on Thursday August 26, 2010 @10:33PM (#33389050)
        Still completely fucking retarded. I was really excited about this product before it came out. Was a part of the forums. The they decided they would charge HEAVILY for the SDK to regular users. Who is their audience? People that want to toy with the thing. That is their ONLY audience until there are several thousand apps for the thing and when it is integrated into a bunch of games like the star wars mmo. And that will only happen if you have lots of developers. Which emotive is shitting on.

        They likely made 20-50 grand selling the dev kits. Where-as the starwars mmo having mind control in it would sell at least 1,000 headsets likely way way more. They would have to try to not sell an additional 500-1000 headsets if they opened the SDK.

        Explaining this obvious failure of business on the forums got me tossed then my post deleted. And now the old forums are gone. (When I explained it back then I was much more encouraging).

        Seriously, this thing was super hyped at release but if you google them or look their crap up on youtube they almost completely died within weeks of release. The forum has 2-3 active devs.

        So sad to see such a cool toy ruined by such a stupid stupid obvious business decision. :( That and proving that Balmer is smarter than you has to be embarrassing. DEVELOPERS!
        • by RattFink ( 93631 )

          Correct me if I'm wrong but isn't this what you are looking for?

          http://www.emotiv.com/apps/sdk/179/ [emotiv.com]

          $500 may not be cheap but it's a lot more reachable for the hobbiest then $10k.

          • Still biting the hand that feeds them. The choice to charge $500 for it was suicide, simple as that. Charging 10k may have been bad enough that they would sell ZERO copies. Which would have been sort of embarrassing.
          • The developer edition doesn't include raw EEG data. Just their blackbox interpretation of those signals.
        • So, load up some USB monitoring software and make your own opensource SDK.
        • by Santzes ( 756183 )
          Wow. Doing some face recognition and other image recognition stuff lately, I left this article open for the night as I was probably going to spend some time with it later. Not gonna happen now.
      • Not entirely correct either. The EnterprisePlus edition with raw EEG output is $7,500. The research $750 version is for individuals, research institutions and companies with turnover

        The other thing that isn't clear to me is if you develop an application using the raw EEG output, do you need other $7,500 headsets to use that application on can you actually use them with the cheaper consumer headsets? If we develop something on our $7,500 headset and then want to implement it globally within our organizatio

        • I was trying to post in a hurry - the above is supposed to read "turnover less than $100,000"
  • What was his reaction to goatse videos? Actually, I think I'd rather not know. I'm sure such an experiment with a normal person would push the Emotiv beyond its capabilities.

  • So let me get this straight: you hook this up, train it, and then in the future you can hook it up again and it will tell you how you feel. Because at some time in the future you won't know what you are feeling, so you have to ask the computer about what you just experienced.

    I understand that there is a market for this in testing products, say video games, but who else would use this stuff? I just don't see this as being very commonly used. I would guess it is about as useful as a "lie detector", which doe

    • There's a BIG market for this technology which is only going to grow. Its already being used in neuro-marketing (market research), I'm personally looking to apply it to usability, and down the line context sensitive affective interactions are going to play a big part of how you interact with software (think a computer that can tell it gave you an unsatisfactory answer by the tone of your voice and thus does another search in the background to try and improve the results).

      I'm personally still sceptical regar

    • For this "EmoRate" program that is true but the device itself not so much. Using their app, "EmoKey", you can bind neurological impulses (I think they really doomed their product by saying it was based on emotions) to key combinations. One application of this technology is simulating a forward feeling, backwards, sideways etc (I know I for one can send fake impulses) and binding it to a WASD setup. Setting up like this would actually help you game faster as it removes the physical lag. I know on their sit
  • by Kitkoan ( 1719118 ) on Thursday August 26, 2010 @06:38PM (#33387750)
    His test video he's watching is Sintel [blender.org] which is a free, open source CG movie soon to be finished.
    • by blair1q ( 305137 )

      If it's open source, can I make a fork of it where Han shoots first?

      • Re: (Score:3, Informative)

        by Kitkoan ( 1719118 )
        Sure, here is the download link for their previous movie, Big Buck Bunny [bigbuckbunny.org] where you can download the movie in multiple formats and video sizes, and at the bottom is the entire studio back up (over 200 gig) where you can download every part of the movie made and used.
  • When I'm watching a video alone, I don't usually have facial expressions, unless something is insanely funny, or I've got into the scotch.

    • An EEG doesnt read facial expressions, it is input with brain activity which the program then translates into emotions. The face expressions were just an exaggeration so it would look good on camera.
      • However it has been shown that your face expression does affect your emotions. So if he was doing those face expressions intentionally during the actual test, it may well have affected the results.

      • Re: (Score:3, Interesting)

        by blair1q ( 305137 )

        Don't complain to me. That's what the summary said.

        As for EEG, I wonder what mine looks like when I'm playing Poker.

        I bet it's not too readable. I'm pretty good; mechanistic even when I'm bluffing.

  • Who wants to bet the porn industry is the first to monetize this?
  • That's awesome. This would become very powerful once these results are amalgamated with millions of other viewers. It would also be a very effective way of improving search results because rather than simply clicking the close button, there could be feedback to say the results sucked. And combined with another technology, which results sucked.

    I rememebr seeing a documentary about this technology on beyond2000 years ago. It's great to see that it has made it into the consumer world.

  • by Anonymous Coward

    Since you enjoyed the video "2girls1cup". EmoRate thinks you might also enjoy...

  • Seriously, the first time you hear a joke is much more effective then subsequent times. So how would you be able to find jokes then?
  • Now I have the perfect tool to fine tune my propaganda and advertising!
  • by flimflammer ( 956759 ) on Thursday August 26, 2010 @07:35PM (#33388126)

    I can't say I see the benefit to this sort of system. My facial expression rarely changes throughout movies, unless I laugh over something funny or flinch due to a movie trying to scare me with a loud noise. I'm hardly alone; I showed the video to a few people and they had similar concerns.

    I can't see myself giving off a heartwarming smile when I see something happy or frowning when I'm sad. At that point it seems like I'm merely trying to appease the technology to make it work, instead of just doing my natural thing and it picking up on that.

    • I can't see myself giving off a heartwarming smile when I see something happy or frowning when I'm sad.

      Well that's too bad. Where's the fun in watching a movie if you can't get lost enough in it to actually feel something? I mean, sure, it's fiction. That doesn't mean you can't let yourself empathize with the characters, or smile at their triumphs, or beetle your brow at one of their more perplexing decisions. Don't get me wrong, movies are not a substitute for real life, but being unable to watch a performance and not feel anything is ... well ... sad.

      I hope you can open yourself up a bit someday, for y

    • The device is an EEG reader, it does not read facial expressions! I don't understand why the submitter had to mention facial expressions just to confuse poor slashdotters. Anyway the applications you can develop with a consumer priced and accurate real-time EEG reader is boundless. Marketers could use it to find out what commercials are the funniest, all psychology programs at all universities could use it for limitless amount of experiments, you could use it for your porn collection or music collection to

      • There have been a couple of products like this out there on the market, with varying numbers and locations of sensors. Some of them are doing EEG type detection to try to see what your brain is doing, while others are mostly sensing facial muscles. It's hard to keep track of which products are which, especially when they're initially marketed toward gamers because they think that's a potential market.

        If this is the one doing actual brain behaviour detection, and if the SDK weren't so expensive*, it'd be

    • My facial expression rarely changes throughout movies[...].

      I would bet that you give off more readable facial expressions than you realize...

  • ....make the software explode.

  • by Anonymous Coward

    while watching the demonstration video. The result is: 'meh'

  • I see a great market in helping guys understand women. The computer could watch her face and text him, "She's bored. Enough football talk." "She's getting upset. Stop talking about her mother." "Uh-oh. She didn't REALLY want to know if she looks fat." Could revolutionize relationships. Pam http://www.thebrewmag.com/ [thebrewmag.com]
  • Pavlov's Orgasm Recall Navigation.

    -

  • Finally something I can use to easily sort my porn.

The truth of a proposition has nothing to do with its credibility. And vice versa.

Working...