Forgot your password?
typodupeerror
HP Input Devices Hardware

HP To Package Leap Motion Sensor Into — Not Just With — Some Devices 54

Posted by timothy
from the unhand-me-sir dept.
cylonlover writes "It hasn't even been released yet but the Leap Motion could already be considered something of a success – at least with PC manufacturers. Following in the footsteps of Asus, who announced in January that it would bundle the 3D motion controller with some of its PCs, the world's biggest PC manufacturer has joined the gesture control party. But HP has gone one step further, promising to build the Leap Motion technology into some future HP devices." (See this video for scenes of users scrabbling with their hands in empty air, and get ready for more of it.)
This discussion has been archived. No new comments can be posted.

HP To Package Leap Motion Sensor Into — Not Just With — Some Devices

Comments Filter:
  • by pellik (193063) on Thursday April 18, 2013 @09:15AM (#43482297)
    While I appreciate what was said in the video, I can't imagine that I'd ever prefer leap motion, or even touch screens, over a mouse. It all boils down to the physical exertion of lifting my arm to perform input vs resting my arm on a desk and lightly moving my wrist. I can overcome intuitive input by learning to use a less intuitive system, but I will never overcome more physically exertive systems being more physically exertive.

    I recall hearing that Chefs learn to manipulate a whisk with their wrist because the smaller muscles are less tiring in the long run, even though it's more natural to want to use your shoulder and arm to perform the action. This seems to be analogous to these new input styles.
    • by Kildjean (871084)

      Have you thought that using this in combination of a mouse would give you more liberty during a presentation or general computer usage? I wouldnt use this alone without a mouse, but i think it works better than the kinect does and it will be integrated on laptops. That is big.

      • by gl4ss (559668)

        Have you thought that using this in combination of a mouse would give you more liberty during a presentation or general computer usage? I wouldnt use this alone without a mouse, but i think it works better than the kinect does and it will be integrated on laptops. That is big.

        well, taking into consideration that the range for this isn't that great.. which is the tradeoff compared to kinect.. I doubt he would be using this on-stage standing up.

        what it does give is depth, which would be useful for some tasks. but lying on the sofa.. I'll still prefer kb + mouse for this.

        • "what it does give is depth, which would be useful for some tasks. but lying on the sofa.. I'll still prefer kb + mouse for this."

          Jesus Christ man! Your mom has to sit on that sofa when you are done. Can't you at least take your laptop down to the damn basement?

    • by Sockatume (732728)

      It's assumed that it's going to be a supplimentary input device, and not the main one. There are times I wish I had a more intuitive way of rotating 3D objects, for example.

    • by gregor-e (136142)
      It also doesn't solve the switching between keyboard and mouse problem. If Leap could implement human interface software that somehow seamlessly integrated a chording keyboard with their positional interface, and if Herman Miller created a workstation chair with arm supports that can suspend the user's arms weightlessly in front of them, this could offer some big advantages.
    • "I recall hearing that Chefs learn to manipulate a whisk with their wrist because the smaller muscles are less tiring in the long run, even though it's more natural to want to use your shoulder and arm to perform the action. "

      You heard wrong. Operating a whisk is like playing the drums. Using the wrist allows much faster action and more subtle control. It may also be true that it is less tiring, but that is incidental, rather than the driving reason.

    • I worked on the XBox Kinect game Your Shape 2012. The menu system in the game was the single best use of the kinect I've ever seen. At first, you make these huge gestures with your hands and arms, swiping really obviously, pushing the buttons with big strokes.

      But after you get used to it, you sort of lift your hand and twitch your fingers to flip through menus. Pushing a button means pushing your palm forward ever so slightly. It became a really good way to move through on-screen menus without reaching for

    • While I appreciate what was said in the video, I can't imagine that I'd ever prefer leap motion, or even touch screens, over a mouse. It all boils down to the physical exertion of lifting my arm to perform input vs resting my arm on a desk and lightly moving my wrist. I can overcome intuitive input by learning to use a less intuitive system, but I will never overcome more physically exertive systems being more physically exertive.

      I recall hearing that Chefs learn to manipulate a whisk with their wrist because the smaller muscles are less tiring in the long run, even though it's more natural to want to use your shoulder and arm to perform the action. This seems to be analogous to these new input styles.

      I agree with you for desktop work. But other situations would benefit from Leap. If I'm in a kitchen, cooking, looking something up, like a recipe would be much easier using Leap. Or the usual living room, entertainment PC - I don't want to have to track down a mouse and find a surface to use it on. A few simple air-gestures to select a movie is all I need. Or conference presentations - gestures to highlight areas or move to the next slide. Etc, etc.

      Couple this with google glass and you can use your gesture

    • While I appreciate what was said in the video, I can't imagine that I'd ever prefer leap motion, or even touch screens, over a mouse. It all boils down to the physical exertion of lifting my arm to perform input vs resting my arm on a desk and lightly moving my wrist. I can overcome intuitive input by learning to use a less intuitive system, but I will never overcome more physically exertive systems being more physically exertive. I recall hearing that Chefs learn to manipulate a whisk with their wrist because the smaller muscles are less tiring in the long run, even though it's more natural to want to use your shoulder and arm to perform the action. This seems to be analogous to these new input styles.

      Precisely. Well said.

      I will give you a tip. Buy a "low-profile" mouse instead of those big-humped things that come with most machines. Your forearm will then be able to rest on the desk. Thus, instead of using your wrist muscles, you will only need to use your fingers muscles to mouse. That is, it will be even less physically exertive.

      If you do this, you will eventually feel a need to turn the sensitivity of the mouse up, due to the decreased effort, and will end up never really using your wrist

      • by Sir Holo (531007)
        Sorry to double-post...

        These days, I have given up my mouse almost entirely––for a trackpad.

        I now use only two fingers to interact with all the usual types of software. Additionally, I am now quite fast with Photoshop, Google Sketchup, Power Point (ugh), various games, and advanced scientific software like LabView (a graphical programming language). All with two fingers. The wrist is only involved in moving my hand from the keyboard to the very close-by trackpad (well, really it's more of
    • It all boils down to the physical exertion of lifting my arm to perform input vs resting my arm on a desk and lightly moving my wrist.

      You shouldn't have to. Considering the precision of Leap Motion, it could behave like sort of a 3D touchpad that you mainly operate by moving your finger(s) slightly (although large arm based motions should be easy to support simultaneously). It should even be possible to put a Leap Motion unit in a monitor and then interpret the movements of your hand on your desk as if you were moving a mouse.

      The awesomeness of the sensing technology simply cannot be contested. The challenges lie in where to physically pu

  • With all the fancy touch enabled UIs floating around on tablets and phones, the PC can finally make a similar evolutionary advancement. Touch screens on the desktop or a laptop are annoying to reach for, but a device sitting in the keyboard is even closer at hand than a mouse. Windows 8 doesn't seem so silly now.
  • If this sort of thing were built into touch-screen devices, when the screen is touched, it would be possible for the device to identify exactly which finger was responsible for the touch. This could considerably increase the versatility of using a touch-screen as an input medium

    Also, you could get hover-detection practically for free.

    • by h4rr4r (612664)

      Will it magically get rid of fingerprints too?
      How about solving the low accuracy of touch issue?

      I can understand touch when you don't have better controls available, but at a PC you have much better controls right there.

      • by mark-t (151149)

        Fingerprints can be dealt with by regularly wiping the screen. If you wipe down a touch screen just once a day, it will make a huge difference.

        Touch screens involve the very natural gesture of pointing... a communication mechanism that human beings learn to use even before they've learned to talk. It's admittedly imprecise, but not every type of application requires any more precision than that. Conversely, however, some types of application *DO* require more precision than that, and it's a grievous u

    • by EdZ (755139)

      it would be possible for the device to identify exactly which finger was responsible for the touch

      Unfortunately, the Leap Motion cannot do this. Or at least, if this functionality exists it is not available to developers working with the Leap. You don't get any sort of point cloud, or even raw camera data, all you get is a series of vectors where the Leap has detected linear highlights (and using stereo cameras, deduced to be cylindrical objects) and the positions where it has detected them. You know where fingers are, but not which fingers are which.

      • by mark-t (151149)
        You can deduce which fingers are which by counting the number of fingers to the left and right of the one that touched the screen, and assuming a particular handedness (which could be provided to the program as a user profile setting, for instance).
        • by EdZ (755139)
          Only if all fingers are currently present and visible, or all have been present (and externally identified, either using manual calibration or an additional camera and some sort of classifier) at once and the remaining fingers have been continuously visible since (no occlusions), and the hand has not changed orientation. Forming a fist and then extending a single finger would prevent identification.
          • Only if all fingers are currently present and visible, or all have been present (and externally identified, either using manual calibration or an additional camera and some sort of classifier) at once and the remaining fingers have been continuously visible since (no occlusions), and the hand has not changed orientation. Forming a fist and then extending a single finger would prevent identification.

            ...and we could all guess which finger that would be.

          • by mark-t (151149)
            If your hand is close enough to the device for any finger to touch the screen, given the actual range that the device detects on, how would any fingers be not within the field of view of such a sensor?
          • "Forming a fist and then extending a single finger" is not a very good gesture, so that is not a major concern.

            A good variety of user interfaces can be developed without exact identification of all fingers in all possible positions. Identifying a finger in a touchscreen can be done if that finger is the thumb, in a natural resting position; then, the other fingers can be from their relative distance.

            This in particular allows for chording gestures, the ones used for touch-typing and that could be used for ot

      • "You know where fingers are, but not which fingers are which"

          I know exactly what you mean, but surely you can use heuristics for this, at least some of the time?

        Now I'm curious about how reliably this is able to detect and track a pinch gesture....

    • You don't need 3D space recognition to identify which finger is being used - it can be done from their size and relative positions for a good deal of versatility.

      • by mark-t (151149)
        Relative position to what, exactly? The other fingers aren't touching the screen.
        • Relative to the thumb, which can be recognized on its own. The other fingers will touch the screen later at some point after the thumb; all fingers have a fixed position and distance from it, so you can identify each finger after calibrating for hand size.

          If you add the temporal dimension, you can recognize a variety of chords and multi-touch positions. Sure, it's not perfect tracking of all fingers the all time, but you don't need that to recognize a high number of hand positions, enough to provide a vari

  • While the press release for Leap doesn't go into specifics, I can only imagine how bad HP will screw this up like they've done with their printer drivers. They haven't bothered to give you real drivers, just some generic universal drivers which, pointedly, sucks. Even for their newest printers, you get only a Universal driver. I guess they need to pay all those executives who keep halving the stock price every few years the big bonuses they've come to expect rather than invest in producing the software.

    No

    • by vswee (2040690)
      I actually wouldn't mind waving at my printer to perform tasks from across the room maybe. As far as in computers, I can only see it really being useful for giving presentations or displaying 3D models or something of that sort and even then it's slightly gimmicky.
  • by swschrad (312009) on Thursday April 18, 2013 @09:25AM (#43482403) Homepage Journal

    interpret THIS, buddy

  • Tactile feedback (Score:5, Insightful)

    by PSVMOrnot (885854) on Thursday April 18, 2013 @09:50AM (#43482657)

    People keep coming up with these nice shiny user interface devices, but they always seem to forget how important tactile feedback is.

    Sure I can type on a touchscreen keyboard, but it takes twice as long, because I have to actually look at the screen and check that a) it has noticed I am typing, and b) it has correctly recognised what I had intended to type. With a proper physical keyboard I can pick up such information purely by proprioception, audio and tactile feedback.

    The same sort of issue applies with any sort of hand waving interface: there is a much greater potential for the computer getting it wrong, and it takes longer to recognise & fix it when it occurs.

    Untill these things can be made as reliable as a physical push button I think people should be a lot more careful where and what they use them for.

    • You're doing it wrong. They have this technology called Swype now. Before Swype you would be correct, but now the issue is that you don't know how to properly type (i.e. swype) on a smartphone screen.
      • by PSVMOrnot (885854)

        For basic typing, which is only one specific case, swype comes close to being suitable. However, swype is still limited. It guesses what you are typing based on a weighted dictionary of common words. While it may be fairly accurate it is still only a guess. Add to that it will not be able to handle uncommon words or symbols as well.

        In other words: programming on one of those is a pain. Accurately entering lots of numbers is a pain. Playing Doom would be a pain.

        To sum it up: swype may be good for inputting

      • by dkf (304284)

        You're doing it wrong. They have this technology called Swype now.

        How does that work when writing Perl? Heck, how does it work even with something as verbose as Java? (No, having to go back to using COBOL would not be a step forward!)

        • You use an IDE designed specifically to work with it. Since those haven't been invented yet, I cannot give you the specifics. However, the discussion is about typical use cases. Programmers and people here in general tend to forget that they are not the typical use case.
    • by chispito (1870390)
      Seems like it would be better than a keyboard for, say, learning sign language.
      • by PSVMOrnot (885854)

        Seems like it would be better than a keyboard for, say, learning sign language.

        Yes! That! The right technology in the right place, rather than just because it's shiny.

    • by antdude (79039)

      I love them, but apparently females seem to hate them [slashdot.org]. :(

  • Had this news come from Apple, Razer, Samsung, Huawei, or any other tech manufacturer, I'd be fairly excited.

    But this is fucking HP, for crying out loud. The same HP that lost its lead in desktop PC sales, had serious QC issues with its notebooks recently, botched what was to be the biggest merger in the PC industry, drove its lucrative digital camera business into the ground, and is on the fast track to demolish its lead in the undisputed cash cow, printers, as well.

    Should we really give them another chan

    • Should we really give them another chance on a new product, when there are alternatives??

      Depends on the alternatives, and intended use. While HP has made some mis-steps, I'd still put their hardware over most of their competitors, and right along-side the rest. The leap motion *could* be a great product for artists. I'm wondering how it could implement with autodesk's sketchbook, google's sketchup, Zbrush, 3DS Max, etc. This plus a wacom tablet could offer quite a few methods of creative expression. In time we will see how widely it is supported, and that will be the difference maker.

      I'm pict

      • by Misagon (1135)

        It can't match 1:1 to the screen. There is no eye tracking, which would be required for proper eye-hand-screen coordination.
        Sony managed to be first to patent a combo with eye tracking, however.

        • It can't match 1:1 to the screen. There is no eye tracking, which would be required for proper eye-hand-screen coordination. Sony managed to be first to patent a combo with eye tracking, however.

          of course, I forgot the eye-tracking, but with the web-cams these things come with, it wouldn't be out of reach...

What the world *really* needs is a good Automatic Bicycle Sharpener.

Working...