Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Input Devices Hardware Hacking

Intelligent Thimble Could Replace the Mouse In 3D Virtual Reality Worlds 65

New submitter anguyen8 (3736553) writes with news of an interesting experimental spatial input device. From the article: "The mouse is a hugely useful device but it is also a two-dimensional one. But what of the three-dimensional world and the long-standing, but growing, promise of virtual reality. What kind of device will take the place of the mouse when we begin to interact in three-dimensions? Anh Nguyen and Amy Banic ... have created an intelligent thimble that can sense its position accurately in three-dimensions and respond to a set of pre-programmed gestures that allow the user to interact with objects in a virtual three-dimensional world. ... The result is the 3DTouch, a thimble-like device that sits on the end of a finger, equipped with a 3D accelerometer, a 3D magnetometer, and 3D gyroscope. That allows the data from each sensor to be compared and combined to produce a far more precise estimate of orientation than a single measurement alone. In addition, the 3DTouch has an optical flow sensor that measures the movement of the device against a two-dimensional surface, exactly like that inside an ordinary mouse." The prototype is wired up to an Arduino Uno, with a program on the host machine polling the device and converting the data into input events. A video of it in action is below the fold, a pre-print of the research paper is on arxiv, and a series of weblog entries explain some of the development.

This discussion has been archived. No new comments can be posted.

Intelligent Thimble Could Replace the Mouse In 3D Virtual Reality Worlds

Comments Filter:
  • LEAP Motion (Score:5, Interesting)

    by aaronb1138 ( 2035478 ) on Monday July 07, 2014 @09:03PM (#47404285)
    LEAP promised similar things. Logically, their technology should work well, but the execution was piss poor. The trick to getting 3D finger interaction to work will either be higher immersion, such as proportional (to the controller) 3D displays or Occulus Rift style implementations where you can see your hand interacting. Another issue LEAP has is defining the horizontal and vertical ground planes. Their controller would work better if it detected and calibrated to you monitor and activation motions occurred when you touched the screen in many cases.

    3D gesture identification and intent management seems to be a stumbling block so far as well. Seems largely that programmers figured out the hand skeletal structure and then immediately ignored that musculature, tendons, and fine motor control are not the same in all positions and directions.

    Some example dumb hand / finger gestures for 3D control (I see these in LEAP motion software and in proposed hand gesture libraries for similar technology):
      - Triggering a thumb against the side of the index finger - most of the hand moves, especially the index finger (which is typically being keyed off of for cursor position)
      - Triggering by pulling the index finger like a trigger - surprisingly inconsistent when there is no resistive grip or button
      - Holding a splayed out hand(s) horizontally, mid air as a default centered position
      - Keying z-rotation off of a hand pointed at the screen as if one's arm protruded from the chest
      - Expecting the hand to translate mid-air like camera dolly & track.
      - Lots of other ergonomically / kinematically ignorant ideas. I think they modeled everything with those articulated wooden hands for clay sculpture. And no arms.

    Just some things to consider before creating your own 3D motion controller...

COMPASS [for the CDC-6000 series] is the sort of assembler one expects from a corporation whose president codes in octal. -- J.N. Gray