Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Television Hardware Build

MIT Develops Holographic, Glasses-Free 3D TV 98

MrSeb writes "Engineers at the Massachusetts Institute of Technology (MIT) are busy working on a type of 3D display capable of presenting a 3D image without eye gear. What you've been presented with at your local cinema (with 3D glasses) or on your Nintendo 3DS console (with your naked eye) pales in comparison to what these guys and gals are trying to develop: a truly immersive 3D experience, not unlike a hologram, that changes perspective as you move around. The project is called High Rank 3D (HR3D). To begin with, HR3D involved a sandwich of two LCD displays, and advanced algorithms for generating top and bottom images that change with varying perspectives. With literally hundreds of perspectives needed to accommodate a moving viewer, maintaining a realistic 3D illusion would require a display with a 1,000Hz refresh rate. To get around this issue, the MIT team introduced a third LCD screen to the mix. This third layer brings the refresh rate requirement down to a much more manageable 360Hz — almost within range of commercially produced LCD panels."
This discussion has been archived. No new comments can be posted.

MIT Develops Holographic, Glasses-Free 3D TV

Comments Filter:
  • by CyberVenom ( 697959 ) on Thursday July 12, 2012 @09:34PM (#40634589)

    The article from the first link is a little better explanation than the second link.

    This is not quite a hologram, but it is a true multi-viewer solution without the need for headtracking or other dynamic tricks. It is a precomputed video stream displayed on precisely spaced, and slightly higher-than-your-living-room-tv-refresh-rate, but otherwise normal LCD panels.

    Basically, the MIT guys have come up with algorithms to compute a set of three overlay transparencies, which selectively occlude or reveal certain pixels when viewed from certain angles due to parallax, such that one of many possible perspective images of a scene is produced depending on the angle from which this stack of overlays is viewed.

    The part they seem most proud of is that because these different perspective views are all of the same scene, many of the pixels are the same color from one perspective to another, so they only need to concentrate their parallax trick on making a select few pixels vary by angle, thus reducing the complexity of the problem to the point where it can actually be realized with consumer resolution LCD panels and attainable data rates.

  • Waste of time. (Score:0, Informative)

    by Anonymous Coward on Thursday July 12, 2012 @10:11PM (#40634855)

    Today Viacom has induced me to swear off all the cheesy garbage TV of theirs I'd been watching on the net. I realize now what a colossal waste of my time that was, that could be better used doing other things.

    The advancement of 3D viewing technology is, to my mind, completely pointless. The quality of what passes for television now days, and most movies too, is utter dogshit, with so many annoying, offensive, stupid, repetitious commercials that an hour long show is not much longer in reality than a HALF hour. Then I am expected to put up with all the ham-handed product placements (mostly Apple) that I don't even want to watch anymore. So spending all this time and money and effort trying to make it possible to watch something where what you see depends on where you sit is pointless, polishing a fat, stinking turd. They should focus their efforts on the economic model behind our visual entertainments, rather than worrying about the delivery medium, and come up with a way to have an hour long show that actually takes up 50+ minutes with interesting, valuable content, instead of this throw-away Shitivision we have anymore. I'm done. I'm cutting the cord, AND I'm not watching this garbage online anymore either.

    Fuck Viacom, fuck television, I'm going to go READ. You should all join me too, send Viacom a big fat bird, say "suck a hairy syphilitic cock, you fucking pieces of greedy shit, you've lost me as a customer, drop dead you scum!" Everybody pick up a book, and let your brain come back to life!

  • by zalas ( 682627 ) on Thursday July 12, 2012 @10:31PM (#40634987) Homepage

    Oh interesting, so they finally gave it a name. I remember coming across the 2-layer version of the display sometime ago. Looks like they also have an interesting theoretical foundation to go with it; the abstract of the first paper from Gordon Wetzstein's page [mit.edu] gives a nice overview.

    What essentially is going on is that you can model (at least when talking about things much larger than the wavelength of light) light as a four-dimensional function (i.e. intensity of light along all the possible rays that fill space), which is referred to in this research area as a "light field." Putting a mask somewhere in space will mask out a 2D-extrusion of the mask shape in 4D space. Putting multiple masks at different planes will mask out the product of this 2D-extrusions (and the extrusion angle varies as a function of depth). Hence, what they are doing is attempting to piece together the original 4D function by piecing together unmasked portions at each time frame.

    For a more simplified view, you can think of this as trying to create a 2D picture through a sequence of special single-color 2D pictures created by placing stripe patterns oriented at a fixed set of angles on top of a light panel.

    If you've taken linear algebra, it is somewhat like decomposing a matrix into a sum of rank-one matrices, except here each component needs to be positive (masks cannot create "negative" light).

  • THEY'VE DONE IT (Score:5, Informative)

    by wisebabo ( 638845 ) on Thursday July 12, 2012 @11:26PM (#40635325) Journal

    This is really a significant breakthrough. I mean good looking, glasses free 3D (please look at the video) which means MULTIPLE SIMULTANEOUS VIEWERS using CHEAP components. The only difficulty is the compute power requirement is a little high but that's nothing that won't be solved quickly thanks to Dr. Moore. (I think they are also able to use GPUs so massive cheap parallelism can overwhelm the problem).

    A previous poster brought up the good point that it wasn't clear if the scene was pre-rendered. If/when it can be done on the fly (just a matter of CPU power), think of the applications. CAD, GAMES!

    In 10 years (or less hopefully) we should have really large (80") true 3D displays that a bunch of people can stand around and touch (like what those guys in Perceptive Pixel, recently bought by Microsoft*, do). Talk about science fiction.

    I actually submitted this story a day or two ago but I didn't understand how it worked (and still really don't get it, the math is beyond me). Anyway I'm glad it's getting the attention it deserves.

    *Let's hope that Microsoft doesn't kill it, or use the patents it acquired to block progress.

  • by Exrio ( 2646817 ) on Friday July 13, 2012 @03:26AM (#40636455)

    The camera that films video for this display is a light-field camera: https://en.wikipedia.org/wiki/Light-field_camera [wikipedia.org]

    Surprisingly they're already being sold to mere mortals, but those are early models that are not mature enough to be used for video production (the Lytro is for consumers but can only take pictures, the Raytrix can take video but is for industrial applications).

    In the meantime while these cameras mature, any way you can turn imagery into 3D models is fair game, maybe a wide-angle high resolution Kinect, or interpolation from two normal cameras (it's a bit more complex than interpolation but you get the idea), or mere image recognition a la gimmicky 2D-to-3D conversion, etc.

  • Re:Just one viewer? (Score:5, Informative)

    by White Flame ( 1074973 ) on Friday July 13, 2012 @05:04AM (#40636823)

    No, this effectively broadcasts many views of the image through the entire range. Any viewer at any valid angle within the field of view should see a properly tracked perspective.

UNIX is many things to many people, but it's never been everything to anybody.