Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Displays Upgrades Technology

Display Makers To Use Quantum Dots For Efficiency and Color Depth 100

ArmageddonLord writes with this news from the IEEE Spectrum, reporting on display industry gathering Display Week: "Liquid crystal displays dominate today's big, bright world of color TVs. But they're inefficient and don't produce the vibrant, richly hued images of organic light-emitting diode (OLED) screens, which are expensive to make in large sizes. Now, a handful of start-up companies aim to improve the LCD by adding quantum dots, the light-emitting semiconductor nanocrystals that shine pure colors when excited by electric current or light. When integrated into the back of LCD panels, the quantum dots promise to cut power consumption in half while generating 50 percent more colors. Quantum-dot developer Nanosys says an LCD film it developed with 3M is now being tested, and a 17-inch notebook incorporating the technology should be on shelves by year's end."
This discussion has been archived. No new comments can be posted.

Display Makers To Use Quantum Dots For Efficiency and Color Depth

Comments Filter:
  • Enjoy your phone in new psychedelic colors!

    -AI

    • This technology is nothing new. Its been used heavily since the sixties to bring out vivid colors in all manner of displays (its actually even older than traditional color tv displays). Sometimes they refer to the technology as microdots [wikipedia.org]. I'm not sure I need a LSD screen yet or one that uses PCB bus instead of a PCI bus one.

    • Great! I always thought my phone was missing about colors.
  • Static images (Score:5, Interesting)

    by AlphaWolf_HK ( 692722 ) on Saturday June 16, 2012 @04:04AM (#40343075)

    Any word on burn-in, permanent image persistence, or uneven aging? That's my main concern with OLED and Plasma.

    LCD can get image persistence if it shows the same image for very long periods of time (e.g. 24 hours) but on most displays it is only temporary.

    I'd be interested to hear if quantum dot might have any of these issues.

    • Since this is BEHIND the LCD, the light passes through it first. It will degrade evenly, and not be affected by the image displayed on the LCD.

  • the light-emitting semiconductor nanocrystals that shine pure colors

    What the hell is a pure color? Something that matches the frequency response of our cones? Fully saturated colors?

    • Re: (Score:2, Informative)

      by Anonymous Coward

      A pure color is light with a narrow spectral bandwidth. It doesn't matter which color, just that there is ONLY that color.

      • by jpapon ( 1877296 )
        What good is light with a narrow spectral bandwidth?? The point of a TV is to make images life-like. Light sources in real life have wide bandwidth, and objects generally reflect relatively large swaths of frequency. It would be a nightmare to produce images using lots of pixels with 1 nm bandwidth... it's much better to just choose 3 or 4 primaries and mix them... but mixing works just fine with wide bandwidth primaries.
        • by Anonymous Coward

          What good is light with a narrow spectral bandwidth?? The point of a TV is to make images life-like. Light sources in real life have wide bandwidth, and objects generally reflect relatively large swaths of frequency. It would be a nightmare to produce images using lots of pixels with 1 nm bandwidth... it's much better to just choose 3 or 4 primaries and mix them... but mixing works just fine with wide bandwidth primaries.

          Good god, the ignorance up in this thread.

          One word: Gamut. [wikipedia.org]

          Mixing works much better with tight primaries. sRGB cannot correctly depict the selective yellow headlights of an old French car, the ubiquitous green LEDs of early '90s electronics, the GaN blue LEDs Shuji Nakamura cursed us with since, nor an LPS streetlight. Not what I'd call "just fine".

    • Fully saturated, i.e. a single wavelength, as a laser produces.

      The color filters of LCD panels let through a narrow but not-single-wavelength bandwidth of color. This restricts the color gamut you can reproduce. As explained in TFA.

    • Single frequency peak. That is a pure colour. When you look at a typical incandescent light it is a broadband signal spread across the visible range and well into infrared (hence their inefficient at lighting a room despite being very efficient way of converting electrical energy into photons). For an LCD displaying pure red the peaks actually look rather fat around the red with minor peaks in the green and blue range as well as the backlight bleeding through the display. These imperfections is what makes t

  • Soooo, any idea what they mean by "50% more colours"? Do these allow the screen to display a wider set of the visible spectrum than LCD screens? Do they allow the same set but at a higher bitrate? Do they simply display the desired colour more precisely? Is this "extra" in the range that consumer GPUs and OSes can display?

    • The whole field of computing is built on three-primary color specification anyway. Either RGB, or HSV, or YUV, or some varient of them. Or CMYK, in which the K is really a fudge-factor used to account for real inks not behaving like mathematically ideal inks. So even if someone built a display of a wider gamut, good luck finding any content to use it. I suspect this is just marketing being allowed to write the press report.
      • But if they refer to your average TN display that often can't even display TrueColor without dithering, 50% more colors may essentially mean "it's not the trash your laptop probably uses".
      • The whole field of computing is built on three-primary color specification anyway. Either RGB, or HSV, or YUV, or some varient of them. Or CMYK, in which the K is really a fudge-factor used to account for real inks not behaving like mathematically ideal inks. So even if someone built a display of a wider gamut, good luck finding any content to use it. I suspect this is just marketing being allowed to write the press report.

        RGB has nothing to do with computing, but everything to do with the physics of light. Printing uses CMYK also because of the physics of light. The difference is RGB is when light is emmitted and CMYK when it is reflected. That is why blue and yellow paint make green, but blue and yellow light make magenta. With light, mixing colors is additive, with painting/printing, it is subtractive.

        • RGB were the chosen wavelengths only because by mixing them in appropriate ratios it is possible to reproduce the perception of most colors to human vision. If there were any non-humans animals smart enough to judge, they'd tell you that all the colors on television look wrong. Humans see subjective colors, not spectrographs. To represent a color with precision would require storing the entire spectrum, which is impractical.

          The mathematics of CMYK say that f you have full use of C, M and Y all absorbing yo
      • by Prune ( 557140 )
        RGB is a small subset of the visible color gamut. It's the triangle in this graph: http://www.antigrain.com/doc/introduction/cie_1931.jpg [antigrain.com]
    • by Dunbal ( 464142 ) *
      They mean "we need funding and investors, and 50% more sounds great without actually saying anything".
    • It doesn't matter because 24 bit color produces more color variants than the human brain can actually distinguish. Having more color won't look any different.

      • by QQBoss ( 2527196 )

        I have a buddy who used to teach ophthalmic surgery at Georgetown U. and did research in this area. He also did computer animation as a hobby (one that actually made him good money, to where I think his teaching later became the hobby). Wish I could locate one the papers, but most of his work was done pre-WWW and probably has never been put up. His info showed that most people can resolve 8 bits of red, 9 bits of green, and 8 bits of blue. That extra bit sucks from a memory usage point of view, though,

    • Soooo, any idea what they mean by "50% more colours"?

      It means they let someone with a marketing degree write a blurb about technology.

  • by outsider007 ( 115534 ) on Saturday June 16, 2012 @04:26AM (#40343101)

    You won't know how many pixels are dead until you open the box.

    • They are working fine until you look at the TV.
    • Which raises the question that perhaps the quantum dot monitor will display the correct color only when you're not looking at it.
  • by Anonymous Coward

    " beautiful displays that would be inexpensive and easy to manufacture."

    But expensive to buy for sure. And will only be slightly cheaper when the next superior tech is at the door. Rinse and repeat...

    • by Jeremi ( 14640 )

      But expensive to buy for sure. And will only be slightly cheaper when the next superior tech is at the door. Rinse and repeat...

      Well, yes, that's how capitalism works. Someone invents something useful, and then they try to maximize the profit from their labor by selling it for as much as the market will bear. Eventually the price comes down due to competition. You can either pay top dollar for the new hotness now, or wait a while for the price to come down, your choice.

      It's a feature. Note that you can buy a $99 LCD display at Walmart today that performs better in all respects than the $9,000 LCD display of the same size you cou

  • LCD TVs already easily match Rec. 709 color primaries (similar to sRGB used in standard color destkop monitors).

    Since TV signals and Blu Rays are all using this standard, using a non standard wider gamut emitter, just gets you unnatural colors.

    If you like artifical, oversaturated hues, great, but if you want natural looking color this does nothing for you.

    IIRC, LGs new 55" OLED TV will be corresponding to Rec. 709 color primaries, not the outlandish Neon of OLED smartphones.

    For a TV, what you want is prope

    • by Prune ( 557140 )
      Uh, Rec. 709 is a small portion of the visible color gamut. It's represented by the triangle in this graph: http://upload.wikimedia.org/wikipedia/commons/8/8f/CIExy1931_sRGB.svg [wikimedia.org] Note the area it covers of the overall visible gamut is maybe 50%.
      • by guidryp ( 702488 )

        Uh, Rec. 709 is a small portion of the visible color gamut. .

        Uh, So?

        Standards exist for a reason. Just about all available Media is produced for Rec. 709/sRGB.

        Showing it with wider color primaries will not make it look more real, it will make it look more unnatural.

        Wide gamut PC monitors were all the rage 3 or 4 years ago, until people started realizing it made it nearly impossible to get neutral color and the tide turned back to sRGB screens.

        Gamut isn't simply a case of more == better. In the vast majority of cases, more == worse.

        • by Prune ( 557140 )
          Gamut is about being able to represent all colors that are perceptible in the real world. You're making an argument that because of immature technology, we should handicap our displays. It's the dumbest thing I've read in a long time on this site. The difficulties with neutral color are based on 1) poor calibration, and, more importantly, 2) insufficient quantization -- due to the extended gamut you need around 10 to 12-bit quantization _per channel_ to have sufficient precision. Considering most LCDs and O
          • by guidryp ( 702488 )

            No I am simply arguing for standards, instead of ill defined "wider" color gamut, that is simply, bigger number is better nonsense.

            Unless you have media AND players, AND displays all working in lockstep, you get worse results, not better.

            You need standards body to meet, create a new wider gamut standard and build new product at all stages to meet it.

            Going it alone is pointless spec whoring.

  • Why do we need this? The power savings is a plus, but the human brain can only "see" and distinquish an estimated 10 million colors ( http://hypertextbook.com/facts/2006/JenniferLeong.shtml [hypertextbook.com] ) and current display technologiy produces 16.7M colors (24-bit True Color). Having a display show 24M colors (50% increase) won't look any different since current technology already exceeds our ability to percieve the differences.

    • by Nyder ( 754090 )

      Why do we need this? The power savings is a plus, but the human brain can only "see" and distinquish an estimated 10 million colors ( http://hypertextbook.com/facts/2006/JenniferLeong.shtml [hypertextbook.com] ) and current display technologiy produces 16.7M colors (24-bit True Color). Having a display show 24M colors (50% increase) won't look any different since current technology already exceeds our ability to percieve the differences.

      You answered your own question. It's worth it for the Power Savings, IMO, the fact that it shows colors possibly better then we can see them is just the bonus.

    • Apparently from all the other posts, the 16.7M colors we can get now do not overlap 100% with the 10M colors we can see. I believe this is called the Gamut range of colors being produced vs the Gamut we can see.

      Supposedly these light emitters can create a Gamut of light frequencies (colors) that overlaps more, thus can produce more colors (that we can see).

    • by gstrickler ( 920733 ) on Saturday June 16, 2012 @10:54AM (#40344625)

      Because the gamut [wikipedia.org] of 24-bit RGB doesn't cover the entire range of visible colors and intensities. While we can only distinguish ~ 8M colors, we can distinguish a huge range of intensities. 24-bit displays cover 16M colors AND intensities, so in this case, 16M is not > 8M because they're counting different things.

      While current displays are adequate for most purposes, they do not display all of the colors we can see, nor all the intensities we can see. Typical displays only cover 45%-75% of the AdobeRGB (1998) color-space [wikipedia.org], which itself is a subset of the visible gamut. Some (more expensive) displays cover a greater percentage of the visible range, but none cover the entire range.

      • Because the gamut [wikipedia.org] of 24-bit RGB doesn't cover the entire range of visible colors and intensities. While we can only distinguish ~ 8M colors, we can distinguish a huge range of intensities. 24-bit displays cover 16M colors AND intensities, so in this case, 16M is not > 8M because they're counting different things.

        While current displays are adequate for most purposes, they do not display all of the colors we can see, nor all the intensities we can see. Typical displays only cover 45%-75% of the AdobeRGB (1998) color-space [wikipedia.org], which itself is a subset of the visible gamut. Some (more expensive) displays cover a greater percentage of the visible range, but none cover the entire range.

        As stated in another post, the color problem you are referencing is one of physics -- producing the various wavelengths. What we see, however, is one of biology and the human brain cannot differentiate between similar wavelengths. Therefore, including all of them does not mean that we will see the image any better. Intensity is an issue, but the summary is talking about color, not intensity, although they are related.

        The limiting factor in all of this is not going to be the production of the visible wavel

        • Yes, the human eye and the brain are going to be the limits. And given the range of intensities (e.g. contrast) one can see at any given time, and the ability to discern continuous color gradients, it appears that we'll need somewhere between 24 and 36 bits driving displays with contrast ~5000:1, using at least 3 color narrow band color sources centered on the frequencies to which the eye cones respond, and capable of delivering more than 1000 lux [wikipedia.org] (the brightness of an overcast day) at the viewer's position

    • by Prune ( 557140 )
      This is one of the dumbest comments I've read on slashdot. You're confusing quantization with extent. The article is very obviously talking about covering a larger part of the visible color gamut. RGB is represented by the triangle in this graph: http://upload.wikimedia.org/wikipedia/commons/8/8f/CIExy1931_sRGB.svg [wikimedia.org] You'll note it doesn't even cover 50% of visible colors. Most TVs and displays can't even reproduce the full RGB space. The 24-bit/16.7M merely refers to the number of colors and affects how smoo
      • This is one of the dumbest comments I've read on slashdot. You're confusing quantization with extent. The article is very obviously talking about covering a larger part of the visible color gamut. RGB is represented by the triangle in this graph: http://upload.wikimedia.org/wikipedia/commons/8/8f/CIExy1931_sRGB.svg [wikimedia.org] You'll note it doesn't even cover 50% of visible colors. Most TVs and displays can't even reproduce the full RGB space. The 24-bit/16.7M merely refers to the number of colors and affects how smooth gradients are, and has nothing to do with the range of colors that can be reproduced.

        For fuck's sake, I didn't expect this level of stupidity from someone with a sub-1M user ID!

        Has nothing to do with how much the TV or screen can reproduce. It has everything to do with how well the brain can discriminate the various wavelengths. So while it is theoretically true that the technique may produce more colors, whatever that means exactly, if the human brain cannot discriminate between them, what good does it do?

        This is not an issue of physics, but of biology, but then maybe I'm just to much of a dumb fuck to know what I'm talking about.

        • by Prune ( 557140 )
          It _is_ an issue of biology. And that's exactly what the larger encompassing graph represents: the perceptual color space for humans. Humans can see all colors in that; RGB can only represent the colors in the triangle, and most monitors are a subset of the triangle. This has nothing to do with physics so I'm not sure why you brought physics into the discussion. Next time I recommend counting to 10 before letting an itchy Submit-clicking finger take action. It gives you time to save later embarrassment.
  • by Culture20 ( 968837 ) on Saturday June 16, 2012 @12:04PM (#40345109)
    I don't care about colors or power savings. Get me better DPI or just more pixels overall.
    • by Prune ( 557140 )
      Troll much? RGB doesn't even cover 50% of colors visible to the human eye. It's represented by the triangle here: http://upload.wikimedia.org/wikipedia/commons/8/8f/CIExy1931_sRGB.svg [wikimedia.org] The larger superset is the full CIE XYZ color space visible to the human eye.
      • That's nice. I personally have no need for more colors in the currently limited screen space. I need to have a bigger view, even if it drops to 16bit color.
        • by Prune ( 557140 )
          It's not the number of colors but the color gamut. You seem to lack reading comprehension. The issue is not quantization (bit depth) but the saturation that can be achieved. One is completely unrelated to the other.
          • It's not the number of colors but the color gamut. You seem to lack reading comprehension. The issue is not quantization (bit depth) but the saturation that can be achieved. One is completely unrelated to the other.

            And you seem to lack comprehension of simple concepts. I said I want pixels. Not color. Hell, give me monochrome, but give me 19200x12000.

Real Programs don't use shared text. Otherwise, how can they use functions for scratch space after they are finished calling them?

Working...