A Billion-Color Display 206
The Future of Things covered the introduction last month of HP's DreamColor display, with 30 bits/pixel, developed in conjunction with DreamWorks Animation. The display is aimed at the video production, animation, and graphic arts industries. HP promises blacker blacks and whiter whites — though TFoT quotes one source who notes that if they deliver this, it will be due to the back-lighting and not to the number of bits/pixel. No word on the size of the displays that will actually be delivered, or on the price.
Re:To what end? (Score:1, Informative)
Re:To what end? (Score:3, Informative)
Once you increase the range of colours that you are going to display it means the gaps between distinct colours become larger and so more bits are required to compensate. I'm way too lazy to actually look at the "article" but they've probably shifted from a fixed point representation for colour components to a floating-point one. This produces a colour-space that maps much better onto what we perceive.
Come back after you've turned off anti-aliasing. (Score:5, Informative)
Most new displays have a resolution of 96dpi, whereas low-end printers can easily pull off 300dpi. Same goes for color-depth. Black and White screen images at 8 bits/pixel simply cant match the range of black&white print & film.
When you think about it, techniques such as anti-aliasing are really just hacks to work around the limitations of today's monitors. If monitors could pull off 300dpi, you wouldn't need anti-aliasing.
Re:To what end? (Score:5, Informative)
Just as in audio where quantizing becomes a problem only in very low level passages, fine greyscale, especially in the blackest image areas, will benefit from more bits/pixel.
For example, an ordinary CD (16 bits) can sound rather gritty on quiet recordings such as the low level passages of classical music. That's because you're probably only using two or three bits of sample depth down there. To combat the issue, 24 bit audio will elevate the sample depth everywhere but will show itself best at low levels. Dither (essentially noise) is used to randomize and mask the problem, but that's a cheat.
In video, fine greyscale performance comes from higher bit depth per pixel and is visible throughout the entire luminance range. The issue shows itself on flat (un-textured) areas with minute lighting changes across the surface, like a softly lit painted wall. You'll see annular rings on the surface as the bit values step through their range. Again, dither may be used to randomize the quantized transitions.
24 bit video is really 8 bits per primary color - so it's not that good to start with. In professional application, it's not unusual to work with 10 bit [per channel] or even up to 16 bit[per channel] images, mostly to be more friendly to post production.
Fortunately, analog humans are fairly blind to minute color changes. Unfortunately, our system of digital video happily shows you everything wrong with it.
Re:To what end? (Score:5, Informative)
The range of colors that can be reproduced by a 24-bit RGB device is always going to be different from the range of colors that a 24-bit CMY device can reproduce.
By the same note, a 24-bit RGB display can produce colors that the CMY printer cannot.
One color space isn't bigger than the other; they're simply different. Once you increase the bit-depth far enough to encompass the full spectrum of visible light for both color spaces, the distinction can finally be dropped.
Mod parent (or his sibling) up... however,... (Score:5, Informative)
However, a larger bitdepth doesn't do anything for color space. It simply determines the granularity of that color space. If with 16 bit you get 65,536 individual colors within the RGB gamut (with slightly higher granularity in the green channel, typically), and with 24bit you get 16,777,216 individual possible colors within the RGB gamut, then with 30 bit (10 bit per channel; it's not new, really), you get 1,073,741,824 individual possible colors... but still within the RGB gamut (of the device at hand).
An HDR display (either by using a very bright backlight or more localized LED backlights control, etc.) also doesn't change the gamut of that device - it simply allows for much brighter values of them.
Now, if they were to make an LCD panel that aside from the R,G,B pixel elements also had C M Y pixel elements, then you most certainly could increase the gamut. It would also be much more difficult to switch to than a simple bitdepth change.
Re:Oh no, not again (Score:3, Informative)
Again, read this. [wikipedia.org]
As for the 1600, the trade-off you have for a true 24 bpp display is narrower viewing angle and slower response time, this is due to the physics of the crystals. Check out the National Semi [national.com] page for lots of info on what exactly a liquid crystal is, what the different types are and how they're driven, and lots of amusing info on the guts of LCD panels.
But for the dithering, it's sort of like buying CDs with 16 bit samples, but CD players only having 12 bit DACs but it not being written anywhere. But then, if no one can tell, why choose 16 bits in the first place? This reminds me of the waning days of Minidisc when suddenly everyone here became a very critical, golden-eared audiophile and could tell the difference between a CD and MD, but the same people turn around to their 18 bit displays, can't tell the difference, and go on thinking they are 24 bits.
Life on this planet never ceases to amaze and befuddle me.
Re:Great (Score:3, Informative)
Re:To what end? (Score:3, Informative)
There are 3 types of cone receptors, and 3 numbers is sufficient to describe any color the human eye can perceive, but those 3 numbers can not represent actual physical colors.
Your cones do not just detect one monochromatic color, each type has it's own response curve across varying frequencies, and they're not even nice simple bell curves (one even has two peaks). To represent the entire visible color space with 3 numbers, as the CIE 1931 XYZ color space does [wikipedia.org], you need to allow things like negative luminance, that don't exist in the real world.
As you can see here [wikipedia.org], using three colors you can represent a subset of what is actually visible, represented as a triangle within the chromacity diagram. If you used a set of 5 monochromatic colors instead, you could represent a larger subset of the full visible range, which could be visualized as a pentagon in that chromacity diagram (and, of course, using even more colors would let you add more points to the shape representing the colors you can display, letting it conform even more closely to the full range of visible colors).
Re:Come back after you've turned off anti-aliasing (Score:4, Informative)
Couple of things (Score:3, Informative)
Then of course there's the problem of wider gamut and wider dynamic range displays. Right now most displays show a fairly small subset of the total amount of colours humans can perceive, and also have a fairly narrow contrast range. Well, we'd like to increase that and I'm sure will succeed with newer technology (there has already been some success, I'm typing this on a wide gamut LCD). The only problem is that the more range a display has to cover, the larger and thus more noticeable in individual step is.
As an analogy say we were trying to measure distance. We have a 1 metre range and we measure it using 8 bits of precision. Ok, no problem, this gives us sub millimetre resolution. However now say we expand that range to 100 metres. Well now our resolution went to shit, it is only slightly better than half a metre. If we want to get back down to the millimetre range, we need more steps, more bits of precision.
Same thing as displays improve the range of colours they can display. The individual steps between colours will get larger and more noticeable unless we add more steps.
Re:To what end? (Score:1, Informative)