Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays HP Upgrades

HP Introduces First-Ever 30-bit, 1 Billion Color Display 236

justechn writes "I recently had the opportunity to see, first hand, HP's new 30-bit, 1 billion color LCD display. I have to say I am impressed. Not only is the HP Dreamcolor LP2480zx capable of displaying so much more than standard LCDs, but it considered a Color Critical display. This means if you work with videos or photos you can be guaranteed that what you see is what it is supposed to look like. With 6 built-in color spaces (NTSC, SMPTE, sRGB, Rec. 709, Adobe RGB and DCI), you can easily switch to the one that best suits your applications and process. At $3,499, it is too expensive to be a consumer level LCD, but compared to other Color Critical displays (which can cost as much as $15,000 and $25,000) this is a real bargain. This display was a joint venture between HP and DreamWorks animation. When I talked to the executives of DreamWorks, they were very excited about this display because it solved a huge problem for them."
This discussion has been archived. No new comments can be posted.

HP Introduces First-Ever 30-bit, 1 Billion Color Display

Comments Filter:
  • by It doesn't come easy ( 695416 ) * on Tuesday June 10, 2008 @10:44AM (#23725935) Journal
    Don't have time to find all of the references but most of the human race cannot distinguish that many colors, except possible the few who have the extra color rod in their eyes. Most of us cannot see more than about 1 million colors, I believe.

    Cool technology, though.
  • Re:Link? (Score:5, Informative)

    by frankie ( 91710 ) on Tuesday June 10, 2008 @10:45AM (#23725971) Journal
    Lots and lots of links for your perusal. Google makes all computing simple [google.com]
  • Oh, really? (Score:5, Informative)

    by Timothy Brownawell ( 627747 ) <tbrownaw@prjek.net> on Tuesday June 10, 2008 @10:46AM (#23725997) Homepage Journal

    An LED-backlit 24-inch widescreen monitor, the DreamColor features 30-bit imaging with a over billion colors. That's 64 times the standard LCD color gamut
    No it isn't. Gamut is something like how far apart the most different colors it can show are, and depends on what colors the actual pixel elements are. The number of bits just determines how close together the most similar colors it can show are.
  • by justechn ( 821584 ) on Tuesday June 10, 2008 @11:00AM (#23726249) Homepage
    Actually I did see it in person. I apologize about my website going down. It looks like I got slashdotted.
  • by jcupitt65 ( 68879 ) on Tuesday June 10, 2008 @11:09AM (#23726421)

    That's not quite right.

    CIELAB colour space codes colours as L (lightness) with a 0 - 100 range, and a/b (red-green / yellow-blue) each with about a +/- 100 range for physically realizeable colours. A pair of colours which are just distinguishable are a unit apart, so we can distinguish very roughly 100 * 100 * 100 colours, or a million.

    However those are surface reflectances under a single illuminant. In a natural scene, your eye is adapting constantly as you look around. Your iris changes size, your retina changes sensitivity, and so on. The range of lightnesses in a natural scene is up to about 10 billion to 1 if you compare direct sunlight to deep shadow. You can distinguish a million colours at each of these points of adaptation.

    If you want a display that can show a full range of dark colours and a full range of light colours, you need more than a million to 1.

  • Re:Registration (Score:5, Informative)

    by justechn ( 821584 ) on Tuesday June 10, 2008 @11:09AM (#23726427) Homepage
    The website does not require registration. It just defaults to that page when it is overloaded. I apologize about my website going down. It looks like I got slashdotted. I am working on it.
  • by mikael ( 484 ) on Tuesday June 10, 2008 @11:09AM (#23726431)
    The monitor is designed to be color calibrated with color printers and scanners.

    We had some art friends who used a system like this. One time, they discovered there was a market for their paintings as prints rather than as originals, so they decided to set up their own print shop.

    However, the problem was making sure the scanned input matched what was on the screen and what was printed out. So they bought a system calibrator which had a photosensor that attached to the screen. You basically scanned in a pre-supplied test image, placed the photosensor on the screen and then onto the printed output. Each time the system would readjust the gamma correction for each color channel of every device until they all matched.

    This was in accordance with the Pantone Matching System [wikipedia.org]

    For a company like Dreamworks, they will want to be able to visualize 3D characters as designed by the artists and be able to use this information to create merchandise like wall posters, bean bag toys, plastic models and accessories.
  • by Doc Ruby ( 173196 ) on Tuesday June 10, 2008 @11:17AM (#23726579) Homepage Journal
    This display might work for reliable color matching, but not for the reasons supplied.

    The main problem with getting color on one object, say a display monitor, to look exactly the same as on another object, say a magazine page, is mostly the problem of gamma [wikipedia.org], a nonlinear contrast range in different light levels. And, of course, the differing illumination of the two objects in different places, which is the actual source of the possible range of colors that can be seen coming from the object.

    The human eye is very sensitive to different spectral content of light detected coming from objects. Sunlight starts out with different colors than the light shining on a display monitor or generated by the display. The magazine in the sunlight filters a range of colors through its ink, then reflecting off the paper (which is itself some color, even if that color is "close" to "white"), back through the ink, and to the eye. The display monitor's light starts out a different color from the sunlight, then is filtered through and reflected from very different materials than ink and paper. By the time the light reaches the eye from each object, they're very different. And each instance is a little different, owing to manufacturing quality variations.

    And then gamma has to be factored in, which tends to dominate the color content reaching the eye. The gamma is a kind of nonlinear "contrast" (as in a TV control) in different frequencies, varying as the intensity of the same illumination is increased. But even that illumination generally isn't just the same color at all intensities, because it's emitted from some manufactured material that has its own gamma (or emission equivalent) and "color temperature [wikipedia.org]" bias. Which is in turn different from sunlight, which is more stable in its source color range than most manufactured materials (except lasers, a completely different kind of illumination that looks completely different from sunlight).

    Color calibration works best when there's a feedback loop of the data passed between different output objects (like paper/ink and a display monitor), linked by a video sensor (that has its own color calibration problems). It's an extremely hard problem. When I was a member of the Joint Photographic Experts Group (JPEG, who created the image file format - I helped with the color spaces spec), we spent a lot of time getting it close enough for commercial use. But we knew enough to tell that "solving" the problem 100% was not going to work. And even now, almost two decades later, it's still not solved. But every few years new tech makes it affordable for industries to add another "9" to what was once 99.999% accurate. The 30 bit gamut [wikipedia.org] of this display monitor means that it doesn't constrain the range of colors as much as have old technologies. But the calibration requries sophisticated processes and software to automate them, as well as a method for comparing to actual outputs. And it still can't account for variances in manufacturing the target output media.

    For Hollywood, this problem might be close to solved, though. Because movies are moving to digital projection, which can be manufactured to high precision of consistency in materials and their interaction with light, and from the same parts as the production display monitors. If all the theaters used the same DLP chips, LEDs and image surfaces (or to the precisely same standard specs) for their projectors as the studios did for all their display monitors and as all people did for their home TVs, then colors would be pretty close to identical in all those environments (except for that variable ambient lighting). These display monitors might flexibly replicate a lot of different environments to match, but the matched objects are still highly variable. For $3500, they better deliver something good.
  • by mark-t ( 151149 ) <markt@nerdf[ ].com ['lat' in gap]> on Tuesday June 10, 2008 @11:35AM (#23727029) Journal

    The range of lightnesses in a natural scene is up to about 10 billion to 1 if you compare direct sunlight to deep shadow. You can distinguish a million colours at each of these points of adaptation.
    While true, this overlooks the fact that there will be an absolutely HUGE number of hues at one level of illumination that do not produce different optical characteristics from different hues at different levels of illumination. This sort of thing _drastically_ reduces the color space required for a full set of representable colors. 8 bits per color isn't actually sufficient to represent every possible human perceivable shade because the human eye has different levels of sensitivity to different colors, even though it does represent more colors in total than the eye can discern. If one is to use the same number of bits for each color, however, I had heard somewhere that about 10 bits per primary color would be sufficient to represent every shade distinguishable by any normal human eye (ie, one that does not have an extra color cone).
  • by TheSync ( 5291 ) * on Tuesday June 10, 2008 @11:48AM (#23727367) Journal
    To date, I have not seen any LCD or Plasma monitor that can perform as well as certain projection D-ILAs in terms of the combination of luminance ranges, good black levels, contrast ratios, gamma accuracy, viewing angle, and coverage of the Rec. 709 gamut. But don't take my word for it, here [plasmadisp...lition.org] the Plasma Display coalition admits they can only cover 80% of Rec. 709 with their best displays, with many more falling in the 75% department.

    From a digital television perspective I am much more interested in monitor gamut effectively covering the Rec. 709 color space, because that is all I can put on TV. Sure, it's OK to have extended gamut outside Rec. 709, but if you can't actually cover all of Rec. 709 gamut I don't care if you cover color outside that space. Similarly, I'm sure digital cinema people want the DCI gamut covered well first before having coverage outside that gamut.

    On the LCD side, the production lines are changing so rapidly that two versions of the same type of panel from different months will have different results. I have seen a $300 Dell LCD computer monitor perform better than some professional television LCD displays that are priced 10 times as much.

    My suggestion is to measure displays yourself, and ignore marketing literature. Of course, you need a good broadcast engineering lab to do that, not all networks have such a thing...

    If you want to know what you need in a good monitor, see the EBU User requirements for Video Monitors [www.ebu.ch]. SMPTE is working on a set of recommendations as well.

    I'm hoping that OLED displays will come to the rescue, but it will take a while for them to come up to needed sizes and maturity.
  • Re:Good as CRTs? (Score:3, Informative)

    by GleeBot ( 1301227 ) on Tuesday June 10, 2008 @11:59AM (#23727637)
    Better than CRT, actually. At least under certain conditions.

    Matrix-style displays have some big inherent advantages over scanning phosphor technology, such as crisp, precise, flicker-free display.

    Meanwhile, there have been "deep color" displays like this capable of more than 24-bit color for a while. Use of LED backlights give them a much wider color gamut than phosphors are capable of.

    The main failings of current LCD technology fall into two categories:

    First, LCDs block light imperfectly, so you get potentially poorer black levels. (CRTs aren't as good at this as their boosters would like you to believe, though.)

    Secondly, you have the color shift problem, where the angle of viewing distorts the color accuracy. The degree depends on the technology, but it can't ever be completely eliminated.

    Under proper viewing conditions, LCDs can do a good job on both fronts; a major movie studio is certainly an example of an absolutely color-critical user. However, it comes at a big cost.

    The future is probably OLED, or maybe e-ink. Unlike LCD, OLED is a light emissive technology, so it has absolute blacks and no color shifts. However, who knows how long it'll take OLED technology to reach commercialization, due to the problem with blue OLED lifetimes; the closest thing right now is a tiny 11" Sony TV that costs a small fortune, and minuscule cell phone screens.
  • Re:Good as CRTs? (Score:3, Informative)

    by GleeBot ( 1301227 ) on Tuesday June 10, 2008 @12:04PM (#23727793)
    Incidentally, for those who don't understand the bit about the "wide color gamut" enabled by LEDs, color spaces (such as the Adobe RGB, sRGB, NTSC, and so on spaces mentioned in the summary/article) are defined by three primary colors. Nothing new there.

    The tricky bit is that the specifications define these three primary colors in terms of a precise frequency of light. The only light source that comes close are tuned lasers. Consequently, that LCD monitor sitting on your desk (or lap), probably backlit by a fluorescent light, can only reach something like 80-90% of the specified color gamut.

    LEDs are pretty close to lasers in terms of color purity, and monitors backlit by LEDs can often reach an astonishing 98% or more of the color gamut. This wide gamut often allows them to cover more than one color space adequately, as exemplified by the monitor mentioned in this article.
  • by Anonymous Coward on Tuesday June 10, 2008 @12:05PM (#23727797)
    You mean you don't want any colour. [wikipedia.org]
  • Of course it does (Score:5, Informative)

    by Sycraft-fu ( 314770 ) on Tuesday June 10, 2008 @12:19PM (#23728121)
    Because to not do so is problematic for the computer which is controlling it. There's also the issue that what we REALLY see the best is greys. If you have a different number of bits per channel, you'll run in to the problem of not being able to do truly neutral greys (as was a problem in 16-bit 5-6-5 colour mode). Because of our grey perception, there's already been 10-bit black and white medical displays out there. Finally, it would be silly to artificially cripple the display.

    LCDs function by filtering light through red, blue and green filters, and then blocking part or all of the light to specific sub pixels. So if you can have 1024 driving levels for one sub pixel, you can have it for all of them. No reason to restrict the pixels that happen to have red and blue filters instead of green.

    So this display is 10-bits per primary colour channels, giving 1024 steps for grey, 1,073,741,824 total possible different colours.
  • by Sycraft-fu ( 314770 ) on Tuesday June 10, 2008 @12:24PM (#23728231)
    You can get LCDs that have better colour, both in terms of gamut and in terms of quality, than a CRT today. Problem is you don't get them in the bargain bin. The NEC LCD2690WUXi is quite superior to even professional CRTs in my opinion (and I happen to have a LaCie Electron22Blue IV to compare it to). The gamut is no question superior, you can measure that, but the subject colour quality is just great too. Thing it it's over $1000.

    Cheapest you can probably find a "better than CRT" panel is about $700 for a Doublesight DS-263N. Same IPS panel as the NEC, just less advanced electronics backing it up. Still, better colour than any CRT out there.

    However cheap and good just don't go together. If you want a budget LCD, well you get the budget image.
  • Re:Hype (Score:5, Informative)

    by GleeBot ( 1301227 ) on Tuesday June 10, 2008 @12:53PM (#23729005)

    Why do so many people not care about having sharp eyesight?
    I was one of those people, so I'll try to answer this for you.

    Frankly, most daily tasks don't require good eyesight. I don't even bother wearing my glasses unless I'm reading signs or driving or something. And my level of eyesight actually requires correction; a lot of people have less-than-perfect eyesight that's still legal to drive with.

    When I go to the movie theater or watch a DVD on a big screen or something (if I'm watching on my laptop, I can already see every pixel at a comfortable viewing distance), I do put on my glasses so I can enjoy the sharpness (if it's that sort of movie; some movies are better without being pixel-perfect sharp).

    However, for everyday life, it provides marginal benefit. And corrective lenses inevitably introduce other kinds of distortion, which I find give me a headache. Certainly if I want to make sure something is straight and level, I take off my glasses, because I can't trust my lenses to match what my brain has been wired over the years to perceive as straight.
  • by DrYak ( 748999 ) on Tuesday June 10, 2008 @02:02PM (#23730629) Homepage

    There's also the issue that what we REALLY see the best is greys.
    Yes and no.
    We *DO* have very strong sensitivity to greys. But that mostly happens in our peripheral vision. Our foveolla is richer in cones, rather than rods and thus has very big colour sensitivity, but sucks at distinguishing very dark levels of grey.

    This can easily be illustrated when looking at the sky, at night, when there are no cloud and no light pollution from a nearby big city : you see a lot of stars (when getting a global picture with all your visual field including peripheral vision) but if you try to look at some region in detail, some star seem to disappear (you're looking it with the high resolution / high color / but bad grey region of your retina), and then are visible again if you stop looking at them.

    There's no such thing as a single resolution or a single sensitivity to colours/greys in eyes. More likely those parameters depends on the region of the retina considered.

    Because of our grey perception, there's already been 10-bit black and white medical displays out there.
    Well... not exactly. Those displays are grey, simply because most of the picture produced in radiology are, indeed, grey scale. Thankfully we happen to have good sensitivity to grey contrasts so doctors in radiology can read them (with the help of monitors that have a wide enough dynamic range of light intensity and enough steps in between to mimic the quality of actual radiology films).

    On the other hand, you could imagine obtain similar visibility to fine details by using pseudo colours. The problem is that no doctor is used to to analyse rainbow coloured pictures (...I tend to be the only one liking pseudo colour scales...) and if you move the window around (the mapping of data to intensity of grey) colours completely shift around (dark region may have been cyan with one window and orange with another), whereas with a grey scale darker region are always darker grey than lighter regions.

    So the reasons are not only because of compatibility with our retina, but even more so because of practical considerations (looks like the original medium, simpler to manipulate, etc...)

    Pseudo colours on the hand may be very popular in engineering printout because, well, once it's printed, it's hard to play with a display window, so you better find a way to cram as much possible information even on a medium that offers not such a big dynamic range of shades.

    Note that then you have scale problems, which are happily abused for example by charlatans trying to sell snake oil to lower the radiation of your cell phone : the picture with snake oil looks much less redder than the one with snake oil. But that's because the pseudo colour mapping is different between the two pictures. Not because putting a sticker on the back of the phone suddenly stops it from frying your brain.
  • Re:Dithering (Score:2, Informative)

    by Anonymous Coward on Tuesday June 10, 2008 @02:32PM (#23731341)
    Except not. Dithering is NOT the same as having multiple discrete levels of each color. This new display only has the 3 primaries, but a thousand levels of each one. Dithering is having pixels near each other being different colors to appear like they're something they aren't. These displays are ACTUALLY the color they appear. Big difference.

    Posted AC since I've modded in this thread.

    - Pitabred

People will buy anything that's one to a customer.

Working...