Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Displays HP Upgrades

HP Introduces First-Ever 30-bit, 1 Billion Color Display 236

justechn writes "I recently had the opportunity to see, first hand, HP's new 30-bit, 1 billion color LCD display. I have to say I am impressed. Not only is the HP Dreamcolor LP2480zx capable of displaying so much more than standard LCDs, but it considered a Color Critical display. This means if you work with videos or photos you can be guaranteed that what you see is what it is supposed to look like. With 6 built-in color spaces (NTSC, SMPTE, sRGB, Rec. 709, Adobe RGB and DCI), you can easily switch to the one that best suits your applications and process. At $3,499, it is too expensive to be a consumer level LCD, but compared to other Color Critical displays (which can cost as much as $15,000 and $25,000) this is a real bargain. This display was a joint venture between HP and DreamWorks animation. When I talked to the executives of DreamWorks, they were very excited about this display because it solved a huge problem for them."
This discussion has been archived. No new comments can be posted.

HP Introduces First-Ever 30-bit, 1 Billion Color Display

Comments Filter:
  • GIMMEH (Score:5, Insightful)

    by Aphoxema ( 1088507 ) on Tuesday June 10, 2008 @09:33AM (#23725719) Journal
    I WANT IT. I don't really know why, though...
    • Re:GIMMEH (Score:5, Funny)

      by couchslug ( 175151 ) on Tuesday June 10, 2008 @04:40PM (#23736393)
      "I WANT IT. I don't really know why, though..."

      I do. My collection of Roseanne Barr b3av3r shots!
  • anyone have a link that DOESN'T require a login?
    • by dkf ( 304284 ) <donal.k.fellows@manchester.ac.uk> on Tuesday June 10, 2008 @10:19AM (#23726633) Homepage
      Does it use the same number of pixels per channel? I hope not. Here's why: the human eye is not equally sensitive to each of the three primary colors; we can see quite a lot finer differences in green than in blue (red comes between the two extremes). To show this, create a simple monochromatic stepped gradient image in green and another in blue. Now eyeball them using a viewer that doesn't do fancy gamma correction; on a 24bpp display you should be able to see the steps on the green image (assuming normal color vision) but you'll have real problems doing that with the blue image.
      • I'm assuming you mean bits per channel rather than pixels per channel? Why would you want to miss out some colours altogether for certain pixels?
      • Of course it does (Score:5, Informative)

        by Sycraft-fu ( 314770 ) on Tuesday June 10, 2008 @11:19AM (#23728121)
        Because to not do so is problematic for the computer which is controlling it. There's also the issue that what we REALLY see the best is greys. If you have a different number of bits per channel, you'll run in to the problem of not being able to do truly neutral greys (as was a problem in 16-bit 5-6-5 colour mode). Because of our grey perception, there's already been 10-bit black and white medical displays out there. Finally, it would be silly to artificially cripple the display.

        LCDs function by filtering light through red, blue and green filters, and then blocking part or all of the light to specific sub pixels. So if you can have 1024 driving levels for one sub pixel, you can have it for all of them. No reason to restrict the pixels that happen to have red and blue filters instead of green.

        So this display is 10-bits per primary colour channels, giving 1024 steps for grey, 1,073,741,824 total possible different colours.
        • by DrYak ( 748999 ) on Tuesday June 10, 2008 @01:02PM (#23730629) Homepage

          There's also the issue that what we REALLY see the best is greys.
          Yes and no.
          We *DO* have very strong sensitivity to greys. But that mostly happens in our peripheral vision. Our foveolla is richer in cones, rather than rods and thus has very big colour sensitivity, but sucks at distinguishing very dark levels of grey.

          This can easily be illustrated when looking at the sky, at night, when there are no cloud and no light pollution from a nearby big city : you see a lot of stars (when getting a global picture with all your visual field including peripheral vision) but if you try to look at some region in detail, some star seem to disappear (you're looking it with the high resolution / high color / but bad grey region of your retina), and then are visible again if you stop looking at them.

          There's no such thing as a single resolution or a single sensitivity to colours/greys in eyes. More likely those parameters depends on the region of the retina considered.

          Because of our grey perception, there's already been 10-bit black and white medical displays out there.
          Well... not exactly. Those displays are grey, simply because most of the picture produced in radiology are, indeed, grey scale. Thankfully we happen to have good sensitivity to grey contrasts so doctors in radiology can read them (with the help of monitors that have a wide enough dynamic range of light intensity and enough steps in between to mimic the quality of actual radiology films).

          On the other hand, you could imagine obtain similar visibility to fine details by using pseudo colours. The problem is that no doctor is used to to analyse rainbow coloured pictures (...I tend to be the only one liking pseudo colour scales...) and if you move the window around (the mapping of data to intensity of grey) colours completely shift around (dark region may have been cyan with one window and orange with another), whereas with a grey scale darker region are always darker grey than lighter regions.

          So the reasons are not only because of compatibility with our retina, but even more so because of practical considerations (looks like the original medium, simpler to manipulate, etc...)

          Pseudo colours on the hand may be very popular in engineering printout because, well, once it's printed, it's hard to play with a display window, so you better find a way to cram as much possible information even on a medium that offers not such a big dynamic range of shades.

          Note that then you have scale problems, which are happily abused for example by charlatans trying to sell snake oil to lower the radiation of your cell phone : the picture with snake oil looks much less redder than the one with snake oil. But that's because the pseudo colour mapping is different between the two pictures. Not because putting a sticker on the back of the phone suddenly stops it from frying your brain.
      • by Ed Avis ( 5917 )
        Do you think that ten bits is inadequate for the green channel?
  • Meh (Score:5, Funny)

    by NotQuiteReal ( 608241 ) on Tuesday June 10, 2008 @09:39AM (#23725833) Journal
    I looked at the pictures.

    It doesn't look like anything special to me. I guess I don't need to upgrade my current monitor.

    • Re: (Score:2, Funny)

      Yeah. I couldn't see any extra colors. Go figure.
    • I want a LCD, SED, or whatever that has rich colors like old fashion CRTs. None of the affordable LCDs do that right now. :(
      • Re: (Score:3, Informative)

        by GleeBot ( 1301227 )
        Better than CRT, actually. At least under certain conditions.

        Matrix-style displays have some big inherent advantages over scanning phosphor technology, such as crisp, precise, flicker-free display.

        Meanwhile, there have been "deep color" displays like this capable of more than 24-bit color for a while. Use of LED backlights give them a much wider color gamut than phosphors are capable of.

        The main failings of current LCD technology fall into two categories:

        First, LCDs block light imperfectly, so you get pot
        • Re: (Score:3, Informative)

          by GleeBot ( 1301227 )
          Incidentally, for those who don't understand the bit about the "wide color gamut" enabled by LEDs, color spaces (such as the Adobe RGB, sRGB, NTSC, and so on spaces mentioned in the summary/article) are defined by three primary colors. Nothing new there.

          The tricky bit is that the specifications define these three primary colors in terms of a precise frequency of light. The only light source that comes close are tuned lasers. Consequently, that LCD monitor sitting on your desk (or lap), probably backlit b
      • You can get LCDs that have better colour, both in terms of gamut and in terms of quality, than a CRT today. Problem is you don't get them in the bargain bin. The NEC LCD2690WUXi is quite superior to even professional CRTs in my opinion (and I happen to have a LaCie Electron22Blue IV to compare it to). The gamut is no question superior, you can measure that, but the subject colour quality is just great too. Thing it it's over $1000.

        Cheapest you can probably find a "better than CRT" panel is about $700 for a
        • by antdude ( 79039 )
          Yeah, that's the problem. I used to be able to get a decent priced CRTs that have better colors than these LCD monitors. I am waiting for prices to keep falling.
    • Re:Meh (Score:4, Funny)

      by Dachannien ( 617929 ) on Tuesday June 10, 2008 @12:52PM (#23730427)
      I dunno. I've seen one of these things in person, and it can actually display octarine.
  • Registration (Score:5, Insightful)

    by jefu ( 53450 ) on Tuesday June 10, 2008 @09:42AM (#23725885) Homepage Journal

    It might be better to avoid stories from people (justechn, roland p, etc) that just link to their websites. Especially those that require registration.

    Slashdot should not be giving these guys (and their like) the free publicity that they figure they deserve.

  • Dithering (Score:4, Insightful)

    by Thelasko ( 1196535 ) on Tuesday June 10, 2008 @09:43AM (#23725917) Journal
    Did they determine those specs using the same calculations Mac used. [macworld.com]
  • by It doesn't come easy ( 695416 ) * on Tuesday June 10, 2008 @09:44AM (#23725935) Journal
    Don't have time to find all of the references but most of the human race cannot distinguish that many colors, except possible the few who have the extra color rod in their eyes. Most of us cannot see more than about 1 million colors, I believe.

    Cool technology, though.
    • by Hijacked Public ( 999535 ) on Tuesday June 10, 2008 @09:57AM (#23726187)
      I don't particularly want 1 billion colors, I actually just want 1 new one: black.

      Not a very slightly gray-black, but silver-print-face-of-the-half-dome black.
    • Especially considering that most people buying these will be big tech geeks. Which are mostly men. Most men don't have very good abilities at differentiating a lot of different colours. But who's to say you have to have people using them. Who knows though. They could get a considerable market share of the mantis shrimp [softpedia.com] population.
      • by eh2o ( 471262 )
        A non-color anormal man is just as good in color discrimination tests as a woman. However, cone opsin genes are carried on the X chromosome which causes men to manifest color deficiency an order of magnitude more often than women (who have another X to fall back on). The rate is 5-8% or so.
    • by jcupitt65 ( 68879 ) on Tuesday June 10, 2008 @10:09AM (#23726421)

      That's not quite right.

      CIELAB colour space codes colours as L (lightness) with a 0 - 100 range, and a/b (red-green / yellow-blue) each with about a +/- 100 range for physically realizeable colours. A pair of colours which are just distinguishable are a unit apart, so we can distinguish very roughly 100 * 100 * 100 colours, or a million.

      However those are surface reflectances under a single illuminant. In a natural scene, your eye is adapting constantly as you look around. Your iris changes size, your retina changes sensitivity, and so on. The range of lightnesses in a natural scene is up to about 10 billion to 1 if you compare direct sunlight to deep shadow. You can distinguish a million colours at each of these points of adaptation.

      If you want a display that can show a full range of dark colours and a full range of light colours, you need more than a million to 1.

      • Re: (Score:3, Informative)

        by mark-t ( 151149 )

        The range of lightnesses in a natural scene is up to about 10 billion to 1 if you compare direct sunlight to deep shadow. You can distinguish a million colours at each of these points of adaptation.

        While true, this overlooks the fact that there will be an absolutely HUGE number of hues at one level of illumination that do not produce different optical characteristics from different hues at different levels of illumination. This sort of thing _drastically_ reduces the color space required for a full set of

        • Oh sure, I'm just saying that you need more than a million for high dynamic range media (ie. media with a bigger range than you get from reflective materials).
      • Coming up next: nuclear-powered displays, for when those pesky LEDs just aren't bright enough....
    • by argent ( 18001 )
      If the display only generated the specific million colors that you can distinguish, you'd be right, but there's nothing like a 1:1 match between the RGB color map and what you can see.
    • by Firehed ( 942385 )
      It's different for each of the three primary additive colors (RGB). I think green tends to be the most sensitive, and red the least (the rough equivalent of 12-bit green, 8-bit red sensitivity, IIRC, with blue somewhere in the middle). However, for work where color accuracy is key such as photography and video work (especially with the 14-bit-per-channel sensors in many DSLRs today), you definitely want your eyes rather than your monitor to be the limiting factor. As a photographer I'd consider paying a
    • Most of us cannot see more than about 1 million colors,

      Bollocks. It all depends on the contrast and mapping curve.

      1mill combos from 3 channels is only 100 levels per channel, i.e.
      around 7 bit per channel.

      Sure, on a cheap monitor where the difference between how much light the lightest and the darkest pixel send toward your eye is not really that big, you don't need many steps in between either, since they will be very close together.

      But even then you will see clear banding in a smooth sweep with 1-bit inte
    • I don't think more colors steps are necessary, except to make a broader gamut. 10 bit color means 1024 steps from white to black rather than just 256. Banding in brightness can be very apparent.
    • Re: (Score:3, Funny)

      by owlnation ( 858981 )
      My eyes... the monitors... ze do nothing...
    • by foobsr ( 693224 )
      Most of us cannot see more than about 1 million colors, I believe.

      It seems that the "experts" also do believe rather than know. Numbers differ from 100,000 to 10Mio here [hypertextbook.com].

      CC.
    • Re: (Score:2, Interesting)

      by GleeBot ( 1301227 )
      There are some interesting comments about whether or not the human eye can actually distinguish all these colors, but I think they miss the point about the true purpose of the extra bits.

      It's so you can throw them away.

      Achieving color accuracy requires a lot more than just having a lot of precision. If any given display can output 2^30 different shades, that still doesn't get you accuracy, because you want any given 3x8-bit color to map to a precise one of those 2^30 shades.

      The extra bits give you room to
  • Oh, really? (Score:5, Informative)

    by Timothy Brownawell ( 627747 ) <tbrownaw@prjek.net> on Tuesday June 10, 2008 @09:46AM (#23725997) Homepage Journal

    An LED-backlit 24-inch widescreen monitor, the DreamColor features 30-bit imaging with a over billion colors. That's 64 times the standard LCD color gamut
    No it isn't. Gamut is something like how far apart the most different colors it can show are, and depends on what colors the actual pixel elements are. The number of bits just determines how close together the most similar colors it can show are.
    • Yeah, but with this display they get both more contrast AND a better coverage of the XYZ horseshoe diagram (i.e. more saturated primaries). So they actually do increase the gamut, not just the bit depth. But you are right, the factor 64 was obviously computed using just bit depth.
    • Though having an LED backlight means it will have a very large gamut. The NEC LCD2180WG-LED has an exceedingly wide gamut, enough that it can completely cover the aRGB and original NTSC spaces and then some.

      So this display has a nice wide gamut AND precise steps of colour.
  • Hype (Score:3, Interesting)

    by mathimus1863 ( 1120437 ) on Tuesday June 10, 2008 @09:48AM (#23726035)
    This is really just hype more than anything. Remember that article about like 50% of people with HDTVs think they are viewing in HD but it turns out they're not (b/c of having wrong cables, etc)? It's the same with colors--the eyes just can't distinguish between a display with 10 million colors and a billion colors. Personally I think you're wasting your money buying this thing. But at the very least, maybe the price of "inferior" monitors will go down if this goes mainstream, so I shouldn't complain.
    • Re: (Score:3, Insightful)

      by vijayiyer ( 728590 )
      It's not the 1B colors that matter, but the gamut. Do you agree there are colors that most monitors can't show but do exist in real life? Think of neon greens, bright magentas, etc. This monitor, covering the Adobe RGB gamut, displays colors other monitors simply can't. That may not matter to you, but it does to photographers.
      • Re: (Score:2, Insightful)

        by wprowe ( 754923 )

        Exactly! People who work in color managed work flows need exact color representation and will want this. We need to know that what we see is what will be in a publication.

        I would say the entry price is a bit steep, except that pro photographers will spend twice as much on a camera body alone. They will keep that camera body for less time than they will keep this monitor.

    • I find that that study is quite surprising. SDTV on an HDTV looks worse than SDTV on an SDTV. Either these people are delusional, or they need their eyes checked. I don't doubt the study, it's probably quite indicative of what people actually think. But it's amazing how little people know about a TV that they spend hundreds or thousands of dollars on.
      • Re:Hype (Score:4, Insightful)

        by Firehed ( 942385 ) on Tuesday June 10, 2008 @10:28AM (#23726845) Homepage
        True. But stick most people watching American Idol in front of a 52" screen and they'll be too enthralled by the size and brightness to notice the image/video quality. If they're willing to put up with that kind of programming, you can't expect them to be overly picky about AV quality. It's not called the idiot box for nothing, even if it would be more aptly named the idiot panel these days.

        Remember - "bigger is better" for most people. I can hardly watch typical HDTV due to how hard they stomp on the video for compression, as the macro blocking is too distracting to me (web content tends to be better, as most web producers actually CARE about that kind of thing). At least SDTV tends to be too soft of a picture to have bad macro blocking, and they don't need to compress it has hard in the first place to send it down the tubes.
      • Re:Hype (Score:5, Interesting)

        by egomaniac ( 105476 ) on Tuesday June 10, 2008 @10:44AM (#23727263) Homepage
        I bought a 50" plasma some years ago, and was showing a few of my friends SDTV channels versus HDTV channels. Now, this was a very high-end plasma, properly calibrated, showing some of the prettiest content on Discovery HD, so we are talking a KICK YOU IN THE FACE improvement that anybody with half a brain should have been able to appreciate.

        One was suitably impressed. The second said that she could kind of see a difference, but didn't really care. The third said she couldn't even tell.

        I suspect these are the same people that buy a nice 24" LCD and then run it in 800x600 resolution. Sadly, I have seen this. After fixing it, I have then seen these same people maintain that aside from the aspect ratio change, they couldn't tell the difference.

        Evidently a lot of people desperately need glasses and have absolutely no idea how bad their vision is. The weird part is that even when this is pointed out to them -- "Wait, you seriously can't tell the difference between 800x600 and 1920x1200? Please, for the love of Zeus get your eyes checked!" -- they generally act completely nonplussed and never bother to see an optometrist. I just don't get it. Why do so many people not care about having sharp eyesight?
        • Re:Hype (Score:5, Informative)

          by GleeBot ( 1301227 ) on Tuesday June 10, 2008 @11:53AM (#23729005)

          Why do so many people not care about having sharp eyesight?
          I was one of those people, so I'll try to answer this for you.

          Frankly, most daily tasks don't require good eyesight. I don't even bother wearing my glasses unless I'm reading signs or driving or something. And my level of eyesight actually requires correction; a lot of people have less-than-perfect eyesight that's still legal to drive with.

          When I go to the movie theater or watch a DVD on a big screen or something (if I'm watching on my laptop, I can already see every pixel at a comfortable viewing distance), I do put on my glasses so I can enjoy the sharpness (if it's that sort of movie; some movies are better without being pixel-perfect sharp).

          However, for everyday life, it provides marginal benefit. And corrective lenses inevitably introduce other kinds of distortion, which I find give me a headache. Certainly if I want to make sure something is straight and level, I take off my glasses, because I can't trust my lenses to match what my brain has been wired over the years to perceive as straight.
    • Re:Hype (Score:5, Insightful)

      by Colonel Korn ( 1258968 ) on Tuesday June 10, 2008 @10:16AM (#23726565)

      It's the same with colors--the eyes just can't distinguish between a display with 10 million colors and a billion colors. Personally I think you're wasting your money buying this thing. But at the very least, maybe the price of "inferior" monitors will go down if this goes mainstream, so I shouldn't complain.
      I'm amazed at how uninformed you and most of the posters seem to be. You can prove that the eye can distinguish, VERY EASILY, between 16.7 million and 1 billion colors, and you can do it right now.

      1) Open photoshop.

      2) Make a gradient from 0-0-0 RGB to 255-0-0 RGB. This covers every possible variation of the red channel in a 16.7 million color space. Draw the gradient across your whole screen.

      3) Look at the color banding and say, "Oh, I guess I can see why 30 bit color would be noticeable."
      • Re: (Score:3, Interesting)

        by egomaniac ( 105476 )
        Did you know? Many LCD monitors, even if they claim to, don't actually support 24-bit color!

        If you do this test and can see prominent color banding, then either you're using a crappy monitor or you have superhuman color vision. I performed this test on my Dell 2405FPW, and I see absolutely no color banding in red or blue and only the slightest, itty-bittiest hint of it in green.

        I don't believe for a second that the average person could see color banding in this test at all, let alone easily.
        • Re: (Score:2, Interesting)

          by GleeBot ( 1301227 )
          I'll back this up, in case anyone doesn't believe him. After I bought a colorimeter and calibrated my display, gradients have almost no stepping (even though the calibration process actually removes colors, because it maps to a subset of the available colors). And I don't even have particularly nice monitors, like the 2405FPW.

          I find it amusing how most people don't even realize how poorly calibrated their monitors are. If they don't come out poorly calibrated from the factory or the store, someone fiddle
          • Too true. For the record, said 2405FPW was calibrated using Mac OS X's calibration in advanced mode, but using my built-in colorimeters (i.e. eyeballs) rather than a store-bought colorimeter. It's also been about a month since I last calibrated it. So it's got good but not perfect calibration.
      • by glgraca ( 105308 )
        You are ignoring all the reds that have a bit of blue or green. Take a look at the CIE colour space diagram and see if you can see any bands, even with 16 bits: http://en.wikipedia.org/wiki/Image:CIExy1931.svg
      • by skeeto ( 1138903 )
        I did this with Gimp. Just as some sibling posts said, I also do not see any bands. My screen is 20 inches wide and 1920 pixels wide. It's some kind of Dell monitor at work, so I have no idea what exactly it is (no label).
      • Interestingly enough (If I understand correctly) there isn't enough pixels on the screen to do a 30-bit gradient with 1-bit per pixel.
    • This is really just hype more than anything.
      Well, no. Really it's a niche thing. Film companies (like Dreamworks.. as TFS mentions) need monitors with a bigger color space and higher dynamic range. It's not a mainstream monitor, which should be fairly obvious from the ridiculous price tag. But it is useful for a particular industry.
  • HP's new 30-bit, 1 billion color LCD display.

    Or, put another way, yet another display that can show about 999 million more colors than most people can tell apart (or in my case, 999,999,000, aka "six-nines of wasted color").


    With 6 built-in color spaces [...snip...] you can easily switch to the one that best suits your applications and process.

    Translation - Users will always pick the wrong one, "guaranteeing" that they never see the right thing.
    • Re: (Score:3, Insightful)

      by Firehed ( 942385 )
      Users spending thirty five hundred dollars on a computer monitor will know what to use. Excepting the obnoxious rich guys, the target audience of this is primarily advertising businesses and high-end video/photography where color space and bit depth is actually important.
  • by Andy_R ( 114137 ) on Tuesday June 10, 2008 @09:51AM (#23726083) Homepage Journal
    Is "considered color critical" anything other than meaningless hype? Is there a graphics card that can feed it with more than 24bits of color information, and any software that works with that combination? More importantly, what's the resolution of the display, how black is it's black, and is it's colour gamut any larger than a normal monitor?

    I'd need a lot more information before I consider this to be a competitor to the SWOP certified 2560x1600 pixel screen I'm using now.
    • Re: (Score:3, Informative)

      by mikael ( 484 )
      The monitor is designed to be color calibrated with color printers and scanners.

      We had some art friends who used a system like this. One time, they discovered there was a market for their paintings as prints rather than as originals, so they decided to set up their own print shop.

      However, the problem was making sure the scanned input matched what was on the screen and what was printed out. So they bought a system calibrator which had a photosensor that attached to the screen. You basically scanned in a pre-
      • by Andy_R ( 114137 )
        All high end monitors are calibrated the way you explained, that's what the SWOP certification in my post refers to.

        Monitors can never really reproduce accurate Pantone ink colours, since they are lit from within, not by ambient light (and that's before considering the metallic pantone inks), hence my scepticism that "considered color critical" might be less meaningful than SWOP is.
    • by aibrahim ( 59031 )
      A lot of those color spaces are for video and film post production applications.

      There are plenty of devices that output that data type, including HDMI 1.3 and some DVI display cards, but I expect that most of the output devices that people will want to use are going to want to use dual link SDI connectors. Knowing the industry, that is probably an expensive add on option.

      Here is the Sony BVM L230 [sony.com],type of device it is competing against
      • by aibrahim ( 59031 )
        Just checked the Specs... there appears to be no SDI option, so this is going to limit how some people can use these monitors.

        The component inputs can be used in a pinch, but they really just don't cut it for many professional uses (particularly in HD broadcast or film post.)

        Of course near this price point the broadcast video market is filled with 24 bit 4:2:2 color displays, like the Sony LMD-2450 [abelcine.com]. (The 2450WHD model shown in the sidebar includes the SDI i/o option board.)

        This is probably most useful for 3
    • HDSDI supports up to 12 bit per channel, i.e. the equivalent of roughly 36 bit RGB ("roughly" because they use another color space, YPbPr, so you'd loose a bit in the conversion).

      http://en.wikipedia.org/wiki/Serial_Digital_Interface [wikipedia.org]
  • Confused... (Score:5, Insightful)

    by InvisblePinkUnicorn ( 1126837 ) on Tuesday June 10, 2008 @09:52AM (#23726099)
    They make it sound like out-of-the-box you're going to get the best image possible. But that's not the case. The color profile for the monitor needs to be adjusted to match reality (using something like ColorVision's Spyder2)before you can make that claim. There's no point in having billions of colors if they're all wrong.
  • I recently had the opportunity to see, first hand, HP's new 30-bit, 1 billion color LCD display. I have to say I am impressed.

    This doesn't get you much unless the display also has a wide dynamic range- and it doesn't, it's 1000:1, which is pretty average (ratios range from 700:1 to 2000:1 in Dell's lineup, for example.) Ie, keep the same 'color resolution' (which is useless past the eye/brain recognition point) and make the gamut and DR larger.

    I especially don't see why the submitter is "impressed", s

    • Re: (Score:2, Informative)

      by justechn ( 821584 )
      Actually I did see it in person. I apologize about my website going down. It looks like I got slashdotted.
  • by Lucas123 ( 935744 ) on Tuesday June 10, 2008 @09:58AM (#23726207) Homepage
    And here my world has been limited to Crayola's 64-count box. Wow. Who'd have thunk it?
    • New colors!

      Slightly burnt umber, yellowish-greenlike, yellow ochre with a bit of light brown, sorta blue, rhododendron roseum purple, you know that color you get when you have an old bruise? yah, that color...
    • Clearly you've not been cross-hatching, stippling, or shading with the monochrome crayons in the set; give it a try and you will be able to draw shades of colors that are beyond difficult to achieve by altering applied pressure alone.
  • I tried to look at the stunning images of the new monitor. Besides being slashdotted, I gave up realizing I'd be looking at their stunning display on my MacBook, which only pretends to do millions of colors anyway.

    Nothing for me to see here. I'm moving along.
  • While this display costs 5 to 10 times less than its current competition, it probably won't attract anyone outside the special niche of professional video editing. Which is, definitely, a large niche, don't get me wrong, but for 99% of people the existing displays which cost 10 times less, provide the same quality and experience.

    Before you mod me as "troll" (believe me, I was modded as troll for much less), think about this: would you rather spend $3000 on a display, or $300 on a display of the same resolut
    • by Dunbal ( 464142 )
      I know what my choice would be.

            The display - amirite?
    • Re: (Score:2, Funny)

      by rilarios ( 1164755 )

      .....and the rest on occasional romantic dinners with your GF or wife,....
      you spend too much on occasional romantic dinners with your GF or wife.
  • by Aqua OS X ( 458522 ) on Tuesday June 10, 2008 @10:05AM (#23726329)
    Umm, how?

    Print reflects light, montors emit light. You can get close-ish, but that's about it.

    All in all, if you still want acurate color, you'll still need to do a print/press check.
    • Re: (Score:3, Insightful)

      Well, first they are talking about movies here, so I'd think "print" means a film print for a theater release in this context.

      The problem you have with printing and especially film printing is that the color gamuts of various printing methods are different from and only partially overlapping with the gamuts of regular monitors. That is, the monitor can show colors that the print can't show, and vice versa.

      What they did with this displays is build a device that has a very wide gamut, so it can cover the full
  • That's nice, but what OS / software combo actually supports 30 bit colour displays? (as TFA is already dead...)
    • by TheSync ( 5291 ) *
      what OS / software combo actually supports 30 bit colour displays?

      No idea, but you can carry 48-bit color over dual-link DVI-D. Analog component is another option, but let's see...

      Thermal noise, Vn=sqrt(4*1.38x10-23 J/K * 300K * 75 Ohms) = 1.1 nV * sqrt(Hz)

      HD goes up to about 1.485 GHz, so Vn = 46 uV. 1 of 30 bits (assumping 1V peak) is 0.9 nV. So I suspect the cleanest analog component video will top out at about 19 bits at room temperature due to thermal noise.

      Keep in mind that DCI gamut is 12-bit (4:4
  • Anyone know what the response time on this is? Did they relax the times to get the wider gamut? Color is great but fast response times are what I really want. I could not find any information in the link(s), so it must not be that good or they'd be touting that, too. How can you do video editing with a low response time?
  • by Doc Ruby ( 173196 ) on Tuesday June 10, 2008 @10:17AM (#23726579) Homepage Journal
    This display might work for reliable color matching, but not for the reasons supplied.

    The main problem with getting color on one object, say a display monitor, to look exactly the same as on another object, say a magazine page, is mostly the problem of gamma [wikipedia.org], a nonlinear contrast range in different light levels. And, of course, the differing illumination of the two objects in different places, which is the actual source of the possible range of colors that can be seen coming from the object.

    The human eye is very sensitive to different spectral content of light detected coming from objects. Sunlight starts out with different colors than the light shining on a display monitor or generated by the display. The magazine in the sunlight filters a range of colors through its ink, then reflecting off the paper (which is itself some color, even if that color is "close" to "white"), back through the ink, and to the eye. The display monitor's light starts out a different color from the sunlight, then is filtered through and reflected from very different materials than ink and paper. By the time the light reaches the eye from each object, they're very different. And each instance is a little different, owing to manufacturing quality variations.

    And then gamma has to be factored in, which tends to dominate the color content reaching the eye. The gamma is a kind of nonlinear "contrast" (as in a TV control) in different frequencies, varying as the intensity of the same illumination is increased. But even that illumination generally isn't just the same color at all intensities, because it's emitted from some manufactured material that has its own gamma (or emission equivalent) and "color temperature [wikipedia.org]" bias. Which is in turn different from sunlight, which is more stable in its source color range than most manufactured materials (except lasers, a completely different kind of illumination that looks completely different from sunlight).

    Color calibration works best when there's a feedback loop of the data passed between different output objects (like paper/ink and a display monitor), linked by a video sensor (that has its own color calibration problems). It's an extremely hard problem. When I was a member of the Joint Photographic Experts Group (JPEG, who created the image file format - I helped with the color spaces spec), we spent a lot of time getting it close enough for commercial use. But we knew enough to tell that "solving" the problem 100% was not going to work. And even now, almost two decades later, it's still not solved. But every few years new tech makes it affordable for industries to add another "9" to what was once 99.999% accurate. The 30 bit gamut [wikipedia.org] of this display monitor means that it doesn't constrain the range of colors as much as have old technologies. But the calibration requries sophisticated processes and software to automate them, as well as a method for comparing to actual outputs. And it still can't account for variances in manufacturing the target output media.

    For Hollywood, this problem might be close to solved, though. Because movies are moving to digital projection, which can be manufactured to high precision of consistency in materials and their interaction with light, and from the same parts as the production display monitors. If all the theaters used the same DLP chips, LEDs and image surfaces (or to the precisely same standard specs) for their projectors as the studios did for all their display monitors and as all people did for their home TVs, then colors would be pretty close to identical in all those environments (except for that variable ambient lighting). These display monitors might flexibly replicate a lot of different environments to match, but the matched objects are still highly variable. For $3500, they better deliver something good.
    • by Tanman ( 90298 )
      Calibration in a movie studio is not important for matching the theaters. It is important for maintaining consistency throughout the project, which may be made and edited across many sites all over the world. You want to make sure your explosions match and that the lighting is the same from cut-to-cut. Hollywood knows that the specific color will be lost in theaters -- but *consistency* is key.

  • Dr. Evil (Score:5, Funny)

    by Tribbin ( 565963 ) on Tuesday June 10, 2008 @10:37AM (#23727067) Homepage
    One... BILLion colours...!
  • Give me a call when they make one with a CMYK output. [wikipedia.org]
  • by TheSync ( 5291 ) * on Tuesday June 10, 2008 @10:48AM (#23727367) Journal
    To date, I have not seen any LCD or Plasma monitor that can perform as well as certain projection D-ILAs in terms of the combination of luminance ranges, good black levels, contrast ratios, gamma accuracy, viewing angle, and coverage of the Rec. 709 gamut. But don't take my word for it, here [plasmadisp...lition.org] the Plasma Display coalition admits they can only cover 80% of Rec. 709 with their best displays, with many more falling in the 75% department.

    From a digital television perspective I am much more interested in monitor gamut effectively covering the Rec. 709 color space, because that is all I can put on TV. Sure, it's OK to have extended gamut outside Rec. 709, but if you can't actually cover all of Rec. 709 gamut I don't care if you cover color outside that space. Similarly, I'm sure digital cinema people want the DCI gamut covered well first before having coverage outside that gamut.

    On the LCD side, the production lines are changing so rapidly that two versions of the same type of panel from different months will have different results. I have seen a $300 Dell LCD computer monitor perform better than some professional television LCD displays that are priced 10 times as much.

    My suggestion is to measure displays yourself, and ignore marketing literature. Of course, you need a good broadcast engineering lab to do that, not all networks have such a thing...

    If you want to know what you need in a good monitor, see the EBU User requirements for Video Monitors [www.ebu.ch]. SMPTE is working on a set of recommendations as well.

    I'm hoping that OLED displays will come to the rescue, but it will take a while for them to come up to needed sizes and maturity.
  • I'm probably behind the times, so maybe someone could clear this up for me.

    The LCD display on my several-year-old Compaq laptop is quite unreliable in terms of viewing true color, for the simple reason that the contrast of different colors changes significantly depending on the vertical viewing angle. I can often make low-contrast or dark photos more visible by tilting the display away from me (to make them darker) or toward me (to make them lighter, but with some light colors fading to white and ironica

  • by trb ( 8509 ) on Tuesday June 10, 2008 @10:58AM (#23727623)
    If you take a photo of the sun and look at the image on this monitor, you can blind yourself.
  • Apple could easily buy such panels and market them at that price.

    I know i'd save for and buy one, as I believe in quality over quantity, and it would be a good way to reaffirm to photography professionals that they still care about them.
  • Having gotten into digital photography and high dynamic range imaging lately, I can see how this thing would be great for photographers / artists. But how do you drive it? Does your average video card have the capability to drive this? I thought most consumer hardware was pretty much limited to 24-bit colour. (Or what they call 32-bit but is really 24-bit plus an 8-bit alpha channel.)
    TZ
  • I attend SIGGRAPH now and then and see some of these futuristic displays. We still have a bit to go. My criteria for a "perfect display" is that it should be indistingushable from looking out of a window at the same scene. Some of the devices at SIGGRAPH are a lot better than current technology. And all I can say that its almost like looking at magic, to paraphrase Arthur Clarke.

    The biggest defect is contrast or dyanmic range of intensity. When you view the banks of TVs at Best But etc, its the ones t
  • by Mister Whirly ( 964219 ) on Tuesday June 10, 2008 @03:04PM (#23733913) Homepage
    All this dicussion about LCDs, CRTs and colors, bah! I use LSD and I can see billions of colors nobody else can....
  • Video card? (Score:3, Interesting)

    by jgoemat ( 565882 ) on Tuesday June 10, 2008 @03:25PM (#23734507)
    Are there any video cards that support the extra colors, or is there something else where the display can more accurately represent the color based on color space without actually changing the bits per channel sent from the video card? I th ink Matrox had a 30 bit video card at one point...

Programmers do it bit by bit.

Working...