Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Displays Graphics Software

A Billion-Color Display 206

The Future of Things covered the introduction last month of HP's DreamColor display, with 30 bits/pixel, developed in conjunction with DreamWorks Animation. The display is aimed at the video production, animation, and graphic arts industries. HP promises blacker blacks and whiter whites — though TFoT quotes one source who notes that if they deliver this, it will be due to the back-lighting and not to the number of bits/pixel. No word on the size of the displays that will actually be delivered, or on the price.
This discussion has been archived. No new comments can be posted.

A Billion-Color Display

Comments Filter:
  • To what end? (Score:2, Insightful)

    by Eudial ( 590661 )
    Is it really possible to improve screens further, in a way that's visible to the naked eye? It's the same with high end audio system. I sure can't tell the difference between a mid price-range audio system and a bleeding edge $50,000 system.

    My point is that 24 bpp ought to be enough for anyone.
    • Re:To what end? (Score:5, Insightful)

      by gEvil (beta) ( 945888 ) on Saturday May 10, 2008 @04:41PM (#23364220)
      And yet that 24bpp can't reproduce the full range of colors that can be printed on a piece of paper. And the ink on that piece of paper can't reproduce the full range of colors visible to the naked eye. Yes, there's room for a whole lot of improvement. That's not to discount the progress we've already made (24bpp is pretty impressive), but there's still a long way to go.
      • Re:To what end? (Score:5, Informative)

        by moosesocks ( 264553 ) on Saturday May 10, 2008 @05:22PM (#23364518) Homepage
        Modern monitors use an additive method of color blending, while printers (by their very nature) must use subtractive blending.

        The range of colors that can be reproduced by a 24-bit RGB device is always going to be different from the range of colors that a 24-bit CMY device can reproduce.

        By the same note, a 24-bit RGB display can produce colors that the CMY printer cannot.

        One color space isn't bigger than the other; they're simply different. Once you increase the bit-depth far enough to encompass the full spectrum of visible light for both color spaces, the distinction can finally be dropped.
        • by Animaether ( 411575 ) on Saturday May 10, 2008 @05:43PM (#23364648) Journal
          They're absolutely right that CMYK does not encompass RGB. They overlap for a large part, and don't overlap in small areas (with one larger area in the deep vivid cyans).

          However, a larger bitdepth doesn't do anything for color space. It simply determines the granularity of that color space. If with 16 bit you get 65,536 individual colors within the RGB gamut (with slightly higher granularity in the green channel, typically), and with 24bit you get 16,777,216 individual possible colors within the RGB gamut, then with 30 bit (10 bit per channel; it's not new, really), you get 1,073,741,824 individual possible colors... but still within the RGB gamut (of the device at hand).

          An HDR display (either by using a very bright backlight or more localized LED backlights control, etc.) also doesn't change the gamut of that device - it simply allows for much brighter values of them.

          Now, if they were to make an LCD panel that aside from the R,G,B pixel elements also had C M Y pixel elements, then you most certainly could increase the gamut. It would also be much more difficult to switch to than a simple bitdepth change.
          • Now, if they were to make an LCD panel that aside from the R,G,B pixel elements also had C M Y pixel elements, then you most certainly could increase the gamut. It would also be much more difficult to switch to than a simple bitdepth change.

            That would make no sense on an LCD display, given that CMY is a subtractive color model, whilst color is achieved on LCDs via additive blending.

            Although adding another "primary" color should increase your gamut, CMY might not be the best choice of colors to use in that case.

            Think of RGB mixing is analogous to shining three different-colored flashlights at a white target, the complete overlap [wikipedia.org] of which should also be white.

            CMY color mixing is analogous to taking three different colored sheets of glass, and l

          • With OLED displays, adding more colours should become a lot easier.
            • I would think it would be easiest with LCDs, since all you would need to change is the dye/filter layer that goes over the pixels. Probably would work best with CCFL backlights, since they have a fuller spectrum.
          • Re: (Score:2, Interesting)

            by blincoln ( 592401 )
            However, a larger bitdepth doesn't do anything for color space.

            Actually, it does.

            A higher bit depth means that the maximum contrast between channels is greater, *because* you have more resolution (or granularity, if you like) in each channel.

            For a very obvious example of this, take a 24-bit RGB colour image and downconvert it to 16-bit. The difference between 8 bits per channel and 5 (or 6 for green, depending on the type of 16-bit encoding) is quite dramatic. It's why older 3D games tend to look washed out
            • Hello, ignorant user who modded me "overrated". Maybe you should research the topic before you assume I'm incorrect.

              Higher bit depth increases the maximum difference in value between channels. This is simple math. This increases the colour space because it means that the brightest value for each channel can be set higher on the display system without making *everything* appear too bright/saturated.

              Think of it this way - if I have only four bits per channel, then I only get 16 steps in between black and full
        • Increasing the bit-depth will not be enough. The human eye sees way more than 3 colors, most are mapped to a 3-color space in the brain but a couple stand out. So we need a color-space with more than RGB to get closer to human vision. Most important are visible infrared and visible ultraviolet. These are needed for low light vision and glow effects, and without them you will never be able to see what is going on in Doom 4. Also to fix low light the bit-space should be made logarithmic, you can't see the dif
      • It take HP 30 bits to show color? Ha! my old Apple II could do it in just 8 bits. HP has a lot of catching up to do.
        • It take HP 30 bits to show color? Ha! my old Apple II could do it in just 8 bits. HP has a lot of catching up to do.

          I'm not impressed. My old IBM could do color with just 4 bits.
      • Re: (Score:3, Interesting)

        Afaik, the fact that a 24bbp display can't reproduce all visible colors has more to do with the fact that the display's pixels are made up of 3 monochromatic sub-pixels than the fact that there are 8-bits of information for each of those sub-pixels. Just adding 2 extra bits for each of those 3 colors isn't going to do much in terms of spectrum coverage iirc.

        I'd actually be interested in seeing research into displays that didn't use distinct pixels at all, and instead went with something like a bayer patter
        • by evanbd ( 210358 )

          Your eye only has 3 color sensors. Therefore you can reproduce any spectrum in a way that your eye will see as equivalent with only 3 color elements. That said, RGB doesn't do a perfect job of this -- there are some colors at the edges of the color space that your eye can see that RGB can't produce.

          Now, it's entirely possible that the easiest way to produce the full spectrum when it comes to actually building a display is with more than 3 different color elements, but 3 is sufficient if they're the righ

          • Re: (Score:3, Informative)

            You're almost right... which is to say, wrong.
            There are 3 types of cone receptors, and 3 numbers is sufficient to describe any color the human eye can perceive, but those 3 numbers can not represent actual physical colors.

            Your cones do not just detect one monochromatic color, each type has it's own response curve across varying frequencies, and they're not even nice simple bell curves (one even has two peaks). To represent the entire visible color space with 3 numbers, as the CIE 1931 XYZ color space does [wikipedia.org],
            • FWIW, some women have 4 different receptors. One of the receptor pigments is encoded on the X chromosome so women can have two different types. See the Wikipedia [wikipedia.org] article as usual.
      • Actually, the color gamut for CMYK is generally smaller than RGB. A CMYK print out on its own looks perfectly fine, but place it beside a computer monitor or an actual photo and suddenly the colors look a bit muted and in some cases slightly off. Depending on the nature of a particular project there sometimes is a lot of work involved in getting CMYK-based colors to look right.

        One limitation with current displays in reproducing true to life color is that the image is being reproduced by a light source. And
    • Re: (Score:3, Informative)

      by smallfries ( 601545 )
      There are two main ways to improve over a standard system and the summary sounds as if they've done both. The contrast range on a normal screen is in the order of 500:1. On a bright sunny day outdoors our eyes pick up contrast ratios that are 1000s of times larger. The claim about blacker blacks and whiter whites will be a reference to High Dynamic Range.

      Once you increase the range of colours that you are going to display it means the gaps between distinct colours become larger and so more bits are required
      • by ceoyoyo ( 59147 )
        Not in 30 bits they haven't. That's only 10 bits per channel (compared to 8 in a regular screen). There's a reason you don't see 10 bit floating point numbers much.
        • That's not true (that you don't see small floating point - I assume that you're correct about these guys). If you read some of the literature on tone mapping they use small floating point formats. The reason being that although the eye is good at picking out graduations in colour over small ranges, when it is over a larger range the smaller graduations are ignored. Ie you get the same degradation in precision for larger magnitudes that floating point exhibits.

          The best working guesses for this behaviour are
    • by nobodyman ( 90587 ) on Saturday May 10, 2008 @05:10PM (#23364416) Homepage

      Is it really possible to improve screens further, in a way that's visible to the naked eye?
      I think so. As a quick example of why I think this, temporarily turn off anti-aliasing in your OS. The characters on the screen should look pretty crappy relative to a book or an illustration. So, I think we have a ways to go. I think the same is true for color depth, it's just hard to recognize it because we have gotten used to 8 bits/pixel.

      Most new displays have a resolution of 96dpi, whereas low-end printers can easily pull off 300dpi. Same goes for color-depth. Black and White screen images at 8 bits/pixel simply cant match the range of black&white print & film.

      When you think about it, techniques such as anti-aliasing are really just hacks to work around the limitations of today's monitors. If monitors could pull off 300dpi, you wouldn't need anti-aliasing.
      • Displays can already do a much higher DPI - some handhelds with 3" screens can do 800x600. That's 2.4" along the length, for 800dots/2.4" = 333.33333etc. DPI.

        However, imagine a full size 17" widescreen (16:10) at a DPI of 300. 17" is about 14.4" wide by 9" high. 14.4*300 = 4320, 9*300 = 2700. A 4320x2700 display? Crikey. I'm sure we'll get there eventually, but at the resolution rate we're currently seeing - not for some time aside from high end displays.
      • Re: (Score:3, Interesting)

        by Jeff DeMaagd ( 2015 )
        There are different forms of antialiasing. The ClearType used by Windows really gets me. Yes, it does make the shapes smother, but what it does is turn the edges into rainbows. Instead of the right edge of a shape being a consistent color, and the left edge of a shape being a consistent color, it could be any of three colors anywhere. But it is the sharpest form of antialiasing for text.
        • by phasm42 ( 588479 ) on Saturday May 10, 2008 @09:01PM (#23365932)

          The ClearType used by Windows really gets me. Yes, it does make the shapes smother, but what it does is turn the edges into rainbows.
          This may be due to your monitor not being specified correctly. IIRC, there are two main types of LCD panels: RGB and BGR (different color orders), and in order for ClearType to work correctly, it has to know which one you're using. I've noticed if someone does a non-lossy screen capture of some ClearType text on a computer set up for the opposite sub-pixel color order than what I use, the text looks crappy and has that rainbow effect.
          • by cnettel ( 836611 )

            The ClearType used by Windows really gets me. Yes, it does make the shapes smother, but what it does is turn the edges into rainbows.

            This may be due to your monitor not being specified correctly. IIRC, there are two main types of LCD panels: RGB and BGR (different color orders), and in order for ClearType to work correctly, it has to know which one you're using. I've noticed if someone does a non-lossy screen capture of some ClearType text on a computer set up for the opposite sub-pixel color order than what I use, the text looks crappy and has that rainbow effect.

            Another aspect is screen pivoting, analog connection and/or aggressive contrast enhancement in GPU or monitor.

    • Re:To what end? (Score:5, Informative)

      by Divebus ( 860563 ) on Saturday May 10, 2008 @05:11PM (#23364428)

      Is it really possible to improve screens further, in a way that's visible to the naked eye?

      Just as in audio where quantizing becomes a problem only in very low level passages, fine greyscale, especially in the blackest image areas, will benefit from more bits/pixel.

      For example, an ordinary CD (16 bits) can sound rather gritty on quiet recordings such as the low level passages of classical music. That's because you're probably only using two or three bits of sample depth down there. To combat the issue, 24 bit audio will elevate the sample depth everywhere but will show itself best at low levels. Dither (essentially noise) is used to randomize and mask the problem, but that's a cheat.

      In video, fine greyscale performance comes from higher bit depth per pixel and is visible throughout the entire luminance range. The issue shows itself on flat (un-textured) areas with minute lighting changes across the surface, like a softly lit painted wall. You'll see annular rings on the surface as the bit values step through their range. Again, dither may be used to randomize the quantized transitions.

      24 bit video is really 8 bits per primary color - so it's not that good to start with. In professional application, it's not unusual to work with 10 bit [per channel] or even up to 16 bit[per channel] images, mostly to be more friendly to post production.

      Fortunately, analog humans are fairly blind to minute color changes. Unfortunately, our system of digital video happily shows you everything wrong with it.

    • Re: (Score:3, Interesting)

      by moosesocks ( 264553 )
      Although today's monitors are fairly good at color reproduction, they could easily benefit from extra dynamic range, which LCDs have never been particularly good at. Although the article lacks technical depth, it can be inferred that the extra 6 bits will be used as an alpha channel, to adjust the brightness of each pixel, which should comfortably solve the dynamic range problem once and for all if it works.

      Similarly, in the visual arts industry, it is absolutely necessary for an image on the screen to loo
    • Re: (Score:3, Insightful)

      by davolfman ( 1245316 )
      I'd just be happy if the manufacturers told me the panel technology in the specs so I could avoid 6-bit TN displays.

      As it is, 10 bit displays are nothing new. Photographers have been swearing by them for years as they allow for the response curve of the display to be corrected without dipping below 256 displayable tones per channel. Of course the real solution is just to get someone to manufacture CRTs again. For this kind of market an analog display technology has a serious advantage in that there ar
      • If you want a good CRT, you can buy a GDM-W900 or W900F on ebay or craigslist - they're absolutely professional grade displays and the image is a wonder to behold. I've plugged my laptop in to the mine's second input before; The difference in color saturation is stunning.

        Unfortunately, we all have the same problem regadless of our monitor technology. It can either have black be truly black and get it's full dynamic range, or we can work in a room with a normal level of illumination.
    • by DAldredge ( 2353 )
      How many shades of gray can that 24 bit display show? Hint - the answer is 254.
      • by dabraun ( 626287 )
        You are assuming that it can actually display "black" and "white" and I assure you it can not.
    • by ceoyoyo ( 59147 )
      Not really. I've seen some 12-bit grayscale monitors. They used to buy them for the radiologists. Most of the radiologists now use regular (good quality, but still 24-bpp) LCDs. The contrast ratio on a monitor is MUCH more important. Contrast ratio expands the colour gamut while increased bit depth just lets you move through your existing gamut in smaller steps.
    • by mikael ( 484 )
      Higher dot pitches, more bits per pixel sample (even floating-point displays, higher contrast between black and white colors, larger framebuffer resolution, larger monitor size, 3D focus-to-infinity/stereo views without the need for headsets.

      Maybe even laptops with twin LCD displays, that could be folded outwards, along with a keyboard with a foldout numeric keypad.
    • Couple of things (Score:3, Informative)

      by Sycraft-fu ( 314770 )
      The first is to improve grey scale. Your eyes are extremely sensitive to changes in luminescence. As such we can see grey scale gradients with great precision. 256 levels (which is what 8 bits per channel gets you) just isn't enough. There are already grayscale medical displays out that do 1024 greys (10-bit).

      Then of course there's the problem of wider gamut and wider dynamic range displays. Right now most displays show a fairly small subset of the total amount of colours humans can perceive, and also have
    • by colmore ( 56499 )
      Different people have different needs.

      I'm assuming that if Dreamworks is making this request then it is because their guys have a use for it. I'm at least curious.

      But no,

      "I have no need for this! It shouldn't exist!"

      that makes more sense.
    • Dead wrong.

      Load Gimp, Photoshop or any other program.

      Grab the gradient tool.

      Select the colors [0,0,0] and [10,10,10]. Now create canvas as wide as your screen and create a gradient.

      It'll look like you have 10 giant blocks. Now it still faces the same problem of delivering to 24 bpp displays for DVD but most LCDs have problem with even displaying 24bpp correctly. 30 will be headroom to ensure that at the very least the 24bpp will be rendered correctly.

      Also the film and VFX company LARGELY works in 36
    • Is it possible, IS IT POSSIBLE?
      You're actually asking is it possible, really?

      This whole damned video display industry, from TV to computers to god knows what has gone 20 years backwards in display quality, ever since the LCD 'won' and the average consumer decided they preferred convenience and desk space over picture quality.

      At the slow rate we're going at now, we might just have black levels and the colour reproduction of a basic, cheap CRT in about 5 years, maybe in 10 years time we might finally have a f
  • Well, tech such as this will bring our holodeck dreams just that bit closer.

    Also I can see where tech such as this can be implemented in the medical field, as a for-instance.
  • I was hoping for something like ScRGB support. I've always wanted two things out of displays: higher DPI, and higher gamut. Does this deliver either?

    Chris Chinnock, President of the research firm "Insight Media", is one of those who are skeptical about HP's claims. He says that while the 30-bit resolution will allow for better gradation between the color levels, the technology will not be able to increase the color gamut of a display.

    Guess not. Oh well.

  • Yes, but... (Score:4, Funny)

    by Bradmont ( 513167 ) on Saturday May 10, 2008 @04:57PM (#23364324) Homepage
    how am I supposed to see how good this display is if they don't show me a picture of it?
  • ...in which people are shown the a series of images on two of these displays, side by side... with copies of each image in the series being presented on each display, one rendered with a full 30 bits and the other with rendering reduced to 24 bits... and with the 30-bit image being randomly assigned to the left or right.

    I'd like to see whether people can actually identify the 30-bit image at a rate significantly greater than chance... or whether they're just doing it because they can.

    Like the "Eight-transis
    • (I meant to say... yes, I used Preview but I didn't look at it...) ...in which people are shown a series of images on a matched pair of these displays, placed side by side... with copies of each image in the series being presented on each display, one rendered with a full 30 bits and the other with rendering reduced to 24 bits... and with the 30-bit image being randomly assigned to the left or right.

      I'd like to see whether people can actually identify the 30-bit image at a rate significantly greater than ch
    • Re: (Score:3, Insightful)

      by icegreentea ( 974342 )
      Get a 1024 pixel high/wide image. And then make a perfect white-black gradient. You should be able to tell between the two. As someone else pointed out, you only have 256 greys, so you end up with one grey forming a 4 pixel band (which is noticeable). The new displace will have one grey per pixel.. much harder to tell.
      • by evanbd ( 210358 )

        If you dithered that gradient properly it would be much harder to tell. You've essentially created a signal with a period of 8 pixels and overlaid it on top of the smooth gradient as a result of the quantization errors. It's that signal that's easy to see. If you dithered it properly, the noise would be shaped so that it didn't show up at one specific frequency band (ie 8 pixels and its harmonics), it would be much less noticeable. Note that eg digital photography processes do this inherently to some de

  • Normal RGB displays do not span the colorspace the eye can see. Just like good printer need more than 3 color ink to make good photograps, good display need more than Red, Green and Blue dots to span the whole colorspace of the eye. No matter how many bits you put behind each color, you can not improve this fact.

    Brief explanation:
    RGB colors are designed to match the human eyes sensitivity for the three primary colors. Each color cones spectral sensitivity partly overlaps the others. The RGB display therefo
    • The receptor spectral overlap is one reason why the ultimate visual interface can only be direct neural connection. Anyone who's tried psychedelic drugs will have seen colors that don't exist in reality, generated directly in the brain bypassing the eyes. I also suspect the human brain will adapt well to at least five channels of color, given the existence of tetrachromat humans, and even higher dimension color space perceiving animals. Humans only see a tiny portion of the electromagnetic spectrum, but
  • Besides reqular light, I want my screen to radiate X-rays, Gamma-rays and infrared light, and also ordinary radio waves and even more kinds of waves.
    I want it to emit quarks, neutrons and positrons, and perhaps god particles.
    The constrast of todays screens is appalling, I want miniature black holes creating perfect black tones. I wouldn't know how to create perfect white tones though.

    Yes, I am serious!
    • by ceoyoyo ( 59147 )
      You can get a few of those if you just find a really old CRT. The ones that inspired your mother to warn you not to sit too close to the TV.
  • HP promises blacker blacks and whiter whites -- though TFoT quotes one source who notes that if they deliver this, it will be due to the back-lighting and not to the number of bits/pixel.

    Wow, the definition of dynamic range isn't based on the number of bits per pixel? Whodathunk? Then it must also be true that using a double variable instead of a float does not in fact make 3.0000000000000 > pi.

  • I propose a Turing Test for monitors. Have a monitor, and a window opening onto some chosen view, side by side. Through the window one could view a street with cars and people passing by, while on the monitor is a real time video of exactly the same scene. To be fair, maybe the person judging would have his head secured in some kind of harness to prevent head movement. It would be interesting to see when a monitor would pass such a test, where the majority of viewers couldn't tell the difference. Any predic

  • "HP promises blacker blacks and whiter whites - though TFoT quotes one source who notes that if they deliver this, it will be due to the back-lighting and not to the number of bits/pixel."

    Don't monitors use linear DACs? And doesn't this mean more or less linear light level scales? (I'll admit that I don't know much about how LCDs operate.)

  • The human eye can discern around 4.5 million colors. Anything more than that requires instrumentation to detect. You can use it to prove you have a monitor capable of a billion colors, but you'll never see them.
    • Nobody really knows exactly how many colors we can see. The estimates range from around 4 million to over 10 million, and it appears to vary widely between different people.

      I do tend to agree that billions of colors is a waste though.
  • "High Dynamic Range display technology" was presented at SIGGRAPH 2004 by Sunnybrook Technologies [siggraph.org]. If I remember correctly, they used 16 bits of luminance as opposed to the usual 8 per color, and the display combined traditional LCD pixels with LED backing light, which is just what TFA states the HP monitors are now using. Not only did it give a very high contrast ratio (40000:1), but the images it displayed were absolutely stunning to see -- it's the difference between reflected light and transmitted lig

  • while a billion colors is obviously ridiculous, there are people who can see 100x more colors than an average person

    scientists have recently identified a very small, very rare population of women who see in 4 colors, to a total of 100 million colors

    most humans see in 3 colors, about 1 million colors: red, green, and blue. a tetrachromat has an extra cone type between red and green, around orange. it's only women because the mutation requires two x chromosomes to work

    read all about it, they describe a women who can look into a river and make out silting and depth levels a normal human can't, x-men mutant indeed!:

    http://www.post-gazette.com/pg/06256/721190-114.stm [post-gazette.com]

    http://en.wikipedia.org/wiki/Tetrachromacy [wikipedia.org]

  • It goes to 111111111111111111111111111111.
  • #495173 +(1653)- [X]
    <microgal> and whiter than white
    <RobinHood> heh
    <Kronovohr> so...you're like #GGGGGG?
  • Black is the new grey.

    Lines are already forming at Apple stores worldwide
    for the revolutionary black operating system
    (which will ofcourse cost more than MacOS, the white edition).

    Steve Jobs's wardrobe all make sense now. He had it
    all planned from the start. He must be the one.
  • lol I can see it now....
    2011: New apps and games only come in 30 bit. So now we need to upgrade from 24 bit to 30 bit.
    The die-hard 24 bit-ters are holding out with their defunct copies of XP and DirectX9. Even Ubuntu's Zulu Zygote now has full 30 bit support.....

    Hey! There was nothing wrong with VGA was there?
    I mean Dukem Nukem v 1.0 was playable.....

If you aren't rich you should always look useful. -- Louis-Ferdinand Celine

Working...