Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Displays Graphics Software

A Billion-Color Display 206

The Future of Things covered the introduction last month of HP's DreamColor display, with 30 bits/pixel, developed in conjunction with DreamWorks Animation. The display is aimed at the video production, animation, and graphic arts industries. HP promises blacker blacks and whiter whites — though TFoT quotes one source who notes that if they deliver this, it will be due to the back-lighting and not to the number of bits/pixel. No word on the size of the displays that will actually be delivered, or on the price.
This discussion has been archived. No new comments can be posted.

A Billion-Color Display

Comments Filter:
  • Re:To what end? (Score:1, Interesting)

    by Anonymous Coward on Saturday May 10, 2008 @05:46PM (#23364262)
    Ummm, most medical imaging modalities digitize into 12 or 16 grey-scale bits. 30-bit RGB would get a whole lot closer than 24 to rendering w/o introducing down-sampling artifacts... 'course if you want your MRI for neuro-surgical planning done cheaper we can put the image on an 8-bit color mapped (say 10 grey levels, 245 colors used for web browser) or even 565 rgb. The only people I know of for whom eye strain is worse than for graphic artists are radiologiests -- thousands and thousands of images a day and it really matters if you end up missing something due to fatigue.
  • Re:To what end? (Score:3, Interesting)

    by moosesocks ( 264553 ) on Saturday May 10, 2008 @06:13PM (#23364452) Homepage
    Although today's monitors are fairly good at color reproduction, they could easily benefit from extra dynamic range, which LCDs have never been particularly good at. Although the article lacks technical depth, it can be inferred that the extra 6 bits will be used as an alpha channel, to adjust the brightness of each pixel, which should comfortably solve the dynamic range problem once and for all if it works.

    Similarly, in the visual arts industry, it is absolutely necessary for an image on the screen to look as close as possible to the final product on print or in film. It is also important for these colors to be consistent between systems, especially when multiple artists are working on the same project.

    It might be a niche industry, but if HP are able to improve the status quo, they should be able to sell more than a few. The fact that they've hinted that these improvements will be inexpensive to implement simply translates to a benefit for everyday folks.

    Also, in terms of how much room screens have to improve, take at the print in a phone book or the financials section of a newspaper. Then compare that to the smallest font you can comfortably read on your monitor.

    Even for boring business applications, there are many benefits to be had from higher-resolution displays.
  • Re:To what end? (Score:3, Interesting)

    by RalphBNumbers ( 655475 ) on Saturday May 10, 2008 @06:47PM (#23364688)
    Afaik, the fact that a 24bbp display can't reproduce all visible colors has more to do with the fact that the display's pixels are made up of 3 monochromatic sub-pixels than the fact that there are 8-bits of information for each of those sub-pixels. Just adding 2 extra bits for each of those 3 colors isn't going to do much in terms of spectrum coverage iirc.

    I'd actually be interested in seeing research into displays that didn't use distinct pixels at all, and instead went with something like a bayer pattern composed of monochromatic elements of more than 3 colors. The advantages of easy sub-pixel rendering, and simple 1:1 display of computed pixels, become less relevant with the high dpi displays we can make these days imho. It would be a good idea to look at more exotic layouts to make use of increasing pixel densities.
  • by Animaether ( 411575 ) on Saturday May 10, 2008 @06:53PM (#23364734) Journal
    Displays can already do a much higher DPI - some handhelds with 3" screens can do 800x600. That's 2.4" along the length, for 800dots/2.4" = 333.33333etc. DPI.

    However, imagine a full size 17" widescreen (16:10) at a DPI of 300. 17" is about 14.4" wide by 9" high. 14.4*300 = 4320, 9*300 = 2700. A 4320x2700 display? Crikey. I'm sure we'll get there eventually, but at the resolution rate we're currently seeing - not for some time aside from high end displays.
  • by Malekin ( 1079147 ) on Saturday May 10, 2008 @08:29PM (#23365430)
    Human brightness sensitivity is not even close to constant across the total range of brightness we can perceive. It varies widely over the range of colours we can see, and from person to person. Scene composition affects it, too: the shape of an object in relation to nearby objects changes our perception of its brightness. You have to consider lateral inhibition, limited integration capability, the optical modulation function of the eye, and orientation and temporal filtering, not to mention the various forms of noise that affect all parts of the vision system. The human vision system is not a camera and trying to model it as one is extremely naÃve.

    With all that warning out of the way, the greyscale Just Noticeable Difference for a monitor of about 600cd/m^2 is equivalent to 720 steps.

    For a 1024 steps, the monitor would need a peak intensity of around 4000 cd/m^2 to match the greyscale step increase with the statistically average human just noticeable difference.
  • by Jeff DeMaagd ( 2015 ) on Saturday May 10, 2008 @09:38PM (#23365804) Homepage Journal
    There are different forms of antialiasing. The ClearType used by Windows really gets me. Yes, it does make the shapes smother, but what it does is turn the edges into rainbows. Instead of the right edge of a shape being a consistent color, and the left edge of a shape being a consistent color, it could be any of three colors anywhere. But it is the sharpest form of antialiasing for text.
  • by blincoln ( 592401 ) on Saturday May 10, 2008 @10:17PM (#23366006) Homepage Journal
    However, a larger bitdepth doesn't do anything for color space.

    Actually, it does.

    A higher bit depth means that the maximum contrast between channels is greater, *because* you have more resolution (or granularity, if you like) in each channel.

    For a very obvious example of this, take a 24-bit RGB colour image and downconvert it to 16-bit. The difference between 8 bits per channel and 5 (or 6 for green, depending on the type of 16-bit encoding) is quite dramatic. It's why older 3D games tend to look washed out by comparison to newer ones.
  • side bar topic: (Score:4, Interesting)

    while a billion colors is obviously ridiculous, there are people who can see 100x more colors than an average person

    scientists have recently identified a very small, very rare population of women who see in 4 colors, to a total of 100 million colors

    most humans see in 3 colors, about 1 million colors: red, green, and blue. a tetrachromat has an extra cone type between red and green, around orange. it's only women because the mutation requires two x chromosomes to work

    read all about it, they describe a women who can look into a river and make out silting and depth levels a normal human can't, x-men mutant indeed!:

    http://www.post-gazette.com/pg/06256/721190-114.stm [post-gazette.com]

    http://en.wikipedia.org/wiki/Tetrachromacy [wikipedia.org]

1 + 1 = 3, for large values of 1.

Working...