Forgot your password?
typodupeerror
Television Hardware Technology

Is the 4th Yellow Pixel of Sharp Quattron Hype? 511

Posted by timothy
from the hi-fi-jumprope dept.
Nom du Keyboard writes "Sharp Aquos brand televisions are making a big deal about their Quattron technology of adding a 4th yellow pixel to their RGB sets. While you can read a glowing review of it here, the engineer in me is skeptical because of how all the source material for this set is produced in 3-color RGB. I also know how just making a picture brighter and saturating the colors a bit can make it more appealing to many viewers over a more accurate rendition – so much for side-by-side comparisons. And I laugh at how you are supposed to see the advantages of 4-color technology in ads on your 3-color sets at home as you watch their commercials. It sounds more like hype to extract a higher profit margin than the next great advance in home television. So is it real?"
This discussion has been archived. No new comments can be posted.

Is the 4th Yellow Pixel of Sharp Quattron Hype?

Comments Filter:
  • It's like the "120 hz lcd display" stuff. The dvd they use to show you the difference in-store is bogus. If you want REALLY sharp, you'd buy a 600hz plasma. The whole screen changes from one image to the next in 1/600 of a second, with no interpolation (and interpolation algorithms are just "best guesses", so they're no better than an upscaler would be).

  • human eyes... (Score:1, Interesting)

    by Anonymous Coward on Saturday May 08, 2010 @04:47PM (#32141340)

    uhh, human eyes only have RGB cones. therefore, if there is a RGB technology out there that achieves a wide enough gamut, then it should be more than sufficient. if the extra Y pixels achieve a wider gamut then the difference should be clear. otherwise it's just clever marketing garbage.

  • Human retinas (Score:2, Interesting)

    by RightwingNutjob (1302813) on Saturday May 08, 2010 @04:48PM (#32141348)
    Puny human eyeballs only have three kinds of cones, one that peaks in response to red, one to green, and one to blue. While our superior alien overlords may be pleased with this new technology, physiologically, you can't tell the difference.
  • Re:Human retinas (Score:5, Interesting)

    by MRe_nl (306212) on Saturday May 08, 2010 @05:04PM (#32141478)
  • What's wrong? (Score:5, Interesting)

    by rm999 (775449) on Saturday May 08, 2010 @05:06PM (#32141498)

    Representing yellow with a mix of green and red is already a hack. What's wrong with software determining that the color of a pixel is yellow and actually lighting up a yellow light?

    Maybe a yellow light looks more convincing than a red and green light right next to each other. I'd want to see for myself before making blanket judgments.

  • by Psyborgue (699890) on Saturday May 08, 2010 @05:22PM (#32141646) Homepage Journal
    120 Hz could provide some benefit for 24p material since it's evenly divisible.
  • by jjoelc (1589361) on Saturday May 08, 2010 @05:35PM (#32141736)

    joking aside... some of the newer TVs with LED backlighting actually do something like this... Lighting up the picture with thousands(ish?) of independent LEDs (as opposed to a couple of souped up flourescent tubes) means they can selectively dim or turn off entirely sections of the backlighting. So when large parts of the scene are dirk, large parts of the backlighting is dimmed as well, thus increasing the contrast. It also saves a bit of power, making it easier for them to meet energy star standards, etc...

  • by paulsnx2 (453081) on Saturday May 08, 2010 @05:41PM (#32141782)

    If you look at the color spectrum and its frequencies, you will notice the following:

    red -- 610 to 760 nm
    gap - 590 to 620 nm
    green -- 500 to 570 nm
    blue -- 450 to 500 nm

    Now I couldn't find any actual explanation on the net for why Yellow would make a better picture. But if you look at the frequencies above, you will notice that adding yellow DOES do something. It reduces the gap between Red and Green by half; Yellow is in that gap, and comprises the frequencies from 570 to 590.

    By this theory, maybe adding Orange (590 to 610 nm) would make an even more realistic picture?

     

  • Re:RGB (Score:2, Interesting)

    by Dekker3D (989692) on Saturday May 08, 2010 @05:55PM (#32141886)

    actually, xyz. hue, saturation and brightness. that's the best, simple representation we know of.

    to allow broadcast of every single colour, you want hue data. if it's not somewhere on the colour spectrum, it's not a colour, and we only have a limited area on that spectrum that we can actually see. then, you want to tone that up or down to the right brightness and saturation. in theory, you could send data about every single wavelength detected in a pixel, but we only have three receptors anyway. so, hue, saturation and brightness. or a variant of that.

    allocate enough bits for the spectrum and we can use it to encode infrared and ultraviolet too, although the only reason not to just map them to higher or lower frequencies is either communication with aliens/bugs or ultimate customization ("i prefer green as my infrared colour!")

  • Re:RGB (Score:5, Interesting)

    by Twinbee (767046) on Saturday May 08, 2010 @06:03PM (#32141942) Homepage

    Parent is correct. Any colours around green and cyan are usually terribly unsaturated on most monitors. In fact, even in 'real life', it isn't theoretically possible to experience true cyan/aqua because the nearest direct wavelength will stimulate the red eye cone to some extent creating colour pollution.

    There is a trick around this, which can be found by over-saturating the red cone. This weakens it temporarily, and then when shortly afterwards you see anything resembling cyan, it will appear as close to the true qualia as you could ever expect. The "Eclipse of Mars" illusion that follows in the below link demonstrates this for those who are curious:

    http://www.skytopia.com/project/illusion/2illusion.html [skytopia.com]

  • Re:RGB (Score:0, Interesting)

    by Anonymous Coward on Saturday May 08, 2010 @06:10PM (#32141976)

    http://en.wikipedia.org/wiki/YCbCr

    RGB is an approximation. The information is there, the tv processor just has to mathematically figure it out. Having an extra color is always going to make the approximations better.

    If you take a one pixel picture of an orange, your camera takes the analog information and captures it as three "grayscale" values, as seen through the RGB filters- r 0-255, g 0-255, b 0-255. Besides the intensity of each color, information exists in the difference between the three colors. By taking the intensity information along with the differences between them, I can reproduce the color more accurately.

    If someone has a ruler marked in thirds of inches and measures something that's 3/4 of an inch long, their measurement is going to be an approximation. They are going to say that as far as they can tell, the thing is about 2 and a quarter thirds. Now you take your ruler that's marked in quarter inches, you are going to convert 2.25 thirds into quarters and get pretty dead on at 3/4 of an inch.

  • by Kjella (173770) on Saturday May 08, 2010 @06:11PM (#32141990) Homepage

    The fact that your source material is provided as three quantities (YCbCr, not RGB) doesn't mean four phoshors won't help.

    Well there's two possibilities here:

    1) You have non-RGB information like for example in xvYCC [wikipedia.org], which still uses only three quantities but has a much wider gamut. Then four phosphors could definitely help you reproduce more colors. I guess it also proves the problem isn't that you have three quantities, but the way RGB works which doesn't match the eye.

    2) You have RGB or clipped-for-RGB YCbCr encoding, then you don't have more than RGB no matter what. In this case, it only makes sense to improve the gamut inside the RGB triangle because LCDs are imperfect things.

    Unfortunately BluRays don't support anything outside clipped-for-RGB YCbCr even though some players can play back AVCHD recorded with a xvYCC-capable camera over HDMI 1.3+ to xvYCC-capable TVs. You're talking about a very narrow segment here though.

    This is an old case of the chicken and the egg, if you don't have anything but RGB phosphors then only RGB signals make sense. Sony has been one of those pushing for wide gamut signals, wide gamut players (including the PS3) and wide gamut TVs. And also for increased color depth, which will give you more accurate colors but not more colors. Too bad they didn't come up with this before the BluRay spec was made, but then really how many people have noticed the missing colors (which were also missing on DVDs)?

  • Forgot one: (Score:1, Interesting)

    by Anonymous Coward on Saturday May 08, 2010 @06:26PM (#32142088)

    Adding an extra phosphor can extend your gamut, increase your dynamic range within your gamut, give you finer quantization within the gamut

    and increase the size of your penis.

  • by Anonymous Coward on Saturday May 08, 2010 @06:37PM (#32142144)

    Can't they make LCD with adjustable frequencies?
    CRT monitors could do it easily and give you a multiple of 24 or 25 or whatever the frame-rate of the movie is.

  • by Reziac (43301) * on Saturday May 08, 2010 @06:40PM (#32142164) Homepage Journal

    Back in the 1960s there was an ad that did some trick that caused a black-and-white television to display what the eye perceived as colour. There was an explanation as to how it was achieved but lo these many decades later I have no recollection what it was (nor what the ad was for, either). If I hadn't seen it myself, I'd not believe it could be done.

  • by Anonymous Coward on Saturday May 08, 2010 @06:41PM (#32142170)

    Unfortunately they are still at hundreds of LEDs due to cost. Hence negative negative effects on picture quality like "blooming" of bright areas into dark areas, and loss of local contrast.

  • by tommyhj (944468) on Saturday May 08, 2010 @06:42PM (#32142176)

    And don't forget the Green Phosphor Trails that all plasmas suffer from, and ruins every viewing experience I ever had to endure on a plasma. Those trails (or green/yellow flashes) are the only reason I will always pick LCD over plasma.

    Very evident in this video: http://www.youtube.com/watch?v=KV_fXCW2rOM [youtube.com]
    But it's there in all plasma panels, making my head and eyes hurt...

  • by Anonymous Coward on Saturday May 08, 2010 @06:49PM (#32142236)

    Actually (in my understanding anyway), a 600Hz plasma shows 600 "subfields" per second, not 600 frames. If you saw a single subfield on its own, it would not look like the desired picture. Driving a plasma panel is a complex process that involves dithering and repeated discharge of the plasma cells in order to create just one frame. A 120Hz LCD shows 120 recognizable frames per second, even if most of them are interpolated.

  • by Anonymous Coward on Saturday May 08, 2010 @06:57PM (#32142312)

    If the human eye is the most sensitive to yellow, green, and violet, in that order, why aren't we making displays and computers that output combinations of YGV? That seems to me to be the best way to achieve realism is to match the natural sensitivity of human eyes.

  • LCD screens are lighter, thinner, and more efficient though.

    LCD screens are lighter - my 50" plasma might weigh a lot more, but it's on a swivel stand, so moving it takes one finger.
    LCD screens are thinner - not much thinner, and most people spend their time looking at the front, not the side.
    LCD screens are more efficient - depends. Hook it up to a playstation to watch a blu-ray, and you've more than lost any "efficiency" claim. The playstation burns 188 watts to decode blu-ray disks, while a stand-alone blu-ray player only uses 10 to 13 (and a lot less when on standby). Plasmas also don't have to produce more light than required, then selectively block it from each pixel (which is one reason why blacks are blacker on plasmas - no bleed from adjacent pixels, and no light leakage when you stand near them and look down). Turn the room lights off and put it in economy mode. Get a Wii (peak wattage is 18, as opposed the the 184 wattss for the playstation).

    For the little TV viewing I do, we're talking pennies a month, and in the winter I recoup that from the heat generated (electricity from hydro power) - which reminds me, they're calling for snow tonight and tomorrow - so maybe I should invite people over to do some TV watching or play some pinball or snowboarding or air guitar and warm the place up - 6 people running around is almost 2,000 btus, throw in another 1,000 btus from waste heat from the plasma and sound system, and another couple hundred from the dogs and I might not have to turn the heating on :-)

  • by tepples (727027) <tepples@gmaiBLUEl.com minus berry> on Saturday May 08, 2010 @07:04PM (#32142358) Homepage Journal
    A lot of TV sets that use local dimming have a big problem showing starfields. The average color in a starfield is pretty dark, so the LED goes dim and not bright enough to show the stars. It really takes the punch out of Star Wars Special^n Edition if you can't see the stars.
  • by Anonymous Coward on Saturday May 08, 2010 @07:23PM (#32142508)

    This gamut extension is useful and here is why: the LCD in your TV doesn't really have 24 bit accuracy. In fact, it has six bit accuracy per channel (18 bits total). It creates a 24-bit approximation by dithering nearby pixels both spacially and temporally. This is true of almost all large format LCD panels made today. Adding a yellow pixel significantly increases the number of "true" values it can hit before dithering. I haven't done the math, but I would guess it puts you somewhere near having the equivalent of seven RGB bits. So, with the yellow pixel you get much closer to the real color per pixel and require much less dithering. This is a big deal!

  • Re:RGB (Score:1, Interesting)

    by Anonymous Coward on Saturday May 08, 2010 @07:40PM (#32142632)

    The point is that the yellow which results from mixing red and green is the same yellow which a direct yellow channel would encode, except for saturation. This is a matter of color profiles, not encoding. The color space including brightness is a three-dimensional space. RGB is one parameterization of that space. If you choose a color profile with (physically impossible) extreme values for the base colors, RGB can describe all possible colors. As more wide gamut displays become available, color management becomes more important, and with color management you can have actual data for additional color channels, without increasing the dimension of the data.

  • Re:RGB (Score:5, Interesting)

    by Twinbee (767046) on Saturday May 08, 2010 @08:01PM (#32142764) Homepage

    That's my point though. How can a apx 515nm wavelength be a fully saturated green if the L cone is also being activated to some degree? That would be the extra pollution I'm talking about. I believe a much purer green would result if you somehow disabled the L cone. Unless you think we might see a more cyan/blue-like hue here?

    To get a definitive answer, I would be interested to see what one would experience if you disabled two of the three S/M/L cones. I'm suspecting you would see pure red (disable S+M), green (disable S+L) and blue (M+L). Any research into that?

    That's interesting, but it isn't "green"

    What is it then?

  • Re:RGB (Score:2, Interesting)

    by Anonymous Coward on Saturday May 08, 2010 @09:20PM (#32143278)

    "What is it then?"

    It's an imaginary color. There's even a wikipedia article about it.

  • by Cl1mh4224rd (265427) on Saturday May 08, 2010 @09:48PM (#32143410)

    A lot of TV sets that use local dimming have a big problem showing starfields. The average color in a starfield is pretty dark, so the LED goes dim and not bright enough to show the stars. It really takes the punch out of Star Wars Special^n Edition if you can't see the stars.

    My brother-in-law bought an LED set not too long ago. I believe it's an edge-lit model, so that may contribute to the problem, but... the bright parts of dark scenes tend to be about half as bright as they should be.

    This includes loading screens on some video games as well as movie credits. There's one short scene in the recent Star Trek movie showing the Narada flying by that dims so much that you can't make out any details in the ship.

    Like I said, I believe his particular model is edge-lit, so I can't really comment on the traditional back-lit models, but... this seems like an unacceptable quality issue. It's really turned me off to the idea of LED TVs.

  • by mc6809e (214243) on Saturday May 08, 2010 @09:48PM (#32143412)

    With RGB pixels on an LCD, yellow is shown by allowing light to pass through neighboring red and green subpixels. For the red subpixel, blue and green are filtered out. For the green subpixel, blue and red are filtered out. Then the eye fuses the neighboring pixels together to get yellow from two sources that have already filtered out much of the spectrum. But with a single yellow subpixel, only blue light is filtered out and more light reaches the viewer. I'm sure the effect is to make certain colors more vivid.

    Additionally, the use of these yellow subpixels is also to somewhat increase the effective resolution.

  • Re:RGB (Score:5, Interesting)

    by 6350' (936630) on Saturday May 08, 2010 @09:52PM (#32143426)
    and so it does.
    http://en.wikipedia.org/wiki/Imaginary_color [wikipedia.org]

    Neat snippet from the article:
    "At Walt Disney World, Kodak engineered Epcot's pavement to be a certain hue of pink so that the grass would look greener through the reverse of this effect."

    Sneaky!
  • Sure, it could be (Score:3, Interesting)

    by Craig Ringer (302899) on Saturday May 08, 2010 @10:08PM (#32143500) Homepage Journal

    In terms of color theory, nothing stops is potentially being real. If you expect to hook this up to some random source and get an improvement, though ... good luck. It's not going to happen. With an appropriate 10-bit or 12-bit wide-gamut source, though, it's certainly capable of better results.

    The input may be 3-color (RGB), but if it's defined with a wide-gamut space like Adobe RGB, possibly with up to 16 bits of precision per colour channel, then it can represent a huge range of colours. It can do this by defining near-"perfect" primary colours and assuming perfect control over blending of those primaries.

    A regular TV, though also an RGB device, has a very different gamut. That's largely because the primary colours the TV uses aren't as bright/saturated or as "perfect" as those in the Adobe RGB space, but it also can't blend its colours as well. Most likely it only uses 8 bits per colour channel, so it has a much more limited range of graduations, further forcing the colour space to be narrowed to avoid banding due to imprecision.

    The regular TV must "scale" a wide-gamut input signal in a colour space like Adobe RGB to display it on its own more limited panel. It can do this by "chopping off" extreme colours, by scaling the whole lot evenly, or several other methods that're out of scope here. Point is, that they're both RGB devices, but they don't share the same colour space and must convert colours.

    So, if the yellow pixel (another primary) expands the gamut of this new TV, then yes, even though it too only takes an RGB signal, it's in theory better, because it can convert a wide-gamut RGB input to its own RGBY space for display with better fidelity than a TV with the same RGB primaries but no Y channel colour achieve.

    Another device might still be plain RGB, but for each of the red green and blue primaries it might have much better (closer to "perfectly red" etc) colour. This device might have an overall wider gamut (ie better range of colours) than the RGBY device, though it's likely that the RGBY device's gamut would still be capable of better yellows. (If you're struggling to figure out what I mean, google for "CIE diagram RGB CMYK" to get a feel for it).

    Attaining better results through adding a channel and/or having better R,G,B primaries presumes properly colour-managed inputs to gain any benefit, though. In reality, video colour management is in a pathetic and dire state - inputs can be in any number of different colour spaces, there's no real device-to-device negotiation of colour spaces, and it's generally a mess. If you feed a "regular" narrow gamut source through to a TV that's expecting a wide gamut signal, you'll get a vile array of over-saturated over-bright disgusting colour, so this is important. Since this device would rely on wide-gamut RGB input to have any advantage, it'll need a 10-bit or 12-bit HDMI or DisplayPort input with a source that's capable of providing a wider gamut signal (say, BluRay) and is set up to actually do so rather than "scaling" the output video gamut to the expections of most devices.

    The fact that most inputs only support 8 bits per channel (and thus aren't very useful for wide-gamut signals because they'll get banding/striping in smooth tones) really doesn't help.

  • Re:Yellow... yawn (Score:3, Interesting)

    by Nazlfrag (1035012) on Saturday May 08, 2010 @11:03PM (#32143750) Journal

    And isn't all the content avaliable in YUV meaning it has red, green, blue and yellow colour information? All the source material might be made using RGB (is that even true though?) but the transmission is done using luminance, R/G and B/Y values isn't it?

  • Re:Yellow... yawn (Score:5, Interesting)

    by eggnoglatte (1047660) on Saturday May 08, 2010 @11:44PM (#32144074)

    True, but the regular LCD color gamut is smaller than the sRGB/Rec 709 gamut that is encoded in the HDTV video standard.

    Basically, LCD panels use relatively wide spectrum color filters, so that they don't loos too much light in absorption. The result is a relatively small gamut - smaller than plasma or CRT.

  • Re:RGB (Score:5, Interesting)

    by jipn4 (1367823) on Saturday May 08, 2010 @11:59PM (#32144178)

    That's my point though. How can a apx 515nm wavelength be a fully saturated green if the L cone is also being activated to some degree?

    Because all light of a single wavelength is automatically "pure"; it doesn't matter what your cone responses are. The cone responses are just a code to transmit that information to your brain. Your cone responses are such that they overlap (for good reason), but that doesn't keep you from seeing pure colors.

    And actually, you perceive color contrast anyway, not absolute RGB values or wavelengths. So, even if you get a group of cones to produce a pure "green" response somehow, that will simply be processed as being part of a strong red/green contrast and result just in a vivid green percept.

  • by Prune (557140) on Sunday May 09, 2010 @01:20AM (#32144602)
    This issue was solved in Brightside's HDR display, the first to use LED modulation several years ago. LEDs are addressed individually, so the locality is very low. Also, the brightness achievable is much higher (though the unit drew 1500 Watts for the large screen version); i.e. individual areas on the screen could be made as bright as looking at a light bulb. The reason locality wasn't an issue is that the few dozen pixel-sized areas lit by a given LED are small enough to provide higher dynamic range than the *local* dynamic range the human eye can muster. While the eye has a huge dynamic range, that is not the case over a small portion of the field of view. The downside of Brightside's stuff (later acquired by Adobe) is the energy usage needed to get really high brightness through an LCD display, where the best LCDs will at most let through 6% of the light. The LED array in the back had huge heatsinks and active cooling; they put out several times the luminance of a regular display's backlight. That, and he cost of the damn thing... In the end we won't see practical HDR displays for sale to average consumers until something like OLEDs become cheap enough for mass market.
  • by Anonymous Coward on Sunday May 09, 2010 @01:39AM (#32144704)

    Actually, although this has been marked as 'funny' the lack of an active black is a considerable impediment to faithful color production. Turn off your monitor and what do you see? Dark grey probably. As a traditional oil painter who has worked in digital medium for some time, I can attest to how 'thin' the dynamic range of a screen based image is compared to an oil painting.

  • Re:Yellow... yawn (Score:4, Interesting)

    by Teancum (67324) <robert_horning AT netzero DOT net> on Sunday May 09, 2010 @07:13AM (#32145826) Homepage Journal

    Admittedly one of the problems here is that adding a fourth channel would require a 4-dimensional color space to fully utilize that extra channel. To really utilize this sort of new feature would require a whole new image recording system.

    One of the problems facing would-be extensions of the color gamut like adding another color is in part an unlearning of what it means to make a color. In reality, a given color that you see from an object is made up of an entire spectrum from near infrared to ultraviolet (UV-A, to define a "color"), and is a wave function of all possible frequencies along that spectrum.

    Somewhere along the way some crude but generally effective simplifications of this philosophy have resulted in things like the YUV and RBG systems, but it should be noted those are 3-dimensional color spaces. Note that the word "dimension" is not in reference to lengths here, but rather representations of the color. Each dimension is merely one more piece of information to display that color.

    So whenever you use a 3-dimensional color space, you are reproducing that color spectrum wave function by only selecting three frequencies out of the whole spectrum with which to "broadcast" that information... and you are discarding quite a bit of additional information along the way. This is why color reproduction is quite difficult, and never really gets it "right" in most cases. Try as you might, no possible method of reproducing a color where the information has been discarded can be recreated. Heck, that is basic information theory here too.

    Some may counter that a human eye perceives only three colors anyway. Well, that isn't quite true, as there are people with sensitivity to more than three colors (tetra-chromaticity) and of course people who only perceive effectively two or even one color ("color blindness"). Even with all that, not all people perceive the same colors either in the same way, so what may look "good" to one person may look "awful" to somebody else. What it all boils down to is that to really do a proper representation of the color, it really is vitally important to completely and accurately reproduce that entire wave function which represents all possible frequencies.

    Think of it more this way, perhaps. Imagine if you were listening to some music, but the recording medium only reproduced the songs with three frequencies for playback. It would be some rather boring music. BTW, it is possible to "sample" light in the same manner that sound is sampled to give a more accurate reproduction of a color, but that would be an insane amount of data as the sampling frequency would have to be on the same order as the frequency of the light.... actually a higher rate of sampling to be precise.

    One of the really nice things about light from an incandescent light bulb is that it is spread out over nearly the entire frequency spectrum. Traditional film projectors take advantage of that fact and when color film is shown in front of that light bulb, the frequency spread of various color layers on the film tend to smooth out with each other and generate that continuous spectrum. It still isn't perfect and color film still has only three channels (usually) but at least an attempt to re-create that whole frequency spectrum is there. Also note that different film manufacturers have a frequency response that is sometimes different, which is why some film manufacturers are preferred over others and certainly impacts the film making process in some subtle but interesting ways.

    For LCD screens and worse yet for LED systems, the frequency curve isn't nearly so good. If you would look at it with a diffraction grating or prism that separates the colors out, you would see some sharp lines rather than a continuous spectrum. That is where the real problem lies with LCD panels and why color accuracy is not very good. Certainly adding a yellow line to that curve would generally help to smooth out that spectrum or at least add some more visual information, even if that color information isn't being directly recorded by the storage medium.

  • by negRo_slim (636783) <mils_oRgen@hotmail.com> on Sunday May 09, 2010 @01:28PM (#32148136)

    My brother-in-law bought an LED set not too long ago. I believe it's an edge-lit model, so that may contribute to the problem, but... the bright parts of dark scenes tend to be about half as bright as they should be.

    I tend to lose all visual fidelity in dark parts of a high contrast image.

I don't want to achieve immortality through my work. I want to achieve immortality through not dying. -- Woody Allen

Working...