Forgot your password?
typodupeerror
Television Hardware Technology

Is the 4th Yellow Pixel of Sharp Quattron Hype? 511

Posted by timothy
from the hi-fi-jumprope dept.
Nom du Keyboard writes "Sharp Aquos brand televisions are making a big deal about their Quattron technology of adding a 4th yellow pixel to their RGB sets. While you can read a glowing review of it here, the engineer in me is skeptical because of how all the source material for this set is produced in 3-color RGB. I also know how just making a picture brighter and saturating the colors a bit can make it more appealing to many viewers over a more accurate rendition – so much for side-by-side comparisons. And I laugh at how you are supposed to see the advantages of 4-color technology in ads on your 3-color sets at home as you watch their commercials. It sounds more like hype to extract a higher profit margin than the next great advance in home television. So is it real?"
This discussion has been archived. No new comments can be posted.

Is the 4th Yellow Pixel of Sharp Quattron Hype?

Comments Filter:
  • Yellow... yawn (Score:2, Insightful)

    by Anonymous Coward on Saturday May 08, 2010 @04:40PM (#32141274)
    i'd be much more interested if it was a colour that RGB couldn't produce.
  • by eldavojohn (898314) * <[moc.liamg] [ta] [nhojovadle]> on Saturday May 08, 2010 @04:46PM (#32141324) Journal

    And I laugh at how you are supposed to see the advantages of 4-color technology in ads on your 3-color sets at home as you watch their commercials.

    Well, I'm not sure if you're correct to laugh at this or not. But all televisions are approximations of something analogue that was captured and in that capturing process, some information was lost. To illustrate, entertain a scenario where I have N standard definition television sets that are displaying footage from standard definition video cameras. I daisy chain them together (each camera directed at the last screen) to record something. As I move from the 0th screen to the Nth screen, I will begin to see degradation as more information is lost and randomness comes into play. The same can be done with HD but since HD captures more information, it can safely be assumed that the sampling and resampling will retain more of the original image.

    If you played the Nth HD screen next to the Nth SD screen and piped that through an SD television, you'd still be able to see some difference (for reasonable non-astronomical numbers of N) even though you went through yet another SD television in the end.

    I don't know what the fourth color is supposed to buy, I'm unfamiliar with this technology. But the side by side comparison through an SD or HD TV might still be able to demonstrate that the fourth color adds some meaningful information to the image that -- when resampled to be viewed on your device -- suffers less information loss than the three color implementation. Thus successfully demonstrating some superiority. Not showing you precisely what the final product is supposed to be like but instead give you relativity in signal loss and noise.

    I also know how just making a picture brighter and saturating the colors a bit can make it more appealing to many viewers over a more accurate rendition

    Well, I know that there is a huge photography following that is totally enamored with HDR photography [wikipedia.org] and to many people it makes the images come to life ... I think it's overdone (like autotuning in modern music) but it definitely has a place. Perhaps similarly four color displays hope to widen the dynamic range they can display? I wish I could give you better answers about four color displays but this is the first I've heard of them. Perhaps your questions to a large engineer base are the most effective kind of marketing?

  • time to wait (Score:2, Insightful)

    by Anonymous Coward on Saturday May 08, 2010 @04:49PM (#32141356)
    Time to wait for all the /.ers who don't actually understand colour theory pipe up with comments of how 3 colors is more than enough for everything simply because it was a design choice that was made several decades ago.
  • Re:RGB (Score:5, Insightful)

    by Anonymous Coward on Saturday May 08, 2010 @04:56PM (#32141408)

    Only is the camera recording the picture recorded that same color.

    As it has been stated, adding a new color on the TV is literally the last place that it needs to be. (First the camera that films, then the storage medium(DVD?), then broadcast(HDMI?) THEN the TV )

  • Open Mind (Score:3, Insightful)

    by gone.fishing (213219) on Saturday May 08, 2010 @05:09PM (#32141510) Journal

    At first blush it appears to be hype but I am trying to keep an open mind because of something that happened to me when I saw my first HD TV picture. I was of the opinion that HD couldn't be that much better than SD. Shortly after I saw my first HD images I was ready to admit that I was wrong. From the moment I laid eyes on HD I knew there was a whole new world out there! I am now a certifiable HD snob. I don't know what I did before but I do know I watched less TV.

    I haven't seen one of the new TVs yet to day I think it makes a difference or not. I will know, and probably rather quickly when I see it if I believe it or not. The first place I will look is at white/black interfaces. That should tell me a lot.

    I really do hope it is hype. I think the 47" TV is a little too big to be moved into the bedroom.

  • by Sivar (316343) <charlesnburns[.]gmail@com> on Saturday May 08, 2010 @05:30PM (#32141706)

    Digital images are displayed in RGB, yes.

    But colors are printed in CMYK (Cyan Magenta Yellow Black), and you'll notice that the best photo inkjet printers have more than just those four color cartridges. They often have the four plus "photo cyan", "photo magenta", etc. and it does make a huge difference.

    As you know, some colors cannot be accurately expressed in CMYK, nor can some in RGB (even though theoretically any color is possible, but theory is not reality in this case).

    While the extra color may or may not make a big difference, there is at least precedent indicating that the idea is sound.

  • Submitter fail. (Score:5, Insightful)

    by blair1q (305137) on Saturday May 08, 2010 @05:35PM (#32141732) Journal

    "And I laugh at how you are supposed to see the advantages of 4-color technology in ads on your 3-color sets at home as you watch their commercials."

    But the script of the commercial is written almost entirely with deference to that fact.

    The estimable Mr. Takei tells you, while you're no doubt ogling his adam's apple instead of listening, that he can't actually show you the difference itself, but, "I can show you this," wherupon he looks at the screen and gives his review in a single, somewhat gaudily overacted word.

    I'm not sure how anyone misses that, since his behavior is utterly bizarre without the concept of telling-not-showing being in play.

  • by BitZtream (692029) on Saturday May 08, 2010 @05:56PM (#32141894)

    Well, I'm not sure if you're correct to laugh at this or not. But all televisions are approximations of something analogue that was captured and in that capturing process, some information was lost. To illustrate, entertain a scenario where I have N standard definition television sets that are displaying footage from standard definition video cameras. I daisy chain them together (each camera directed at the last screen) to record something. As I move from the 0th screen to the Nth screen, I will begin to see degradation as more information is lost and randomness comes into play. The same can be done with HD but since HD captures more information, it can safely be assumed that the sampling and resampling will retain more of the original image.

    Yes, if you uncompress and recompress an MP3 multiple times it will turn into shit eventually too, but what you're doing is in no way like what happens in the real world.

    Now days the analog signal rapidly turns to digital, more and more often at the first camera. Then in most cases its moved around in a lossless format until it gets to the content distribution networks ... i.e. Your cable company, who then compresses the ever living fuck out of it in order to fit more channels on their shitty 'digital' system.

    There is very little resampling of data until your local cable company or Dish/DirectTV get their gubby hands on it, at which point it rapidly turns to shit because they over sale so badly cramming 15 different home shopping networks down your throat since they get paid to carry them rather than paying to carry them.

    Your example goes analog -> digital -> analog for each N, and thats not what happens in the real world of broadcasting.

    I don't know what the fourth color is supposed to buy, I'm unfamiliar with this technology. But the side by side comparison through an SD or HD TV might still be able to demonstrate that the fourth color adds some meaningful information to the image

    You can't 'add meaningful information' to the image, you can only remove it from the original. When you start 'adding' to it you no longer have the original image. Its much like all the shitty HDR and BLOOM effects in games. It doesn't make the game look more realistic, but many people find it more 'pleasing'. If thats what you want out of your television than just watch animated/CGI stuff since they can do whatever they want to it without regards to being authentic.

    Well, I know that there is a huge photography following that is totally enamored with HDR photography [wikipedia.org] and to many people it makes the images come to life

    Yes, many people think photoshoping makes an image 'better' too, but it doesn't and those people are for the most part, idiots following a marketing fad like lemmings.

    I'd buy the forth color being useful if the signal was coming in as CYMK, but its not, and they aren't switching to CYMK internally, its RGBY which is some completely different color model they've come up with. So they have to extrapolate or more realistically, simply make up some value for the yellow pixel that is almost certainly wrong by any definition of the word other than a purely subjective test by someone with no clue how to properly analyze an image.

    If you buy into this one, you should also buy into thinking 120hz TVs are better and that people can tell the difference between a 10ms ping and 20ms ping in an online game, and that you can tell the difference between a 320kbps mp3 and a 192 (or 128 for most people) mp3 in anything other than a test specifically designed to illustrate the difference.

    In short, Marketing bullshit is all it is. Sharp didn't come up with something that millions of other people who are far more concerned with proper color output than someone watching television didn't come up with.

  • The tone of this article isn't like the summary states. TFA doesn't portray the TV as some magical device; because the article is actually somewhat critical of the TV.

    I think the thing that a lot of us don't realize, because we spend so much time looking at TV and computer screens, is that colored light isn't really a combination of red, green, and blue. The reality is that light gets its color from its wavelength; and we can get a very close approximation by combining light we perceive as red, green, and blue.

    The question is, can we get a more accurate picture by using light that's closer to the original wavelength? Clearly, the information isn't lost, as the original wavelength can be inferred by digitally processing the original RGB levels.

    Something to consider is that the original NTSC (American Color) TV standards didn't just include Red, Green, and Blue, but also included Yellow and Orange. These parts were essentially deprecated, but the concept of TVs displaying yellow isn't new.

  • by kc8apf (89233) <kc8apf.kc8apf@net> on Saturday May 08, 2010 @06:19PM (#32142046) Homepage

    If you've ever had a display calibrated, you'd know that even the existing RGB color space can't be completely recreated with existing RGB-based displays. The problem is in the inability of LEDs or LCD or plasma panels to produce light uniformly in the three color channels. If you can add a 4th channel that lets the RGB color space be more accurately produced by the display, then you will see an improvement. It won't make the source any better, but the output generated by the display for that input will be better.

  • by M8e (1008767) on Saturday May 08, 2010 @06:38PM (#32142154)

    Not to forget that works for both 50hz and 60hz.

    60hz*10=600hz
    50hz*12=600hz

  • by matunos (1587263) on Saturday May 08, 2010 @07:04PM (#32142362)
    Why not just go to the store and look for yourself?
  • Re:RGB (Score:5, Insightful)

    by jipn4 (1367823) on Saturday May 08, 2010 @07:20PM (#32142472)

    That's misleading. A lack of a fully saturated green on a monitor is a limitation with the phosphors or dyes it uses. But monochromatic light of around 515 nm is pure, fully saturated green. Fully saturated green stimulates both your M and L cones ("G" and "R" cones); that's the way your eye works.

    You can achieve non-physical responses from your photoreceptors via oversaturation, drugs, or electrical stimulation. That's interesting, but it isn't "green" and it isn't a "true qualia". Thinking of that as "green" is simply because you think of the M cone as a "green" cone and the L cone as a "red" cone, but those are just arbitrary names.

  • Re:RGB (Score:4, Insightful)

    by iluvcapra (782887) on Saturday May 08, 2010 @07:28PM (#32142544)

    That 1931 color gamut is misleading because it overempasizes greens. In fact, the original NTSC green primary was much closer to the peak, but as a result, yellows were too muted, so they changed it. But you're right - a turquoise primary would increase the RGB gamut significantly.

    It would increase the gamut, but it wouldn't improve the rendition of skin tones (uh... the skin tones of most European/Native American/Asian/Middle Eastern/Medditerranean people. eep.) When people complain about the colors on their TV, it's generally because the skin tones don't look right.

  • by The Grim Reefer2 (1195989) on Saturday May 08, 2010 @10:24PM (#32143540)

    It really takes the punch out of Star Wars Special^n Edition if you can't see the stars.

    Didn't George Lucas do that already?

  • Re:What's wrong? (Score:2, Insightful)

    by vcgodinich (1172985) on Saturday May 08, 2010 @10:41PM (#32143620)
    This is like saying that purple is an "ugly hack" between red and blue.

    No, purple IS between red and blue, just like yellow is between green and red.

    Look at what the RGB gamut actually is, and tell me if yellow is the best place for a 4th point.

    http://en.wikipedia.org/wiki/File:Cie_Chart_with_sRGB_gamut_by_spigget.png

  • by ogl_codemonkey (706920) on Sunday May 09, 2010 @06:57AM (#32145788)

    Except that the difference can be accurately modeled in software and corrected at the LCD pixel - the performance and effectiveness of the algorithms used for this process are a key difference in the resultant picture quality in the models currently available.

    The brightside demo models apparently had excellent correction; and I imagine this is what a lot of the company's IP investment was based in.

You can do this in a number of ways. IBM chose to do all of them. Why do you find that funny? -- D. Taylor, Computer Science 350

Working...