Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Displays Hardware

LCoS Shoot-Out Results 138

mikemuch writes "DisplayMate founder Ray Soniera has revealed the results of his LCoS HDTV Shoot-Out. He puts five HDTV's through a slew of test pattern measurements, and then lets 34 real people, including home-theater lay people and experts, conduct jury tests and make comments. There was one case where the experts gave low marks to a display that the lay people loved. From the article: 'We spent some time trying to understand why the consumer panelists rated the JVC Consumer unit so highly. It had the lowest objective on-screen resolution of all of the units, because of internal signal processing, but a number of consumer panelists commented on how sharp it looked. The copious artifacts and significant edge enhancement produced so much artificial texture in the image that some panelists interpreted it as superior sharpness. All of the Video Experts recognized this effect and gave the unit the lowest score.'"
This discussion has been archived. No new comments can be posted.

LCoS Shoot-Out Results

Comments Filter:
  • by ExE122 ( 954104 ) * on Thursday February 16, 2006 @03:54PM (#14735936) Homepage Journal
    I actually think this result is just a matter of a having a trained eye... just as a real musician would probably cringe at the sound of most pop songs on the radio, despite the fact that a large number of people actually enjoyed that kind of "music". [flamebait warning]

    But seriously, I wouldn't expect a "lay person" to be able to understand the technology involved in these units and to be able to make any intelligent\educated distinctions about their quality. IMHO, there's a reason we call them experts and they are the only ones we should really be paying attention to.
    • As they say,

      "Ignorance is bliss."

      I can guess we can apply this to today's technologies.
    • by Red Flayer ( 890720 ) on Thursday February 16, 2006 @04:04PM (#14736057) Journal
      "But seriously, I wouldn't expect a "lay person" to be able to understand the technology involved in these units and to be able to make any intelligent\educated distinctions about their quality"

      You've got to define quality here -- it depends on your goal and what metrics you assign to measure achievement.

      Is your goal to maximize appreciation of the picture quality in your target market? If so, what's your target market -- video experts or typical consumer? What's the crossover between the two markets?

      If my customers are more satisfied with my product than the 'experts' say they should be, then good for me. The problem here is not that experts and customers disagree -- the problem is that they are using different metrics. And to the people actually buying my product, it's their metrics that really matter.
      • Many smokers are extremely satisfied with their cigarette product, far more than the expert Doctors suggest they should be based on the incidence of cancer.

        It may be their metrics that determine sales, but not necessarily their metrics that determine what good is, or even what is in fact good for them.
      • Subjective quality? (Score:2, Interesting)

        by VON-MAN ( 621853 )
        Well, I always thought "quality" to be an objective descriptor. And in the case of displays quality is synonymous with lack of distortion. Or how "natural" the image is. The JVC consumer model probably conforms to what the non-professional expects from a tv, but that doesn't make it a high quality picture.

        I'd would like to see the units myself, actually, and see how "bad" this consumer model is. And I would *really* like to see the professional unit. I was was thinking to myself when reading the descriptio

    • by Anonymous Coward
      This effect has been observed even with the "trained" eye. Professional photographers have repeatedly chosen images produced by digital cameras that contain noise as being "sharper" than images that contain less or no noise. Human vision interprets edge contrast as an indication of "sharpness", therefore their initial viewing of the images lead them to "believe" that the noise images where actually sharper. When the researchers "came clean" and demonstrated to the photographers the true difference in the
    • by PFI_Optix ( 936301 ) on Thursday February 16, 2006 @04:10PM (#14736110) Journal
      There's something to be said for a "non-expert" opinion on the matter.

      To go back to your analogy of musicians: There is some "music" which is absolutely adored by the experts that sounds like utter crap to the lay person. Why? Because what the expert hears is technical acheivement, innovation, something hard to play that's never been done before. What they lay person hears is an annoying cacophony of seemingly random blarings from an orchestra. I'm thinking of a specific orchestral piece I heard on NPR a few months back. The composer's name eludes me, but his work made a lasting impression...it was impressive to me as a musician that he could write it, but (at best) annoying to listen to.

      A monitor can have all the technical features and perfect picture in the world to impress the experts, but if another "inferior" TV somehow fools the average buyer into thinking they're looking at a better picture, which one do you think they'll buy? Last I checked, buyers far outnumber experts.

      This article raised an excellent point about the *difference* between what technical experts and average consumers see when they look at a TV. In the end, two things will influence a buyer more than anything: their wallet and their eyes.
      • Like Sonic Youth?? Daydream Nation is, in my mind, a great cd (not that I'm an expert musician nor songwriter, cause I most definitely am not!). Yet when Teenage Riot was playing on a jukebox at a college bar, a friend of mine was wondering who selected that "crap noise". And many of Sonic Youth's songs sound kinda chaotic, with weirdly tuned guitars, different arrangements, noises, sung by not-that-great vocalists, etc. Yet they can make a song work in spite of all this, and it's a great song too. Yet I
    • I don't think that the industry neccesarily needs to pay attention so much to the experts. Not that they don't know what they're talking about, because they do, but they are not the main consumers. If JVC can produce a product for less money that most consumers will percieve as higher quality, then thats the bottom line.
    • Quote from the article: [blockquote] We spent some time trying to understand why the consumer panelists rated the JVC Consumer unit so highly. It had the lowest objective on-screen resolution of all of the units, because of internal signal processing (see Fine Detail Artifacts, above), but a number of consumer panelists commented on how sharp it looked. It turns out that the copious artifacts and significant edge enhancement produced so much artificial texture in the image (more than any of the other units
  • No surprise (Score:5, Interesting)

    by Tx ( 96709 ) on Thursday February 16, 2006 @03:57PM (#14735969) Journal
    I know people who watch 4:3 content stretched out to 16:9, and are apparently immune to the completely distorted aspect ratio, they just think whatever they're watching should fill the screen regardless. If a consumer panel contains people like that, I don't wanna know what they think.
    • Re:No surprise (Score:4, Informative)

      by engagebot ( 941678 ) on Thursday February 16, 2006 @04:01PM (#14736021)
      The reason is that depending on the type of screen you have, those 'black bars' on the left and right can cause burn-in lines.

      Yes, i know burn-in is not a huge problem with newer displays, but when 80% of your ungodly amount of TV time is in standard-def viewing, then you still do have a problem.
      • Why not make those black bars grey? That should solve the uneven burn-in problem for CRTs and plasma. On LCoS, DLP, and LCD it's not an issue.
        • On Sony TVs, those black bars are grey.
        • Re:No surprise (Score:3, Informative)

          by engagebot ( 941678 )
          The actual color is not the problem. The problem is its a static image. Not even so much the static 'black bar', but edge of where the bar meets the moving content. If the whole thing was black all the time, average joe wouldn't probably be able to tell there was a burn-in at all. its just that there's that hard edge.
        • Why not make those black bars grey? That should solve the uneven burn-in problem for CRTs and plasma. On LCoS, DLP, and LCD it's not an issue.

          You'll still risk burning in the edges of the bars (because the interior 4:3 portion won't age as "averagely" as the gray bars, unless all you ever watch is a gray screen). Take it one step further and periodically move the position of the 4:3 window. My old Mitsubishi CRT RPTV did this with its gray bars, and I hear modern plasma screens do something similar. T

        • The bars on many TVs can be adjusted between grey and black, but grey is plain annoying, and it dosen't solve the problem in the first place. I don't really mind 4:3 stretched to 16:9 most of the time, but I'm not a hardcore HDTV person, and most broadcasts in my area still aren't HDTV, and the bars annoy me. Big deal.
          • but I'm not a hardcore HDTV person

            hardcore HDTV might actually benefit from a little distortion... you know, an extra inch in the right places...

          • Most TV's allow you to zoom the 4:3 image to fill the screen on a 16:9 TV. The result is that you end up with a cropped image and the resolution sucks, but hey, it's NTSC anyway. I used to do this with my Hitachi RP CRT since it had the grey side bars (which are annoying), but since replacing it with a Sony SXRD which has black side bars and no burn problems, I just watch 4:3 content in 4:3 mode. I simply stop noticing the sidebars after a few minutes.
            • True... However, I find the cropping to be even worse than the stretching. So, people look a little fatter and circles look like ovals. I suspect that when I upgrade to a DLP front/rear projection unit I'll be happier playing in 4:3 mode, but I'm pretty happy with my rear projection 42" TV... I don't think I'll be upgrading in the next 5 years unless something very revolutionary happens with TV technology (and prices)-or I move to a larger house that has a space that would work as a dedicated theatre...

              I
    • Re:No surprise (Score:4, Insightful)

      by Quasar1999 ( 520073 ) on Thursday February 16, 2006 @04:02PM (#14736034) Journal
      Same as people who buy a super sweet hi-def set, and watch crappy analog cable on it, and then tell you that they're watching hi-def.

      These are the same people that put premium gasoline in their 'optimized for 87 octane' car, and then claim they can feel the extra performance.

      Yup... but at the end of the day, the important thing is that the person who paid the money for the thing they got are happy with it. Doesn't matter if they don't actually know they're not getting what they thought, so long as they like it, who cares!
      • Yep, as long as cocaine buyers are happy with their baking soda cut, who cares?

        Seriously, there's a reason that we have truth in advertising laws. They're an attempt to keep people from getting stuck with a crappy product even if they don't know how to tell the difference. We all suffer when crappy tv's have good sales, because that reduces the manufacturer's incentive to produce better tvs.

    • Re:No surprise (Score:2, Interesting)

      by ksattic ( 803397 )
      Has anyone noticed the HBO black outline phenomenon? HBO will play shows like BBC's "Extras" with black bars on all four sides. They broadcast the 16:9 material in 4:3 (broadcasting black bars on the top and bottom to maintain the aspect ratio), and then when the 4:3 content is shown on my 16:9 TV, I also get black bars on the left and right! In 720p and 1080i, my TV's zoom function does not operate, so I have no choice but to watch with a large proportion of my TV area black.
      • Has anyone noticed the HBO black outline phenomenon? HBO will play shows like BBC's "Extras" with black bars on all four sides. They broadcast the 16:9 material in 4:3 (broadcasting black bars on the top and bottom to maintain the aspect ratio), and then when the 4:3 content is shown on my 16:9 TV, I also get black bars on the left and right! In 720p and 1080i, my TV's zoom function does not operate, so I have no choice but to watch with a large proportion of my TV area black.

        Are you sure you're watchin

        • Yep, I have watched shows such as Dead Like Me on Showtime's HD channel just fine, and I can watch the standard def HBO channel that is showing Extras and stretch to fill my screen. They are broadcasting Extras (and a few other shows) in 4:3 with black bars. :o(
      • That's normal I guess. Even worse the BBC actually airs programs that are 16:9 in 4:3 in 16:9 on BBC 1 and 2 (the analog versions I guess (mainland Europe)), resulting in blackborders all around on a 4:3 TV. This happens mostly with live sports (ans sometimes foreign news feeds).
    • That's nothing. My DVD player image has these diagonal shaded areas that move slowly across the screen. Some kind of interference. I find it aesthetically pleasing, like watching waves on the ocean.
    • You know my parents!?
    • We have several broadcasters here in Europe that think those nifty black bars above and below the shows are to be used for subtitling and the displaying of annoying animated logos.

      It gets even more annoying if those subtitles are halfway in and halfway out of the bottom black bar.

      Personally I'd say that this sort of behaviour is a lot more annoying (to me at least) than people watching 4:3 content in 16:9.

      Or do you think 4:3 Ally McBeal content does not look disproportionate when viewed in the right aspect
  • Brighter == Better (Score:5, Interesting)

    by engagebot ( 941678 ) on Thursday February 16, 2006 @03:59PM (#14735994)
    To the average Best Buy shopper, the brightest screen in the lineup wins. Doesn't matter if the red tones are blown out, doesn't matter about artifacting.

    Just turn the brightness control down a few notches on a particular TV in the lineup, and watch the Best Buy sales numbers change.

    Same thing with audio equipment. Room-shaking bass and razorblade sharp piercing highs sell gear. Doesn't matter if its a balanced sound, or if there's any separation between the elements in the mix. More bass? check. Killer sharp highs? check. Go to the checkout counter.
    • If you read the article, you would see the results:

      The JVC Professional unit consistently received the highest grades despite its being the smallest and dimmest of the units
      So, no, "Brighter == Better" did not occur in this test.
      • Granted, that wasn't the case in this test, but it happens a ton on the actual retail front.
      • One thing many manufacturers do is default their white balance on the TV sets to 9000K or higher since it looks a lot brighter than a properly set TV at 6500K. However, when watching it at the lower temperature, although the image will not look as bright, the colors will appear more vivid. I believe the SMPTE NTSC standard recommends 6500K or there abouts. I don't know about ATSC, but I would imagine it's similar. Try it with a monitor. At first it will look yellowish until you adjust to it, though thi
      • What you mean, of course, is that panelests were told not to evaluate brightness. Whether they effectively ignored it or not is another question altogether, especially when evaluating nonidentical models simultaneously.
    • Funny you should mention this, because this is similar to what I came across a few weeks ago when I was shopping for a HDTV. The most expensive Sony in the lineup looked like it was tweaked, because the contrast, brightness, and color balances looked dead on. I don't think they turned the brightness or contrast up, because black definitly looked black and the picture did not look washed out. And the TV actually had a real HD signal run into the television so the picture was at its premium quality (don't kno
    • by Kadin2048 ( 468275 ) <.ten.yxox. .ta. .nidak.todhsals.> on Thursday February 16, 2006 @04:57PM (#14736557) Homepage Journal
      I think in retail it's a combination of brightness and contrast/color saturation. If you look at the TVs people are drooling over at Best Buy, they're often the ones that have the contrast and saturation jacked up ridiculously high, also. Sometimes to the point where flesh tones start to look really distorted, everyone looks like they're wearing a lot of blush on their cheeks and stuff. It's pretty bad.

      But this same thing happens with photos. A few years ago there was a sort of "contrast war" between the makers of different high end digital minilab equipment (principally Agfa and Fuji). In order to create pictures that "look best," they each would come out with new software for the minilab system that would pre-process the digital image coming from the film scan before it went to the printer. Generally the "automatic" options (on either brand) would compress the dynamic range horribly, then proceed to drive the saturation up to almost unbelievable levels. But customers loved it because it made their vacation photos look like postcards, so what the hell. Nobody really cares about 'accuracy' in the real world -- or rather, not accuracy to the physical world or to the film, they want a product that's accurate to their memory of something, which often is nearly unrelated to reality. Give them that, and you'll get rich.

      Same thing with the "bass boosters" or "sound enhancers" on low end stereos. It mucks the music up, but people think it's better that way.

      The television thing is the same. People don't really want to see what the actual football field looks like, they want to see what they think the football field looks like, and that means the grass ought to be bright, hunter green, the white uniforms should be almost shiny, and the yellow lines should be just about ready to pop off the screen, walk across the room, and rip your eyeballs out. Being true to the video signal that's coming into them isn't a factor.

      This is why if you want accuracy, you generally have to pay for it or expend some effort. With a photo, you have to tell the lab operator to run it though without corrections. With audio, you have to get "nearfield monitors" instead of regular consumer stereo speakers, and with televisions, it's why there are video monitors that are actually made to display what they're being fed, instead of an idealized version.

      It's all about giving people what they think they want.
    • Same thing with audio equipment. Room-shaking bass and razorblade sharp piercing highs sell gear. Doesn't matter if its a balanced sound, or if there's any separation between the elements in the mix. More bass? check. Killer sharp highs? check. Go to the checkout counter.

      ::cough:: Bose ::cough, cough::
    • That reminds me of a pal of mine, who works as a color printing expert in the graphics arts industry. Whenever we go to a consumer electronics store, he compulsively starts adjusting all of the TVs. When he starts out, they are all way over on contrast and brightness, and all look a lot different. After a few minutes, he's gotten them all looking as good as they can, and all pretty much the same in color rendition. Then, the sales guys notice and tell us to get out and never come back.
  • "Experts" (Score:4, Insightful)

    by EraseEraseMe ( 167638 ) on Thursday February 16, 2006 @04:00PM (#14736001)
    Experts also go into these reviews with their own 'professional' bias against specific companies, models and brands while a lay-consumer, like myself, doesn't care if it's a Hitachi, RCA, Samsung or Sony.

    Regardless of HOW it gets a 'sharper picture', if it appears to be a sharper picture to my eyes, then of course it's going to get a higher score over something with possibly better technology that SHOULD create a sharper image but creates other problems in it's 'excellency.'

    Do you buy a name brand TV that has all of the gizmos and gadgets to make it perfect, or do you buy the Walmart brand TV that looks good and sounds good (to your eyes anyways) until your TV expert friends comes in and poo-poos on everything?
    • If I'm looking for value, I'll go to Walmart, with full knowledge that if I get a no-name brand, I'm almost always getting less quality.

      College student's budget FTW.
    • Except the highest-ranking and the lowest-ranking displays were both by the same company, so the brand bias issue can be ruled out in this case. And introducing fake detail to make an image look sharper is not good, no matter how many displays it might sell.
      • And introducing fake detail to make an image look sharper is not good, no matter how many displays it might sell.

        Au contraire! Taking a cheap-ass display, and using a clever signal processing trick to make it look sharper and higher quality is an engineering triumph! I'm not joking here. Who is the best judge of which display is "best"? The guy buying the display. If the "experts" want their non-edge-enhanced, non-blown-color, flawless paragon of video processing, great. They can pay extra for the pri
    • Re:"Experts" (Score:3, Informative)

      by LordSkippy ( 140884 )
      Regardless of HOW it gets a 'sharper picture', if it appears to be a sharper picture to my eyes

      But, the picture isn't sharper, it actually degraded with extra noise. I suggest you turn the "Sharpness" all the way down on your TV, and leave it there for a month. At first, the image will look "soft" and not as "crisp", however it will be free of the noise that is distorting the image. After your eyes adjust to watching TV without this added distortion, you'll realize that the "Sharpness" adjustment shoul

    • Re:"Experts" (Score:3, Interesting)

      Experts also go into these reviews with their own 'professional' bias against specific companies, models and brands

      If the tests were properly conducted as double-blind tests where the experts didn't know the specific company, model and brand of the TV set they were judging, a lot of that bias could be discounted.

      (Of course, this isn't entirely possible -- even if you put a piece of electrical tape over the insignia, a consumer electronics expert is going to be able to recognize the make from as little as th
    • You didn't read the article obviously. The 'expert' brand bias caused them to unanimously rate JVC both first and last, $44K pro vs. $4k consumer monitor. The rest of the post reads like little more than one long rant against the notions of 'excellency' or 'expert'.
      • "The JVC Consumer unit scored highest with the consumer panelists and lowest with the Video Expert panelists"

        Apparently there's a disconnect between the "Regular Joe's" and the "Experts". That was the point of the post, jackass. Try reading the article next time.
        • There was no need to convince me further of your lack of reading comprehension, but since you insist. For your benefit then the line:

          "The 'expert' brand bias caused them to unanimously rate JVC both first and last, $44K pro vs. $4k consumer monitor."

          was obvious to most in response to your statement:

          "Experts also go into these reviews with their own 'professional' bias against specific companies, models and brands while a lay-consumer, like myself, doesn't care if it's a Hitachi, RCA, Samsung or Sony."

          Th

          • Because you're apparently retarded, here's the jury panel evaluations:

            Group (panelists) Brillian 720 JVC Consumer 720 Brillian 1080 eLCOS-JDSU JVC Professional 1080

            Student (5) B+ B+ A- A- A-

            Non-Technical (6) B A- A- A- A

            Technical (6) B+ A- A- A- A

            Home Theater (6) A- B+ A- A A

            AV Professional (5) B B B+ B+ A

            Video Expert (6) A- B- A- A- A+

            Notice the difference in scores between the "Video Experts" opinion of the highend JVC compared to the low end JVC compared to everyone els
            • Line 1 of your first post claimed experts operate on brand bias. Yet this same bias caused the experts to place one JVC model top of the class and the other JVC model dead last, in perfect correlation with the measurements. Apparently we've dropped that approach and moved to cost. If you tip the tinfoil back far enough for a better view, every group placed the Pro model first, three as the stand-alone winner and two in conjunction with others. All placed the consumer JVC at or tied for bottom with the Brill
  • Yeah, average people know jack shit about what they see and hear. I know some people (even musicians!) who are just fine with listening to 64kbps mp3s, and can tell the difference between that and 192kbps, but don't care enough to prefer 192. Meanwhile, I can still hear the difference between 128 and 192.
    • Re:Quality (Score:3, Informative)

      by engagebot ( 941678 )
      The problem is that a 64kb stream sounds the same (roughly) as 192kb when you're using $5 wal-mart headphones or the free bundled speakers that came with your $299 after-rebate-special PC.

      Mackie HR824s or some Sennheiser cans would still blow those peoples minds. But then again, i've seen people who STILL can't tell the difference...
      • (...) when you're using $5 wal-mart headphones or the free bundled speakers that came with your $299 after-rebate-special PC. Mackie HR824s or some Sennheiser cans would still blow those peoples minds. But then again, i've seen people who STILL can't tell the difference...

        So if people get maximum satisfaction out of $5 wal-mart headphones or the free bundled speakers, which is really worse off? Somehow I'm glad I don't need a chef from Michelin Guide to enjoy a meal which by my standards is just perfect. YM
    • I know some people (even musicians!) who are just fine with listening to 64kbps mp3s, and can tell the difference between that and 192kbps, but don't care enough to prefer 192.

      My theory is that the musicians don't care about the fidelity that much because that's not what they're listening for. They're paying attention to the notes, not the frequencies.

      My competing theory is that since many musicians play in heavily amplified situations without hearing protection, their sensitivity to certain frequencies is
      • I'm a part-time classical musician. Listening to lower than 128kbps mp3's does not bother me because of just what you stated, except that my hearing is perfect and we never have amplification.
      • (i'm assuming you're also a musician, like me)

        Then again, have you been in a Guitar Center recently? A lot of self-proclaimed musicians don't know what 'sounds good' either. Its really just a toystore these days, with a lot of crazy gadgets (digital modeling amps, etc) and fewer and fewer real items. Apparently with marketing, you can even make a guitarist think that a $25 9-volt powered distortion pedal is just as good as a $1500 point-to-point soldered analog tube amp.
        • You can get some cool sounds out of those Line 6 Pods and Variax guitars. I have both myself, and they're fun, but it doesn't even come close to my B-52 AT-212, my collection of BOSS pedals, my Cry Baby, and my Music Man guitar.

          I don't know why people buy cheap equipment and then say they get great sound from it. A cheap solid state guitar amp simply isn't going to sound anything like a fully analog tube amp. If you want that cheesy faux-metal Linkin Park sound, then go for it, but if you want a good mell
      • For me, it's the opposite. I'm a musician who can't stand to listen to MP3s at 128 or lower.

        I've done blind tests. Have someone play the same music at 192 or 128 and I can tell which is which. I just have to listen for a couple minutes and then decide if my ears are hurting or not. At 128 kbps or less, my ears start to hurt after 5 minutes or so.

        My theory is that my ears are straining to hear the harmonics that are lost with aggresive compression. Since I'm used to playing in accoustic settings (stringed in
    • I'm a semi-professional musician (I've played with a dozen professional artists and recorded with four, but it's not regular enough to be a career). I can hear the difference between 64, 128, 192, and 256. It's largely a frequency range thing, though somewhere between 128 and 64 you seem to lose a lot of the "fullness" of the sound.

      I've got no problem listening to most music at 64k. Most of mine I keep at 128k because my MP3 player is too small to fit many songs at higher bitrates. I recognize the loss in q
    • I know some people (even musicians!) who are just fine with listening to 64kbps mp3s, and can tell the difference between that and 192kbps

      Some people listen for the music, not the sound quality.

      That's why there are people who are content to listen to a recording of a great artist on a scratchy old 78. They can hear the quality of the art even if it's not being faithfully reproduced by the equipment or media.

      Similarly, there are people who obscess over the technical abilities of their equipment so much that
    • Another issue is that the mind is tricky about fooling you into hearing what you think it should sound like after a while. Kind of like how you tune out some obnoxious background noise after a little while of exposure.
    • I have one bad ear. Stereo is a waste of bandwidth.

      As for the vision, well, my eyes are starting to age too.
  • by gEvil (beta) ( 945888 ) on Thursday February 16, 2006 @04:01PM (#14736016)
    You mean to tell me that people unqualified to make a judgment call about something don't necessarily make the best decision?!?
    • Re:I'm shocked! (Score:3, Insightful)

      by timeOday ( 582209 )
      Who are you to say it's not the best decision? Buying a TV or stereo is not like a medical procedure where there are long-term unforseen consequences. Whatever Joe Stupid likes the best is the best... for him.

      Subjectivity is rampant among experts also, for instance many long-time photographers love film grain but can't stand pixelization or compression artifacts. Why? Conditioning.

      • Hehehe. I knew I'd get one of these. Yes, anything that is based upon personal opinion is subjective (hell, that's the very definition right there!). Unless you're comparing hard specs, there's going to be a certain amount of subjectivity involved. And, as others have already pointed out, the consumers and experts were very obviously judging these sets on two totally different sets of criteria.
    • Funny as your comment is, I think from a marketing standpoint, if a product can be brought to the consumer with, albiet subjectively, superior quality at a lower cost, its like a gold mine to manufacturers.

      If artificial texture makes a consumer believe it has better resolution, and makes for a sale, I would wager that all middle to lower end HDTV products pick up on that fact.
  • No Sony (Score:1, Insightful)

    by Anonymous Coward
    No Sony LCOS. Give me a break.
  • Think of how many tech products of obvious low quality to any expert are big sellers. This applies to consumer products in general as well... it shouldn't be a surprise to anyone!
  • it's about making consumers look stupid. Evil, yes, but aren't we all?
  • Oversharp (Score:4, Insightful)

    by LordMyren ( 15499 ) on Thursday February 16, 2006 @04:08PM (#14736088) Homepage
    Many consumer sets are tuned to be strongly oversharpened. I was at circuit city and some guy was doing consumer research for whatever big company he worked for, asked me to compare some DLP and Plasma units. Since I was doing that for myself anyways, I was happy to oblidge in some discussion.

    The JVC at first looked really eye catching and noticable from the rest, but staring at it for three minutes made me realize it was because they cranked the crap out of the sharpness filter. Everything looks sharp and bold for a couple minutes, very eye catching, but after three minutes it gets really exhausting and thoroughly artificial. I cant remember the other set that did this. Way too much post-processing, but it catches your eye.

    I told the guy this, he says I was defiantely the first person to ever describe anything as "oversharp" to him. Suprising, considering how much filtering some of these units do.
    • Well, it caught my mothers eye, so I'll try it out at comming visit I suppose. She mainly compared it with a Samsung TFT, but she thought that the Samsung image was too sharp/too pronounced. And she preferred the design above that of Samsung as well.

      Of course, it will beat the crap out of the 4:3 15 year old television screen no matter what happens (it finally broke down - now I will be able to watch more than 26 channels and zap up AND down).
      • 26 channels? What kind of OTA are you getting?

        My `rents are paying like $14/mo to get a cable that gives them the OTA stations, their reception is so bad. It still looks like garbage and they still only get like 12 channels. Coastal maine tho, ymmv.

        Honestly some of the postprocessing is so overboard I would take the no-resolution 15 year old TV we found in the trash (37 inch Zenith). It wasnt the sharpness, but some other filter made one of the sets look like the everything was coated in three layers of
        • It's not the cable, but the TV, which is/was limited to 26 channels. And she never thought it wise to get a digital set top box, so there you go :)
  • What is Sony's deal? (Score:3, Informative)

    by MrPeavs ( 890124 ) on Thursday February 16, 2006 @04:09PM (#14736102)
    It is really too bad Sony wouldn't send out a unit. Their SXRD line-up, right now, is probably the best consumer grade TV out on the market.

    I have been in awe of LCoS since it came out, when Toshiba's failed attempt at releasing it. Toshiba had some major problems out of the gate and I don't think it helped their price tag was $8,000 for the 50 some inch and $10,000 for the 60 some inch. They did look great though, dispite the problems.

    Then JVC hit the market with one, re-naming it to HD-ILA. Not exactly sure why they renamed it, maybe to disassociate themselves from the failed Toshiba LCoS sets? They looked great when compared to DLP, LCD and even plasma, though they still were on the pricier side. My only complaint with them is they were JVC, a company that I would put in the middle of the road as far as quality. I also hate this new trend for silver TVs, but those two were only minor issues with one just being a personal preference.

    Then Sony came out with their renamed LCoS, the SXRD. Sites like AVSForum were all the buzz with these new sets. When I finally got to see one in person, it was a dream come true. LCoS overall is a better technology that DLP and especially LCD. DLP maybe able to make a surge in taking LCoS's crown once we see 3 chip DLPs sets and at "affordable" prices. I use affordable loosely, as $4,000 for 50" and ~$5,000 for 60" isn't exactly "affordable" for everyone, but for videophiles, it is.

    I have not heard of the other companies that they listed, and to my fault, I haven't been on AVSForum much recently. I would not trust them until I see some reviews, off-brands tend to not do well. Especially like startup companies like Brilla, they usually just don't have the funding or experience to make quality sets their first time around. The one company I would love to see make a LCoS set would be Mitsubishi. I am loyal to them, to a degree. They have been making big screen TVs for many years now, actually almost 3 decades now. They know what is up, when they truely entered the DLP market. I am not talking about thier first sets when DLP was brand new and never took off, but rather about two years ago when them and Toshiba challenged Samsung DLP crown only because they were the only one making DLP sets. Mitsu did it right, beating out Samsung sets hands down. Only downside, you were paying a little more for a Mitsu DLP. Toshiba also did a great job at DLP, I would rank them Mitsu, Toshiba and then Samsung in overall DLP quality, though the new pseudo DLP/LCD 3 driver 1080p Samsung set is pretty impressive.

    The sad thing is, I think LCoS is only going to have a short life as the technology to get. SED and OLED are on their way. SED is suppose to actually rival CRT picture quality for about the same price with out the size and weight of CRT. Something plasma and flat panel LCD is unable to do and probably will never be able to do. Though, for the time being, LCoS is the way to go and if you can't afford the Sony SXRD set. JVC's are still great sets and for much less. I think their ~50" is going for about $2,500 or maybe even less.
    • by chiph ( 523845 )
      I agree with the picture quality of the SXRD sets -- compared to any other projection set in it's price range, they top them all. But still, when I bought my first HD set the other month, I got a tube display (34" XBR) for two main reasons:

      1. 50" (the smallest SXRD) is still far too large for my room. Maybe they'll come out with a 40" at some point.
      2. The tube picture quality is still better than even the SXRD. There's a reason why Best Buy et.al. keep the tube sets far away from the projection models
  • small wonder... (Score:4, Insightful)

    by pulse2600 ( 625694 ) on Thursday February 16, 2006 @04:33PM (#14736339)
    One possible explanation for the consumer ratings is that JVC is simply giving consumers exactly what they think they want.

    This statement hits the nail on the head...JVC knew what they were doing when they made a technically crappy screen, just like Microsoft cares more about how much users like clippy the office assistant than they do about a buffer overflow. They know what they need to do to sell their product, most other things are irrelevant. Why should JVC give a flying rat's if 100,000 geeks see artifacts when 1,000,000 non-geeks see "sharpness and texture"? They'll probably make more off the geeks by selling them some model they deem "higher-end" than the consumer version for 20% extra, because the geeks will percieve it as being so much better than the "inferior consumer" model. Someone at JVC really knows how to play the consumer perception card real well, and I bet this particular example comes at a manufacturing cost savings as well.
    • Re:small wonder... (Score:3, Interesting)

      by kettch ( 40676 )
      JVC isn't the only one who manipulates the customers. I went into the local Sears to look at displays, because they have the best selection in our area. I quickly noticed that the high priced, sexy, flat panel(read: "high commission") models had crystal clear feeds. The slightly older tv's, and the crt models had an obviously doped feed that was fuzzy and had a little bit of static.

      I pointed out the poor picture quality to the person I was with, and the nearest sales droid jumped in and informed us that tho
      • Re:small wonder... (Score:5, Interesting)

        by DiscoOnTheSide ( 544139 ) <ajfili&eden,rutgers,edu> on Thursday February 16, 2006 @06:56PM (#14737670) Homepage
        Having worked in a Sears in Toms River, NJ in the same department... it's not so much that they're doping the signal... it's that retards wire the displays. No TV shop is going to be able to show you the full quality of a TV using an HD signal. Sears had their feed from DirecTV... the non HDTVs were put to the plain ol' Discovery Channel. Now... the splitters they use for the HD screens... are pretty efficient... but there is still a quality loss at each split. The non-HDTVs were hooked up via run of the mill coax going through (as far as I could count) 80 RF splitters... these splits are NOT as efficient. Also, for some reason, some TVs took the split signal fine, others wouldn't touch it and a lot of the LCD TVs (~20 inches) put out the most god awful picture ever. The DirecTV sat box pushing the HD signal was set to 480p... nicer than regular TV but nowhere near the level that the TVs could produce.

        I rewired a bit of the store (my manager didn't give a shit cause the better the TVs looked, the more likely we were to sell). All of the top-shelf TVs (particularly the Sony XBR LCoS line) were hooked up to Samsung or Sony upconversion DVD players via HDMI. Pretty much I could say "This is exactly how DVDs will look on your TV, and full HD service even better." And customers ate it up. Eventually I swapped DirecTV boxes out of the break room and into the display and low and behold 1080i went to all of the HDTVs.

        The difference was immediately noticable and sales surged. I was then fired for not selling enough warranties, my 9.5% not up to their 10% "desired goal", regardless of the big increase of sales I brought in... If the way that store is run is any indication of how Sears as a whole opperate... I give it a decade until they're all K-Marts. They'd shut off the AC on 100 degree days at 8PM (closing time is 10) to save money. Older folks were about ready to have heat strokes, and as was I, surrounded by CRT and Plasma screens all day...
  • A search of PC mag does not even come up with this article...
  • So perhaps the technically precise unit is the 'losses' presentation, and the JVC unit gave up on the technically precise elements that the average consumers eyes can't "hear", and so the 'lossy' presentation looked just as good, or better, than the lossess one because it suited the analog eye instead of the digital microscope?
  • by mmell ( 832646 ) on Thursday February 16, 2006 @05:36PM (#14736917)
    (My dad was a TV repairman - I grew up looking at television damned near 24x7)

    Most consumers don't want a realistic looking picture, they want the picture they've seen all of their lives. Even with televisions; many of my wife's family and friends upon hearing about my background, asked me to look at their televisions. Most needed minor convergance/pincushion adjustment, all needed brightness/contrast/color/tint adjustment. I made them all look (IMHO) pretty good.

    Virtually every set I touched was changed within a week. The single control that was most nudged: color (think saturation). Everybody is used to the cartoon-level, LSD-induced superbright colors of a children's room. Real skin doesn't look like that!. I could even hold my bare arm up next to a character on TV, show my relatives and friends that this is what the picture should look like (gee, flesh looks like flesh. Grass looks like grass), and within ten minutes they'd be cranking up the color.

    I gave up. Nowadays, I tell people "I don't do Windows, and that includes televisions". Yes, I get some wierd looks for it, but I also get bothered a lot less.

    Buy the television which matches your pocketbook and your expectation of picture quality. Most of you will never miss the extra quality that a 200-300% increase in price will bring; worse, you'll probably adjust the extra quality right out of the set in a quest to get the lurid color balance you want. By the way, on a new set you should have a pretty good picture if both brightness and contrast are set to mid-range. Cranking both of them to max may look like what you want, but you're just cutting the lifespan of your picture tube in half (applies to CRT's only - I have no idea what the effect is on LCD/Plasma displays).

    • my father didn't want my to fix the color levels on his TV even though i could show him how badly overblown they were because the solid colors in his DVR menu were bleeding all over the place, he insisted that there is nothing wrong with that and he likes it that way
  • In terms of expert versus consumer opinions, the buyer's opinion is the only one that matters. But experts often pick up things that consumers miss in their initial (brief) evaluation, and then discover much later with regret. To get a handle on this point just replace "HDTV" with "automobile" for example. A consumer needs to be happy with their own "test drive" but they couldn't possibly examine emergency handling, engine and transmission performance issues that only an expert can do. In the context of the
  • I think that all white wines tastes like cat piss, yet there are huge pricedifferences between different brands, and people think some of them taste divine, while others taste like... cat piss.

    It's the same reason why _really_ crappy surround sound systems sell like hotcakes, they sound like crap, but people really don't know the difference, and probably haven't really heard what a good system is suppose to sound like. It was like when I finally got a friend to come with me to a THX theater, and he finally

What is research but a blind date with knowledge? -- Will Harvey

Working...