Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Television Media Hardware

1080p, Human Vision, and Reality 403

An anonymous reader writes "'1080p provides the sharpest, most lifelike picture possible.' '1080p combines high resolution with a high frame rate, so you see more detail from second to second.' This marketing copy is largely accurate. 1080p can be significantly better that 1080i, 720p, 480p or 480i. But, (there's always a "but") there are qualifications. The most obvious qualification: Is this performance improvement manifest under real world viewing conditions? After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however! In the consumer electronics world we have to ask a similar question. I can buy 1080p gear, but will I see the difference? The answer to this question is a bit more ambiguous."
This discussion has been archived. No new comments can be posted.

1080p, Human Vision, and Reality

Comments Filter:
  • Article Summary (Score:5, Informative)

    by Jaguar777 ( 189036 ) * on Tuesday April 10, 2007 @09:02AM (#18674793) Journal
    If you do the math you come to the conclusion that the human eye can't distinguish between 720p and 1080p when viewing a 50" screen from 8' away. However, 1080p can be very useful for much larger screen sizes, and is handy to have when viewing 1080i content.
    • by elrous0 ( 869638 ) * on Tuesday April 10, 2007 @09:11AM (#18674957)
      In other words, your mother was wrong. You're better off sitting CLOSER to the TV.
      • ...depending on how old you are. I think the concern was associated more with X-ray radiation emissions from CRT televisions, and older ones at that (prior to the introduction of the Radiation Control for Health and Safety Act of 1968 [fda.gov]). I would fathom to say that most of us on this site are too young to have been plopped in front of a TV that old for large amounts of time.

        • by Hoi Polloi ( 522990 ) on Tuesday April 10, 2007 @09:39AM (#18675399) Journal
          I assume this is why TV tubes are made with leaded glass, to absoarb the soft x-rays being generated. This is also why tossing out a TV tube improperly is a pollution no-no.
          • by Radon360 ( 951529 ) on Tuesday April 10, 2007 @10:40AM (#18676471)

            You are correct about the lead [wikipedia.org]. According to this site [state.mn.us], a CRT can have 5-8 pounds of lead in it.

          • by Kadin2048 ( 468275 ) <slashdot...kadin@@@xoxy...net> on Tuesday April 10, 2007 @10:43AM (#18676541) Homepage Journal
            I think there are multiple techniques used to control X-ray production. Leaded glass might be one of them.

            What's interesting to note is that although you generally think about the picture tube being the source of problematic X-rays, in reality it was some of the other tubes -- particularly rectifier tubes -- back in the guts of older TVs that really had issues. Since modern televisions usually don't contain any tubes besides the one you look at, we don't think about the others very often, but they were at one point a major concern.

            This Q&A [hps.org] from the Health Physics Society describes the issue: "The three major sources of x rays from these sets were the picture tube, the vacuum tube rectifier, and the shunt regulator tube. The latter (designations 6EF4 and 6LC6) were a particular problem. Over a third of the 6EF4 tubes tested produced exposure rates above 50 mR/hr at a distance of nine inches, and exposure rates up to 8 R/hr were observed at seven inches with one defective tube!" Just to put that in perspective, 8 R/hr is like ~150 chest X-rays per hour, or like getting a whole-body CAT scan once an hour. Granted, you probably don't usually sit seven inches away from your TVs power supply, but it's still unhealthy. (And a lot of people's cats do...)

            So really, sitting next to the side of that old (1965-70) TV could be a lot more hazardous than sitting in front of it.
        • Re: (Score:3, Interesting)

          by ronanbear ( 924575 )
          It's not just that. Sitting too close to the TV (especially CRTs) strains your eyes. That's because the light itself is non parallel (single source) and your eyes have to adjust their focus to see the whole picture. Your eyes actually correct the light but over prolonged time you tend to get headaches and tired.
        • Re: (Score:3, Funny)

          by Gilmoure ( 18428 )
          Speak for yourself non-melted eyeballs boy! Some of us are old. We're out on our lawns, yelling at the kids don' cha' know.
    • by fyngyrz ( 762201 ) * on Tuesday April 10, 2007 @09:18AM (#18675041) Homepage Journal

      According to the linked text, the "average" person can see 2 pixels at about 2 minutes of arc, and has a field of view of 100 degrees. There are 30 sets of 2 minutes of arc in one degree, and one hundred of those in the field of view, so we get: 2 * 30 * 100, or about 6000 pixel acuity overall.

      1080p is 1920 horizontally and 1080 vertically at most. So horizontally, where the 100 degree figure is accurate, there is no question that 1080p is about 2/3 less than your ability to see detail, and the answer to the question in the summary is, yes, it is worth it.

      Vertically, let's assume (though it isn't true) that only having one eye-width available cuts your vision's arc in half (it doesn't, but roll with me here.) That would mean that instead of 6000 pixel acuity, you're down to 3000. 1080p is 1080 pixels vertically. In this case, you'd again be at 1/3 of your visual acuity, and again, the answer is yes, it is worth it. Coming back to reality, where you vertical field of view is actually greater than 50 degrees, your acuity is higher and it is even more worth it.

      Aside from these general numbers that TFA throws around (without making any conclusions), the human eye doesn't have uniform acuity across the field of view. You see more near the center of your cone of vision, and you perceive more there as well. Things out towards the edges are less well perceived. Doubt me? Put a hand up (or have a friend do it) at the edge of your vision - stare straight ahead, with the hand at the extreme edge of what you can see at the side. Try and count the number of fingers for a few tries. You'll likely find you can't (it can be done, but it takes some practice - in martial arts, my school trains with these same exercises for years so that we develop and maintain a bit more ability to figure out what is going on at the edges of our vision.) But the point is, at the edges, you certainly aren't seeing with the same acuity or perception that you are at the center focus of your vision.

      So the resolution across the screen isn't really benefiting your perception - the closer to the edge you go, the more degraded your perception is, though the pixel spacing remains constant. However - and I think this is the key - you can look anywhere, that is, place the center of your vision, anywhere on the display, and be rewarded with an image that is well within the ability of your eyes and mind to resolve well.

      There are some color-based caveats to this. Your eye sees better in brightness than it does in color. It sees better in some colors better than others (green is considerably better resolved than blue, for instance.) These differences in perception make TGA's blanket statement that your acuity is 2 pixels per two minutes of arc is more than a little bit of hand-waving. Still, the finest detail in the HD signal (and normal video, for that matter) is carried in the brightness information, and that is indeed where your highest acuity is, so technically, we're still kind of talking about the same general ballpark — the color information is less dense, and that corresponds to your lesser acuity in color.

      There is a simple and relatively easy to access test that you can do yourself. Go find an LCD computer monitor in the 17 inch or larger range that has a native resolution of 1280x1024. That's pretty standard for a few years back, should be easy to do. Verify that the computer attached to it is running in the same resolution. This is about 1/2 HD across, and 1 HD vertically. Look at it. Any trouble seeing the finest details? Of course not. Now go find a computer monitor that is closer to HD, or exactly HD. You might have to go to a dealer, but you can find them. Again, make sure that the computer is set to use this resolution. Now we're talking about HD. Can you see the finest details? I can - and easily. I suspect you can too, because my visual acuity is nothing special. But do the test, if you doubt that HD offers detail that is useful to your perceptions.

      Finally, n

      • Re: (Score:3, Insightful)

        by maxume ( 22995 )
        So the op says "the human eye can't distinguish between 720p and 1080p when viewing a 50" screen from 8' away" and then you go on and on and on to come to the conclusion that it ends up mattering how big the screen is and how close you sit to it, essentially because the human eye is limited to hd resolution or so when a screen is taking up 1/3 of your field of view. Nice work.
        • by InsaneProcessor ( 869563 ) on Tuesday April 10, 2007 @10:03AM (#18675823)
          This is just another reason why I still you an old standard TV. Until the dust settles, I ain't spending one thin dime on HD.
        • Re: (Score:2, Funny)

          by fyngyrz ( 762201 ) *

          And your accusation of redundancy covers my bringing up acuity variations across the eye, the difference between color and luma acuity, the differences between horizontal and vertical acuity, scanning the image as opposed to trying to catch it all in one gestalt along the general theme that distance isn't the entire issue... exactly how?

          Oh. You didn't get all that. I'm sorry. I thought you might have been paying attention. My bad.

      • by Paladin128 ( 203968 ) <aaron@noSpam.traas.org> on Tuesday April 10, 2007 @09:35AM (#18675321) Homepage
        Though you are correct that human acuity degenerates near its edges of visual range, some people actually look around the screen a bit. I'm setting up my basement as a home theater, and I'll have a 6' wide screen, where I'll be sitting about 9' away. My eyes tend to wander around the screen, so the sharpness at the edges does matter.
        • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday April 10, 2007 @11:56AM (#18677797) Homepage Journal

          Though you are correct that human acuity degenerates near its edges of visual range, some people actually look around the screen a bit.

          Thanks to saccades [wikipedia.org], all people actually look around the screen a bit.

          Your brain does this automatically when it wants more information about something. It doesn't bother to inform you that it's done it because it would only be disorienting. So the eye movement is not represented in your mental map.

          Keep in mind that if you wear glasses that turn the image upside down for a few months, eventually you will learn to navigate and it will all make sense to you. It's very important to remember that you are not seeing what you think you are seeing. Your conscious mind is delivered a representation of what you are seeing, which has been heavily processed by your brain.

      • by jimicus ( 737525 ) on Tuesday April 10, 2007 @09:36AM (#18675347)
        You're commenting on something which it sounds like you might actually be qualified to comment on! What are you doing on /. ?
      • by badasscat ( 563442 ) <basscadet75NO@SPAMyahoo.com> on Tuesday April 10, 2007 @10:27AM (#18676239)
        I'm constantly arguing with people about whether or not you can see the difference between 1080p and 720p or not. As in most other things, many people want an absolute answer: yes you can, or no you can't. They read things containing ambiguity and conclude the answer must be "no you can't." But that's not what the word "ambiguity" means.

        As you point out, not everybody has the same visual acuity. My vision is corrected to better than 20/20, and even beyond that, to some extent I've been trained in visual acuity first by going to film school and then by working in the TV industry for some years. My job is to look for picture detail.

        I have a 42" 1080p LCD monitor, from which I sit about 6 feet away. I can easily distinguish the difference between 720p and 1080i programming (which gets properly deinterlaced by my TV set, so I'm seeing a full 1920x1080 image). Now, some of that is probably the scaling being done by my cable box, but almost surely not all - and anyway, scaling is never taken into account in the opposite argument (ie. nobody stops to consider that "720p" TV sets are almost all 1366x768 or 1024x768 resolution, meaning they have to scale everything).

        I think the bottom line is some people see the difference, some don't, and yes, it depends on how close you sit and how big your TV is. It depends on a lot of things. There's no "yes, you can see the difference" or "no, you can't".

        One thing I would say is that with prices being what they are these days, I don't see any reason whatsoever not to buy a 1080p set. The price difference between 720p and 1080p in the same line of sets is usually only about 10-20%.
        • Re: (Score:3, Informative)

          by ShakaUVM ( 157947 )
          There are massive differences between the resolution conversion chips. Two more or less identical models, a 42" Sony and a Sharp, look exactly the same in 1080p format. But in 720, the Sharp looks *terrible*, whereas the Sony still looks pretty good. The upconverters used in a lot of set top boxes are the el cheapo ones, so what you may be seeing are simply artifacts from the conversion.

          That said, I have very sensitive eyes too, and have to return more than half of the monitors I buy due to defects of one k
      • Re: (Score:3, Interesting)

        by BakaHoushi ( 786009 )
        This is why I don't care much for HD. I just want to sit down and watch my God damn show or (more often) play my damn game. When you start talking about if the eye can even take in all that data and from what distance, what size screen... it's too much.

        It's still a fucking TV. Does it really matter just how blue X celebrity's jeans look? The reason I've stopped watching as much TV as I used to is because it's become so mindless... or at least I've come to realize how mindless it is. The picture is fine to m
    • Another factor is upscaler quality. I have a DVD player with a Faroudja upscaler, and DVDs played with it look pretty much indistinguishable from HDTV on my set. That is, a well encoded DVD movie looks about as good as (say) an HD episode of CSI.

      That's why I'm in no hurry to get Blu-ray or HD-DVD. I'll wait for one of 'em to win (and for someone to start selling region-free players if Blu-ray wins).
      • Yep, I'm holding off HD-DVD and Blu-ray for the same reason (amongst other, more political ones). I've only got a cheapo DVD player, but my TV is a nice Loewe box that upscales very well (though it has only 720p resolution — no-one was selling 1080any for home TVs at that stage). The first DVD I played on it was Revenge of the Sith, and I was pretty much blown away by the appearance of the opening battle scenes. I've seen some HD demos, and short of watching them side-by-side, I would be hard-pressed

      • Re:Article Summary (Score:5, Insightful)

        by nospmiS remoH ( 714998 ) on Tuesday April 10, 2007 @09:41AM (#18675439) Journal
        I think the bigger issue is that the majority of HD content out there sucks. Taking crappy video and re-encoding it to 1080p will not make it look better. Sure it's "full HD" now, but it can still look like crap. I have seen many 720p videos that look WAY better than some 1080p videos simply because the source content was recorded using better equipment and encoded well. TNT-HD is the worst network of all for this crap. Many of there HD simulcast stuff is the exact same show just scaled up and often times stretched with a terrible fish-eye effect. It is sad the amount of bandwidth being wasted for this "HD" crap (don't even get me started on DirecTV's 'HDLite'). [/rant]
        • Does anybody else have problems with their cable company over-compressing the digital cable? I pay a lot of money for cable, and in the last few years the quality has degraded while they try to stuff more SD channels, HD Channels, OnDemand Channels, VOIP, and Internet over that same line. I'm not going to upgrade to HDTV until I can be guaranteed that I'm actually getting really good looking television and that the quality won't degrade as they try to put more content on the tubes. Has anybody else notice
      • Re: (Score:3, Informative)

        by ivan256 ( 17499 )
        Do you own stock in Faroudja or something?

        Great. You can't tell. Here's a cookie.

        The rest of us can tell the difference between well encoded 1080i content and upscaled 480p content. I'm very sorry for you that you can't.

        (And I still think that your real problem is that your television does a crappy job of downscaling 1080i to 720p, and that's why you mistakenly believe your upscaled DVDs look just as good.)
      • by lokedhs ( 672255 )
        Well, that's because your HD broadcast is to heavily compressed that the advantage becomes minimal.

        Try connecting a BR player to your HD TV and watch a good quality BR movie. Then you'll see a huge difference.

    • by nsayer ( 86181 ) *
      So once again, the /. summary leaves out the most important bits - you can't talk about resolution without also mentioning screen size and distance from the eyes.

      We have a 50" 720p set 6 feet away from us, and for us, it's ideal. A larger set would overwhelm the room, and we wouldn't really want to move any closer or further away. But with that set-up, the difference between HD and SD programming is both obvious and striking. Even mundane stuff - like comparing WPT broadcasts to the NBC National Heads-up ch
      • IDK about that.
        I recently replaced a 720p monitor with a 1080p unit and there is a remarkable difference when working with HD content.
        even on my paltry 22" monitor the difference was mind blowing. I had been producing 1080p content for a while, but displaying it on a 720p output device (the RCs would be viewed on a 1080p plasma), switching to 1080p of everything was awesome.

        -nB
  • 1080p content (Score:5, Insightful)

    by Orange Crush ( 934731 ) * on Tuesday April 10, 2007 @09:03AM (#18674813)
    There's still not much available in the wild that does 1080p justice right now anyway. Horribly compressed 1080p looks every bit as awful as horribly compressed 1080i/720p.
    • I recently saw an article posted by Secrets of Home Theatre, very well known for their DVD benchmark process and articles.

      The article is here [hometheaterhifi.com].

      They show numerous examples of how the processing involved can indeed lead to a better image on 1080p sets. Mind you it is not just the resolution, but how 480 material being processed and scaled can look better on a 1080p screen than on a 720p (or more likely 768p) screen. It is a very interesting read. Although if you are already conversant in scaling and video processing some of it can be very basic. I count that as a feature though as most non-technical people should be able to read it and come away with the information they are presenting.

      Definitely interesting as a counterpoint.

    • Try these trailers [blogspot.com]...
  • People Are Blind (Score:5, Insightful)

    by CheeseburgerBrown ( 553703 ) on Tuesday April 10, 2007 @09:05AM (#18674859) Homepage Journal
    Consider many people can't distinguish between a high definition picture and a standard definition picture warped to fit their HD screen, this question seems largely academic.

    • I can definately tell the difference.. I leave SD broadcasts with the side and top/bottom bars, otherwise it looks crappy. I also disabled my tivo's upscale to 1080i function, and left it in the shows native format, otherwise it gets blocky.. although right now we are doing renovations in the TV room, so the 56" giant 1080p samsung is sitting 4 feet from the futon in my spare bedroom.. that does not help the viewing either :P
    • by Luke ( 7869 ) on Tuesday April 10, 2007 @09:37AM (#18675355)
      Exactly. I wonder how many people claiming they can see a difference between 1080i and 1080p happily listen to compressed audio with earbud headphones.
      • by Random Destruction ( 866027 ) on Tuesday April 10, 2007 @10:33AM (#18676359)
        You're right. Theres no way the people who think 128kbps mp3s sound good could hear the difference between 1080i and 1080p. Those of us who use flac and sennie HD650s, however, can hear the distortion in a 1080i video signal from another room.
      • Re: (Score:3, Insightful)

        by Jeff DeMaagd ( 2015 )
        Exactly. I wonder how many people claiming they can see a difference between 1080i and 1080p happily listen to compressed audio with earbud headphones.

        I'm not sure how that is necessarily insightful.

        Video is compressed too. Compression by itself isn't bad, it's when it is poorly compressed where it becomes a problem.

        Your comparison involves different systems of the body. There are people with better ears than others, but worse eyes than others, and there are people with better eyes than others but worse e
    • Consider many people can't distinguish between a high definition picture and a standard definition picture warped to fit their HD screen, this question seems largely academic.

      That's because, given a good upscaler, you can't distinguish much difference between DVD quality (which is most people's benchmark of what their SD TV can do) and 720p (which is what most HDTVs show). If by "standard definition" you're talking about crappy, digitally compressed TV channels at lower resolutions, then sure, there's a

  • by bleh-of-the-huns ( 17740 ) on Tuesday April 10, 2007 @09:06AM (#18674867)
    Last I checked, other then HD/BR DVD players, and normal DVD players that upscale to 1080p, there are no sources from cable or satellite that broadcast in anything other then 720, so its kind of a moot point. I have heard rumours verizon fios tv will have a few 1080p channels in a few months, but nothing substantial... and last I checked, there boxes do not do 1080p (I could be wrong about the boxes statement though)

    I have a series3 tivo though, which only supports up to 1080i :(
    • Re: (Score:3, Insightful)

      by Cauchy ( 61097 )
      Seems to me, the PS3 is pushing 1080P capable devices into millions of homes (sales issues aside). Many games that are being released are at 1080P. I just ordered my first Blu-Ray DVD (BBC's Planet Earth series). I think that is something worth seeing at 1080P.
      • Seems to me, the PS3 is pushing 1080P capable devices into millions of homes (sales issues aside).

        Then you just need a TV that can display it. :-)

        Perhaps it's different in the US, but certainly here in the UK, 1080 is still exceptional, and almost all HDTVs are really just displaying 720p. Even the serious brands you get from speciality shops have only started supplying 1080-capable units very recently (months, not years) and they cost a fortune even by geek standards. So while I'm sure Planet Earth w

    • My wife and I were looking at TVs, and we walked past some gorgeous 52" LCDs that support 1080p, and I told her this is what I wanted.

      Then she walked past a smalled 32" LCD that only supported 720p/1080i and she said, "this picture looks so much better, and the TV is $1000 less! Why?"

      I casually explained that the expensive TV was tuned to a normal TV broadcast, while the cheaper TV was being tuned to ESPNHD. She looked and realized that the most expensive TV getting a crappy signal isn't going to look all the great.

      I still want a nice LCD that supports 1080p, but I'm not pushing for it immediately until I can afford a PS3 and a nice staple of BluRay movies to go along with it.

      720p looks IMMENSELY better than 480i, or any crappy upscaled images my fancy DVD player and digital cable box can put out. I have yet to see a nice, natural 1080p image myself, but I'm willing to bet I will be able to tell the difference.

      If anyone recalls, there were people who insisted that you couldn't really tell the difference between a progressive and interlaced picture.
      • If anyone recalls, there were people who insisted that you couldn't really tell the difference between a progressive and interlaced picture.

        Well, provided that there is no temporal difference between both fields and they are combined properly, you can't.

        What you can see is the difference between progressive and interlaced video, and even that is not always true, depending on content and the exact display you are using.

        A 480i resolution picture is roughly 2 fields of 640x240, combined to 640x480.
        A 480p resol
    • by Zontar_Thing_From_Ve ( 949321 ) on Tuesday April 10, 2007 @09:19AM (#18675067)
      Last I checked, other then HD/BR DVD players, and normal DVD players that upscale to 1080p, there are no sources from cable or satellite that broadcast in anything other then 720, so its kind of a moot point. I have heard rumours verizon fios tv will have a few 1080p channels in a few months, but nothing substantial... and last I checked, there boxes do not do 1080p (I could be wrong about the boxes statement though)

      Wow, this is wrong. Since you mentioned Verizon, you must live in the USA. NBC and CBS both broadcast in 1080i right now. Discovery HD and Universal HD do too. Those come to mind fairly quickly. I'm sure there are others. By the way, I wouldn't hold my breath about 1080p TV broadcasts. The ATSC definition for high def TV used in the USA doesn't support it at this time because the bandwidth requirements to do this are enormous.
    • BBC's Sky broadcasts some movies in 1080p. They actually use H.264 within a transport stream, which looks pretty good.

      Here is an example of one of their movies I have: Domino [imageshack.us], an OAR broadcast, so its actual res is 1920x800 or so. It looks much better than DVD to me. I also have the DVD around here somewhere, but no caps from it to compare to.
  • IMAX (Score:4, Funny)

    by timster ( 32400 ) on Tuesday April 10, 2007 @09:07AM (#18674897)
    I, for one, will not be happy until I have an IMAX theater in my home. That requires way, WAY more resolution than 1080p. And you can see the difference for sure.
    • I used to get teased about using outdated technology by members of our local photo club who shoot crop-factor digitals and project digitally, until I brought in my 6x6 projector and put some images up on the screen.
      • by mstahl ( 701501 )

        6x6cm is 120/220 sized film. IMAX is actually 70x48.5mm [wikipedia.org]. So each frame is about as large as from my large-format 5x7 camera.

        Nitpicking aside, though, I've used the same trick to get digital advocates to stfu. One frame of 6x6 at, say 100ASA—if you were to consider each grain of silver halide to be a "pixel"—and you're talking hundred of megapixels.

    • Re:IMAX (Score:5, Funny)

      by Hoi Polloi ( 522990 ) on Tuesday April 10, 2007 @09:46AM (#18675527) Journal
      I'm trying out this thing called "outdoors". 3D video, extreme HD, millions of colors, and it is free! The reviews say it is incredibly lifelike.
  • If you lean into your honey for a kiss, she doesn't get all pixellated when you get close to her face.

    When you press your face up against your HDTV panel, you should be able to tell the difference between 1080p and reality.

    If you can't tell the difference between the two, then you might want to get your eyes checked.
  • Analogy (Score:5, Funny)

    by Chabo ( 880571 ) on Tuesday April 10, 2007 @09:12AM (#18674973) Homepage Journal
    After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however!

    But it does mean that the performance of the car won't be limited by the tires... ;)
    • > But it does mean that the performance of the car won't be limited by the tires... ;)

      Depends how you define performance. High-speed tires tend to have harder rubber and/or shallower tread depth.

    • But it does mean that the performance of the car won't be limited by the tires... ;)

      Very true, but I believe there is an expectation that the delivery and display of signal(s) will continue to improve so that the capabilities of the new gear can be realized; we don't have the same expectation of the highway infrastructure, at least in the US. (We don't have enough physical or visionary room for wholesale upgrades.)

      The resolution of current televisions will eventually become a limitation. The Prius w
    • by smchris ( 464899 )
      Enough with the Prius bashing. We all know electric motors have great torque. I believe the Prius computer intentionally reins that in some for the boring commuter experience. I got my wife a "My Prius accelerates faster than your SUV" bumper sticker but she doesn't have what it takes to put it on. And they maneuver just fine in 70+ mph freeway traffic.
    • Re: (Score:3, Informative)

      by vought ( 160908 )
      After all, one can purchase 200mph speed-rated tires for a Toyota Prius®.

      No you can't. [tirerack.com]

      186/65R15 (Prius' standard tire size) is only built to an H speed rating. That makes it a 130mph speed-rated tire. No manufacturer builds this tire to a Z speed rating - 186mph+.

      Car and technology analogies are mostly flawed, as are most generalizations.
  • by davidwr ( 791652 ) on Tuesday April 10, 2007 @09:13AM (#18674987) Homepage Journal
    My "real-world" conditions may be a 50" TV seen from 8' away.

    Another person may watch the same 50" set from 4' away.

    Your kids may watch it from 1' away just to annoy you.

    2 arc-minutes of angle is different in each of these conditions.

    Don't forget: You may be watching it on a TV that has a zoom feature. You need all the pixels you can get when zooming in.
    • I just have to ask...why zoom? Did the director not get a close enough view of Amber's naughty bits for you?

      Seriously...I just don't see a lot of value in a TV that can zoom.
    • Re: (Score:3, Insightful)

      My opinion is that this hi-res frenzy that's been going on for years is pure marketing bullshit. The truth is :

      1- only a small minority of consumers have 50" TVs
      2- only a small subset of aforementioned minority watches 50" TVs upclose
      3- What do you watch on TV that requires high resolution? most TV programs are crap, and if they display text in the image (the toughest kind of feature to display with a low resolution), it's big enough that it never matters anyway.

      High resolution is a solution in search of a
      • High resolution is a solution in search of a problem. The best proof is, nobody in the 25-or-so years I've been hearing about HDTV coming "real soon now" is really clamoring for a better image quality. Most people are happy enough with TV the way it is. That's the reality.

        Spoken like someone who hasn't watched sports in high def vs standard def. Believe me, people are clamoring for new tvs if only to watch football

        • by billdar ( 595311 ) *
          or hockey. The contrast of dark jersy's on white ice (let alone the puck!) moving fast across the screen is murder on any compression/upscaler.

      • Honestly, I don't know anyone (including myself) who has ever gotten an HD set and then later said "this was not worth the switch".

        You can't go around blasting your mouth off about stuff you have not tried. Until you have actually SEEN THE DAY TO DAY DIFFERENCE in shows like CSI and Lost when broadcast in HD vs. non-HD, you're just spouting bullshit.

        I won't even go into the difference it makes to have Dolby Digital 5.1 surround on these shows.

        It is a totally different viewing experience. It makes you barely
        • by Aladrin ( 926209 )
          I think this is a -very- key point. I went to the store with 'I'm going to buy an HDTV' in my head and got to looking at the sets. After about 30 minutes of comparing, I decided it was not worth the difference.

          About 6 months later, I decided I wanted the HD set even if it didn't look -that- much better, but this time because I wanted my console to display high def.

          I would -never- go back.

          TV and gaming are both -so- much better. My dad bought that CRT HDTV from me and I upgraded to an LCD. He now keeps a
      • Everyone I know who has seen a high def broadcast for the first time has been impressed, or at least been able to see the difference between HD and SD. You make it seem like people can't tell the difference or don't care... not true.

        1: About 2/3 of my friends right now have at least 1 HDTV in their house. Mine is a 48" and 2 friends have 36"-42" TV's, but they view them from a little closer than I do. I'm middle class, so I'm sure my friends tend to have more HDTV's than lower class people, but I'd say w
  • 1080p? (Score:4, Funny)

    by ObiWanStevobi ( 1030352 ) on Tuesday April 10, 2007 @09:16AM (#18675017) Journal
    You're still on that? I'm on 3240z, it's higher def than real life.
  • 1997 vintage RCA CRT TV.
    • by Malc ( 1751 )
      1997? Vintage? Good lord that's not old. TVs should last more than 20 years. Vintage would be something older than that.
      • by jandrese ( 485 )
        Yeah, the only reason I upgraded around 2001 or so was because I wanted component and s-video in on the TV instead of just composite (which was the norm back in the 90s). The biggest killer with those older TVs is that they often only support Coax input, which is terrible for anything that is not a TV antenna or a cable from your cable/satellite company.
    • by Alioth ( 221270 )
      1993 vintage Sony Trinitron TV here.

      The thing is the Trinitron TV still looks much better than any LCD or plasma standard def TV, or high def TV showing upscaled standard definition. HD signals on an HDTV look better, but most of the HDTV content isn't particularly interesting to me. There's simply no point me changing it until HD is ubiquitous. The picture on the Sony is as good as you get for standard def, the colours are all still vibrant. (I also don't watch enough TV to really warrant replacing it any
  • "'1080p provides the sharpest, most lifelike picture possible.' '1080p

    the use of "qualification" in the summary terms mean exception to the claim.

    I feel it neccassary to qualify the last word pretty strongly as the biggest "qualification" of that statement.
    http://www.google.com/search?hl=en&safe=off&q=QFHD [google.com] is pretty possible....
    and it does exceed 1080p
  • If you use your HDTV as a computer monitor, definately.

    One of the nice things about the Mac Mini is that it will drive a 1080p signal right out of the box: just hook up a DVI cable or a DVI->HDMI cable to that shiney HDTV and go to town.
  • by StandardCell ( 589682 ) on Tuesday April 10, 2007 @09:32AM (#18675275)
    Having worked in the high-end DTV and image processing space, our rule of thumb was that the vast majority of people will not distinguish between 1080p and WXGA/720p at normal viewing distances for up to around a 37"-40" screen UNLESS you have native 1920x1080 computer output. It only costs about $50 more to add 1080p capability to the same size glass, but even that is too expensive for many people because of some of the other implications (i.e. more of and more expensive SDRAM for the scaler/deinterlacer especially for PiP, more expensive interfaces like 1080p-capable HDMI and 1080p-capable analog component ADCs, etc.). These few dollars are not just a few dollars in an industry where panel prices are dropping 30% per year. Designers of these "low-end" DTVs are looking to squeeze pennies out of every design. For this reason alone, it'll be quite a while before you see a "budget" 1080p panel in a 26"-40" screen size.

    At some point, panel prices will stabilize, but most people won't require this either way. And, as I mentioned, very few sources will output 1080p anyway. The ones I know of: Xbox360/PS3, HD-DVD, Blu-Ray and PCs. All broadcast infrastructure is capable of 10-bit 4:2:2 YCbCr color sampled 1920x1080, but even that is overkill and does not go out over broadcast infrastructure (i.e. ATSC broadcasts are max 1080i today). The other thing to distinguish is the frame rate. When most people talk about 1080p, they often are implying 1080p at 60 frames per second. Most Hollywood movies are actually 1080p but at 24fps which can be carried using 1080i bandwidths and using pulldown. And you don't want to change the frame rate of these movies anyway because it's a waste of bandwidth and, if you frame rate convert it using motion compensated techniques, you lose the suspension of reality that low frame rates give you. The TV's deinterlacer needs to know how to deal with pulldown (aka "film mode") but most new DTVs can do this fairly well.

    In other words, other than video games and the odd nature documentary that you might have a next-gen optical disc for on a screen size greater than 40" and for the best eyes in that case, 1080p is mostly a waste of time. I'm glad the article pointed this stuff out.

    More important things to look for in a display: color bit depth (10-bit or greater) with full 10-bit processing throughout the pipeline, good motion adaptive deinterlacing tuned for both high-motion and low-motion scenes, good scaling with properly-selected coefficients, good color management, MPEG block and mosquito artifact reduction, and good off-axis viewing angle both horizontally and vertically. I'll gladly take a WXGA display with these features over the 1080p crap that's foisted on people without them.

    If you're out buying a DTV, get a hold of the Silicon Optix HQV DVD v1.4 or the Faroudja Sage DVDs and force the "salesperson" to play the DVD using component inputs to the DTV. They have material that we constantly used to benchmark quality, and that will help you filter out many of the issues people still have with their new displays.
  • Content (Score:5, Insightful)

    by BigDumbAnimal ( 532071 ) on Tuesday April 10, 2007 @09:34AM (#18675289)
    This has bugged me for awhile.

    Many TV manufacturers have been pushing 1080p. They have even showed images of sports and TV shows to show off their TV's great picture. However, the fact is that it is very unlikely that anyone will be watching any sports in 1080p in the near future in the US. Television content producers have spent millions upgrading to HD gear that will only support 1080i at the most and 720p as the top progressive scan resolution. They are not likely to change again to go from 1080i -> 1080p to benefit the few folks with TVs and receivers that support 1080p. As others have pointed out, 1080p isn't even supported by the HD broadcast standard.

    The only sports you will seen in 1080p will be some crappy sports movie on Blu-ray.
  • by Thaelon ( 250687 ) on Tuesday April 10, 2007 @09:37AM (#18675359)
    Here [carltonbale.com] is a viewing distance calculator (in Excel) you can use to figure out way more about home theater setups than you'll ever really need.

    It has viewing distances for user selectable monitor/TV/projector resolutions & sizes, seating distances, optimal viewing distances, seating heights(?!), THX viability(?!) etc. It's well researched and cited.

    No I'm not affiliated with it, I just found it and liked it.
  • by redelm ( 54142 ) on Tuesday April 10, 2007 @09:41AM (#18675437) Homepage
    The photo standard for human visual acuity is 10 line-pairs per mm at normal still picture viewing distance (about one meter). 0.1 mil. But 20:20 is only 0.3 mil (1 minute of arc). A 50" diag 16:9 screen is 24.5" vertical. 1080 lines gives 0.58mm each. At 8' range this is 0.24 mil, within 20:20, but not within photo standards.

    Of course, we are looking at moving pictures, which have different, more subjective requirements. A lot depends on content and "immersion". Many people watch these horribly small LCDs (portable and aircraft) with often only 240 lines. Judged for picture quality, they're extremely poor. Yet people still watch, so the content must be compelling enjough to overlook the technical flaws. I personally sometimes experience the reverse effect at HiDef -- the details start to distract from the content!

  • If you want to see the "Wow" factor, download the Apple Quicktime Trailers [apple.com] in 1080p and 5.1. I can really tell the difference between my compressed clear-QAM 1080i recordings and these uncompressed 1080p trailers.
  • Ultimately, we all want affordable full-wall-sized VR so we can have breakfast on the veranda overlooking the scenic world landmark of our choice, don't we?

    But, yes, I quickly realized that large panels are for families, business, and people who entertain by showing movies. My wife and I are probably as well served by an inexpensive 22" 1680x1050 six feet from our heads on the sofa than we would be by an expensive 50" of lower resolution on the opposite wall.
     
  • Geek Cred (Score:2, Informative)

    by Laoping ( 398603 )
    What you are forgetting is geeks like the shinny top of the line. I mean really, if we can't brag about our gear, what can we brag about. Take that buddy, your TV only does 720p, HA!

    Besides you can always pair your 1080p with this (http://www.oppodigital.com/dv981hd/dv981hd_index. html).

    Ohhh Shiny!
  • by Craig Ringer ( 302899 ) on Tuesday April 10, 2007 @10:32AM (#18676347) Homepage Journal

    Sure, it's detailed. Too bad the colour is still a poor match to human vision.

    We see a huge dynamic range - we can see details in extremely dark areas and still perceive detail in very bright areas. What we see as bright or dark also depends on the surrounding lighting (and not just as your iris adapts, either, there are other effects at work). Even more importantly, our perception of colour intensity and brightness is not linear.

    To get truly amazing video, we'd need to switch to exponential format colour that better matches how we actually see and can represent appropriately high dynamic ranges while still preserving detail. We'd also need to dynamically adapt the display to lighting conditions, so it matched our perceptual white-point & black point. Of course, we'd need to do this _without_ being confused by the light from the display its self. And, of course, we'd need panel technology capable of properly reproducing the amazing range of intensities involved without banding or loss of detail.

    We're a very, very long way from video that's as good as it can get, as anyone in high quality desktop publishing, printing, photography or film can tell you. A few movie studios use production tools that go a long way in that direction and photographic tools are getting there too, but display tech is really rather inadequate, as are the colour formats in general use.

    I call marketing BS.

  • by pyite69 ( 463042 ) on Tuesday April 10, 2007 @10:40AM (#18676473)
    There are several problems:

    1) The ATSC specs don't provide a 60 frame 1080p mode - only 24p.
    2) There isn't a lot of content that can use 1080p - and it is likely to just be movies, which are 24p.

    There is one benefit of getting a 1080p display though: MythTV does a good job of deinterlacing 1080i to 1080p. You will probably also want to get some equipment to remove the MPEG artifacts too, which is not cheap.

    Mark
  • Interlaced must go (Score:3, Insightful)

    by AaronW ( 33736 ) on Tuesday April 10, 2007 @10:43AM (#18676531) Homepage
    Interlaced video has got to go. Interlaced video made sense with analog transmission and CRT tubes which rely on the persistance of the eye and the display itself is interlaced. However, virtually all non-CRT displays are inherently progressive. Doing a good job of deinterlacing video is a very difficult problem, and the results will never be as good as video that is progressive to begin with (the exception being film if the device is smart enough to know that the source material is progressive (i.e. 3:2 pulldown). MPEG encoding is also far more efficient and easier to do if the video is progressive as well, since otherwise it's much more difficult to figure out image motion if it shifts up or down an odd number of pixels (or less). Progressive video also uses less bandwidth. 1080p/30 compresses much better than 1080i/60.

    Good deinterlacers for TVs are expensive, and few TVs use good ones. It also introduces a lot of difficulty when trying to scale video since virtually all non-CRT sets also have some fixed native resolution.

    -Aaron
  • by clickclickdrone ( 964164 ) on Tuesday April 10, 2007 @10:59AM (#18676779)
    There was a news piece I read recently where a BBC engineer was interviewed and said their experiments had showed that a faster framerate made a bigger difference to people's perception of an image's quality. They showed a well set up TV at standard res but higher framerate and compared it to a 1080p screen and the former looked better according to the writer. The BBC engineer noted that most of the 'HD is better' was smoke and mirrors anyway because most people's exposure to a normal picture is via a compressed digital feed of some sort and the apparent poor quality is a result of the compression, not the resolution.
    I certainly remember being very disappointed with both digital sat and cable images because of the poor colour graduations and sundry pixelation issues compared to my normal analogue signal so I can well believe it.
  • Seeing the Grids (Score:5, Informative)

    by Doc Ruby ( 173196 ) on Tuesday April 10, 2007 @11:44AM (#18677583) Homepage Journal

    Keep in mind this article is written in general terms, so you scientists out there don't need to stand in line to file corrections!

    I was in the Joint Photographic Experts Group (JPEG) when we invented the popular image format. While I worked for a digital camera company inventing an 8Kx8K pixel (40bits color) scanner, having studied in pre-med college both the physics of light and brain neurology of the visual system. So I'll just jump that line of "scientists" to file this correction.

    It's safe to say, however, that increasing resolution and image refresh rate alone are not enough to provide a startlingly better viewing experience in a typical flat panel or rear projection residential installation.

    It's safe to say that only once you've dismissed the scientists who would correct you.

    The lockstep TV screen is a sitting duck for the real operation of they eyes & brain which compensate for relatively low sampling rates with massively parallel async processing in 4D.

    Joseph Cornwall's mistake in his article is to talk like viewers are a single stationary eye nailed at precisely 8' perpendicular to a 50" flat TV, sampling the picture in perfect sync with the TV's framerate. But instead, the visual system is an oculomotor system, two "moving eyes", with continuous/asynchronous sampling. Each retinal cell signals at a base rate of about 40Hz per neuron. But adjacent neurons drift across different TV pixels coming through the eyes' lenses, while those neurons are independently/asynchronously modulating under the light. Those neurons are distributed in a stochastic pattern in the retina which will not coincide with any rectangular (or regular organization of any linear distribution) grid. The visual cortex is composed of layered sheets of neurons which compare adjacent neurons for their own "difference" signal, as well as corresponding regions from each eye. The eyes dart, roll and twitch across the image, the head shakes and waves. So the brain winds up getting lots of subsamples of the image. The main artifact of the TV the eye sees is the grid itself, which used to be only a stack of lines (of nicely continuous color in each line, on analog raster TVs). When compared retinal neurons are signaling at around 40Hz, but at slightly different phase offsets, the cortex sheets can detect that heterodyne at extremely high "beat" frequencies, passing a "buzz" to the rest of the brain that indicates a difference where there is none in the original object rendered into a grid on the TV. Plus all that neural apparatus is an excellent edge enhancer, both in space (the pixels) and in time (the regular screen refresh).

    Greater resolution gives the eyes more info to combine into the brain's image. The extra pixels make the grid turn from edges into more of a texture, with retinal cells resampling more pixels. The faster refresh rate means each retinal neuron has more chance to get light coordinated with its async neighbors, averaged by the retinal persistence into a single flow of frequency and amplitude modulation along the optic and other nerves.

    In fact, the faster refresh is the best part. That's why I got a 50" 1080p DLP: the micromirrors can flip thousands of times a second (LCD doesn't help, and plasma as it's own different pros/cons). 1600x1200 is 1.92Mpxl, at 24bit is 46.08Mb per image. 30Hz refresh would be 1.3824Gbps. But the HDMI cable delivering the image to the DLP is 10.2Gbps, so that's over 200FPS. I'm sure that we'll see better video for at least most of that range, if not all of it. What I'd really like to see is async DLP micromirrors, that flips mirrors off the "frame grid". At first probably just some displacement from the frame boundary, especially if the displacement changes unpredictably each flip. Later maybe a stochastic shift - all to make the image flow more continuously, rather than offering a steady beat the brain/eyes can detect. And also a stochastic di

    • Re: (Score:3, Funny)

      by Anonymous Coward
      Yeah, what he said.
  • Other Factors (Score:5, Interesting)

    by tji ( 74570 ) on Tuesday April 10, 2007 @11:58AM (#18677827)
    There are other variables than "How does 'The West Wing' look in HD when I'm sitting on my couch". Such as:

    - 1080p provides a good display option for the most common HD broadcast format, 1080i. Since most new displays are based on natively progressive technologies (DLP, LCD, LCOS), you can't just do a 1080i output. So, 1080p allows them to just paint the two 1080i fields together into a progressive frame for high quality display.

    - 720p upscales to 1080p easily. Probably better then downscaling 1080i to 720p and losing information.

    - Computers attached to HDTVs are becoming more and more common (not just game consoles, true computers). Scaling or interlacing has nasty effects on computer displays and all those thin horizontal/vertical lines and detailed fonts. 1080p gives a great display performance for Home Theater PCs.

    - You are not always sitting 12-15' back from the TV. 1080p maintains the quality when you do venture closer to the set.

    - Front Projectors are increasingly common (and cheap), so the display size can be quite large (100-120"), allowing you to see more of the 1080p detail.

    All that said.. If I were buying a new display today, I would still stick with 720p, for two main reasons:

    - Price / Performance. 720p displays are a bargain today, 1080p is still priced at a premium.

    - Quality of available content. The majority of what I watch in HD is from broadcast TV. Many broadcasters are bit-starving their HD channel by broadcasting sub-channels ( e.g. an SD mirror of the main channel, a full-time weather/radar channel, or some new crap channel from the network in an effort to milk more advertising $$). So, the 1080i broadcasts do not live up to the format's capabilities. Watching The Masters last weekend proved that dramatically. My local broadcaster has the bandwidth divided up quite aggressively, so any scenes with fast movement quickly degrade into a mushy field of macroblocks. Utter garbage, and very disappointing.
  • Flawed Analogy (Score:3, Insightful)

    by SeaFox ( 739806 ) on Tuesday April 10, 2007 @10:16PM (#18685075)

    After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled,

    One shouldn't have expectations that buying a high-speed rated tire will improve performance of the car itself. That makes no sense! The point of the speed rating is the tire is designed to withstand driving at those speeds, whereas if you put a S-speed rated tire on your exotic sports car and drive 200mph, your tire may very well "fail" in same same way Firestone SUV tires of a few years ago did.

    Getting back on topic, a TV's resolution support will have a direct impact on what you can see. To reverse the bad car analogy here, the poster just said that one shouldn't buy a 1080p monitor and expect all their 1080i and 720p content to look better. No kidding.

    The reason for buying the 1080p monitor is so when 1080p content starts appearing, you have the monitor to view it already. Just like buying 200mph tires for a Prius would be worthwhile if you were going to be adding a jet engine [ronpatrickstuff.com] to your Prius next month.

Technology is dominated by those who manage what they do not understand.

Working...