Sources Say ITU Has Approved Ultra-High Definition TV Standard 341
Qedward writes with this excerpt from Techworld: "A new television format that has 16 times the resolution of current High Definition TV has been approved by an international standards body, Japanese sources said earlier today. UHDTV, or Ultra High Definition Television, allows for programming and broadcasts at resolutions of up to 7680 by 4320, along with frame refresh rates of up to 120Hz, double that of most current HDTV broadcasts. The format also calls for a broader palette of colours that can be displayed on screen. The video format was approved earlier this month by member nations of the International Telecommunication Union, a standards and regulatory body agency of the United Nations, according to an official at NHK, Japan's public broadcasting station, and another at the Ministry of Internal Affairs and Communications. Both spoke on condition of anonymity."
Great! (Score:5, Funny)
Same old shit in high resolution! =D
Re: (Score:3)
for twice the cost and comes with twice the DRM, along with limited availability! enjoy!
Re: (Score:3)
I suspect ultra-high resolution will fail like Super Audio CD and DVD-Audio failed. People have no desire to upgrade to a higher standard if they can't hear (or see) any difference. For 99% of the population an SACD or DVD-A sounds no better than a CD, or else the difference is trivial, so they ignore the new standard. I expect the same to happen with UHDTV.
Re:Great! (Score:4, Informative)
There's a good link I usually pass out when people start to talk about noticing the difference between 720p and 1080p.
http://hd.engadget.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/ [engadget.com]
Now I don't know where the line for 4320p would be since the article is old, but if you look at the line for 1080p at a viewing distance of 5 feet you need a TV around 38 inches. For 1440p at the same distance you need a TV around 51 inches, a difference of 13 inches.
1080p is 2,073,600 pixels
1440p is 3,571,200 pixels
4320p is 33,177,600 pixels
1440 is 1.33... times bigger than 1080
3,571,200 is 1.72... times bigger than 2,073,600
4320 is 3 times bigger than 1440
33,177,600 is 9.29 times bigger than 3,571,200
Using simple linear approximation:
If you take just a 3 times bigger standard 1440p -> 4320p you need 29 more inches, or a TV that is 67 inches, or 3,571,200 -> 33,177,600 you need 70 more inches, or a TV that is 109 inches wide at 5 feet to get the full benefit of 4320p.
I don't know about you but sitting 5 feet away from 109 inches wouldn't work for me. 67 inches is doable, but that's still a huge TV to be only 5 feet away. I don't think you can follow all the action across the entire screen from that distance.
screw that (Score:5, Funny)
I am going to wait for CSUHDTV
Crazy Super Ultra High Definition TV.
Re:screw that (Score:4, Funny)
Re:screw that (Score:5, Funny)
I don't understand why they skipped Super HDTV ... anyone that grew up in the 80s knows that Super is before Ultra.
Re:screw that (Score:4, Informative)
That's nominally 3840 x 2160, aka, "4K".. you get it, or something like it, at the better movie theaters these days. There are already camcorders shipping that do this, and televisions coming Real Soon Now (http://www.theverge.com/2012/8/22/3259613/lg-84-inch-4k-tv-korea-release-north-america-europe-latin-asia). YouTube already supports 4K video. HDMI 1.4 does, too, at least up to 24p.
So if it's already real, it's hopefully not the subject of work on new standards. And the 4K stuff is coming on fast enough that it's all based on logical extensions to what already exists. TVs are smart enough to adapt to the input and reformat lower resolution video. Disc delivery doesn't matter as much as it used to, but just like 3D, if 4K is important in the home, a new Blu-ray profile will cover it (if you really want more storage, the existing BD-XL format might get employed).
Starting out worrying about 8K video now, these guys will have the time to think about much larger changes in the video infrastructure.
Another piece of the puzzle. (Score:5, Informative)
We have Bluray that can pump out 40 Mbps and a new High Efficiency Video Coding (HEVC) standard coming that support 4K/60Hz video at around 40 Mbps
We also have a few 4K displays just starting to appear.
And now a UHDTV 4K video standard (as well as 8K).
So looking good for the new gen with broadcast, storage, encoding and display standards all sorted out .. bring it on !!!
Re: (Score:2)
The sooner the better (Score:2)
Re: (Score:2)
Re: (Score:2)
Some shows are broadcast at 720p, but recorded on either 35 mm or 1080p/24 HDCAM.
For Instance:
Bones [imdb.com]
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Perhaps it's your cable company that's to blame. Why have only one channel featuring crystal clear, pristine video when you can have five featuring an approximation of what was intended?
Re: (Score:2)
Jarring clarity (Score:5, Funny)
Re:Jarring clarity (Score:4, Funny)
Way more than 2x (Score:3)
Re: (Score:3)
Oh good... (Score:4, Funny)
A new international television standard. How long until we in the US invent our own entirely incompatible system just so it can depend on patents owned by American companies?
ATSC versus DVB-T, CDMA2000/EvDO vs. GSM/UMTS, etc.
Re: (Score:2)
Re: (Score:3)
NTSC: Never Twice the Same Color
SECAM: Something Essentially Contrary to the American Mode
PAL: not really
Re:Oh good... (Score:4, Insightful)
We've already got something like that with ATSC standards, at least in the RF modulation schemes -- broadcast used 8-VSB (time domain), and the alternatives considered were OFDM (frequency domain) and 256-QAM (phase domain). Well, the cable industry is using 256-QAM and broadcast is using 8-VSB, last I heard, and I think what edged 8-VSB ahead for broadcast was that it's not sensitive to the phase jitter in antique GEO satellite transponders. So with modulation, at least, yeah, we're already there. (The fortunate thing is that, unlike when NTSC rolled out, TV manufacturers aren't forced to design around just one demodulation standard, and it's not all that difficult to incorporate both 8-VSB and 256-QAM demodulation in modern receivers, even within a single demod chipset, so for the most part you never notice it.)
I suspect as standards get more and more complex, we'll start seeing a lot more of this kind of thing, and it will help rather than hurt, as the TV manufacturers design more and more agile multi-standard receivers that can handle anything the standards folks throw at them. Note that most if not all of them will also still display analog NTSC-M VSB-modulated signals just fine .. because there are still a lot of cable providers offering analog basic cable tiers ..
(<- still thinks the way NTSC-M avoided obsoleting the first-gen monochrome TV's was a cool hack, even if the chroma performance sucked most of the time)
Wow (Score:3)
Maybe cable companies might finally get FULL HD content to display on our Ultra HD TV's.
Another reason why cable companies need to be destroyed, because they don't even know how to provide state of the art, but feel inclined to comment on what the new standards should be.
About the only thing UltraHD is going to introduce is a new optical disk format because broadband and content providers are incapable of creating and delivering UltraHD content without massive compression and inferior audio.
Re: (Score:3)
Re: (Score:2)
About the only thing UltraHD is going to introduce is a new optical disk format
And a new format war! Oh, and new and "improved" DRM. I can't wait to see who cracks it first. I'll get the popcorn ready.
Comment removed (Score:5, Interesting)
Re: (Score:2)
This. What exactly is it that you want to see with this much resolution?
The thin wires that hold props and whatnot in place for movies? Look, it's supposed to be a suspension of belief. That's what's required for a work of fiction. Too high of a resolution ruins the effect. Not to mention who wants to see every pore of every actor's face?
Even for documentaries: What's the point? Do you require this much resolution IRL? I'd venture to guess if you had an ultra-high resolution view of your pillow (including d
Re: (Score:3)
Re: (Score:3)
Do you require this much resolution IRL? I'd venture to guess if you had an ultra-high resolution view of your pillow (including dust mites) you'd probably not be able to sleep.
God yes ... human visual acuity is in the 100-500MP range depending on eye movement, field-of-view and interpolation assumptions. Even 8K video is only around 30MP in a static pattern.
But also remember your taste in 'reality' may not be everyone else's taste - you (and I) have probably been trained that low resolution, grainy film is 'suspension of belief' just as the in 50s/60s the dodgy film colours (technicolor?) were part of the suspension or in the 30s/40s/50s with black-and-white doing that job or in
Re: (Score:2)
Re: (Score:3)
How decent is 720p? Well, both TVs appear to be about the same quality as, or often a little higher than, watching a friggin' movie at the cinema, if the source is decent and relatively free of artifacts.
Wow - you really need access to a better cinema then ... I can assure you 720p (from a good source) is miles behind a good digital cinema ...
I, for one, have a 85ft (110ft diagonal) 4K digital locally in addition to IMAX, film and 2K digital ... I was fortunate to see The Bourne Legacy recently in 4K digital and it was stunning compared to even 2K digital, let alone 1080p and 720p.
If i can have a high-tech $1,000 4K 80" screen in 5 years or high-value $500 1080p 50" screen in 5 years .. hmmm ... easy decisi
Re:Anyone seeing the point of this? (Score:4, Insightful)
I was lucky enough to see a demo of Ultra High Definition a few years back when NHK was developing it. I didn't think you could get much better than 1080p, but it was actually noticeably better. What people forget is that it isn't simply the resolution that is higher, the colour is better and the frame rate has been bumped to a native 120fps. Everything looked hyper realistic and natural. Not the same level of improvement going from SD to HD, but the frame rate alone was enough to really set it apart.
Having said that I was quite impressed by 4k and have not had an opportunity to compare 4k to Ultra HD. I'm kinda sceptical at how much improvement there would be over 4k/60p, but won't pass judgement until I have seen it. And of course it remains to be seen if 48 or 60 fps will take off for films.
What I really can't understand is people who say they can't see any difference between SD and HD. Even if their eyesight is bad and they can't see the extra resolution they should still be able to see that the colour is better. Well, unless they have set their TV to be deliberately really low contrast, and I know one guy who does. 720p to 1080p is going to be more subtle because it is just a resolution bump, so really it depends on the size of your TV and your distance from it. I used to think a 50" TV was ridiculous, this year I bought one for less than a good 32" CRT set cost a decade ago...
Very disappointing (Score:3, Insightful)
I'm sure it's very nice, but these types of things are simply diverting time and resources away from what the true goal should be: sexbots. Anime themed sexbots, porn star themed sexbots, weird fetish sexbots -- sexbots.
Japan, why have you gone astray?
And people are going to watch this... how? (Score:2)
Cable and Satellite can barely handle HD as it is right now due to bandwidth constraints. Unless this also comes with some miracle new encoding that can give us all this extra picture quality without increasing bitrates at all, it's not going to fly. Internet transfer caps make it totally unsuitable for streaming. Optical media isn't exactly the way of the future.
We're a *long* way off from this being available to home users in any kind of practical way.
Re: (Score:2)
I dont see the need for it myself...but the average consumer doesnt think about that, they just see "BIGGER NUMBERS ARE BETTER!". And so they buy these things. And its too much for the current pipelines coming into the houses....
and viola! Hardcore, cannot be ignored (like it is now) market pressure to upgrade cable/telco infrastructure and deliver more bandwidth! So for that reason, I support it.
Re:And people are going to watch this... how? (Score:4, Informative)
Certainly is work in that direction ... http://en.wikipedia.org/wiki/High_Efficiency_Video_Coding [wikipedia.org] ... That gets you a 50% reduction in bit rates over MPEG-4/AVC - which in turn is 50% reduction over MPEG-2 used in many most digital TV standards
So that's a 2-4x increase in efficiency + modulation improvements that are bound to happen = plenty of scope for 4K digital TV
8K is a bit more a stretch at the moment
OK everybody! (Score:2)
On the bright side (Score:2)
What about the computer screens? (Score:2)
Re: (Score:2)
Re: (Score:2)
Ok, Apple has launched their retina displays which do have a really good resolution but where's the rest of the industry?
Apple can get an actual economy of scale with a line of consumer laptops costing over 2 grand. Nobody else has been able to pull that off. Also, since Apple has direct control over their OS, they can customize it as needed for non-mainstream hardware like high dpi displays. Their competitors are stuck with whatever MS sells them.
Just what we need! (Score:2)
This is just what we need!
Instead of developing ultra-hd tv, they should be developing content that is actually worth watching.
FOUR times the resolution, not 16 (Score:2)
Experienced system in operation during Olympics (Score:4, Interesting)
The BBC and NHK collaborated to demonstrate this system during the olympics , broadcasting to 3 sites in UK , 2 in US and 2 in Japan.
Further detail See http://www.bbc.co.uk/blogs/researchanddevelopment/2012/08/the-olympics-in-super-hi-visio.shtml [bbc.co.uk]
The opening/closing ceremony were broadcast live whereas during the rest of the week a daily hour long highlights package covering the opening ceremony and specific events package was compiled and broadcast on a daily basis.
I was fortunate enough to experience the system at Bradford Museum of the Moving Image on a 15 metre square screen and a couple of megawatts of sound..
With reputedly only 3 cameras in the world camera angles were somewhat limited, the opening ceremony coverage placed you in the heart of the stadium as if you were an audience member showing off the wide field of vision offered. I found the 22 channels of sound to be somewhat overwhelming in volume which I judged to be a bit of a cheap trick to impress. As with initial experience of Hidef the enhanced resolution can lead one to examine detail towards the edge of the field of vision. I was slightly disappointed that there was some blockiness at the edge. This may be due to focussing issues, focus is performed away from the camera.
All in all I found it quite comparable to the Imax experience excepting lack of 3d.
Why? Why ? Why? (Score:2)
Seriously, what is the point, when every "provider" compresses the signal to the point that you already have pixellated TV, with blocky regions of color in "HD".
Consider any scene on TV in a smoke-filled room, where the actors are sitting around a table, and the light shines down from above, a blue spotlight.
Now, back in the days of NTSC, there would be a smooth graduation from the light to dark, a million shades of blue, making the whole thing appear to be smooth and natural.
Under HDTV and signal compressi
Great (Score:3)
Now throw out those old TVs in crappy old 3D HD and buy NEW TVs in UltraHD!! And now that cable companies have started giving out HD programming for free, rather than charge extra they get the option to charge for UltraHD content instead. Great for everyone!
Oh, and if you sit at home and look at your TV screen with a microscope, I guess you can see a little more detail now.
Re: (Score:2)
To be completely fair, there is no mention of computer monitors anywhere in that article. You sure you read it?
Re: (Score:2)
It's about image format; since people move away from dedicated TVs at a rapid rate, forcing people to watch their movies on a narrow strip of a screen means they'll either end up with a display unsuitable for anything else, or will complain about black strips.
Re:useless aspect ratio (Score:5, Insightful)
Ever since TV and computer displays became essentially the same thing, the consumer market has dominated.
Recall, if you will, all the build-up to the "Grand Alliance" that gave us today's ATSC (HDTV) standard. There was politicing on Analog vs. Digital (kind of a no-brainer), on RF modulation (we lost out on that one, here, 8VSB was selected due to Qualcomm lobbying and the fact it interfered less with existing NTSC broadcasts... now that those broadcasts are gone, we still have the problem that the signal isn't worth a damn indoors). And on display resolution.
Hollywood, Inc. wanted a 2:1 aspect ration. The computer industry, savvy enough to understand the impact of millions of consumer displays at higher-than-existing PC resolutions, wanted something more boxy. 16:9 was the compromise widescreen aspect ratio.
The PC industry, naturally, went full steam ahead... at 16:10. Silly PC industry. This lasted for awhile, but ultimately, with all those consumer LCD panels out there, most cried "Uncle" and went 16:9. I have dual 1920x1200 16:10 monitors at home, though I see an upgrade to 2560x1440 in the very near future. At work, they've been 16:9 (dual) for my current and previous job. Hardly useless for real work (and that's more Electronics CAD than video these days, though I did EE-CAD, embedded software, web servers, photography, and video at my last job), and the difference is, if anything, more significant for video work (16:9 monitors don't leave any room for controls on the full-screen video panel, 'cept as an overlay) than "real" work like designing circuitry.
Re: (Score:2)
huh, wot? Academy radio monitors are about out for good...
Re: (Score:2)
Re:useless aspect ratio (Score:5, Insightful)
The fact that real work is done with lots of text, and text goes from top to bottom far more frequently then scales off endlessly to the right?
We have these stupidly huge 16:9 monitors today that can't even display one page of a PDF without scrolling and yet 2/3 of the screen is sitting empty. It's a terrible aspect ratio for computers.
Re:useless aspect ratio (Score:5, Insightful)
Not useless for everyone, just you.
Even so - there are monitors that pivot from portrait to landscape. 16x9 is great for office work if you rotate it 90 degrees.
Re:useless aspect ratio (Score:4, Insightful)
Ignoring the non-uniform brightness and viewing angle issues, it's substantially more mouse movement with a screen pivoted. Yes, I suppose I can install some 3rd party software, but most of my work is spent remoting into servers and I can't set them up so they only work well in my environment.
tl;dr: Pivoted monitors sounds like a great idea, but not suitable for my usage pattern.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
16:9 is great for having windows side by side. And as someone else pointed out, if you prefer to see lots of text at once, why not get a HD display and rotate it 90 degrees? Then you basically have two and a half 1024x768 displays piled on top of each other. Basically a desktop publishing type setup.
Re: (Score:2)
The fact that real work is done with lots of text
That's funny. My Solidworks models have no text at all on them and the drawings fit perfectly on a widescreen monitor.
Re: (Score:3)
Re: (Score:3)
The fact that real work is done with lots of text, and text goes from top to bottom far more frequently then scales off endlessly to the right?
When you get to a vaguely reasonable size of screen, text *doesn't* generally just go from top to bottom - it is usually arranged into many blocks, such as overlapping windows. Reading lines of text that stretch all the way from one side of a 21" monitor to the other is *hard*, even on a 4:3 screen, and this is exactly why broadsheet newspapers arrange text into columns.
Personally, for most of my work I find large wide screen monitors are nicer to work with than large 4:3 monitors.
With a widescreen monitor
Re: (Score:3)
They still make 4:3 (actually 5:4) monitors. It just costs more at the same display size.
Worse than that it costs more for less pixels of width and the same/less pixels of height to get the traditional aspect ratio monitor.
looking at a computer parts supplier I use frequently
1024x768: not for sale except on very expensive touchscreens.
1366x768: £61.39
1280x1024: £78.76
1920x1080: £64.28
1600x1200: £541.80
1920x1200: £203.28
This is why so many of us end up with 1920x1080 screens, it's not the nicest aspect ratio but it's FAR lower on a price per pixel scale than anythin
Re: (Score:3)
Windows 8's "new style" apps don't. So it's easy to imagine Microsoft deciding for us desktop users that programs on a 7680x4320 panel should also be full screen, just to make sure we don't get confused about the difference between that view and that of a 3.5", 800x600 smartphone screen. That'll be Windows 9 that phases out the "classic" windowed Windows. And Windows 10 that's sold, fire-sale style, to Oracle or IBM or someone looking to get into the OS business, as Microsoft goes down in flames. Or not...
Re: (Score:3)
What makes a 4:3 ratio so much better than a 16:9 ratio for your monitor?
I think that 16:10 is nice, through something in between 4:3 and 16:10 would be ideal. Much of the work I do involves source documents and a working document. Since most of those are formatted for the 8.5" x 11" written page, a 16:10.3 monitor is the right size to hold two. Given some additional space for menus, a taskbar, etc., I think that the idea ratio is about 16:10.5.
Re: (Score:2)
So, why don't they make 1:1 monitors?
Re: (Score:2)
If you only had one eye, a 1:1 monitor would make a bit more sense.
But most people have two. An aspect ratio where the horizontal dimension is larger than the vertical dimension makes sense.
Re:useless aspect ratio (Score:4, Interesting)
Once upon a time (ca. 1989-1990), they did. NCD (Network Computing Devices) [hack.org] made a series of X Terminals based on both the 68000 and some of the early MIPS CPUs. One model (the NCD16 [classiccmp.org]) featured a 16" square monochrome CRT [hack.org], at 1024x1024 resolution, and a 1:1 aspect ratio. The Computer History Museum also has an NCD16 [computerhistory.org] in its collection.
Re: (Score:2)
Re: (Score:2)
I don't agree. As long as you have sufficient vertical resolution (1080 isn't enough, 1200 is ok, 1440 is great for me at the moment), then horizontal resolution is fine at either 16:9 or 16:10. In fact, at (say) 1200 vertical, 16:9 would give you a more useful monitor (won't ever exist, of course).
1920x1080 is, of course, an abomination for work and I think this is where the hatred of 16:9 comes from. Whereas 2560x1440 looks great from where I'm sitting.
Re: (Score:3)
you may have a point.
most people, myself included, complain because 1080 vertical lines of resolution is a regression from where we were headed in the mid-2000s. all of a sudden, circa 2009 or 2010, 1080 vertical lines of resolution was the maximum you could get, no matter what monitor size you purchased, unless you were willing to spend over $1000 on a monitor. It's like every panel manufactuer in existance decided to just quit. all of them were constantly increasing pixel density every few years and then
Re: (Score:2)
Agreed, people feel that 1080 pixels of vertical isn't really enough but find it hard to justify spending 3 times as much to go up to 1200 pixels (admittedly they do probablly get a higher quality monitor for that) or 3 times as much again to go up to 1440 pixels.
Re: (Score:2)
Sorry, 7680x4320 means a 16x9 aspect ratio, and a monitor by that proportion is useless for any actual work.
I don't know about you, but I don't use my TV as a monitor, and I have a separate set of monitors for my computer work.
Re:useless aspect ratio (Score:5, Insightful)
I think the GP is referring to the fact that once we had a high resolution TV spec, pretty much all panel manufactuers decided that "what's good for TVs is good for computers" and no longer make any higher resolution than 1920x1080 unless you're willing to spend close to $1000 or more.
I see no reason to expect they'll do otherwise in the future, so any future TV resolution spec has immediate implications on future computer monitor resolutions.
Re: (Score:3)
This is also ignoring just how cheap those 16:9 screens are, compared with what you paid for a 4:3 CRT 15 years ago. They are able to be so cheap partly due to cheaper components, but also due to volume. If a monitor can work as both a TV and a computer display, that greatly expands its possible market, which means the manufacturer can sell more and defray more startup costs per unit. That translates into a lower price at the point of sale.
This is also why resolutions of more than 1080 lines cost more: ther
Re: (Score:2)
I do. Why would I not want to be able to use my nice big HDTV to look at webpages from the couch?
PROTIP: This also means you don't need hulu plus, nor the content owner to OK the item for viewing on a TV.
Re: (Score:2)
The 16:9 aspect ratio is better for multitasking since you can view two windows side by side.
It's usefulness is diminished when using a multi-monitor set up, but the majority of the market uses a single monitor.
Re: (Score:2)
Woot, and excuse to buy a new TV.
Re: (Score:2)
Useless, eh? My 2560x1440 (16:9) and 1920x1200 (16:10) monitors covered in 80x48 character vim and log windows beg leave to differ. Your "actual work" differs from mine, so let's keep the generalisations down, okay?
Incidentally, a 364x106 character terminal window has also occasionally been useful for long-lined log analysis.
TV is obsolete (Score:3)
Isn't that what old people used to watch CBS on before hulu/netflix/amazon/youtube/bittorrent/hbo to go/showtime anytime/Oprah24
By the time this standard is implemented, we'll all be streaming video directly into our eyeballs from our iPhone Vs while riding around in our self-driving google cars. Who cares what format the data is in when you're just going to slap it in one of a dozen windows and project it on the back of the seat in front of you?
Re: (Score:2)
Re: (Score:2)
I always get a bit of a kick over how much better the picture quality I get for free OTA is than the cable my friends pay for. Of course, they get far more channels, so there is a trade-off.
Re: (Score:2)
If anything, the release of zOMG Resolution++!!! new TV formats(and the accompanying marketing push by people who sell TVs) will probably make things worse.
Cable bandwidth isn't infinite, and upgrading it costs money that could be given to shareholders and management. The combination of bitrate and cleverness of encode/decode algorithm isn't something easily encapsulated into a marketing pitch. Resolution is.
If the demand is for 'better' resolution, they can just crank up the compression and deliver in the
Re: (Score:2)
Re: (Score:2)
That would be fine as long as something like CableCard existed for it so that you could bring your own device.
Re: (Score:2)
Re:I dont see the point, yet (Score:4, Insightful)
I dont see the point, yet
People buying the TVs subsidizing the economy of scale lowering the price of equally resolant computer monitors. And incidentally releasing us from the purgatory of 1920x1080 low dpi crap that is spun as high-end by CE marketing departments everywhere.
Re: (Score:2)
And have you thought of the implications for porn???
You mean shaving rashes, bad teeth and stretch marks seen with even more clarity?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Already past what eye can resolve (Score:4, Informative)
I think that leaves out the Niquist sampling theorem and the dynamic environment.
Even assuming the eye is a non-moving digital receiver, for the TV to exceed the eye's spatial frequency it has to provide 2X the spatial resolution in each direction.
But also, as was shown in the first 3D head-up display work at NASA Ames in the early 1990s, the eye's natural dithering combined with retinal and brain processing provides a virtual resolution that can be much higher - several times higher - than simple static pixels. Which is partly why 'nature' looks better. In the NASA experiment a pair of 128x128 pixel displays were built into a helmet that also had eye tracking. When the eye tracking and display were running at high enough resolutions (60 Hz+), the dithering of the eyes was picked up by the eye tracker and the 3D scene could be synthesized to match the new perspective. As a result a virtual resolution an order of magnitude greater was perceived than the rough 128 pixels.
The eye is constantly moving very slight amounts so that an edge between colors (for example) may be picked up by different cells (vertically and horizontally). Since cells are not aligned in vertical rows, this provides a virtual edge line that our brain extrapolates into our perception based on this constantly shifting view, resulting in perhaps (nobody knows AFAIK) five to ten times the apparent static resolution. It's the eye+brain's equivalent of subpixel rendering - call it subpixel perceiving.
Also the retinal cells are constantly switching on and off (firing and resting), shifting the view between adjacent retinal cells- anyone who has taken LSD has been aware of that as they see the 'squirming' of the image as it's picked up by different cells. Normally our brain filters that out but LSD turns off the filters, apparently.
So, bottom line, the Lechner Distance is not the final word. It assumes a static environment that does not exist, and ignores temporal characteristics in retina and brain processing of the image.
Re: (Score:3)
I think it depends greatly on usage. As we move more toward internet-based watching, resolution becomes more important - text requires better resolution than images of ripples on water, for example. It also depends on which viewers. Let's stipulate that the 'average' football fan doesn't really care whether the edges of the numbers on a jersey are crisp, as two 300 pound behemoths crash into each other at a combined speed of 30 miles per hour. But another viewer might very much like to see the details o
Re: (Score:2)
Re:Already past what eye can resolve (Score:4, Informative)
Real life is crisper because of the dynamic range of the intensities of light. All the technical details of photography---ISO range, aperture, neutral density filter, etc.---are just clever ways to clamp down the dynamic range to get a reasonable approximation of real life. Even high dynamic range (HDR) photography is an approximation. It still has to be presented through a low dynamic range display. It just means HDR is using a different clamping function.
Consider that there are also people who are tetrachromatic who can see a color between red and green. Surely all computer and TV displays, being RGB, are always lacking a color for them. Imagine seeing the world through a broken display where one of the colors isn't working.
Re:Already past what eye can resolve (Score:5, Informative)
What you're talking about is little to do with resolution so much as colour gamut, accurate reproduction and (yes) true 3D.
Also your eye is pretty bad unless it's looking directly as something. Then that thing comes into focus because you focus on it. That can't happen with a screen showing already-chosen focus on something else. So no matter how you squint, your eyes can't get the background trees into focus when they pass over them (and thus it's not "real") - and they probably pass over them several times a second while you're watching content that you've never seen before.
What you're saying is that watching a flat box showing colour reproductions of pre-recorded 2D imagery isn't like "real-life". And it isn't. Because even the best colour elements in a TV can't replicate real-life (and some people can even perceive UV and not know it!), even the best 3D TV can't provide depth to the image sufficiently, even the best camera doesn't record everything in "focus-free" format so that you *CAN* focus on any part of the image you like, etc. etc. etc. In the same way that Stereo, 5.1, 7.2, or anything else you choose cannot accurately reproduce an arbitrary sound in an arbitrary location around your head.
The room for improvement is not in resolution. You honestly *cannot* resolve it at a decent distance with a pure datastream (companies badly compressing video? That's another issue entirely). Even though you *can* see the light of a candle in complete darkness from MILES away, you're not measuring the same things.
The best room for improvement would probably be proper "free-focus" imagery. Where you can put up an image and I can see EVERY pixel in pin-sharp detail whether it was one mile away from the camera or one inch (and not have to refocus my eyes, or to fool them sufficiently that they AUTOMATICALLY refocus themselves). Because that pixel element behind the actor's shoulder ISN'T REALLY six foot behind the one that represents his shoulder when it's displayed, so it will not look "real".
Until you have proper, full, 3D and such free-focus media, you won't get what you want. And we know how well 3D has gone down - just as well as it does every time it's "reinvented" for another generation.
Re: (Score:3)
I don't think we want light-field television (non-polarized light picture allowing you to focus on foreground or background) any more than we want 360 degree panoramic TV (except maybe for live events). The director chooses what's in the frame and what's in-focus and it's a storytelling tool. I'd like a higher color gamut and greater dynamic range, though.
Re: (Score:2)
Maybe you should go outside.
Re: (Score:2)
The re-encoding is handled by the individual affiliate's tower. If they try to squeeze two or three subchannels into their 20Mbps feed, then quality suffers. Networks tend to balance the bandwidth more to the primary sub-channel at prime-time, while making it more equal during the day. Some just have lousy encoders and waste bandwidth.