Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Television Hardware Technology

Sources Say ITU Has Approved Ultra-High Definition TV Standard 341

Qedward writes with this excerpt from Techworld: "A new television format that has 16 times the resolution of current High Definition TV has been approved by an international standards body, Japanese sources said earlier today. UHDTV, or Ultra High Definition Television, allows for programming and broadcasts at resolutions of up to 7680 by 4320, along with frame refresh rates of up to 120Hz, double that of most current HDTV broadcasts. The format also calls for a broader palette of colours that can be displayed on screen. The video format was approved earlier this month by member nations of the International Telecommunication Union, a standards and regulatory body agency of the United Nations, according to an official at NHK, Japan's public broadcasting station, and another at the Ministry of Internal Affairs and Communications. Both spoke on condition of anonymity."
This discussion has been archived. No new comments can be posted.

Sources Say ITU Has Approved Ultra-High Definition TV Standard

Comments Filter:
  • by mister2au ( 1707664 ) on Thursday August 23, 2012 @09:41AM (#41094253)

    We have Bluray that can pump out 40 Mbps and a new High Efficiency Video Coding (HEVC) standard coming that support 4K/60Hz video at around 40 Mbps

    We also have a few 4K displays just starting to appear.

    And now a UHDTV 4K video standard (as well as 8K).

    So looking good for the new gen with broadcast, storage, encoding and display standards all sorted out .. bring it on !!!

  • by pikine ( 771084 ) on Thursday August 23, 2012 @10:06AM (#41094737) Journal

    Real life is crisper because of the dynamic range of the intensities of light. All the technical details of photography---ISO range, aperture, neutral density filter, etc.---are just clever ways to clamp down the dynamic range to get a reasonable approximation of real life. Even high dynamic range (HDR) photography is an approximation. It still has to be presented through a low dynamic range display. It just means HDR is using a different clamping function.

    Consider that there are also people who are tetrachromatic who can see a color between red and green. Surely all computer and TV displays, being RGB, are always lacking a color for them. Imagine seeing the world through a broken display where one of the colors isn't working.

  • by ledow ( 319597 ) on Thursday August 23, 2012 @10:11AM (#41094849) Homepage

    What you're talking about is little to do with resolution so much as colour gamut, accurate reproduction and (yes) true 3D.

    Also your eye is pretty bad unless it's looking directly as something. Then that thing comes into focus because you focus on it. That can't happen with a screen showing already-chosen focus on something else. So no matter how you squint, your eyes can't get the background trees into focus when they pass over them (and thus it's not "real") - and they probably pass over them several times a second while you're watching content that you've never seen before.

    What you're saying is that watching a flat box showing colour reproductions of pre-recorded 2D imagery isn't like "real-life". And it isn't. Because even the best colour elements in a TV can't replicate real-life (and some people can even perceive UV and not know it!), even the best 3D TV can't provide depth to the image sufficiently, even the best camera doesn't record everything in "focus-free" format so that you *CAN* focus on any part of the image you like, etc. etc. etc. In the same way that Stereo, 5.1, 7.2, or anything else you choose cannot accurately reproduce an arbitrary sound in an arbitrary location around your head.

    The room for improvement is not in resolution. You honestly *cannot* resolve it at a decent distance with a pure datastream (companies badly compressing video? That's another issue entirely). Even though you *can* see the light of a candle in complete darkness from MILES away, you're not measuring the same things.

    The best room for improvement would probably be proper "free-focus" imagery. Where you can put up an image and I can see EVERY pixel in pin-sharp detail whether it was one mile away from the camera or one inch (and not have to refocus my eyes, or to fool them sufficiently that they AUTOMATICALLY refocus themselves). Because that pixel element behind the actor's shoulder ISN'T REALLY six foot behind the one that represents his shoulder when it's displayed, so it will not look "real".

    Until you have proper, full, 3D and such free-focus media, you won't get what you want. And we know how well 3D has gone down - just as well as it does every time it's "reinvented" for another generation.

  • by mister2au ( 1707664 ) on Thursday August 23, 2012 @10:54AM (#41095631)

    Certainly is work in that direction ... http://en.wikipedia.org/wiki/High_Efficiency_Video_Coding [wikipedia.org] ... That gets you a 50% reduction in bit rates over MPEG-4/AVC - which in turn is 50% reduction over MPEG-2 used in many most digital TV standards

    So that's a 2-4x increase in efficiency + modulation improvements that are bound to happen = plenty of scope for 4K digital TV

    8K is a bit more a stretch at the moment

  • Re:screw that (Score:4, Informative)

    by hazydave ( 96747 ) on Thursday August 23, 2012 @11:58AM (#41096751)

    That's nominally 3840 x 2160, aka, "4K".. you get it, or something like it, at the better movie theaters these days. There are already camcorders shipping that do this, and televisions coming Real Soon Now (http://www.theverge.com/2012/8/22/3259613/lg-84-inch-4k-tv-korea-release-north-america-europe-latin-asia). YouTube already supports 4K video. HDMI 1.4 does, too, at least up to 24p.

    So if it's already real, it's hopefully not the subject of work on new standards. And the 4K stuff is coming on fast enough that it's all based on logical extensions to what already exists. TVs are smart enough to adapt to the input and reformat lower resolution video. Disc delivery doesn't matter as much as it used to, but just like 3D, if 4K is important in the home, a new Blu-ray profile will cover it (if you really want more storage, the existing BD-XL format might get employed).

    Starting out worrying about 8K video now, these guys will have the time to think about much larger changes in the video infrastructure.

  • Re:Great! (Score:4, Informative)

    by Durrik ( 80651 ) on Thursday August 23, 2012 @01:01PM (#41097657) Homepage

    There's a good link I usually pass out when people start to talk about noticing the difference between 720p and 1080p.

    http://hd.engadget.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/ [engadget.com]

    Now I don't know where the line for 4320p would be since the article is old, but if you look at the line for 1080p at a viewing distance of 5 feet you need a TV around 38 inches. For 1440p at the same distance you need a TV around 51 inches, a difference of 13 inches.

    1080p is 2,073,600 pixels
    1440p is 3,571,200 pixels
    4320p is 33,177,600 pixels

    1440 is 1.33... times bigger than 1080
    3,571,200 is 1.72... times bigger than 2,073,600
    4320 is 3 times bigger than 1440
    33,177,600 is 9.29 times bigger than 3,571,200

    Using simple linear approximation:
    If you take just a 3 times bigger standard 1440p -> 4320p you need 29 more inches, or a TV that is 67 inches, or 3,571,200 -> 33,177,600 you need 70 more inches, or a TV that is 109 inches wide at 5 feet to get the full benefit of 4320p.

    I don't know about you but sitting 5 feet away from 109 inches wouldn't work for me. 67 inches is doable, but that's still a huge TV to be only 5 feet away. I don't think you can follow all the action across the entire screen from that distance.

  • I think that leaves out the Niquist sampling theorem and the dynamic environment.

    Even assuming the eye is a non-moving digital receiver, for the TV to exceed the eye's spatial frequency it has to provide 2X the spatial resolution in each direction.

    But also, as was shown in the first 3D head-up display work at NASA Ames in the early 1990s, the eye's natural dithering combined with retinal and brain processing provides a virtual resolution that can be much higher - several times higher - than simple static pixels. Which is partly why 'nature' looks better. In the NASA experiment a pair of 128x128 pixel displays were built into a helmet that also had eye tracking. When the eye tracking and display were running at high enough resolutions (60 Hz+), the dithering of the eyes was picked up by the eye tracker and the 3D scene could be synthesized to match the new perspective. As a result a virtual resolution an order of magnitude greater was perceived than the rough 128 pixels.

    The eye is constantly moving very slight amounts so that an edge between colors (for example) may be picked up by different cells (vertically and horizontally). Since cells are not aligned in vertical rows, this provides a virtual edge line that our brain extrapolates into our perception based on this constantly shifting view, resulting in perhaps (nobody knows AFAIK) five to ten times the apparent static resolution. It's the eye+brain's equivalent of subpixel rendering - call it subpixel perceiving.

    Also the retinal cells are constantly switching on and off (firing and resting), shifting the view between adjacent retinal cells- anyone who has taken LSD has been aware of that as they see the 'squirming' of the image as it's picked up by different cells. Normally our brain filters that out but LSD turns off the filters, apparently.

    So, bottom line, the Lechner Distance is not the final word. It assumes a static environment that does not exist, and ignores temporal characteristics in retina and brain processing of the image.

We are each entitled to our own opinion, but no one is entitled to his own facts. -- Patrick Moynihan

Working...