Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Hardware

New HDMI 2.1 Spec Includes Support For Dynamic HDR, 8K Resolution (techhive.com) 192

The HDMI Licensing Group has unveiled the HDMI 2.1 spec, adding support for dynamic HDR, 8K60, and 4K120. From a report on TechHive: To take full advantage of the new HDMI spec, you'll need a new 48-gigabit-per-second cable. That cable will also work with older HDMI 1.3 (10.2Gbps) and HDMI 2.0a (16Gbps) ports, but those ports don't support the new HDMI 2.1 features. [...] HDMI 2.1 adds support for the new object-oriented audio codecs -- such as Dolby Atmos and DTS X -- which can position audio events from movie soundtracks in 3D space.
This discussion has been archived. No new comments can be posted.

New HDMI 2.1 Spec Includes Support For Dynamic HDR, 8K Resolution

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Wednesday January 04, 2017 @01:44PM (#53605465)

    Get your HDMI 2.1 Monster cable to day only $89.99 and ask a blue shirt about our install deals and our audio systems.

    • by SlashdotOgre ( 739181 ) on Wednesday January 04, 2017 @02:02PM (#53605615) Journal

      Don't forget your extended warranty, only $49.99 + 10% deductible.

    • by EvilSS ( 557649 )

      Get your HDMI 2.1 Monster cable to day only $89.99 ...>

      It's on sale for 90% off!?

      • when you take our home install deal and must buy at least 2 of them of 1 from your tv box to the sound system and from from the sound system to the tv.

  • [...] which can position audio events from movie soundtracks in 3D space

    I, for one, welcome this new THREE DIMENSIONAL space. I hate this crappy 2 dimensional world we have been living in our whole lives until now.

    • by Rei ( 128717 )

      Obviously what they're talking about is breaking individual sounds down into their own channels based on assignment to 3d spatial coordinates rather than having individual channels for some arbitrary "standard" set of speakers coming from certain "standardized" directions. Which is IMHO kind of neat. You could have as many speakers as you wanted and put them wherever you wanted. Or perhaps combine face recognition and ultrasonic directional speakers to give a very precise sound direction to each person w

  • by Oswald McWeany ( 2428506 ) on Wednesday January 04, 2017 @01:47PM (#53605497)

    I'm sure there's a reason why someone might want 8K, but I've not even been convinced of the benefit of 4K yet.

    • Virtual Reality might be a reason. Although we'd need 8K90, not 8K60 and what we'd really want is 8K90 per eye, which comes to 16K90. So expect HDM2.1 to be outdated before it becomes available ;)

      • Actually, virtual reality needs higher resolutions, but it needs higher framerates even MORE. 60fps is enough to produce fluid video when passively watched (though synthetic video still needs some amount of added motion blur to be really convincing), but it's absolutely NOT fast enough for immersive video. When there's at least 1/60th of a second lag between your hand's motion and seeing your virtual hand move, you'll DEFINITELY notice the lag. 1/60th second lag feels, well... "laggy". 1/30th second lag fee

      • by Rei ( 128717 )

        VR's an interesting case, if you can incorporate eye tracking. You only need that sort of resolution in the center of view; in the peripheral you could update only one in every X pixels in each direction, and have the display just fill in their neighbors with the same colour. Or you could only send the low frequency components of a compressed data block for peripheral areas, dropping all of the high frequency data (aka fine detail). Eye tracking wouldn't reduce the number of physical pixels that the scre

    • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Wednesday January 04, 2017 @02:07PM (#53605651) Homepage

      For movies, not much. There's definitely a wow factor in some of them but you quickly forget about it and just enjoy the movie.

      But for coding and web browsing, I found 4K to be a surprising win.

      The extra clarity in text is absolutely wonderful. With low-res screens I often find myself wanting to zoom in on text despite being able to read the small text without straining my eyes. When I got my first 4K screen I noticed I was no longer tempted to do any zooming.

      • For movies, not much. There's definitely a wow factor in some of them but you quickly forget about it and just enjoy the movie.

        After getting used to higher resolution on my desktop and in games I occasionally get distracted by the resolution of 1080p TV and movies. I get used to it for a while then there will be a scene switch or something and I'll get bothered by it. Not as often as the jitter from low frame rate panning though.

      • For movies, not much. There's definitely a wow factor in some of them but you quickly forget about it and just enjoy the movie.

        But for coding and web browsing, I found 4K to be a surprising win.

        The extra clarity in text is absolutely wonderful. With low-res screens I often find myself wanting to zoom in on text despite being able to read the small text without straining my eyes. When I got my first 4K screen I noticed I was no longer tempted to do any zooming.

        I disagree. For movies there is a good bit of difference, especially for 4K content with HDR. Is it a huge difference, nope. Is it enough of a difference to make it more enjoyable and more lifelike, yep! Is it worth upgrading just for the new format? Not yet. Not unless you have to (i.e. broken TV). I was planning on waiting another couple of years to go to 4K, but my Plasma died and it was under warranty.

        Do you notice the difference? Yes, definitely. For content that is filmed in 4K and uses HDR.

      • Exactly. Your monitor goes from looking like a really high-res computer monitor, to looking like the brightly-illuminated output of a laser printer. My Nexus 6P has a ~480dpi screen, and I've gotten so spoiled by razor-crisp text, 200dpi text on a more average tablet now makes my eyes feel like they're bleeding.

        I actually bought a Chuwi Hi12 (2160x1440 12" display) JUST to use for ebook reading, because I couldn't stand to read ebooks on my older tablets anymore. It's a little too heavy to use as a "tablet"

      • by AmiMoJo ( 196126 )

        For cinema size screens 4k is a bit crap. Looks pixelated to me, which is not surprising when you consider the low DPI.

    • Early Adopter of 4k here. 8K is for the uselessly Rich bleeding edge adopters. 4k is still fresh, but no longer bleeding edge or early now in my opinion.

      Short story, there is already some benefit, but not much yet. So unless you are itchy for new tech hold off on going to spring for that new TV just yet.

      I Currently have a Samsung 4k UN65KS8000
      It was selected for the low input lag and that I could get it for $1500
      I Previously used a Visio 4k D55-D2 4k
      Both do very well on input lag in PC/Game modes. The S

      • Everyone here saying 8K is useless was saying the same arguments about 4K.

        I'm standing now, in front of a 55" 4K monitor, two feet from my face. It's freaking beautiful for playing Chivalry, or having multiple remote desktops running at once.

        AMD and nVidia have BOTH come out and said 8K and 16K will come to the limits of the human eye and be useful. LG has just patented a process for their 11K TV because they say it "feels 3-D" without actually using any goggles.

        I just watched Jungle Book in 4K on and it wa

    • by Greyfox ( 87712 )
      I'm on a comparatively tiny 15" laptop screen that's 4K, and the difference between 4K and 1080p is astounding to me. It's significantly sharper, and I'm quite keen to upgrade to a large 4K screen on my desktop machine. Unfortunately, that would require upgrading the entire machine, which I don't want to pay for just yet, but I will at the first opportunity.

      I've also started shooting my skydiving videos with a 4K GoPro, and even when viewing the videos at 1080, the difference is pretty amazing. You can ac

    • by Kjella ( 173770 )

      I'm sure there's a reason why someone might want 8K, but I've not even been convinced of the benefit of 4K yet.

      Well on my UHD monitor I can see the difference between a 3840x2160 crop of a photo and the same photo resized to 1920x1080 and back, though it's not huge. If I use a really stupid upscaler to simulate a 1920x1080 screen it's even more obvious. But if I need it to watch TV... not so sure. But there's UHD the resolution and there's UHD the format with HDR, Rec.2020, 10 bit color etc. which all together is a pretty big improvement over BluRay. Going to 8K is probably going to be like 96KHz/24 bit audio, it mi

    • by EvilSS ( 557649 )

      I'm sure there's a reason why someone might want 8K, but I've not even been convinced of the benefit of 4K yet.

      Video walls. Like Barney Stinson had!

      Or projectors. Amazing how people forget those exist.

    • by Solandri ( 704621 ) on Wednesday January 04, 2017 @03:31PM (#53606279)
      1080p Blu-ray was encrypted by HDCP. Intel somehow lost the master key, allowing anyone to decode any past and future content encoded with HDCP 2.1 or earlier. The studios' response was to create an entirely new, not backwards-compatible HDCP 2.2 around the time the HDMI 4k video standard (HDMI 2.0) was released. Unfortunately they did it late, so there was about a 1-year gap when 4k equipment was sold with HDMI 2.0 capability, but not HDCP 2.2. This meant that these 4k TVs and Blu-ray players could not play commercial 4k Blu-rays. If you burned your own 4k movies to Blu-ray they would play, but not the stuff Hollywood released. I spent a lot of time warning people not to buy 4k equipment that first year, and warning them to be careful to check for "HDCP 2.2" in the specs that second year. Hollywood doesn't care if your TV/Blu-ray player doesn't work. They just want their crap protected.

      HDCP 2.2 was broken in late 2015 [wikipedia.org]. Not sure if it was cracked or someone just made a device using a legit HDCP 2.2 decryption key. But if it was cracked, we're probably going to go through all this again. Hollywood will insist a new not backwards-compatible HDCP 2.3, and it will be used to encode all future Blu-rays starting with 8k releases.

      They also enjoy double- or triple-dipping: charging you full price for a license to the same movie in different formats. Same with the record studios, who had no qualms about charging your for the same song on vinyl, tape, and CD. The software industry gets this right - they let you upgrade at a discounted price if you own a previous version. This reflects the reality that you already purchased a license for the previous versions, and thus the new version is only giving you some new functionality instead of entirely new functionality. But Hollywood has self-deluded themselves into thinking that their product is a license when it's convenient for them if it's a license, and a product when it's convenient for them if it's a product. So will charge you full price even if you've already purchased licenses for the movie three times at 360i (VHS), 525i (DVD), and 1080p.

      People need to stop putting up with this crap and demand lower-price upgrade licenses for content they've already paid for. IMHO a lot of piracy would disappear if the studios simply adopted pricing which better reflected reality. Most people want to pay content creators for their work, but not if they judge that the content creators are trying to rip them off. The whole fiasco with Windows XP support contracts is a great example. Microsoft pushed support contracts for XP hard and lots of companies signed up. Instead of buying XP, they were buying 3 years of Windows support, which would include XP and an upgrade to the next version of Windows (new versions normally come out about every 3 years). Unfortunately Vista got delayed and wasn't released until 5.5 years after XP - outside the support contract period these companies had paid for. There was hell to pay, with many companies believing Microsoft deliberately delayed Vista so they wouldn't have to fulfil that portion of the contract. Even though Microsoft eventually relented and gave these companies Vista, many of them will never buy a support contract or subscription software from Microsoft again. Because they judge it to be unfairly skewed in favor of the supplier.
      • A secret master key sounds pretty naive. EVERY decoding device needs to include that master key in some shape or form; eventually someone is going to reverse-engineer it and release it! Or more likely, like the failure of the Great Wall of China, somebody will just bribe one of the gatekeepers.
    • 4K is the limit of human visual perception, so 4K makes sense in that you can't see the individual pixels when the entire screen is in your field of view. Going higher that 4K for video entertainment is pointless, it's only a marketing numbers game to get more money out of naive consumers. Recently, I discovered many vendors make TOSlink cables with gold-plated connectors. That's right, OPTICAL cables with gold plating, which does absolutely nothing to improve the light-conducting properties of the fiber! C
  • Since most theaters only project at 2K and 4K barely has a foothold in the home market, why? It seems like a new standard this early is really a bad idea. Some better compression technology could come along that will make this standard obsolete before its goes into consumer production. The only reason I can think of is a non-existent 3D technology.
    • Standard theaters don't need higher resolution in part because nobody likes sitting close enough to the screen (first few rows) that the resolution is readily noticeable, and outside of the large IMAX theaters, the average screen distance to seat ratio isn't big enough to justify the cost of higher resolution projectors.

      Possible 8K Uses:
      * Large, High Resolution Desktop monitors for artists, as well as for developers that like densely packed screens. I'd gladly trade my current multi-display setup for somet

      • Again, human visual perception is limited to about 4000 pixels across the field of view, so having a screen with greater resolution is only useful if you are planing on looking at just a fraction of the screen at a time. I think most movies assume you're looking at the whole screen!
    • 8K buys you nothing when human visual perception is limited to 4K! "Duh, bigger numbers better!" appears to be the new marketing mantra.
      • 8K buys you nothing when human visual perception is limited to 4K! "Duh, bigger numbers better!" appears to be the new marketing mantra.

        But... From what I can find, some estimates are that the human eye can see up to 15 Million pixels. 4K is only about 8.3 million pixels. So, there is some benefit of going to 8K, about 48 million pixels, as it finally exceeds what the eye can theoretically handle. If you cut that in half to provide for 3D, then you have two 24 million pixel pictures that the eye cannot distinguish between computer generated and reality.

    • Since most theaters only project at 2K and 4K barely has a foothold in the home market, why? It seems like a new standard this early is really a bad idea. Some better compression technology could come along that will make this standard obsolete before its goes into consumer production. The only reason I can think of is a non-existent 3D technology.

      I am really surprised people talking about cinemas with 2K projectors, that is some cheap shit, that is not even an upgrade from the analog film that came before. All cinemas I frequent has either analog (for old stuff and indie) or 4K digital. Still 8K digital would be nice, though I would prefer movies being shot at something faster than 24fps, that is a much better way to add visual details movie-goes can reliably perceive.

  • But will the standard allow transmission of traditional closed captioning embedded in the video signal? Useful when the source doesn't provide open captioning or uses a crappy font.

    • But will the standard allow transmission of traditional closed captioning embedded in the video signal? Useful when the source doesn't provide open captioning or uses a crappy font.

      If the source doesn't provide them I don't see why it helps. Likewise if they're encoded in the video with a crappy font you're out of luck. But if they're proper closed captions, then you should be able to change the font in your media player.

  • Great so my current receiver, tv and console are out of date. Time to throw it all away and buy new stuff for 8k!!!!
    • by rnturn ( 11092 )

      They convinced us to do just that several times with our music collections. I'm sure the media companies are thinking "Why wouldn't they do it to watch movies at home?"

  • A group of veterinarians and ophthalmology surgeons announced plans for a new procedure to enable people to enjoy 8K resolution monitors. The details are unclear, but it looks like they are planning to harvest fovea from eagle retinas and transplanting them in their volunteers. They feel the retinas and eye resolution of human beings do not do full justice to the 8K monitors. Mr Applef Anboy, spokesman for the American Association Of Consumers Of The Latest And Greatest Gadgets agreed. "We have reached the
    • That's exactly the point I keep trying to make: human visual perception is limited to approximately 4000 pixels across the field of view. Doubling that resolution is absolutely imperceptible to a human viewer!
  • by djinn6 ( 1868030 ) on Wednesday January 04, 2017 @02:48PM (#53605981)

    48-gigabit-per-second cable

    The problem HDMI solves is the problem of shuffling data from one device to another. We've had 100 Gbit/s ethernet for years now, and those solve the exact same problem. USB and Thunderbolt also solve the same problem, but provides DC power on top of it. TV's are basically small computers at this point, there's absolutely no reason they need a specialized port just to receive data.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Really? 100Gbps ethernet? Maybe you should go look at the specs again. ONE Gbps is common at home, 10Gbps is common only in enterprise environments, 100Gbps is not common at all and usually requires fiber optics.

    • A 40Gbps IP-over-HDMI link sounds nice at consumer AV equipment prices.
      • A 40Gbps IP-over-HDMI link sounds nice at consumer AV equipment prices.

        You mean like how I can get IP-over-Thunderbolt at 40Gbps now? I don't have to wait 2 years for HDMI 2.1 to do what I've been doing for months already.

        I'm not impressed.

    • Great... so, how long will it take to download that 48Gbps video over my 100Mbps Comcast internet connection? Only 20 days for a 1 hour movie? Awesome! As I try to tell anyone buying a 4K TV: figure out where your 4K content is going to come from first! Comcast doesn't have 4K content, although they keep saying "Any day, real soon now!" And downloading 4K video from Netflix or Amazon Prime is probably going to put you over Comcast's new 1TByte/month data cap, costing you up to an additional $200/month. ($10
      • by rnturn ( 11092 )

        This.

        (Too bad I already commented and forfeited my moderation points.)

      • It's actually only an additional $50/month for unlimited bandwidth, or free if you have Comcast Gigabit PRO (2gbps symetrical).

        • by fnj ( 64210 )

          Comcast Gigabit PRO (2gbps symetrical).

          Funny; there's no mention of any offering anywhere near that where I am. I could get 150/10, but it adds so much cost and my need is so slight that I stick with the 25/5.

      • You underestimate the bandwidth of envelopes filled with Blu-ray discs sent by post.

        As I recall Netflix still rents out discs by post. If 4K and 8K become popular then services like this might become popular as well.

    • by AmiMoJo ( 196126 )

      Don't understate just how impressive 48Gb/sec over a cheap $5 cable with cheap electronics is.

      100G ethernet is fairly expensive, but at least manages 7m over copper. USB 3.1 only goes up to 10Gb/sec, max 5m cable. Thunderbolt 3 tops out at 40GB/sec, and the power delivery is less than 10W, and the maximum copper cable length is 3m. Optical can go further but is more expensive to implement and consumers don't seem to like optical cables for some reason.

      As for why not just use an existing port, the answer is

  • by BlueCoder ( 223005 ) on Wednesday January 04, 2017 @02:55PM (#53606023)

    Can we please bite the bullet? We survived the transition to HD. Remember when plain 1080i TV was 8 grand? People still pay $100 for digital monster cables.

    Don't tell me laser are that expensive and yes I do understand about the frequencies. But plain red lasers use to cost $200 and now you can get them at the 99 cent store.

    When are we going to transition over to optical? Why are the powers that be holding us back?

    • It's obvious if you look at the way people treat their cables. You can't expect them to obey something as esoteric as bend radius limits when their plain old copper wires hardly survive in one piece.

      Incidentally, S/PDIF isn't doing too great these days, which is a shame. One of my old laptops from 2005 had optical audio output, and it was awesome especially given the poor quality of its analog output. Since then, this feature has been missing from most laptops, and even with desktop mobos you have to be

      • Incidentally, S/PDIF isn't doing too great these days, which is a shame. One of my old laptops from 2005 had optical audio output, and it was awesome especially given the poor quality of its analog output. Since then, this feature has been missing from most laptops, and even with desktop mobos you have to be careful.

        Current systems can generally output S/PDIF digital audio through the line-out port; it's a standard feature, though somewhat hidden. You just need to connect an RCA adapter (use the right/red channel) and enable the S/PDIF output switch in the sound card settings. Audio quality is the same as Toslink (optical S/PDIF), though the signal may attenuate over very long coax links. There are devices like this one [a.co] available which convert from coax to Toslink.

        It seems since HDMI came out, you shouldn't need any other way of getting raw digital audio, which seems especially silly with something like 5.1 or better...

        Unfortunately, S/PDIF doesn't support multichannel PCM;

    • by AmiMoJo ( 196126 )

      Optical is more expensive than copper, both for the cable and for the transceiver on either end. It's less flexible too, doesn't like being bent too much, especially if it is cheap.

      Sure, cheap lasers are cheap, but they also suck. To get the kind of speed required (48Gb/sec) from fibre it needs to be multi-mode, i.e. you need to send multiple optical signals down it. The diodes that generate and receive the signals simply can't switch fast enough to do it with one stream. So you need some expensive hardware

  • Not sure what the article is referring to, we've had DTS:X and Atmos enabled Blu-rays and receivers for quite a while already...

    • by nwf ( 25607 )

      Not sure what the article is referring to, we've had DTS:X and Atmos enabled Blu-rays and receivers for quite a while already...

      That's likely just a bit stream which has no meaning per the spec. I think the spec is allowing for a higher-level way of sending that sort of audio, such that you could decode Atmos into some number of PCM streams placed in 3D space vs an Atmos bit stream. Not really sure how that helps anyone.

      I know my AppleTV decodes surround sound into straight PCM surround which it sends to my receiver. I guess this would be similar, but with N spatial channels.

  • by Locke2005 ( 849178 ) on Wednesday January 04, 2017 @04:18PM (#53606581)
    8K is TWICE the resolution of human eye, which can only distinguish about 4000 pixels across the field of vision. Higher resolution is only useful if you're sitting close enough to only see half the screen in your field of view! Point it, 4K is the point of diminishing returns in video resolution. At 4K, you cannot distinguish individual pixels when the entire screen is in your field of view. Higher resolution for a TV screen is pointless. Higher resolution for a camera only makes sense if you plan on blowing the image or a section of the image up.Satellite cameras can still use all the resolution they can get.
    • For TV and movies, perhaps, but I'm still waiting for it for desktop use. And, by desktop, I mean like a Surface Studio with a 48-50" monitor for working on full size E architectural prints. I may not be able to see pixels on more than a portion of the screen, but there's no bigger productivity killer than having to constantly scroll around a print looking through a little "window" onto the page. Right now I use a pair of 42" 4k monitors which is good for a D size drawing at nearly 1:1. Even so, at my norm

    • by djbckr ( 673156 )
      I would suggest that the 8K support is mainly geared toward professional capture. Current [very] high-end cameras support 8K capture, but managing the output is proprietary per brand. The pros use the 8K to work with editing/zooming allowing them to keep at least a 4K image as final output. IMO, having the HDMI spec support 8K is a good move to standardize video transfer. I see it supports 60fps too. Another good move. I really can't imagine why consumer equipment would exceed 4K - but I'm sure somebody wil
      • IMO, having the HDMI spec support 8K is a good move to standardize video transfer.

        In my experience the HDMI standard is already dead or dying. I'd much rather see a more common connector be used if backward compatibility is required. I'd also rather see a better designed connector used than HDMI. HDMI is friction fit and heavy, meaning the connector can work itself loose under its own weight. DVI doesn't have this problem (screws), USB-C doesn't have this problem (small and light), and neither does DisplayPort (locking tab).

        I've seen SuperMHL announced a year ago, not that you'll fin

    • by Ramze ( 640788 )

      Not really. For 20/20 vision ("good" eyesight), the limit is closer to 5K, so most everyone will notice the difference from 4K to 8K because it will surpass the 5K barrier. But, that's not the limit of human eyesight. There are those of us with 20/10 vision and better that can discern up to 11K or better. Lots of pilots have "eagle eye" vision in the 20/10 or better range. One can also get better than 20/10 with laser eye surgery.

      You can read up on a decent article about it here:
      http://www.red.co [red.com]

  • ... there was actually content that actually needed 8K resolution. Is it possible that watching `Two Broke Girls" or `Kevin Can Wait' in 8K will actually make them enjoyable. Maybe having the laugh track accurately positioned in three dimensions will be the must-have feature that makes the new HDMI spec worth the extra money. (Too cynical?)

    • ... there was actually content that actually needed 8K resolution.

      Sports.

      Watching sports doesn't "need" 8K video. People have been listening to sports broadcasts on the radio for a very long time now, so they don't "need" any video at all. In fact there are numerous profitable sports only radio channels right now. I suspect that this is largely due to a captive audience that drive, but I also suspect these same people would like to enjoy their sports in 8K when they are not on the road. With sports there can be a lot happening on a large area. People watching might

  • Ever since HDMI came out I've heard people complain about the connector. It's too big for most any portable device, has only friction to hold it in place, leading to a common problem of the connection coming loose from just the weight of the cable. I recall it being described by someone as how a computer scientist would solve an electrical engineering problem. Not to call computer scientists stupid or anything it's just that the basic level of electronic theory required by a computer science degree is in

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...