New HDMI 2.1 Spec Includes Support For Dynamic HDR, 8K Resolution (techhive.com) 192
The HDMI Licensing Group has unveiled the HDMI 2.1 spec, adding support for dynamic HDR, 8K60, and 4K120. From a report on TechHive: To take full advantage of the new HDMI spec, you'll need a new 48-gigabit-per-second cable. That cable will also work with older HDMI 1.3 (10.2Gbps) and HDMI 2.0a (16Gbps) ports, but those ports don't support the new HDMI 2.1 features. [...] HDMI 2.1 adds support for the new object-oriented audio codecs -- such as Dolby Atmos and DTS X -- which can position audio events from movie soundtracks in 3D space.
Get your HDMI 2.1 Monster cable to day only $89.99 (Score:3)
Get your HDMI 2.1 Monster cable to day only $89.99 and ask a blue shirt about our install deals and our audio systems.
Re:Get your HDMI 2.1 Monster cable to day only $89 (Score:4, Funny)
Don't forget your extended warranty, only $49.99 + 10% deductible.
Re: (Score:2)
Get your HDMI 2.1 Monster cable to day only $89.99 ...>
It's on sale for 90% off!?
Re: (Score:2)
when you take our home install deal and must buy at least 2 of them of 1 from your tv box to the sound system and from from the sound system to the tv.
Re: (Score:2)
There was someone who tested this at some point and found that people couldnt tell the difference between gold plated copper wire and a coathanger in terms of sound quality .
Re: (Score:2)
what about the directional channels for electrons? i hear those help with maximizing picture quality and audio fidelity.
Re: (Score:3)
what about the directional channels for electrons? i hear those help with maximizing picture quality and audio fidelity.
Yep, they make the ones straighter and the zeros rounder. But the cables have to use oxygen-free copper, braided on the thighs of virgins from a third-world country for it to work right.
That work used to be done by little old Italian widows listening to Verdi but then they unionized and things just haven't been the same since.
Re: (Score:2)
https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
Re: (Score:2)
not true but that is sales pitch that our old manager use to push for the disk guard plan. With the disk guard plan you can upgrade you madden to the next year under the plan (not true but cashiers where pushing it hard)
Re: Get your HDMI 2.1 Monster cable to day only $8 (Score:2)
Cables made by monoprice also carry a lifetime warranty, plus free shipping when you make a claim.
Re: (Score:3)
Who needs future proofing when the cable only costs $4? At the rate new cables come out, it would take monster cable basically 50 years to break even with monoprice.
Re: (Score:3)
Re: (Score:3)
The main advantage to good cables is simple mechanical reliability. But you can get $5-10 cables form Dayton (or in some cases Amazon Basics) that look to be made in the same factory as Monster cables.
Even for long runs for digital cables, where shielding and impedance matching barely matter, acceptable quality can actually get expensive as you need larger wire gauge (or a repeater) for long runs. Don't take the cheapest 50-foot HDMI cable, or you'll get one that works most days.
3d space (Score:2)
[...] which can position audio events from movie soundtracks in 3D space
I, for one, welcome this new THREE DIMENSIONAL space. I hate this crappy 2 dimensional world we have been living in our whole lives until now.
Re: (Score:3)
Obviously what they're talking about is breaking individual sounds down into their own channels based on assignment to 3d spatial coordinates rather than having individual channels for some arbitrary "standard" set of speakers coming from certain "standardized" directions. Which is IMHO kind of neat. You could have as many speakers as you wanted and put them wherever you wanted. Or perhaps combine face recognition and ultrasonic directional speakers to give a very precise sound direction to each person w
I'm sure there's a reason... (Score:5, Insightful)
I'm sure there's a reason why someone might want 8K, but I've not even been convinced of the benefit of 4K yet.
Re: (Score:2)
Virtual Reality might be a reason. Although we'd need 8K90, not 8K60 and what we'd really want is 8K90 per eye, which comes to 16K90. So expect HDM2.1 to be outdated before it becomes available ;)
Re: (Score:2)
Actually, virtual reality needs higher resolutions, but it needs higher framerates even MORE. 60fps is enough to produce fluid video when passively watched (though synthetic video still needs some amount of added motion blur to be really convincing), but it's absolutely NOT fast enough for immersive video. When there's at least 1/60th of a second lag between your hand's motion and seeing your virtual hand move, you'll DEFINITELY notice the lag. 1/60th second lag feels, well... "laggy". 1/30th second lag fee
Re: (Score:2)
VR's an interesting case, if you can incorporate eye tracking. You only need that sort of resolution in the center of view; in the peripheral you could update only one in every X pixels in each direction, and have the display just fill in their neighbors with the same colour. Or you could only send the low frequency components of a compressed data block for peripheral areas, dropping all of the high frequency data (aka fine detail). Eye tracking wouldn't reduce the number of physical pixels that the scre
Re:I'm sure there's a reason... (Score:5, Insightful)
For movies, not much. There's definitely a wow factor in some of them but you quickly forget about it and just enjoy the movie.
But for coding and web browsing, I found 4K to be a surprising win.
The extra clarity in text is absolutely wonderful. With low-res screens I often find myself wanting to zoom in on text despite being able to read the small text without straining my eyes. When I got my first 4K screen I noticed I was no longer tempted to do any zooming.
Re: (Score:2)
For movies, not much. There's definitely a wow factor in some of them but you quickly forget about it and just enjoy the movie.
After getting used to higher resolution on my desktop and in games I occasionally get distracted by the resolution of 1080p TV and movies. I get used to it for a while then there will be a scene switch or something and I'll get bothered by it. Not as often as the jitter from low frame rate panning though.
Re: (Score:2)
For movies, not much. There's definitely a wow factor in some of them but you quickly forget about it and just enjoy the movie.
But for coding and web browsing, I found 4K to be a surprising win.
The extra clarity in text is absolutely wonderful. With low-res screens I often find myself wanting to zoom in on text despite being able to read the small text without straining my eyes. When I got my first 4K screen I noticed I was no longer tempted to do any zooming.
I disagree. For movies there is a good bit of difference, especially for 4K content with HDR. Is it a huge difference, nope. Is it enough of a difference to make it more enjoyable and more lifelike, yep! Is it worth upgrading just for the new format? Not yet. Not unless you have to (i.e. broken TV). I was planning on waiting another couple of years to go to 4K, but my Plasma died and it was under warranty.
Do you notice the difference? Yes, definitely. For content that is filmed in 4K and uses HDR.
Re: (Score:2)
Exactly. Your monitor goes from looking like a really high-res computer monitor, to looking like the brightly-illuminated output of a laser printer. My Nexus 6P has a ~480dpi screen, and I've gotten so spoiled by razor-crisp text, 200dpi text on a more average tablet now makes my eyes feel like they're bleeding.
I actually bought a Chuwi Hi12 (2160x1440 12" display) JUST to use for ebook reading, because I couldn't stand to read ebooks on my older tablets anymore. It's a little too heavy to use as a "tablet"
Re: (Score:2)
For cinema size screens 4k is a bit crap. Looks pixelated to me, which is not surprising when you consider the low DPI.
Re:I'm sure there's a reason... (Score:4, Informative)
That's funny since most printed text is printed at like 72dpi and nobody complains that printed text is pixelated or "unclear." The human eye isn't that good. What you are loving isn't resolution related -- it's the better backlight giving you better blacks than what you had on old 1080 monitors.
That's funny since most printed text is printed at like 72dpi and nobody complains that printed text is pixelated or "unclear." The human eye isn't that good. What you are loving isn't resolution related -- it's the better backlight giving you better blacks than what you had on old 1080 monitors.
Found the guy who's never worked on a 4K monitor. Try it for a while. Going back to 1080p it looks like text was rendered with a circular saw. I have a very new, high end 27" 144hz gaming monitor. I love it for games but I also have two 27" 4K monitors on the PC I use for work and the difference is extremely noticeable. Text and lines are razor sharp.
Not to mention all that extra screen real estate!
Re: (Score:2)
I used 4K on a work's Lenovo Thinkpad P50. It drove me nuts when programs don't fully support it! 4K is not quite ready for computing. :(
Re: (Score:2)
Bwahaha. Yeah, back when we were using MX-80 dot matrix printers that was the case. Most laser printers are 300-600 dpi, and some are higher. Inkjet is usually several hundred.
Re: (Score:2)
I'm looking for a reasonably priced 5k monitor. A 27 inch 5k screen seems to be the sweet spot. A standard definition 27" screen would be 2560x1440, and 5k is exactly twice that so you can use 200% scaling.
The ideal screen size for 4k is 24", because a 1920x1080 24" monitor is close to the standard 96 DPI.
There were some okay deals on 5k last month, but nothing spectacular. Hopefully this year though.
Re: (Score:2)
That's funny since most printed text is printed at like 72dpi and nobody complains that printed text is pixelated or "unclear." The human eye isn't that good. What you are loving isn't resolution related -- it's the better backlight giving you better blacks than what you had on old 1080 monitors.
Uh, no, you're thinking of PPI on displays. Even cheap printers can usually handle at least 300x300 DPI. Most laser printers I've seen default to 600x600, and even ones marketed for home use are often capable of at least 1200 in at least one dimension. Many inkjets also are able to increase their DPI for "high quality" or photo printing.
Let's take an average computer monitor, however--say a 22" monitor with 1680x1050 resolution. This gives about 90 PPI (or DPI as it's more often called here, even though som
Re: (Score:3)
Early Adopter of 4k here. 8K is for the uselessly Rich bleeding edge adopters. 4k is still fresh, but no longer bleeding edge or early now in my opinion.
Short story, there is already some benefit, but not much yet. So unless you are itchy for new tech hold off on going to spring for that new TV just yet.
I Currently have a Samsung 4k UN65KS8000
It was selected for the low input lag and that I could get it for $1500
I Previously used a Visio 4k D55-D2 4k
Both do very well on input lag in PC/Game modes. The S
Re: (Score:2)
Everyone here saying 8K is useless was saying the same arguments about 4K.
I'm standing now, in front of a 55" 4K monitor, two feet from my face. It's freaking beautiful for playing Chivalry, or having multiple remote desktops running at once.
AMD and nVidia have BOTH come out and said 8K and 16K will come to the limits of the human eye and be useful. LG has just patented a process for their 11K TV because they say it "feels 3-D" without actually using any goggles.
I just watched Jungle Book in 4K on and it wa
Re: (Score:2)
I've also started shooting my skydiving videos with a 4K GoPro, and even when viewing the videos at 1080, the difference is pretty amazing. You can ac
Re: (Score:2)
I'm sure there's a reason why someone might want 8K, but I've not even been convinced of the benefit of 4K yet.
Well on my UHD monitor I can see the difference between a 3840x2160 crop of a photo and the same photo resized to 1920x1080 and back, though it's not huge. If I use a really stupid upscaler to simulate a 1920x1080 screen it's even more obvious. But if I need it to watch TV... not so sure. But there's UHD the resolution and there's UHD the format with HDR, Rec.2020, 10 bit color etc. which all together is a pretty big improvement over BluRay. Going to 8K is probably going to be like 96KHz/24 bit audio, it mi
Re: (Score:2)
I'm sure there's a reason why someone might want 8K, but I've not even been convinced of the benefit of 4K yet.
Video walls. Like Barney Stinson had!
Or projectors. Amazing how people forget those exist.
Probably for the benefit of movie studios (Score:5, Interesting)
HDCP 2.2 was broken in late 2015 [wikipedia.org]. Not sure if it was cracked or someone just made a device using a legit HDCP 2.2 decryption key. But if it was cracked, we're probably going to go through all this again. Hollywood will insist a new not backwards-compatible HDCP 2.3, and it will be used to encode all future Blu-rays starting with 8k releases.
They also enjoy double- or triple-dipping: charging you full price for a license to the same movie in different formats. Same with the record studios, who had no qualms about charging your for the same song on vinyl, tape, and CD. The software industry gets this right - they let you upgrade at a discounted price if you own a previous version. This reflects the reality that you already purchased a license for the previous versions, and thus the new version is only giving you some new functionality instead of entirely new functionality. But Hollywood has self-deluded themselves into thinking that their product is a license when it's convenient for them if it's a license, and a product when it's convenient for them if it's a product. So will charge you full price even if you've already purchased licenses for the movie three times at 360i (VHS), 525i (DVD), and 1080p.
People need to stop putting up with this crap and demand lower-price upgrade licenses for content they've already paid for. IMHO a lot of piracy would disappear if the studios simply adopted pricing which better reflected reality. Most people want to pay content creators for their work, but not if they judge that the content creators are trying to rip them off. The whole fiasco with Windows XP support contracts is a great example. Microsoft pushed support contracts for XP hard and lots of companies signed up. Instead of buying XP, they were buying 3 years of Windows support, which would include XP and an upgrade to the next version of Windows (new versions normally come out about every 3 years). Unfortunately Vista got delayed and wasn't released until 5.5 years after XP - outside the support contract period these companies had paid for. There was hell to pay, with many companies believing Microsoft deliberately delayed Vista so they wouldn't have to fulfil that portion of the contract. Even though Microsoft eventually relented and gave these companies Vista, many of them will never buy a support contract or subscription software from Microsoft again. Because they judge it to be unfairly skewed in favor of the supplier.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The only reason I got the 4k is that it was damn near the same price as the 1080.
I recently got a 48" 4K HDTV for $350 at Costco during the holidays to replace a $200 26" CRT that I got 12 years ago. That's a big step up. The difference between 1080p and 4K, not so much.
Re: (Score:2)
With a 55" TV, the biggest benefit comes from having 4K **content** and a good scaler. By the time 2160p24 gets compressed down to something Netflix can handle, you MIGHT end up with the PQ of real, minimally-compressed 1080p24. Basically, 4K shaves away the shittiness that has gradually crept into mainstream "2k" HD video.
For an illustrative example, think about a non-HDTV with s-video input and either cable or satellite TV. You can feed it a "SD" channel, or you can feed it a downrezzed "HD" channel. The
Re: (Score:2)
You're blind and deaf, apparently, too. Ignorance is bliss I guess. While the jump isnt' as impressive as 240i to 1080p, it's still there. The bit about audio is certainly not true.
Re: (Score:2)
Re: (Score:3)
Like fuck there aren't. If you can't stand in front of a 1080p and a 4k screen (55" or even smaller) with the same demo showing on both and not see a huge difference then you are blind or have some sort or brain problem.
+1 for this. Every time someone says they can't tell the difference between 1080p and 4K, I think to myself "Just how bad is that mofos eyes???" There's a huge difference!
They had an 8K TV setup at my local Best Buy on an 18 wheeler and actually fooled most of us that it was a window to the outside of the trailer, before they told us it was a TV. It literally looked like a piece of glass to the outside. They had it turned sideways and put a wooden border around it to enhance the effect.
Re: (Score:2)
Apparently you don't have binocular vision. If someone puts a high-resolution print of a tunnel on a wall, do you try to run into it roadrunner-style?
Re: (Score:3)
Binocular human vision is only useful for 3D up to about 6 meters away (roughly 20 ft). Objects farther than 6 meters appear too similar in position to both eyes to discern distance accurately, so the brain uses other clues for reference. The average distance between human eyes is only 6 cm, and the parallax is so small of an angular difference at over 20 ft, the brain really can't tell the difference. That's partly why piers in the distance always seem so close, you could walk to them... and then 30 m
Re: (Score:2)
The resolution of the human eye [wolfcrow.com] is measured in arc seconds, not pixels. Someone with 20/20 vision cannot differentiate 1080p from a higher resolution 50" T.V. when seated 6 feet away. T
Re: (Score:2)
wolfcrow made a mistake in that blog, of which was pointed out a few times in the comments. A 50" TV seated 6' away @2k only has a dpi of ~38dpi, well shy of the 100dpi required. A 65" TV seated 6' away @4k still only has 67dpi. A 65" TV seated 6' way @8k has 135dpi, which based on his calculations done correctly, would be indistinguishable. Of course that still makes a number of assumptions, like you have average eye sight, and you are sitting 6' away from your TV, eye sight towards the center of the e
Re: (Score:2)
Sorry, fell short of the 120 dpi required, not 100. A 65" 8K TV at 6' is just about right (at 135dpi).
Re: (Score:2)
I've never stood by a 4K and 1080 side by side. Might I be able to tell the difference then? Maybe, without having experienced it, I don't know.
I do know if I look at a random TV, I couldn't tell you if it were 4K or 1080 (without looking at specs). If the difference isn't big enough to notice without having two sets next to each other, I don't care to pay the extra.
Granted, my eyesight actually isn't all that great!
Re: (Score:2)
+1 for this. Every time someone says they can't tell the difference between 1080p and 4K, I think to myself "Just how bad is that mofos eyes???" There's a huge difference!
I can set up a showroom so that you'd tally see the differece and be willing to pay for it, even though both screens were secretly identical. Never trust what you see in a showroom.
For the home, it all comes down to viewing distance. With 4K you can see a difference, but you have be closer than most people find comfortable. With my 65" screen I'd have to be within 8 feet. But that's a legitimate use case for plenty of people, especially for gaming and for PC monitors.
To see the difference between 4K and
Re: (Score:2)
HDR alone makes it worth it to get a 4k set.
While 4K with HDR provides a definite improvement over 1080p, 4K doesn't necessarily include HDR. If you want to argue that 4K with HDR is markedly better than 1080p, I'll heartily agree. If you want to make the generalization that 4K (no mention of HDR) is better for everyone than 1080p, as you did, I'll suggest you've overstated your side.
If you can't stand in front of a 1080p and a 4k screen (55" or even smaller) with the same demo showing on both and not see a huge difference then you are blind or have some sort or brain problem.
So, a few things:
A) Demo sets don't matter. Big box retailers know that when people are given the choice between two otherwise identical images, they'll choose a brighte
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Arguably correct.
Ludicrously wrong. Even 256 greyscale gives a painfully obvious stepping effect in a gradient. Even 16 million colors (256 R, 256 G, 256 B), while quite acceptable for most purposes, is readily detectable as deficient compared to 1 billion (1024 R, 1024 G, 1024 B).
Re: (Score:2)
Why 8k? (Score:2)
Re: (Score:2)
Standard theaters don't need higher resolution in part because nobody likes sitting close enough to the screen (first few rows) that the resolution is readily noticeable, and outside of the large IMAX theaters, the average screen distance to seat ratio isn't big enough to justify the cost of higher resolution projectors.
Possible 8K Uses:
* Large, High Resolution Desktop monitors for artists, as well as for developers that like densely packed screens. I'd gladly trade my current multi-display setup for somet
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
8K buys you nothing when human visual perception is limited to 4K! "Duh, bigger numbers better!" appears to be the new marketing mantra.
But... From what I can find, some estimates are that the human eye can see up to 15 Million pixels. 4K is only about 8.3 million pixels. So, there is some benefit of going to 8K, about 48 million pixels, as it finally exceeds what the eye can theoretically handle. If you cut that in half to provide for 3D, then you have two 24 million pixel pictures that the eye cannot distinguish between computer generated and reality.
Re: (Score:2)
Since most theaters only project at 2K and 4K barely has a foothold in the home market, why? It seems like a new standard this early is really a bad idea. Some better compression technology could come along that will make this standard obsolete before its goes into consumer production. The only reason I can think of is a non-existent 3D technology.
I am really surprised people talking about cinemas with 2K projectors, that is some cheap shit, that is not even an upgrade from the analog film that came before. All cinemas I frequent has either analog (for old stuff and indie) or 4K digital. Still 8K digital would be nice, though I would prefer movies being shot at something faster than 24fps, that is a much better way to add visual details movie-goes can reliably perceive.
Re: (Score:2)
You know Fuck it... Lets just go to 1024GigaK
Re: (Score:2)
A MILLION 80 P !?!? https://www.youtube.com/watch?... [youtube.com]
What about closed captioning? (Score:2)
But will the standard allow transmission of traditional closed captioning embedded in the video signal? Useful when the source doesn't provide open captioning or uses a crappy font.
Re: (Score:2)
But will the standard allow transmission of traditional closed captioning embedded in the video signal? Useful when the source doesn't provide open captioning or uses a crappy font.
If the source doesn't provide them I don't see why it helps. Likewise if they're encoded in the video with a crappy font you're out of luck. But if they're proper closed captions, then you should be able to change the font in your media player.
All new stuff? (Score:2)
Re: (Score:2)
They convinced us to do just that several times with our music collections. I'm sure the media companies are thinking "Why wouldn't they do it to watch movies at home?"
To take full advantage of 8K ... (Score:2)
Re: (Score:2)
Yet another standard (Score:3)
48-gigabit-per-second cable
The problem HDMI solves is the problem of shuffling data from one device to another. We've had 100 Gbit/s ethernet for years now, and those solve the exact same problem. USB and Thunderbolt also solve the same problem, but provides DC power on top of it. TV's are basically small computers at this point, there's absolutely no reason they need a specialized port just to receive data.
Re: (Score:2, Insightful)
Really? 100Gbps ethernet? Maybe you should go look at the specs again. ONE Gbps is common at home, 10Gbps is common only in enterprise environments, 100Gbps is not common at all and usually requires fiber optics.
Re: (Score:2)
Re: (Score:2)
A 40Gbps IP-over-HDMI link sounds nice at consumer AV equipment prices.
You mean like how I can get IP-over-Thunderbolt at 40Gbps now? I don't have to wait 2 years for HDMI 2.1 to do what I've been doing for months already.
I'm not impressed.
Re: (Score:3)
Re: (Score:2)
This.
(Too bad I already commented and forfeited my moderation points.)
Re: (Score:2)
It's actually only an additional $50/month for unlimited bandwidth, or free if you have Comcast Gigabit PRO (2gbps symetrical).
Re: (Score:2)
Funny; there's no mention of any offering anywhere near that where I am. I could get 150/10, but it adds so much cost and my need is so slight that I stick with the 25/5.
Re: Yet another standard (Score:2)
For gigabit internet, Http://xfinity.com/gig-offer if you live in a city they have rolled it out to, but as I said, you should still be able to get unlimited for an additional $50 on whatever plan you have. Check http://dataplan.xfinity.com/fa... [xfinity.com]
Re: (Score:2)
You underestimate the bandwidth of envelopes filled with Blu-ray discs sent by post.
As I recall Netflix still rents out discs by post. If 4K and 8K become popular then services like this might become popular as well.
Re: (Score:2)
Don't understate just how impressive 48Gb/sec over a cheap $5 cable with cheap electronics is.
100G ethernet is fairly expensive, but at least manages 7m over copper. USB 3.1 only goes up to 10Gb/sec, max 5m cable. Thunderbolt 3 tops out at 40GB/sec, and the power delivery is less than 10W, and the maximum copper cable length is 3m. Optical can go further but is more expensive to implement and consumers don't seem to like optical cables for some reason.
As for why not just use an existing port, the answer is
Re: (Score:2)
The only thing still missing from HDMI is power to drive HDMI attached media dongles. Maybe with HDMI 3.0 ...
It looks like MHL solved this problem by borrowing the HDMI connector, using the MHL protocol, and upping the current on some pins for power. The power isn't much, 5 watts is all, but enough to run a ChromeCast or something like it.
A *consumer* 48Gbps cable is a pretty damn amazing creation, especially being backward / forward compatible with old ports / old cables.
You mean like how USB-C can do 40Gbps, provide 100W for power, and is backward/forward compatible with USB-2.0, MHL, DisplayPort, Thunderbolt, and (funnily enough) HDMI?
I'm not impressed. 48Gbps may be larger bandwidth than Thunderbolt, SuperMHL, or DisplayPort but not hugely s
When are we going optical? (Score:3)
Can we please bite the bullet? We survived the transition to HD. Remember when plain 1080i TV was 8 grand? People still pay $100 for digital monster cables.
Don't tell me laser are that expensive and yes I do understand about the frequencies. But plain red lasers use to cost $200 and now you can get them at the 99 cent store.
When are we going to transition over to optical? Why are the powers that be holding us back?
Re: (Score:2)
It's obvious if you look at the way people treat their cables. You can't expect them to obey something as esoteric as bend radius limits when their plain old copper wires hardly survive in one piece.
Incidentally, S/PDIF isn't doing too great these days, which is a shame. One of my old laptops from 2005 had optical audio output, and it was awesome especially given the poor quality of its analog output. Since then, this feature has been missing from most laptops, and even with desktop mobos you have to be
Re: (Score:2)
Incidentally, S/PDIF isn't doing too great these days, which is a shame. One of my old laptops from 2005 had optical audio output, and it was awesome especially given the poor quality of its analog output. Since then, this feature has been missing from most laptops, and even with desktop mobos you have to be careful.
Current systems can generally output S/PDIF digital audio through the line-out port; it's a standard feature, though somewhat hidden. You just need to connect an RCA adapter (use the right/red channel) and enable the S/PDIF output switch in the sound card settings. Audio quality is the same as Toslink (optical S/PDIF), though the signal may attenuate over very long coax links. There are devices like this one [a.co] available which convert from coax to Toslink.
It seems since HDMI came out, you shouldn't need any other way of getting raw digital audio, which seems especially silly with something like 5.1 or better...
Unfortunately, S/PDIF doesn't support multichannel PCM;
Re: (Score:2)
Optical is more expensive than copper, both for the cable and for the transceiver on either end. It's less flexible too, doesn't like being bent too much, especially if it is cheap.
Sure, cheap lasers are cheap, but they also suck. To get the kind of speed required (48Gb/sec) from fibre it needs to be multi-mode, i.e. you need to send multiple optical signals down it. The diodes that generate and receive the signals simply can't switch fast enough to do it with one stream. So you need some expensive hardware
Re: (Score:2)
That TOSLink standard must be very wrong : it has enough transfer rate for very slightly above 1x CD speed - it was invented for 16bit 48KHz stereo as far I as know.
Since then there's been eleven thousand encoding standards for surround, 5.1, ultra surround, 7.1 and tons of standards with "H", "X" or "S" in them. Because there's not enough bitrate for 8 channels of sound.
So, it's more akin to infrared serial ports or week-end projects made by amateurs. It does not have the about 20,000x bigger bandwith need
DTS:X and Atmos already out. (Score:2)
Not sure what the article is referring to, we've had DTS:X and Atmos enabled Blu-rays and receivers for quite a while already...
Re: (Score:2)
Not sure what the article is referring to, we've had DTS:X and Atmos enabled Blu-rays and receivers for quite a while already...
That's likely just a bit stream which has no meaning per the spec. I think the spec is allowing for a higher-level way of sending that sort of audio, such that you could decode Atmos into some number of PCM streams placed in 3D space vs an Atmos bit stream. Not really sure how that helps anyone.
I know my AppleTV decodes surround sound into straight PCM surround which it sends to my receiver. I guess this would be similar, but with N spatial channels.
Funny thing about 8K... (Score:5, Insightful)
8K would be fantastic at my office. (Score:2)
For TV and movies, perhaps, but I'm still waiting for it for desktop use. And, by desktop, I mean like a Surface Studio with a 48-50" monitor for working on full size E architectural prints. I may not be able to see pixels on more than a portion of the screen, but there's no bigger productivity killer than having to constantly scroll around a print looking through a little "window" onto the page. Right now I use a pair of 42" 4k monitors which is good for a D size drawing at nearly 1:1. Even so, at my norm
Re: (Score:2)
Re: (Score:3)
IMO, having the HDMI spec support 8K is a good move to standardize video transfer.
In my experience the HDMI standard is already dead or dying. I'd much rather see a more common connector be used if backward compatibility is required. I'd also rather see a better designed connector used than HDMI. HDMI is friction fit and heavy, meaning the connector can work itself loose under its own weight. DVI doesn't have this problem (screws), USB-C doesn't have this problem (small and light), and neither does DisplayPort (locking tab).
I've seen SuperMHL announced a year ago, not that you'll fin
Re: (Score:2)
Not really. For 20/20 vision ("good" eyesight), the limit is closer to 5K, so most everyone will notice the difference from 4K to 8K because it will surpass the 5K barrier. But, that's not the limit of human eyesight. There are those of us with 20/10 vision and better that can discern up to 11K or better. Lots of pilots have "eagle eye" vision in the 20/10 or better range. One can also get better than 20/10 with laser eye surgery.
You can read up on a decent article about it here:
http://www.red.co [red.com]
Now if only... (Score:2)
... there was actually content that actually needed 8K resolution. Is it possible that watching `Two Broke Girls" or `Kevin Can Wait' in 8K will actually make them enjoyable. Maybe having the laugh track accurately positioned in three dimensions will be the must-have feature that makes the new HDMI spec worth the extra money. (Too cynical?)
Re: (Score:2)
... there was actually content that actually needed 8K resolution.
Sports.
Watching sports doesn't "need" 8K video. People have been listening to sports broadcasts on the radio for a very long time now, so they don't "need" any video at all. In fact there are numerous profitable sports only radio channels right now. I suspect that this is largely due to a captive audience that drive, but I also suspect these same people would like to enjoy their sports in 8K when they are not on the road. With sports there can be a lot happening on a large area. People watching might
Why keep the crappy connector? (Score:2)
Ever since HDMI came out I've heard people complain about the connector. It's too big for most any portable device, has only friction to hold it in place, leading to a common problem of the connection coming loose from just the weight of the cable. I recall it being described by someone as how a computer scientist would solve an electrical engineering problem. Not to call computer scientists stupid or anything it's just that the basic level of electronic theory required by a computer science degree is in
Re: (Score:2)
Unless you, y'know, disable overscanning...
Re: (Score:3)
Re: (Score:2)
I have a few guesses. First is that Ethernet is bi-directional, video doesn't need that. Sure there is some up link data, for things like telling the source device the supported resolutions, key exchange for DRM, remote control signals, and so on, but those are all very low data compared to the video, it doesn't take 100Gbps for that to work. Second is overhead. Ethernet packets have data in the frames telling the destination device things it does not need to know to do video. Ethernet is a bus but thi