New HDMI 1.4 Spec Set To Confuse 357
thefickler writes "HDMI Licensing LLC, the company that determines the specifications of the HDMI standard, is set to release the HDMI 1.4 spec on 30 June. Unfortunately it could very well be the most confusing thing to ever happen to setting up a home theater. When the new cables are released, you're going to need to read the packaging very carefully because effectively there are now going to be five different versions of HDMI to choose from — HDMI Ethernet Channel, Audio Return Channel, 3D Over HDMI, 4K x2K Resolution Support and a new Automotive HDMI. At least we can't complain about consumer choice."
Re:Set fail... (Score:2, Insightful)
HDMI Ethernet (Score:5, Insightful)
â HDMI Ethernet Channel
"The HDMI 1.4 specification will add a data channel to the HDMI cable and will enable high-speed bi-directional communication. Connected devices that include this feature will be able to send and receive data via 100 Mb/sec Ethernet, making them instantly"... OBSOLETE
Thanks for coming out.
Yah but (Score:5, Insightful)
Are they gold plated?
The TV manufacturers are simply screwing themselves over. They're dreaming. The new standard is going to be a computer screen attached to a PC streaming from youtube or similar.
Re:what was wrong with DVI? (Score:1, Insightful)
DRM. DVI has no DRM.
I'm a geek, but... (Score:5, Insightful)
Now, as far as cabling goes, I suspect most of this is driven by a marketing department. If you look at computer display technology, which has been in rapid flux for at least 20 years, they've managed to standardize on TWO different connectors: one for analog and one for digital. Sure, there are some weirdo ones out there, like ADC and 13W3, but they never really had any real relevance. But with TVs, which is ostensibly simpler than computer displays, we have this panoply of cables. Why?
Now, Cat5e-- that's an impressive technology. The data rates people have been able to squeeze out of plain ol' twisted pair! But seriously; we do everything in software now. Why does television insist on having cable after cable to do functions that we could do with a single one?
Great (Score:5, Insightful)
This is the 11th revision of the HDMI specification in the less-than 7 year life of HDMI. Meanwhile, the 22-year old VGA connection still works fine, at full HDTV resolution, and with none of the incompatibility or usage restrictions (DRM) that HDMI brings to the table. Um, progress?
Re:HDMI Ethernet (Score:5, Insightful)
Well, for starters, 1080p (keep in mind this involves "raw" devices, not sending an MPEG4 down the line) uses just shy of 1.5 Gbps.
We can follow that up with "anyone not using wireless already upgraded to gig-E switches about five years ago".
We can then finish it off with one of my favorites (actually not, but in this case it really does serve the described need) - Any attached devices needing bidirectional communication can use plain ol' ubiquitous USB. And really, do my speakers actually need to talk back to my receiver under any even remotely plausible scenario that doesn't scream "DRM, mother fucker, do you speak it?"
Re:I'm a geek, but... (Score:3, Insightful)
er, because a 42 inch computer flat panel will cost far more than a 1080p LCD of the same size??
Here in Oz you will be paying 1000AUD for a 30inch monitor. That cash will get you a 42" 1080p plasma. Heck 1080p is fine for couch computing. Can you even get 42inch monitors?
Also the TV will most likely have tuners built in. You and I run a media center. Most people do not.
And for your final question: do you really want average consumers to wrestle with ethernet and a TCP/IP stack just to get a signal from their 'video player' to their telly.
Simple example: what about networked players? Do you add a switch into the unit and have both the TV and the player on the same subnet/VLAN? Or does the player and TV have its own subnet and the unit acts as a router? Or should the TV? WHich address space should it pick? What if it clashes with the existing network? What if there are duplex negotiation bugs/issues? why the ---- would you want to deal with any of those potential issues than just cable goes in, select the channel, works out of the box.
Please don't start talking about myth and streaming same input into multiple outputs or muxing channels etc. for most people THAT'S COMPLETELY IRRELEVANT
Sorry its a bit of a silly question IMHO
Why not just use Ethernet? (Score:3, Insightful)
Forgive me for not having kept up with the progress of HDMI, but wouldn't it have made infinitely more sense to have simply used gigabit Ethernet for all this? The data is all digital anyway, and networking technology is quite mature, so why did these folks feel the need to reinvent the wheel? Right now, you have to worry about whether your new TV will have enough HDMI inputs for the devices you have or might get later, or you need to get an HDMI switcher. With Ethernet, you just connect everything to a switch or router, and you're all set. One connection per component is all you need, and, if you use a router, everything immediately gets connectivity to the home network or Internet. And if a new component comes out that needs to talk to another component in a different way or using more bandwidth, that can all be handled in the firmware. As long as you don't flood the local network with more data than it can handle, everything is fine, and the rest of the networked devices, including the router and cables, can stay exactly the same.
Or did someone in the entertainment industry worry that using Ethernet for connecting entertainment devices would make it too easy for those evil hacker types to connect a computer to the setup and break their DRM? Or maybe that if this gear was too easily networked, we might...GASP!...use it to send video from our Internet-connected computers out into the living rooms, undermining traditional TV?
Why bother. Just use component video (Score:4, Insightful)
It amazes me how much the proles gobble this shit up when *gasp* analog component video is perfectly capable of handling a high bandwith video without all the incremental upgrades to a poorly thought out spec. Remember, a VGA cable (not quite as good as separate coax) is able to carry higher resolution and refresh rates than 1080p/60 and it could be all achieved on an early/mid 90's PC with a high end video card.
Re:Yah but (Score:4, Insightful)
Worst prediction ... ever ...
Re:Why not just use Ethernet? (Score:5, Insightful)
To sell more wheels.
Re:Why bother. Just use component video (Score:3, Insightful)
As I recall, early HD gear used just that, but the powers-that-be got worried that component video didn't do DRM, so those nasty evil pirates would have a far too easy time copying the video, so they decreed that any device that could output 1080 must do it only via HDMI, which supports HDCP. The side effect of this was that the early adopters who spent the really big bucks for HDTV sets got royally screwed, since no new gear that output video over 720p would connect to them. No 1080i/p cable or sat boxes, and no Blu-ray or HD-DVD. So you paid to get this tech first? Sucker!
Re:Why not just use Ethernet? (Score:2, Insightful)
btw I laugh whenever I refer to HD-TV's as 'hi-def' Lulz, our PC's have had better resolution for years and years
Bluray is the new Laser Disk
Re:Does any get that sinking feeling about HDMI? (Score:4, Insightful)
I thought six bucks for a fifteen foot cable was quite reasonable. You're not paying the extortionate prices for cables in the retail store, are you?
As for wireless HDMI, no, no, no. That's just what we need. Some huge bandwidth hog spewing unnecessary interference all over the little bit of spectrum we've got just because you find plugging one end of a cable into your blu-ray player and the other into your TV too confusing. Save the wireless for things that actually benefit from being wireless.
Re:Great (Score:1, Insightful)
It does? I don't see any mention of that in the cited article or in the article on HDMI in Wikipedia. AFAIK, HDCP is still a sitting duck if anyone reverse engineers 39 device keys. And probably has a few unpublished holes, in addition.
Because it doesn't look as good (Score:2, Insightful)
Modern displays are all digital. LCD, DLP, all digital technologies. The image source is then, of course, digital as well. Whether it is compressed MPEG-4 off a Blu-ray or a generated image off a 3D card, it is a digital source. So you go from digital to analogue and then back to digital. This loses quality, especially at high resolutions and colour depths. It is hard to build an ADC that does a really good job on a 100MHz 10-bit signal.
Try it with a good LCD (as in a high end one, not a $200 cheapie) sometime. Hook up DVI and VGA to the computer and switch back and forth. You'll get a better picture with DVI.
Re:I'm a geek, but... (Score:5, Insightful)
Simple example: what about networked players?
* Do you add a switch into the unit and have both the TV and the player on the same subnet/VLAN? Or does the player and TV have its own subnet and the unit acts as a router?
If we assume that these devices are going to be on Joe Average's network, then we do nothing fancy... behave just like any PC attached to the network would behave.
[Should the player be a router o]r should the TV?
Neither, unless one really *wanted* to make them a router, I would make them a switch. I suppose that you could advertise your network-enabled media device as a handy-dandy router, for those folks who can't be arsed to buy a $30 router + WAP. But then, you'd need to add an IP stack to the devices in question.
Which address space should it pick? What if it clashes with the existing network?
Use DHCP to figure this out. If there are no DHCP servers, turn to the procedures outlined in RFC 3927.
What if there are duplex negotiation bugs/issues?
I imagine that duplex/rate negotiation is a solved problem by now. Have you seen issues caused by non-broken hardware that would not negotiate rate or duplex settings?
Re:Ethernet (Score:2, Insightful)
Why Ethernet? Can you think of any devices that connect to your home theater (Game console, DVR, etc.), that have video, audio, and ethernet? Here is a hint.. I just named two. Can you look forward and see how more devices in the future will have network connectivity? I sure hope so.
And what is the point of having a network channel between these devices and a *display* ? As the GP asked, why do I need ethernet on my display device?
Re:Ethernet (Score:3, Insightful)
HDMI is also HDCopyProtection encumbered, For HD signals. If you want HD content it has to be HDCP while traveling from source to display. It is why Linux and Apple don't have blue ray players for HD content(legal ones at least)
As for ethernet other posters have covered that. I am more worried aobut HDMI ethernet also getting forced into so layer of DRM just because it was easier to DRM everything and sort it out later.
Re:Ethernet (Score:3, Insightful)
But how would ethernet between the display and the device accessing the network help in that? If the device is a computer, or an XBox or whatever, then it can view youtube and then display the video over HDMI. Having IP between the display and the device using the display doesn't seem to serve any purpose that I can think of but one*.
* And I don't like that purpose because it is a back-channel for DRM authentication.
Re:I'm a geek, but... (Score:5, Insightful)
People buying a 40" monitor would expect a much better resolution than that offered by 1080p... Look at the resolutions supported by the high end displays from Apple for instance.
Re:Does any get that sinking feeling about HDMI? (Score:2, Insightful)
Re:Why not just use Ethernet? (Score:3, Insightful)