Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Displays Upgrades IT

New HDMI 1.4 Spec Set To Confuse 357

thefickler writes "HDMI Licensing LLC, the company that determines the specifications of the HDMI standard, is set to release the HDMI 1.4 spec on 30 June. Unfortunately it could very well be the most confusing thing to ever happen to setting up a home theater. When the new cables are released, you're going to need to read the packaging very carefully because effectively there are now going to be five different versions of HDMI to choose from — HDMI Ethernet Channel, Audio Return Channel, 3D Over HDMI, 4K x2K Resolution Support and a new Automotive HDMI. At least we can't complain about consumer choice."
This discussion has been archived. No new comments can be posted.

New HDMI 1.4 Spec Set To Confuse

Comments Filter:
  • by YesIAmAScript ( 886271 ) on Saturday May 30, 2009 @11:17PM (#28155079)

    There are 5 cables in the spec, but the descriptions are incorrect.

    There 4 cables which are the 4 possible combinations of low-bandwidth (often referred to as HDMI 1.1) and high-bandwidth (capable of 1080p/60, deep color, etc., often referred to as HDMI 1.3) with the possibilities of supporting ethernet in the cable (100mbit) or not.

    So there are:
    low-bandwidth no ethernet (effectively an HDMI 1.1 cable)
    high-bandwidth no ethernet (effectively an HDMI 1.3 cable)
    low-bandwidth with 100mbit ethernet
    high-bandwidth with 100mbit ethernet

    Now, in reality, it's already difficult to buy an HDMI 1.1 cable, and likely few going to make a low-bandwidth cable with ethernet added, since low-bandwidth cables aren't popular already.

    So that leaves two of these cables to decide between:
    HDMI 1.3 cable
    high-bandwidth with 100mbit ethernet (perhaps to gain the popular name HDMI 1.4 cable?)

    and then there is one final cable, the wildcard, the automotive HDMI cable.

    So 3 cables to choose from, one of which is a weirdo cable (automotive).

    I don't think this will cause much of a problem.

    The options listed in the article, return channel, etc, are all things added to the spec that can be there for an HDMI 1.4 device but without needing a specialized cable.

  • by Tiger4 ( 840741 ) on Saturday May 30, 2009 @11:22PM (#28155119)

    I know technology never really stops, but the salesdroids/scammers will milk this mercilessly to generate sales. You only have 1.3 devices on each end, but if you don't have some flavor of 1.4 cable, it'll never work. And only $10 per foot. Scumsuckers preying on the ignorant.

  • Re:HDMI Ethernet (Score:3, Informative)

    by Tiger4 ( 840741 ) on Saturday May 30, 2009 @11:27PM (#28155165)

    OBSOLETE

    Thanks for coming out.

    100Mb/s bandwidth for a 40Mb/s signal. What is the problem?

  • by Kizeh ( 71312 ) on Saturday May 30, 2009 @11:44PM (#28155261)

    The reason I always found when griping that my plasma couldn't send audio, or even output SPDIF, was that it was a DRM restriction imposed on the manufacturers. No clue if that was true, and what might have changed.

  • Re:Set fail... (Score:2, Informative)

    by im_thatoneguy ( 819432 ) on Saturday May 30, 2009 @11:49PM (#28155297)

    Actually set stupid article to fail.

    They aren't releasing 72 different cables.

    They're releasing 3:

    1.4 (ethernet, 4k, etc)
    1.4 Mini. (Won't be used in a home theater. This will come with your ZuneHD, Sony HD camcorder or cell phone.)
    1.4 Automotive. (When would you ever buy that thinking it would work in your home theater system?)

    So in reality they're releasing 1 new cable that customers will ever encounter. And it'll make things MUCH less confusing for the customers. Buy a new home theater. Plug an HDMI cable from your receiver to your XBox 720, BluRay Player and TV. Done! No ethernet cable into your xBox 720, BluRay Player and TV. Now need to run an audio out cable from your TV to your receiver. Just one easy cable between every system and all the features should work.

  • Re:HDMI Ethernet (Score:3, Informative)

    by wintermute000 ( 928348 ) <{ua.moc.sserpxetenalp} {ta} {redneb}> on Saturday May 30, 2009 @11:51PM (#28155317)

    Gigabit is a lot more fussy about a lot of things. All four pairs are used and the standard (twist spacing etc.) is designed so all the crosstalk etc. cancels out nicely.

    The classic is a gigabit cable that someone cut into a small 1m cable which borks it (sometimes). This is because over that short length all 4 pairs happen to have the same twist or section of twist which means the interference cancellation is not working. You will notice Cat6 cables specify where you should cut them and the segment lengths are clearly defined. If you buy a premade short gigabit cable they're manufactured to make sure they work properly over that short length.

    Basically its fussier with finer tolerances and you want to KISS for the non techie home consumers. That's how I would look at it.

    Of course I also agree that is stupid

  • by upuv ( 1201447 ) on Saturday May 30, 2009 @11:54PM (#28155333) Journal

    The most successful products all have the same qualities.

    1. Simple
    2. Ubiquity
    3. Affordable

    HDMI is not simple.
    Ubiquity, Well I give it points here. It really was the first popular spec to finally include video and audio on one cable.
    Affordable. Not a chance. Ridiculous prices for cables and accessories.

    1 out of 3 is not good enough to survive. HDMI is setting it self up to be toppled of it's lofty perch.

    Wireless HDMI would rock.
    1. It would be simplish ( Some marketing guy would F&*K this up with some screwed up we must know what you are broadcasting so we can tap your wallet. )
    2. Ubiquity. No real restriction here on what is on the channel. So basically everything should work with everything else.
    3. NO HYPER EXPENSIVE CABLES. So that has to help the bottom line.

    Oh wait. The wireless HDMI spec is already here. Can anyone say Wireless USB 3.

  • Re:Set fail... (Score:5, Informative)

    by im_thatoneguy ( 819432 ) on Saturday May 30, 2009 @11:59PM (#28155367)

    My mistake. They are in fact releasing 5 + mini plug:

    o Standard HDMI Cable - supports data rates up to 1080i/60;
    o High Speed HDMI Cable - supports data rates beyond 1080p, including Deep Color and all 3D formats of the new 1.4 specification;
    o Standard HDMI Cable with Ethernet - includes Ethernet connectivity;
    o High Speed HDMI Cable with Ethernet - includes Ethernet connectivity;
    o Automotive HDMI Cable - allows the connection of external HDMI-enabled devices to an in-vehicle HDMI device.

    But. Standard HDMI cable == HDMI 1.1 cable and I don't even see those for sale anywhere. I assume it's pin compatible. So really the only new cables that people will encounter are:

    1.4 Highspeed (1080p -> 4k, 3D, Deep color etc)
    1.4 Highspeed + Ethernet.

    Automotive will be built into your car hidden away from view. So unless you work at crutchfield you can ignore it.

    Mini will be the same cables just with a differently sized plug.

  • Re:Great (Score:1, Informative)

    by Anonymous Coward on Sunday May 31, 2009 @12:01AM (#28155393)

    It also features an all new and improved DRM.

  • by NJRoadfan ( 1254248 ) on Sunday May 31, 2009 @12:01AM (#28155395)
    Actually thanks to the computer folks who felt HDMI didn't meet their "needs", we have yet another NIH standard: Display Port. It of course is not compatible with HDMI/DVI monitors, but hey its royalty-free.
  • by LoRdTAW ( 99712 ) on Sunday May 31, 2009 @12:56AM (#28155673)

    That's the way to go. When I bought my HDTV I mainly used it solely as a HTPC monitor and console monitor (have cable now so its a TV too).

    The biggest offense of HDMI is the simple fact that multiple HDMI inputs on a TV/monitor are useless. If you want to use a home theater receiver for surround sound then you have to upgrade to one that has HDMI inputs. Why? Well Hollywood decided that HDMI cannot have its digital sound passed through the optical or coaxial output of your TV. So if you are thinking of using that shiny new HDTV with four HDMI inputs as your AV switch box then your out of luck. Even though that TV has an optical/coaxial output it will be disabled for HDMI, only analog outputs will work. You need to buy an expensive HDMI receiver for that setup to work.

    My friend learned the hard way after purchasing a 47 inch HDTV with four HDMI inputs. He connected his XBox 360, PC and cable box. After digging through menus and testing his receiver he emailed the manufacturer of the TV and found out that his perfectly working Dolby digital receiver was now useless. He wound up getting an optical switch box to switch between inputs but OH wait the XBox 360's optical port is blocked when using the HDMI port. Fuck them for screwing us like that. HDCP and all the other copyright protection is a fucking sham.

  • by Xtravar ( 725372 ) on Sunday May 31, 2009 @01:50AM (#28155931) Homepage Journal

    OH wait the XBox 360's optical port is blocked when using the HDMI port.

    I agree with your general sentiment, but that statement is untrue. I'm using 360->HDMI->TV and 360->optical->receiver and it works fine. HTH!

  • by cgenman ( 325138 ) on Sunday May 31, 2009 @02:11AM (#28156013) Homepage

    HDMI for the most part is a pretty nice, simple, it-just-works standard. It transmits both audio (up to 7:1 I think) and video digitally in that one cable. It is nicely vertically shaped, so that it can't go in upside-down (BOO USB!), and in such a way that is really obvious when looking at it from a distance. There are no pins that bend or break. It's pretty easy to shove in while reaching around behind a TV without looking, and things don't seem to break when you hot plug it. While I think network over HDMI is a solution in serch of a problem (Does your TV need network access from your Xbox? Does your Xbox need display info from Fios?), it is still an interesting simplification.

    This is why a lot of computers now come with HDMI ports, and a lot of displays take HDMI in. It's not some panacea high-end thingie. It's a cheap cable that does everything... or at least everything we will soon be doing wirelessly. And thanks to the digital nature, the cheapest HDMI cables work basically as well as the most expensive ones.

    I connect laptops and computers to 5' Plasma TV's via HDMI all the time. It's not something designed to keep these two worlds apart, but a simplification that helps make them play better.

  • by evilviper ( 135110 ) on Sunday May 31, 2009 @02:32AM (#28156125) Journal

    Forgive me for not having kept up with the progress of HDMI,

    ...nor having the most basic knowledge of the topic at hand, correct?

    wouldn't it have made infinitely more sense to have simply used gigabit Ethernet for all this?

    Oh, my, yes. When transferring 16 Gigabits/sec of uncompressed HD video at high frame rates from your DVD player to your TV, what you really want is a 1 Gigabit/sec standard, designed for unreliable communications over 500 meter distances, using a shared-channel, with LOTS of overhead, and very high computational requirements...

    Your insight is... stunning.

  • by TopSpin ( 753 ) * on Sunday May 31, 2009 @03:23AM (#28156355) Journal

    Or did someone in the entertainment industry worry that using Ethernet for connecting entertainment devices would make it too easy for those evil hacker types to connect a computer to the setup and break their DRM?

    Forgive me; I'm going to offer something other than 0MG iT'5 Ev1L DEE-ARE-EM zors.

    Ethernet is slow. 10Gbit Ethernet is still exotic and costly. Gigabit Ethernet is much too slow for digital video, and gigabit phys cost more than what high volume TV manufactures will tolerate. An HDMI phy manufactured in 2003 sources or sinks 4.9Gbit/s. Two subsequent revisions have doubled this twice to 10Gbit and then 20Gbit. Basically HDMI provides an order of magnitude more bandwidth than the sort of common Ethernet you have in mind. Uncompressed digital video and audio (what HDMI does) requires emense bandwidth.

    Ethernet is designed for the LAN use case. Consider the magic 300m minimum distance copper Ethernet is built around. This distance is desirable because it covers a large percentage of facilities where LANs exist without additional infrastructure. Among other things, signal frequency and copper (read cheap) cable construction are both bound by this. HDMI has no such requirement and thus does not incur the cost to achieve it.

    HDMI clearly distinguishes between sources and sinks and has different expectations of each. Your digital TV will never suddenly begin transmitting Gbits of data someone will wish to render. It is exclusively a sink. Ethernet doesn't make provision for this sort of asymmetry which means both ends are peers and both suffer a certain minimum amount of complexity (read cost) because of it.

    Ethernet is overly robust for digital TV. There are no packet collisions between your cable box and your TV. While HDMI does provide for error detection and correction, the remedy is radically different than what occurs on a LAN (retransmission usually.) The bad data is just spaced. The moment has passed and whatever pixel(s) or audio samples were corrupted are replaced by new bits before you perceive it (hopefully.) Here is some language from HDMI 1.3, 7.7:

    The behavior of the Sink after detecting an error is implementation-dependent. However, Sinks
    should be designed to prevent loud spurious noises from being generated due to errors. Sample
    repetition and interpolation are well known concealment techniques and are recommended.

    You wouldn't need to read many IEEE 802.whatever documents see just how far computer networking is from the design of HDMI. It is an entirely distinct use case.

    Finally, HDMI provides timing guarantees that are totally absent in Ethernet. Devices are made cheaper through accurate timing (your TV doesn't need a larger high speed buffer for instance.) Recently so-called "Data Center Ethernet" has emerged to address this so that Ethernet can be used in latency sensitive applications. HDMI had this baked-in on day #1.

    Some people are convinced that DRM is the only concievable reason for creating HDMI and all other claims offered are a smokescreen. That's the fashion around Slashdot, anyhow. Don't believe it. Those folks don't know what digital TV is.

  • by Anonymous Coward on Sunday May 31, 2009 @04:14AM (#28156503)

    Umm, HDCP(DRM) works fine with DVI or HDMI. Or do you mean that the DVI device *might* not support HDCP(meaning, if I've read stuff correctly, that the sending device might decide not to send the signal)?

    In fact, you can get adapters (that are just switching connectors around) for HDMI to/from DVI. What you gain with HDMI over DVI is audio and video over one cable, rather than just audio. (Looks like you might also get remote control features)
    Wiki Page [wikipedia.org].
    Personal experience: Brother's monitor with DVI says HDCP compatible, works fine with friend's PS3 with HDMI to DVI adapter. Biggest problem they had was just setting up so audio went out through the analog plugs.

  • Re:Ethernet (Score:3, Informative)

    by phulegart ( 997083 ) on Sunday May 31, 2009 @04:41AM (#28156573)

    Hello?

    Um... [tap tap tap]... hello?

    HDMI stands for High Definition Multimedia Interface. If it was just a monitor cable, there would be no audio either. It would just be video. In other words, it would not be HDMI. It would just be another DVI cable.

    The point behind HDMI was to reduce the number of cables necessary to hook up a multimedia device into an entertainment center.

    Plus, I don't know of any brand of monitor that comes with a HDMI input. I know plenty of televisions that have them, and those televisions can be used as monitors... but they are NOT monitors. I've seen cables that convert from DVI to HDMI, but that is only video. There is no audio portion to that cable.

    Why Ethernet? Can you think of any devices that connect to your home theater (Game console, DVR, etc.), that have video, audio, and ethernet? Here is a hint.. I just named two. Can you look forward and see how more devices in the future will have network connectivity? I sure hope so.

  • by Nirvelli ( 851945 ) on Sunday May 31, 2009 @05:32AM (#28156775)
    I don't know what your friend's doing wrong, but I'm using HDMI and optical audio from an Xbox360 and it works just fine...
  • Re:Ethernet (Score:3, Informative)

    by EyelessFade ( 618151 ) on Sunday May 31, 2009 @05:38AM (#28156791) Homepage
    > Plus, I don't know of any brand of monitor that comes with a HDMI input
    Lots of them, my Dell 2408WFP for once
  • by swilver ( 617741 ) on Sunday May 31, 2009 @06:10AM (#28156917)

    Oh come now, I use VGA to display 1600x1200 at 100 fps (on a CRT). I noticed that at this speed, the picture can become slightly blurry (only slightly mind you) when using an old cheap analog VGA cable. When run at 80 fps the picture was crisp and clear again and indistinguishable from the digital equivalent.

    Do I need to explain that 1600x1200 @ 80 Hz is way more bandwidth than 1920x1080 @ 25 Hz?

  • by Sycraft-fu ( 314770 ) on Sunday May 31, 2009 @07:18AM (#28157171)

    No you don't, what you do need to do is understand that you are talking about CRT vs LCD/DLP. One is not the other. A CRT is an inherently analogue device. Thus a VGA cables works perfectly fine. An LCD is not. It makes sense to transmit information digitally to it. This is the reason for DVI on computers.

    D-A-D is not a lossless process. Converters aren't perfect, and that goes double when manufacturers don't wish to spend tons of money on them. There is also no reason to spend tons of money on them, when you've got digital devices on both ends. They will often cheap out, since it is for older compatibility only.

    As an example take my monitor, a nice NEC display. When connected digitally, it has numerous controls, but all relating to things such as brightness, colour, scaling and so on. They are all for the user to configure how they'd like the image displayed. None of them are to deal with image problems, to correct for transmission errors.

    In analogue mode, all those same controls remain but there are a number of additional ones:

    --H. Position, V. Position, H. Size, V.Size, and Fine: These all control the position and sizing of the picture on the display, since it is no longer receiving absolute information.

    --R-H Position, G-H Position, B-H Position, R-Fine, G-Fine, and B-Fine. These control the signal timing of individual colour channels, since they can vary.

    --R-Sharpness, G-Sharpness, B-Sharpness. These attempt to control the amount of bleed between adjacent pixels.

    On top of all that, they have software called cable comp to deal with color timing difference problems form long analogue cables.

    All that, just to use a VGA connection and attempt to get the maximum signal quality out of it. Or, you can use a DVI connection and then there's no problem. All those controls are locked out since they aren't needed. There's no problems with positioning or sizing or timing. The pixels are sent digitally to the display.

    When you have digital devices, you want to keep transmission digital. It just makes more sense. All you do by using an analogue cable in the middle is increase your problems and/or costs.

    If you mean why not use a VGA cable to transmit digital data, the reason was to go with a balanced signal. VGA uses 5 unbalanced connections for video data (RGBHV). The problem with that is to get longer runs or more noise resistance, you need thick coax cable which won't fit. DVI (and by extension HDMI) use balanced TMDS signaling and signals over 4 twisted pairs.

  • Re:Ethernet (Score:3, Informative)

    by Bert64 ( 520050 ) <bert@[ ]shdot.fi ... m ['sla' in gap]> on Sunday May 31, 2009 @08:13AM (#28157377) Homepage

    You can do audio over DVI too, i have a satellite receiver called a DM800 which does exactly that... DVI on the DM800 end, and HDMI on the TV end, audio works over it.

    Some videocards do that too i believe.

  • Re:Set fail... (Score:3, Informative)

    by d3ac0n ( 715594 ) on Sunday May 31, 2009 @09:10AM (#28157635)

    Crutchfield [crutchfield.com].

    Premier automotive audiophile / general audiophile source in N.A. They've been around since the paper catalog mail order days, and have transitioned well to the internet.

    Obviously, if you are a /. reader from outside N.A. you might not have heard of them.

  • Re:Ethernet (Score:3, Informative)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Sunday May 31, 2009 @09:45AM (#28157795) Journal

    HDMI is also HDCopyProtection encumbered, For HD signals.

    *facepalm*

    No, it is not. HDMI, at least the kind we use today, is DVI plus Audio. That's it.

    HDCP works just as well over DVI as it does over HDMI. Similarly, I currently use an HDMI cable to hook my laptop up to a second monitor -- and that does not use HDCP, as far as I can tell. (It'd be strange if it did, as I am running Linux.)

    If you want HD content it has to be HDCP while traveling from source to display.

    That depends entirely on where you're getting it from. If I get mine from BitTorrent, or from a video game, no, it doesn't have to be HDCP'd. And I don't have a Blu-Ray drive.

  • by TheGratefulNet ( 143330 ) on Sunday May 31, 2009 @10:54AM (#28158261)

    plus, there is no 'back channel' in component analog video. therefore, no drm! backchannel = evil.

    hdmi has that handshake crap going on. it causes me (and a lot of others) to have to reboot our tvs (really). if you power off the tv and then power it on with a stereo (switched ac outlet is the use-case) then the tv loses 'link' with the hdmi peer (in my case a popcorn hour hdmi sender) and the tv needs bouncing before you get any hdmi signal!

    you get NONE of these problems with analog comp-out wiring. you do have to worry about analog wire quality and length of cable run - but at least no DRM anymore. that may cause me to go 'back' to comp instead of hdmi.

    also, comp is easier to switch (circuit level) than hdmi. no license needed to build comp circuits (I build stuff for analog and digital audio but NOT digital video cause - well - they stop me). analog is still more DIY friendly.

    finally, keeping audio OUT of the video cable bundle is a good thing. it really is. regular spdif and the raw variants (dd5.1 and dts) over non-hdmi cabling is also just fine for us end consumers. the 'true hd' stuff is fluff and massive overkill for end-user consumer use.

    its good that some people still see the evil that hdmi did and are ok with 'going back' to comp video and spdif digital audio (sep wires, of course).

    hdmi muxed audio and video. that was a fatal flaw for us end users. great for equip makers but fatal for us.

    finally, there may be some hope to capture comp video at HD resolutions - for use in a myth-tv system. but if your pay tv is ONLY in hdmi format, its very hard (if not impossible) to use that stream and record it via your myth box. analog is the only real hope, I think, to integrate with myth and HD resolutions. (other than free OTA or clear-qam; but I'm talking about wanting to myth your HBO/SHO etc streams, too).

  • Re:Set fail... (Score:1, Informative)

    by Anonymous Coward on Sunday May 31, 2009 @11:07AM (#28158347)

    I wonder if the the new spec will support closed captioning.

    Despite the awesome visual and aural quality offered by this spec, the question remains whether those who are hearing impaired will be considered or left in the analog dust.

    Someone fouled up when the "digital age" started rolling in. Just google "closed captioning hdmi" or "hdmi closed captioning" or take a look at http://en.wikipedia.org/wiki/Closed_captioning#HDTV_interoperability_issues

    And by the way, be careful not to confuse subtitles with closed captioning. They are different.

  • by LoRdTAW ( 99712 ) on Sunday May 31, 2009 @01:00PM (#28159181)

    On his Xbox 360 the optical port is only accessible by plugging in the analog cable. BUT the HDMI port is right on top of the analog port so when the analog cable is plugged in the HDMI port is partially blocked. Its the Version that came right after the first model that did not come with HDMI. So for his model you cannot use HDMI and optical audio at the same time.

  • Re:Ethernet (Score:2, Informative)

    by Supergibbs ( 786716 ) on Sunday May 31, 2009 @06:20PM (#28161667) Homepage
    Um... [tap tap tap]... hello? Ever tried this thing called Google? Amazing what you can find...

    Computer monitor with HDMI input [viewsonic.com]

    DVI/HD Audio to HDMI with audio converter [svideo.com]

    You can argue that they aren't TVs but there are devices that are advertised as TVs without tuners designed for use with Cable/Satellite. Here is one (notice the category it's under) [sonystyle.com]

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...