Forgot your password?
typodupeerror
Displays Upgrades IT

New HDMI 1.4 Spec Set To Confuse 357

Posted by timothy
from the avoid-confusion-stick-with-coax dept.
thefickler writes "HDMI Licensing LLC, the company that determines the specifications of the HDMI standard, is set to release the HDMI 1.4 spec on 30 June. Unfortunately it could very well be the most confusing thing to ever happen to setting up a home theater. When the new cables are released, you're going to need to read the packaging very carefully because effectively there are now going to be five different versions of HDMI to choose from — HDMI Ethernet Channel, Audio Return Channel, 3D Over HDMI, 4K x2K Resolution Support and a new Automotive HDMI. At least we can't complain about consumer choice."
This discussion has been archived. No new comments can be posted.

New HDMI 1.4 Spec Set To Confuse

Comments Filter:
  • Set fail... (Score:4, Funny)

    by Shikaku (1129753) on Saturday May 30, 2009 @11:12PM (#28155059)

    For HD.

    • by Jeremiah Cornelius (137) on Saturday May 30, 2009 @11:13PM (#28155065) Homepage Journal

      There will be 12 different MONSTER cables.

      I look forward to the Audiophile gold ends.

      • by bmo (77928) on Saturday May 30, 2009 @11:20PM (#28155099)

        Oxygen free "magnetically aligned" copper, hand twisted, and manually rubbed between the breasts of virgins for extra "lustre."

        They just won't tell you that the virgins look like Rush Limbaugh.

        --
        BMO

        • by Qubit (100461) on Saturday May 30, 2009 @11:30PM (#28155183) Homepage Journal

          They just won't tell you that the virgins look like Rush Limbaugh.

          Well they are Monster cables....

        • by timeOday (582209) on Saturday May 30, 2009 @11:30PM (#28155187)

          They just won't tell you that the virgins look like Rush Limbaugh.

          Rush Limbaugh has been married (and divorced) three times - but he has no children (cite [about.com])... so he may in fact be the virgin in question.

        • Re: (Score:3, Funny)

          Now if you let people pay TO rub them between the breasts of virgins I don't think any of us would be mocking them quite so much.

        • Re: (Score:3, Funny)

          by mrsteveman1 (1010381)

          Oxygen free "magnetically aligned" copper, hand twisted, and manually rubbed between the breasts of virgins for extra "lustre."

          That's required if you want to watch porn over the cable, it's sorta like how you have to magnetize a screwdriver with a magnet if you want screws to stick to it.

    • Re: (Score:2, Insightful)

      by Reikk (534266)
      The great thing about standards is that there are so many to choose from
    • Re: (Score:2, Informative)

      by im_thatoneguy (819432)

      Actually set stupid article to fail.

      They aren't releasing 72 different cables.

      They're releasing 3:

      1.4 (ethernet, 4k, etc)
      1.4 Mini. (Won't be used in a home theater. This will come with your ZuneHD, Sony HD camcorder or cell phone.)
      1.4 Automotive. (When would you ever buy that thinking it would work in your home theater system?)

      So in reality they're releasing 1 new cable that customers will ever encounter. And it'll make things MUCH less confusing for the customers. Buy a new home theater. Plug an HDMI

      • by 3vi1 (544505) on Saturday May 30, 2009 @11:53PM (#28155331) Homepage Journal

        Yes, much less confusing.

        Hey, if I want to use a ZuneHD in my car which cable do I need?

      • Re:Set fail... (Score:5, Informative)

        by im_thatoneguy (819432) on Saturday May 30, 2009 @11:59PM (#28155367)

        My mistake. They are in fact releasing 5 + mini plug:

        o Standard HDMI Cable - supports data rates up to 1080i/60;
        o High Speed HDMI Cable - supports data rates beyond 1080p, including Deep Color and all 3D formats of the new 1.4 specification;
        o Standard HDMI Cable with Ethernet - includes Ethernet connectivity;
        o High Speed HDMI Cable with Ethernet - includes Ethernet connectivity;
        o Automotive HDMI Cable - allows the connection of external HDMI-enabled devices to an in-vehicle HDMI device.

        But. Standard HDMI cable == HDMI 1.1 cable and I don't even see those for sale anywhere. I assume it's pin compatible. So really the only new cables that people will encounter are:

        1.4 Highspeed (1080p -> 4k, 3D, Deep color etc)
        1.4 Highspeed + Ethernet.

        Automotive will be built into your car hidden away from view. So unless you work at crutchfield you can ignore it.

        Mini will be the same cables just with a differently sized plug.

        • What's the ethernet connectivity in the monitor for? I suppose it could be well-intentioned, for cable or IPTV say, but I'm concerned it might be for validating DRM against public key servers.

          • Re: (Score:3, Informative)

            by phulegart (997083)

            Hello?

            Um... [tap tap tap]... hello?

            HDMI stands for High Definition Multimedia Interface. If it was just a monitor cable, there would be no audio either. It would just be video. In other words, it would not be HDMI. It would just be another DVI cable.

            The point behind HDMI was to reduce the number of cables necessary to hook up a multimedia device into an entertainment center.

            Plus, I don't know of any brand of monitor that comes with a HDMI input. I know plenty of televisions that have them, and those te

            • Re: (Score:3, Informative)

              by EyelessFade (618151)
              > Plus, I don't know of any brand of monitor that comes with a HDMI input
              Lots of them, my Dell 2408WFP for once
            • Re: (Score:3, Insightful)

              by peragrin (659227)

              HDMI is also HDCopyProtection encumbered, For HD signals. If you want HD content it has to be HDCP while traveling from source to display. It is why Linux and Apple don't have blue ray players for HD content(legal ones at least)

              As for ethernet other posters have covered that. I am more worried aobut HDMI ethernet also getting forced into so layer of DRM just because it was easier to DRM everything and sort it out later.

              • Re: (Score:3, Informative)

                HDMI is also HDCopyProtection encumbered, For HD signals.

                *facepalm*

                No, it is not. HDMI, at least the kind we use today, is DVI plus Audio. That's it.

                HDCP works just as well over DVI as it does over HDMI. Similarly, I currently use an HDMI cable to hook my laptop up to a second monitor -- and that does not use HDCP, as far as I can tell. (It'd be strange if it did, as I am running Linux.)

                If you want HD content it has to be HDCP while traveling from source to display.

                That depends entirely on where you're getting it from. If I get mine from BitTorrent, or from a video game, no, it doesn't have to be HDCP'd. And I don't have a Blu-Ray drive.

            • Re: (Score:3, Informative)

              by Bert64 (520050)

              You can do audio over DVI too, i have a satellite receiver called a DM800 which does exactly that... DVI on the DM800 end, and HDMI on the TV end, audio works over it.

              Some videocards do that too i believe.

  • by YesIAmAScript (886271) on Saturday May 30, 2009 @11:17PM (#28155079)

    There are 5 cables in the spec, but the descriptions are incorrect.

    There 4 cables which are the 4 possible combinations of low-bandwidth (often referred to as HDMI 1.1) and high-bandwidth (capable of 1080p/60, deep color, etc., often referred to as HDMI 1.3) with the possibilities of supporting ethernet in the cable (100mbit) or not.

    So there are:
    low-bandwidth no ethernet (effectively an HDMI 1.1 cable)
    high-bandwidth no ethernet (effectively an HDMI 1.3 cable)
    low-bandwidth with 100mbit ethernet
    high-bandwidth with 100mbit ethernet

    Now, in reality, it's already difficult to buy an HDMI 1.1 cable, and likely few going to make a low-bandwidth cable with ethernet added, since low-bandwidth cables aren't popular already.

    So that leaves two of these cables to decide between:
    HDMI 1.3 cable
    high-bandwidth with 100mbit ethernet (perhaps to gain the popular name HDMI 1.4 cable?)

    and then there is one final cable, the wildcard, the automotive HDMI cable.

    So 3 cables to choose from, one of which is a weirdo cable (automotive).

    I don't think this will cause much of a problem.

    The options listed in the article, return channel, etc, are all things added to the spec that can be there for an HDMI 1.4 device but without needing a specialized cable.

    • I'm a geek, but... (Score:5, Insightful)

      by raddan (519638) on Saturday May 30, 2009 @11:32PM (#28155197)
      Ugh. Maybe you can explain why I'd want to buy an HDTV with all of the accoutrements rather than buy a vastly cheaper flat panel display, and use it with my far more flexible computer. In my opinion, TVs and computers are converging, and new revisions of HDMI are a way to keep them differentiated. Is there really an advantage to an HDTV? This is the thing that has stopped me from buying an HDTV.

      Now, as far as cabling goes, I suspect most of this is driven by a marketing department. If you look at computer display technology, which has been in rapid flux for at least 20 years, they've managed to standardize on TWO different connectors: one for analog and one for digital. Sure, there are some weirdo ones out there, like ADC and 13W3, but they never really had any real relevance. But with TVs, which is ostensibly simpler than computer displays, we have this panoply of cables. Why?

      Now, Cat5e-- that's an impressive technology. The data rates people have been able to squeeze out of plain ol' twisted pair! But seriously; we do everything in software now. Why does television insist on having cable after cable to do functions that we could do with a single one?
      • Re: (Score:3, Insightful)

        by wintermute000 (928348)

        er, because a 42 inch computer flat panel will cost far more than a 1080p LCD of the same size??

        Here in Oz you will be paying 1000AUD for a 30inch monitor. That cash will get you a 42" 1080p plasma. Heck 1080p is fine for couch computing. Can you even get 42inch monitors?
        Also the TV will most likely have tuners built in. You and I run a media center. Most people do not.

        And for your final question: do you really want average consumers to wrestle with ethernet and a TCP/IP stack just to get a signal from thei

        • Re: (Score:2, Troll)

          by daver00 (1336845)

          Sorry but, fail. Pretty much every large flat panel I've seen comes with a DVI input, and many with good ole analogue and coax inputs. You don't buy a goddamn 42" computer screen, you just go buy a 42" TV and plug your computer into it. You can even get a DVI -> HDMI adapter so on the off and bizarre circumstance that your new TV doesn't have a DVI input you can directly connect your computer output into HDMI anyway.

        • by ion.simon.c (1183967) on Sunday May 31, 2009 @03:29AM (#28156363)

          Simple example: what about networked players?
          * Do you add a switch into the unit and have both the TV and the player on the same subnet/VLAN? Or does the player and TV have its own subnet and the unit acts as a router?

          If we assume that these devices are going to be on Joe Average's network, then we do nothing fancy... behave just like any PC attached to the network would behave.

          [Should the player be a router o]r should the TV?

          Neither, unless one really *wanted* to make them a router, I would make them a switch. I suppose that you could advertise your network-enabled media device as a handy-dandy router, for those folks who can't be arsed to buy a $30 router + WAP. But then, you'd need to add an IP stack to the devices in question.

          Which address space should it pick? What if it clashes with the existing network?

          Use DHCP to figure this out. If there are no DHCP servers, turn to the procedures outlined in RFC 3927.

          What if there are duplex negotiation bugs/issues?

          I imagine that duplex/rate negotiation is a solved problem by now. Have you seen issues caused by non-broken hardware that would not negotiate rate or duplex settings?

          • Re: (Score:3, Interesting)

            by Bert64 (520050)

            Streaming media capabilities built in to the TV would be quite useful... Unfortunately, they're more likely to use the functionality to implement DRM.

      • Re: (Score:2, Informative)

        by NJRoadfan (1254248)
        Actually thanks to the computer folks who felt HDMI didn't meet their "needs", we have yet another NIH standard: Display Port. It of course is not compatible with HDMI/DVI monitors, but hey its royalty-free.
        • by Winckle (870180)

          There is a direct converter you can get for DVI to DisplayPort.

          • by YesIAmAScript (886271) on Sunday May 31, 2009 @03:01AM (#28156285)

            No there isn't.

            There is no adapter that will let you hook a DVI output to a DisplayPort-only monitor.

            There are physical adapters that let you get DVI output from some DisplayPort ports. But it just ties a line on the connector that tells the sending device to not actually send DisplayPort signaling, but send DVI instead. This has a couple problems. First is that you are still paying the DVI licensing fees, including HDCP fees. Second is that if the source device doesn't have this alternate mode, the adapter doesn't work, because it can't convert it itself, it can only tell the sending device to send DVI instead.

            Apple's adoption of DisplayPort seems like a disaster so far. If you have a more then 3 month old MacPro or iMac, Apple doesn't have a 24" display they can offer you. If you want to put an Apple 30" display on your MacPro, Mac Mini, MacBook or iMac, you need a $99 adapter that is large, takes up a USB port and doesn't even work right on some displays. And if you want to be able to give a presentation from your MacBook/Macbook Pro, you had better have brought a gaggle of adapters with you, since there isn't a projector on the planet that accepts DisplayPort. Not that you would have a DisplayPort cable to connect to the projector anyway, Apple doesn't even sell one! And even if the projector had a DisplayPort cable already attached, you couldn't use that either because Apple used mini DisplayPort, so it's adapter time again, except Apple doesn't sell that adapter either.

            • by gittela (248158) on Sunday May 31, 2009 @07:29AM (#28157215)

              Now, as I work most of my time as an AV-tech, I'd have to say that this is truth with modifications.
              Projectors at conferences are usually vga only. I've never encountered a DVI cable in static systems at conferences. Sure, when we set it up ourselves and go for high quality HD projectors @ 10k ansilumens we will use our nice fiberoptic dvi cables or hd-sdi, but most of the time it is vga/rgb-hv.

              That means one(1) adapter, if you bring your own laptop. Even peecees come with DVI these days.
              For us techs, it means 3 adapters, one dvi-vga, one minidvi-vga and one DisplayPort-vga. This will not make much of a difference in our flight cases...

              Apart from that, I agree. Apple pulled a bit of a stunt with the DisplayPort. While I like the new port, I think it's way too arrogant to assume that people will ditch a 6 months old machine just like that. :-)
              H

          • Re: (Score:3, Interesting)

            by TheRaven64 (641858)
            The DisplayPort specification allows you to run DVI signals through the DisplayPort connector. Apple sells two DP to DVI adaptors. The single-link one just puts the laptop's display hardware in DVI mode and is a trivial connector. This will not work on all DP systems, because it requires the graphics hardware to support encoding both DVI and DP signals. The dual-link one is much more complex. There are not enough pins in DP for this to work, so it needs to decode each frame and re-encode it as a DL-DVI
      • Re: (Score:3, Interesting)

        by im_thatoneguy (819432)

        I bought a 52" 1080p LCD *AS* my Computer Monitor.

        Computer DVI -> HDMI.

        Works great. I have a little 19" LCD off to the side that isn't even plugged in. Just in case.

        For BluRay and HD-DVD I have a combo driver which cost $90 in my computer. Everything is run through a single harmony remote and wireless keyboard/mouse.

        I also have a laptop in case I want to browse the web while watching TV. Just about the perfect setup if you ask me. Perhaps not for work when I want dual displays but more than adequate f

        • I also have a laptop in case I want to browse the web while watching TV

          I assume trying both on the 52" via split-screen is a no-go?

        • I bought a 52" 1080p LCD *AS* my Computer Monitor.

          That's gotta be sore on the neck.

        • by Khyber (864651)

          I'm still using 15-pin D-SUB on my 32" 1080p LCDTV. What's this DVI you talk about? ;)

      • by ceoyoyo (59147)

        LCD monitors are not cheaper than HD TVs.

        As for computer displays, the cables currently in use are, what, at least different flavours of DVI (analog, digital, dual link), display port and VGA. HDMI basically has the cheap low bandwidth cables that are pretty hard to find now, and the ones everyone uses that can handle 1080p. Apparently the new ones may or may not have ethernet support built in, which seems kind of dumb to me, but whatever.

        The cable confusion seems about equal to me.

      • by LoRdTAW (99712) on Sunday May 31, 2009 @12:56AM (#28155673)

        That's the way to go. When I bought my HDTV I mainly used it solely as a HTPC monitor and console monitor (have cable now so its a TV too).

        The biggest offense of HDMI is the simple fact that multiple HDMI inputs on a TV/monitor are useless. If you want to use a home theater receiver for surround sound then you have to upgrade to one that has HDMI inputs. Why? Well Hollywood decided that HDMI cannot have its digital sound passed through the optical or coaxial output of your TV. So if you are thinking of using that shiny new HDTV with four HDMI inputs as your AV switch box then your out of luck. Even though that TV has an optical/coaxial output it will be disabled for HDMI, only analog outputs will work. You need to buy an expensive HDMI receiver for that setup to work.

        My friend learned the hard way after purchasing a 47 inch HDTV with four HDMI inputs. He connected his XBox 360, PC and cable box. After digging through menus and testing his receiver he emailed the manufacturer of the TV and found out that his perfectly working Dolby digital receiver was now useless. He wound up getting an optical switch box to switch between inputs but OH wait the XBox 360's optical port is blocked when using the HDMI port. Fuck them for screwing us like that. HDCP and all the other copyright protection is a fucking sham.

        • by Xtravar (725372) on Sunday May 31, 2009 @01:50AM (#28155931) Homepage Journal

          OH wait the XBox 360's optical port is blocked when using the HDMI port.

          I agree with your general sentiment, but that statement is untrue. I'm using 360->HDMI->TV and 360->optical->receiver and it works fine. HTH!

          • by LoRdTAW (99712) on Sunday May 31, 2009 @01:00PM (#28159181)

            On his Xbox 360 the optical port is only accessible by plugging in the analog cable. BUT the HDMI port is right on top of the analog port so when the analog cable is plugged in the HDMI port is partially blocked. Its the Version that came right after the first model that did not come with HDMI. So for his model you cannot use HDMI and optical audio at the same time.

        • Re: (Score:3, Interesting)

          by spire3661 (1038968)

          THere is an adapter made by MS that allows HDMI with optical(or RCA jack Stereo) output. Full Dolby Digital through optical. In this configuration i get full HDCP link DVD upscaling. PS3 same thing. HDMI out straight to TV, optical to a receiver. I get Dolby Digital, DTS, the works.

        • Re: (Score:3, Informative)

          by Nirvelli (851945)
          I don't know what your friend's doing wrong, but I'm using HDMI and optical audio from an Xbox360 and it works just fine...
      • by cgenman (325138) on Sunday May 31, 2009 @02:11AM (#28156013) Homepage

        HDMI for the most part is a pretty nice, simple, it-just-works standard. It transmits both audio (up to 7:1 I think) and video digitally in that one cable. It is nicely vertically shaped, so that it can't go in upside-down (BOO USB!), and in such a way that is really obvious when looking at it from a distance. There are no pins that bend or break. It's pretty easy to shove in while reaching around behind a TV without looking, and things don't seem to break when you hot plug it. While I think network over HDMI is a solution in serch of a problem (Does your TV need network access from your Xbox? Does your Xbox need display info from Fios?), it is still an interesting simplification.

        This is why a lot of computers now come with HDMI ports, and a lot of displays take HDMI in. It's not some panacea high-end thingie. It's a cheap cable that does everything... or at least everything we will soon be doing wirelessly. And thanks to the digital nature, the cheapest HDMI cables work basically as well as the most expensive ones.

        I connect laptops and computers to 5' Plasma TV's via HDMI all the time. It's not something designed to keep these two worlds apart, but a simplification that helps make them play better.

        • Re: (Score:3, Interesting)

          by dangitman (862676)

          It's a cheap cable that does everything... or at least everything we will soon be doing wirelessly. .

          What the hell? HDMI is one of the most overpriced cables out there, for what it is.

          And thanks to the digital nature, the cheapest HDMI cables work basically as well as the most expensive ones

          No, you need expensive, high-quality HDMI cables if you want to do anything other than a short cable run. The "digital nature" doesn't help with this. Even though it's digital, the frequencies it operates at cause failure if you go beyond the rated distances. And the rated distances aren't very long at all.

      • by YesIAmAScript (886271) on Sunday May 31, 2009 @02:45AM (#28156201)

        HDTVs are cheaper than flat panel displays in equivalent sizes. But as to your question, unless you want a big panel, don't buy a big panel, it's a waste of money.

        Computer display has not been in rapid flux compared to TVs. TVs have gone from not having any input except an antenna input to having composite inputs, then s-video, then component, then HDMI in a little bit under 30 years. In the same time, computer monitors used 3 connectors, the original DB-9 (three flavors, RGB+I, RrGgBb, then analog RGB), then the HDI-15 (VGA connector), then DVI in a slightly shorter timeframe.

        In the case of TVs, every change of connector/signaling was due to needing increased resolution except for HDMI. HDMI was to simplify connections, and indeed, a single HDMI connector is far simpler (and cheaper) than 3 RCAs for video + 2 RCAs or a single optical cable for audio.

        TVs are by FAR not simpler than computer displays. HDMI allows 36 bit color, it allows more than 3 (RGB) channel color. It also brings audio, including multichannel audio. It also brings control signaling through CEC and now ethernet. With HDMI, turning on your TV can automatically turn on your amp. That doesn't happen with computers unless you use a 2nd cable, a USB cable.

        I can't say I'm thrilled that the HDMI group can't understand that changing the spec less often will help make sure it is successful. But I do like the idea of return audio on HDMI 1.4. I do like the audio sync that was brought with HDMI 1.3. The multichannel audio in HDMI 1.1a was a fantastic idea, solving the problem computers never solved properly (which is why computers have 3 1/8" jacks for audio on the back instead of a single audio connector).

  • HDMI Ethernet (Score:5, Insightful)

    by NFN_NLN (633283) on Saturday May 30, 2009 @11:20PM (#28155107)

    â HDMI Ethernet Channel
    "The HDMI 1.4 specification will add a data channel to the HDMI cable and will enable high-speed bi-directional communication. Connected devices that include this feature will be able to send and receive data via 100 Mb/sec Ethernet, making them instantly"... OBSOLETE

    Thanks for coming out.

    • Re: (Score:3, Informative)

      by Tiger4 (840741)

      OBSOLETE

      Thanks for coming out.

      100Mb/s bandwidth for a 40Mb/s signal. What is the problem?

      • Re:HDMI Ethernet (Score:5, Insightful)

        by pla (258480) on Saturday May 30, 2009 @11:43PM (#28155257) Journal
        100Mb/s bandwidth for a 40Mb/s signal. What is the problem?

        Well, for starters, 1080p (keep in mind this involves "raw" devices, not sending an MPEG4 down the line) uses just shy of 1.5 Gbps.

        We can follow that up with "anyone not using wireless already upgraded to gig-E switches about five years ago".

        We can then finish it off with one of my favorites (actually not, but in this case it really does serve the described need) - Any attached devices needing bidirectional communication can use plain ol' ubiquitous USB. And really, do my speakers actually need to talk back to my receiver under any even remotely plausible scenario that doesn't scream "DRM, mother fucker, do you speak it?"
        • Re: (Score:3, Interesting)

          by FrostDust (1009075)

          And really, do my speakers actually need to talk back to my receiver under any even remotely plausible scenario that doesn't scream "DRM, mother fucker, do you speak it?"

          Off the top of my head, I imagine you could have a mic each speaker so the system can dynamically adjust for ambient noise, or something silly like a surround sound system pinging itself during setup to concentrate the "surround" effect on the desired area.

          I also guess you could work this into some type of karaoke/Rock Band game.

        • by nabsltd (1313397)

          Then, too, wouldn't it make a lot more sense to connect your XBox, PS3, etc., all directly to the network to stream videos from a computer, so that the data moves:
          computer==>XBox==>receiver
          rather than
          computer==>receiver==>XBox==>receiver

          It's not like Cat6 cables are heavy or expensive, and a 8-port switch is less than $50.

  • by Tiger4 (840741) on Saturday May 30, 2009 @11:22PM (#28155119)

    I know technology never really stops, but the salesdroids/scammers will milk this mercilessly to generate sales. You only have 1.3 devices on each end, but if you don't have some flavor of 1.4 cable, it'll never work. And only $10 per foot. Scumsuckers preying on the ignorant.

  • by kithrup (778358) on Saturday May 30, 2009 @11:22PM (#28155123)

    But the main article is fairly wrong. The Audio Return channel doesn't require a different cable, and the higher resolutions and 3D will both work over the high-bandwidth version. The ethernet options will be different cables, as will the automotive, so there will be quite a few new cables, but I don't think that's particularly confusing. (That's normal HDMI; HDMI plus ethernet; high-speed HDMI; high-speed HDMI plus ethernet; and automotive HDMI.)

    dvice.com [dvice.com] has some analysis and the press release.

    The Audio Return thing will allow your display to send audio to your receiver, instead of using a second audio (e.g. optical or coaxial) cable. Why that wasn't there from the beginning is beyond me, since the connection was already bidirectional (to negotiate DRM).

    • Re: (Score:3, Informative)

      by Kizeh (71312)

      The reason I always found when griping that my plasma couldn't send audio, or even output SPDIF, was that it was a DRM restriction imposed on the manufacturers. No clue if that was true, and what might have changed.

      • by kithrup (778358)

        The "Audio Return Channel" should allow that -- the normal HDCP negotiation can go on. Hopefully, this will let you plug HDMI devices into your TV, and have the receiver be able to handle the audio.

        That'll require a new TV and a new receiver of course. sigh

  • Yah but (Score:5, Insightful)

    by Colin Smith (2679) on Saturday May 30, 2009 @11:23PM (#28155127)

    Are they gold plated?

    The TV manufacturers are simply screwing themselves over. They're dreaming. The new standard is going to be a computer screen attached to a PC streaming from youtube or similar.

     

  • ...If those 5 connector types came in 7 different versions...

    One that can only be used on Mondays, One for Tuesdays only.. etc... etc...

    HDMI 1.4 spec = fail...

  • Meh (Score:3, Interesting)

    by PPH (736903) on Saturday May 30, 2009 @11:25PM (#28155143)
    The electronics shop down the road will just come out with a new rev of their HDMI-whatever to DVI converter.
  • Great (Score:5, Insightful)

    by macemoneta (154740) on Saturday May 30, 2009 @11:33PM (#28155205) Homepage

    This is the 11th revision of the HDMI specification in the less-than 7 year life of HDMI. Meanwhile, the 22-year old VGA connection still works fine, at full HDTV resolution, and with none of the incompatibility or usage restrictions (DRM) that HDMI brings to the table. Um, progress?

  • by upuv (1201447) on Saturday May 30, 2009 @11:54PM (#28155333) Journal

    The most successful products all have the same qualities.

    1. Simple
    2. Ubiquity
    3. Affordable

    HDMI is not simple.
    Ubiquity, Well I give it points here. It really was the first popular spec to finally include video and audio on one cable.
    Affordable. Not a chance. Ridiculous prices for cables and accessories.

    1 out of 3 is not good enough to survive. HDMI is setting it self up to be toppled of it's lofty perch.

    Wireless HDMI would rock.
    1. It would be simplish ( Some marketing guy would F&*K this up with some screwed up we must know what you are broadcasting so we can tap your wallet. )
    2. Ubiquity. No real restriction here on what is on the channel. So basically everything should work with everything else.
    3. NO HYPER EXPENSIVE CABLES. So that has to help the bottom line.

    Oh wait. The wireless HDMI spec is already here. Can anyone say Wireless USB 3.

    • by MrMista_B (891430)

      Well that's just not true. Something like a Ferrari, iPhone, XBOX, or such is just about the opposite of simple, ubiquitious, or affordable, yet each of those items is tremendously successful. That's 0 out of 3, and all of those are surviving quite well while ignoring your thesis.

    • HDMI really wasn't complicated. This time there are silly variants that shouldn't be there.

      HDMI doesn't have to be expensive, it's not a fault of the standard, it's just marked up a lot by retailers. It just takes a quick Google search to find far cheaper alternatives that do the job just as well. I get them for something like $5 a piece on monoprice (Google hit #1). The Monoprice cables work just fine. Amazon has cheap HDMI cables that are reputable too. The problem there is that B&M retailers ar

    • by ceoyoyo (59147) on Sunday May 31, 2009 @12:48AM (#28155639)

      I thought six bucks for a fifteen foot cable was quite reasonable. You're not paying the extortionate prices for cables in the retail store, are you?

      As for wireless HDMI, no, no, no. That's just what we need. Some huge bandwidth hog spewing unnecessary interference all over the little bit of spectrum we've got just because you find plugging one end of a cable into your blu-ray player and the other into your TV too confusing. Save the wireless for things that actually benefit from being wireless.

    • Oh wait. The wireless HDMI spec is already here. Can anyone say Wireless USB 3?

      Please. HDMI 1.3 (single link) is 10.2 Gb/s.

      USB 2.0 and Wireless USB 1.0 offer just 480 Mb/s.
      Wireless USB 1.1 expands this to 1 Gb/s.
      USB 3.0 is slightly faster at 4.8 Gb/s, but then, it uses fiber optics.

      NO HYPER EXPENSIVE CABLES. So that has to help the bottom line.

      You don't have to buy from monster cable.

  • by ISurfTooMuch (1010305) on Sunday May 31, 2009 @12:00AM (#28155381)

    Forgive me for not having kept up with the progress of HDMI, but wouldn't it have made infinitely more sense to have simply used gigabit Ethernet for all this? The data is all digital anyway, and networking technology is quite mature, so why did these folks feel the need to reinvent the wheel? Right now, you have to worry about whether your new TV will have enough HDMI inputs for the devices you have or might get later, or you need to get an HDMI switcher. With Ethernet, you just connect everything to a switch or router, and you're all set. One connection per component is all you need, and, if you use a router, everything immediately gets connectivity to the home network or Internet. And if a new component comes out that needs to talk to another component in a different way or using more bandwidth, that can all be handled in the firmware. As long as you don't flood the local network with more data than it can handle, everything is fine, and the rest of the networked devices, including the router and cables, can stay exactly the same.

    Or did someone in the entertainment industry worry that using Ethernet for connecting entertainment devices would make it too easy for those evil hacker types to connect a computer to the setup and break their DRM? Or maybe that if this gear was too easily networked, we might...GASP!...use it to send video from our Internet-connected computers out into the living rooms, undermining traditional TV?

    • by Zero__Kelvin (151819) on Sunday May 31, 2009 @12:15AM (#28155481) Homepage

      " ... so why did these folks feel the need to reinvent the wheel?"

      To sell more wheels.

    • With A/V applications there are latency issues to be cognizant of. The big issue is keeping audio and video in sync. The best case scenario would be to stream the audio and video together to one destination. If the audio and video go to two locations then latency becomes an issue. Even then you have to deal with interactive latency issues for things like gaming consoles. With an ethernet based implementation there is nothing to prevent somebody from routing separate data streams through disparate switches a

      • Er, that's why they add in GPS, weather stations, and some kind of visible spectrum laser radar. All of the problems you speak of can be rendered entirely obsolete. Each device will know where it is in relation to every other, each will know the atmospheric pressure and temperature so it can calculate the speed of sound and adjust to your position instantly (embedded locator beacon surgery sold separately), and it'll be able to compensate for any distortions in the speed of light as an added bonus.

    • Re: (Score:2, Insightful)

      by johncandale (1430587)
      It's all about the DRM, they had to make something that would fool the industry players into thinking the DRM worked. The only thing special about HDMI was to be HDMI complicit, a device has to send and receive some signal letting it know the line is not being tapped. It's all completely silly, it's quite easy to order non-complicit HDMI recorders that only pretend to play along because god forbid you buy a movie and make a back up. To get Hollywood to release their movies on blu-ray the blu-ray makers
    • by ceoyoyo (59147)

      Since HDMI bandwidth starts at 4 Gbit/s and goes up from there, gigabit ethernet might fall slightly short.

    • Or did someone in the entertainment industry worry that using Ethernet for connecting entertainment devices would make it too easy for those evil hacker types to connect a computer to the setup and break their DRM? Or maybe that if this gear was too easily networked, we might...GASP!...use it to send video from our Internet-connected computers out into the living rooms, undermining traditional TV?

      Bingo. HDMI is a giant sham.

    • by evilviper (135110) on Sunday May 31, 2009 @02:32AM (#28156125) Journal

      Forgive me for not having kept up with the progress of HDMI,

      ...nor having the most basic knowledge of the topic at hand, correct?

      wouldn't it have made infinitely more sense to have simply used gigabit Ethernet for all this?

      Oh, my, yes. When transferring 16 Gigabits/sec of uncompressed HD video at high frame rates from your DVD player to your TV, what you really want is a 1 Gigabit/sec standard, designed for unreliable communications over 500 meter distances, using a shared-channel, with LOTS of overhead, and very high computational requirements...

      Your insight is... stunning.

    • by TopSpin (753) * on Sunday May 31, 2009 @03:23AM (#28156355) Journal

      Or did someone in the entertainment industry worry that using Ethernet for connecting entertainment devices would make it too easy for those evil hacker types to connect a computer to the setup and break their DRM?

      Forgive me; I'm going to offer something other than 0MG iT'5 Ev1L DEE-ARE-EM zors.

      Ethernet is slow. 10Gbit Ethernet is still exotic and costly. Gigabit Ethernet is much too slow for digital video, and gigabit phys cost more than what high volume TV manufactures will tolerate. An HDMI phy manufactured in 2003 sources or sinks 4.9Gbit/s. Two subsequent revisions have doubled this twice to 10Gbit and then 20Gbit. Basically HDMI provides an order of magnitude more bandwidth than the sort of common Ethernet you have in mind. Uncompressed digital video and audio (what HDMI does) requires emense bandwidth.

      Ethernet is designed for the LAN use case. Consider the magic 300m minimum distance copper Ethernet is built around. This distance is desirable because it covers a large percentage of facilities where LANs exist without additional infrastructure. Among other things, signal frequency and copper (read cheap) cable construction are both bound by this. HDMI has no such requirement and thus does not incur the cost to achieve it.

      HDMI clearly distinguishes between sources and sinks and has different expectations of each. Your digital TV will never suddenly begin transmitting Gbits of data someone will wish to render. It is exclusively a sink. Ethernet doesn't make provision for this sort of asymmetry which means both ends are peers and both suffer a certain minimum amount of complexity (read cost) because of it.

      Ethernet is overly robust for digital TV. There are no packet collisions between your cable box and your TV. While HDMI does provide for error detection and correction, the remedy is radically different than what occurs on a LAN (retransmission usually.) The bad data is just spaced. The moment has passed and whatever pixel(s) or audio samples were corrupted are replaced by new bits before you perceive it (hopefully.) Here is some language from HDMI 1.3, 7.7:

      The behavior of the Sink after detecting an error is implementation-dependent. However, Sinks
      should be designed to prevent loud spurious noises from being generated due to errors. Sample
      repetition and interpolation are well known concealment techniques and are recommended.

      You wouldn't need to read many IEEE 802.whatever documents see just how far computer networking is from the design of HDMI. It is an entirely distinct use case.

      Finally, HDMI provides timing guarantees that are totally absent in Ethernet. Devices are made cheaper through accurate timing (your TV doesn't need a larger high speed buffer for instance.) Recently so-called "Data Center Ethernet" has emerged to address this so that Ethernet can be used in latency sensitive applications. HDMI had this baked-in on day #1.

      Some people are convinced that DRM is the only concievable reason for creating HDMI and all other claims offered are a smokescreen. That's the fashion around Slashdot, anyhow. Don't believe it. Those folks don't know what digital TV is.

      • Re: (Score:3, Insightful)

        by markov23 (1187885)
        Agreed on why this isnt ethernet -- not agreed on why this exists. HDMI is really all about copy protection. I have a custom AV business and we commonly ran long component runs and had great results -- now we have HDMI -- a monitor spec with copy protection. The problem is that its really the old DVI spec -- which never expected the monitor to be more than a few feet away -- so when we do long runs ( 100 ft, 200 ft etc ) we now are spending > 500 for a piece of cable, we need to put in repeaters and
  • by wiredlogic (135348) on Sunday May 31, 2009 @12:04AM (#28155407)

    It amazes me how much the proles gobble this shit up when *gasp* analog component video is perfectly capable of handling a high bandwith video without all the incremental upgrades to a poorly thought out spec. Remember, a VGA cable (not quite as good as separate coax) is able to carry higher resolution and refresh rates than 1080p/60 and it could be all achieved on an early/mid 90's PC with a high end video card.

    • Re: (Score:3, Insightful)

      by ISurfTooMuch (1010305)

      As I recall, early HD gear used just that, but the powers-that-be got worried that component video didn't do DRM, so those nasty evil pirates would have a far too easy time copying the video, so they decreed that any device that could output 1080 must do it only via HDMI, which supports HDCP. The side effect of this was that the early adopters who spent the really big bucks for HDTV sets got royally screwed, since no new gear that output video over 720p would connect to them. No 1080i/p cable or sat boxes

    • It amazes me how much the proles gobble this shit up when *gasp* analog component video is perfectly capable of handling a high bandwith video without all the incremental upgrades to a poorly thought out spec.

      The "proles" gobble up that one HDMI wire is neater and simple than Red-Blue-Green + optical, digital coax, or analog. Especially with the last choice, as you have five plugs all the same shape, going into five sockets right next to each other, and two of them are the exact same color.

      Just because you think something sucks doesn't mean it's utterly without merit.

      • by jedidiah (1196)

        He's not talking about a bunch of color coded RCA cables, he's talking
        about the ancient VGA cabling spec which is perfectly suitable for the
        end user consumer rube.

    • Re: (Score:3, Informative)

      plus, there is no 'back channel' in component analog video. therefore, no drm! backchannel = evil.

      hdmi has that handshake crap going on. it causes me (and a lot of others) to have to reboot our tvs (really). if you power off the tv and then power it on with a stereo (switched ac outlet is the use-case) then the tv loses 'link' with the hdmi peer (in my case a popcorn hour hdmi sender) and the tv needs bouncing before you get any hdmi signal!

      you get NONE of these problems with analog comp-out wiring. yo

  • How many tv's, cable / sat boxes / sound amps, dvd / blue ray, game systems , pc's / video cards will even support all of this and will you have to look at see what the box can do as well as the cable?

When the weight of the paperwork equals the weight of the plane, the plane will fly. -- Donald Douglas

Working...