Forgot your password?
typodupeerror
Television Graphics Hardware

3D HDMI Specification Is Set Free 100

Posted by kdawson
from the seeing-double dept.
An anonymous reader writes "The licenser of the HDMI specification has announced the intent to 'secure the application of 3D' by making the 3D portion of the HDMI 1.4 Specification available for public download, as well as extracts from the upcoming HDMI 1.4a. While the spec includes a 3D component, apparently not everyone has decided to sign up to adopt it. Given the developments happening in DisplayPort v1.2, the next year in displays looks like it will be an interesting one."
This discussion has been archived. No new comments can be posted.

3D HDMI Specification Is Set Free

Comments Filter:
  • by iCantSpell (1162581) on Monday February 08, 2010 @06:14AM (#31059218)
    People are just now using HDMI. No one is moving anytime soon.
    • no cable or sat box has Displayport anyway as well

    • by ultranova (717540)

      People are just now using HDMI. No one is moving anytime soon.

      Maybe. Then again, adding an extra connector to a graphic card isn't exactly expensive. Mine has DVI, HDMI and Displayport.

      Anyway, what I'd really like to see in a future standard would be variable refresh rate. That is, instead of updating the image 60 or 75 or 100 times per second, send the next image when it's ready. That way, when you're watching a PAL movie, your display shows 25 frames per second; when an NTSC one, 30; and when you're play

      • Maybe. Then again, adding an extra connector to a graphic card

        Perhaps not, but adding one to your TV is a more costly venture.
    • DisplayPort is more compatible and extensible. Check the specs. I was actually wondering whether you can extend its micro-packet protocol to carry HyperTransport data flows. Anybody know the specs well enough to answer?
  • Truly (Score:4, Funny)

    by Lorkki (863577) on Monday February 08, 2010 @06:16AM (#31059228)

    2011 will be the year of DisplayPort on the desktop!

    ...what?

    • Re:Truly (Score:5, Insightful)

      by fuzzyfuzzyfungus (1223518) on Monday February 08, 2010 @07:51AM (#31059546) Journal
      Actually, it has already been the year of displayport on the desktop for a year or two now. The basic cheap dell business boxes we buy as bulk word processing/web/email/light other stuff machines all come with it(soldered right onto the motherboard, not some option card).

      Curiously, the year of DisplayPort on the Monitor does not seem to have arrived yet. The monitors that come paired with these machines are VGA/DVI only, and you have to go a fair ways higher up the line before you get a displayport option. I don't know if Dell just refreshes their desktops much more often than their monitors, or whether they have some dream that everybody is going to start buying 22' widescreens for their office drones...
      • Re: (Score:3, Insightful)

        by Ost99 (101831)

        22" widescreens are practically free.
        No matter the size of your company, if your buying new displays smaller than 22" for people you pay to work for you; you're wasting money. Big time.

        The difference between a Dell 17" and a 23" wide screen is about $20 (not US prices). Increased productivity will pay for that $20 the first hour/week/month (depending on officedrone wages).

        • Re: (Score:3, Interesting)

          by Anonymous Coward

          Don't forget that the 23" widescreen will consume significantly more power than the 17" monitor.

          I worked at one place where they somehow got a deal on converting 4000 or so 15" LCD monitors to 20" LCD monitors. So a bunch of us spent a long weekend switching everyone to the new 20" monitors. A month later, the company's power bill had fucking skyrocketed. They had us transition back some users to the old 15" screens, to save money...

          The difference in energy consumption is probably quite minimal for a typica

          • Re: (Score:3, Informative)

            by L4t3r4lu5 (1216702)
            Nothing to do with the fact that your staff could have two windows side by side, thereby probably doubling CPU load?

            The only time I've felt that my 24" home monitor could be used effectively as a work tool (and not just for games, at least without fiddling with windows sizes all the time) is with Win7. Drag the title bar to the side of the screen and it automatically maximises to one-half the horizontal width.

            That's just begging for widescreen TFTs in the workplace.
            • hey ! I didn't know that trick ! thanks !

              • You can thank me for that -- Windows 7 was my idea.
              • Win+LeftArrow and Win+RightArrow will do the same... Win+UpArrow == FullScreen, Win+Down == minimize. My wife will do the bills with quicken on the left, and the browser on the right, same when she balances/reconciles the accounts. I find it less useful for me, prefering dual monitors for work.
            • by Mr. DOS (1276020)

              I got dual 21.5" monitors (1920x1080) for my workstation a few months ago. While it had Windows XP on it at the time, I moved to Windows 7 within a day of getting the new monitors just to take advantage of the superior window management. Without good window management capabilities, multiple and high-resolution monitors can cause more frustration than they're worth.

              FWIW, the Windows 7 window management “killer feature” for me was the ability to drag maximized windows between monitors without firs

            • As a mac user, I've found it a significantly useful and innovative feature (a friend with Win7 showed me)... so I looked, and there is an app for Mac OSX called Cinch [irradiatedsoftware.com] that replicates the feature pretty elegantly.

              I guess this is proof that Win7 is a bit more than Vista SP2.

        • by cb95amc (99589)

          What happens when half your staff set the resolution to 800x600?

        • Re: (Score:1, Insightful)

          by Anonymous Coward

          The difference between a Dell 17" and a 23" wide screen is about $20 (not US prices).

          Okay, I'll bite..... what prices are they then? Zimbabwe dollars?

          Seriously, you've specified a currency value, and then said that it isn't US dollars. What good is that to anyone?

      • by iivel (918436)
        I have dual 30" monitors side-by-side running at 2560x1600px each (don't ask about the video card cost) at the office ... and would personally have a hard time going back to my old quad 17" setup.
  • Sure you can download part of the spec. But it has a restrictive license (and only valid 1 year, after that it will self destruct), and it is actually quite useless without the rest of the HDMI spec.

    • Re: (Score:3, Funny)

      by mb1 (966747)

      and it is actually quite useless without the rest of the HDMI spec.

      Unless you order the Professional Edition Deluxe Diamond Tipped Laser Nanotube 9000XT HDMI Spec Reader Cable available from Monster.

      Then you're golden.

    • Exactly.. The headline is bogus. I didn't see anything about public domain in there. This is just an attempt to set the hooks even deeper. A feeble attept to make them look like one of the good guys. Oh well, the market has spoken.

  • by l2718 (514756) on Monday February 08, 2010 @06:35AM (#31059296)

    Very nice of them to allow us to read the spec. Now what about the patents? the rest of the HDMI spec on which this piece depends?

    If you can't implement the standard, what good will it do you to be able to read it?

    • Re: (Score:2, Insightful)

      by Sehnsucht (17643)

      Not only that, but what they need to do is drive consumer interest. Consumers don't care about openness, they want shiny stuff cheap. Might appeal to some of the more tech savvy of us, but its not like you can go and implement your own HDMI gear to tinker with it.

    • by Joce640k (829181) on Monday February 08, 2010 @06:59AM (#31059378) Homepage

      You'll be able to have a massive nerdgasm imagining owning your your very Blu-ray copy of Avatar as you read it.

      They won't release the 3D version right away of course. Oh no... First it will be the 2D theatrical version, then the 2D extended version, then the 3D theatrical version, then the 3D extended version ...the 3D director's cut where Jake and Neytiri plug their hair together in the love scene {ooooh!} and *finally* the R-rated 3D extended director's cut with topless Na'vi. All versions will also be sold as boxed sets with collectible blue plastic dolls.

      • by timmarhy (659436)
        you missed the part where all those versions end up on bit torrent, and the MPAA complain the billions it made out of the avatar franchise isn't enough to feed the starving actors.
        • by speedlaw (878924)
          Well yes, but for the names who got up fronts, the actors will starve. Read about the Studios vs. the unions at some point.
      • Huh? When did Lucas acquire the HDMI patents?

      • by lennier (44736)

        the 3D director's cut where Jake and Neytiri plug their hair together in the love scene {ooooh!}

        We jacked, straight across. ....
        Ordinarily I get the raw material in a studio situation, filtered through several million dollars' worth of baffles, and I don't even have to see the artist. The stuff we get out to the consumer, you see, has been structured, balanced, turned into art. There are still people naive enough to assume that they'll actually enjoy jacking straight across with someone they love. I think most teenagers try it, once. Certainly it's easy enough to do; Radio Shack will sell you the box

    • by Twinbee (767046) on Monday February 08, 2010 @07:11AM (#31059442) Homepage

      Although patents can be bad, isn't adding a HDMI port to a device very cheap for manufacturers to do?

    • by tlhIngan (30335)

      Very nice of them to allow us to read the spec. Now what about the patents? the rest of the HDMI spec on which this piece depends?

      If you can't implement the standard, what good will it do you to be able to read it?

      You're supposed to join the HDMI consortium if you want to implement HDMI. It's a non-free group to join, like SD Card Association, Blu-Ray Association, DVD Forum, etc. There are free groups you can join, like the USB Association, PCI (note: specs cost $$$), and many others. Of course, some groups

  • Set free (Score:3, Interesting)

    by PCM2 (4486) on Monday February 08, 2010 @06:53AM (#31059352) Homepage

    Slashdot editing is so inconsistent. Is that set free as in "turned loose"? Or set free as in "nobody owns one"?

  • Stereoscopic, (Score:2, Informative)

    by Anonymous Coward

    not 3D. You insensitive clods.

    • Re: (Score:1, Funny)

      by Anonymous Coward

      It's 4D!

      Two 2D pictures = one 4D picture.

  • HDMI mess (Score:5, Insightful)

    by timmarhy (659436) on Monday February 08, 2010 @07:01AM (#31059388)
    yes because the consumer is going to know the difference between HDMI 1.1,1.2,1.3 and 1.4

    between DLNA, HDMI and the 3d crazy that's comming i'm predicting lots of ripped off people. consumer electronics in 2010 is going to be a mine field.

    • by vegiVamp (518171)

      More of a minefield than it usually is with "gold-plated cables for better digital image quality" at half an arm per running meter ?

      • Re:HDMI mess (Score:4, Insightful)

        by timmarhy (659436) on Monday February 08, 2010 @07:20AM (#31059458)
        yes, have you trying anything to do with DLNA. at first look it seems to be file sharing reinvented in a crappy way, but then you realise it's actually slow ass file sharing, which doesn't work very well and a lot of the time you can't FF or RW during streaming.

        i've got 2 DLNA devices, both of them behave in totally different ways and neither one is what i would call satisfactory. the only saving feature of DLNA is transcoding, but since that is hit and miss it doesn't save it.

    • Re:HDMI mess (Score:4, Insightful)

      by Opportunist (166417) on Monday February 08, 2010 @07:39AM (#31059514)

      Someone hand that guy an insightful mod.

      2010 will be the year of a lot new buzzwords and add-on-words to various standards. Watch out for "enhanced", "empowered", "true", "extended" and whatever other buzzword the markedroids will come up to sell their outdated past standard not-quite-anymore-compatible junk. I'm still struggling with "HD ready" and "full HD" and HD...whatever the buzzword of the day is.

      *whimper* I just wanna watch TV!

      Waitasec... why? There's nothing on worth my time.

      • Re: (Score:3, Insightful)

        by L4t3r4lu5 (1216702)
        I don't have a console or a TV. My PC plays DVDs, games, and TV shows just fine on a 24" TFT.

        FWIW, I've had higher-than-HD in almost all of my games for around two years, for significantly less price than an HDTV and a console.
        • by BESTouff (531293)
          Same thing here. Though I'd love to buy one ove those 65" TV sets, but without the tuner and multimedia crap, just a biiig monitor with one DVI or HDMI or Displayport plug. Please !
        • by PitaBred (632671)

          But it's also only 24". As long as you're the only person using it, sure, that's fine. But what about those of us with families and friends IRL? Watching the 24" computer screen on your desk isn't quite as nice as being able to sit on a couch, not to mention multiplayer games on the same screen, which the PC doesn't do very well at all. If it does, it's just by running a console emulator.

          • But what about those of us with families and friends IRL?

            My mate is a bank manager, and he has the 52" TV. Someone brings their 360, someone a PS3, I provide the beer, someone else brings the pizza. The weekend becomes a haze.

            He has the biggest living room and is centrally located to all of his mates. Thinking like a geek, it's a logical progression.

            • by PitaBred (632671)

              The point is that SOMEONE has the HDTV ;) The 24" is great if you're a single guy in a small place, especially when you can go elsewhere for the group gaming. It's not as acceptable for a family, friends, etc.

        • My computer is in my office, not my living room. I don't want to sit in an office chair and watch movies.

    • Re:HDMI mess (Score:5, Insightful)

      by ledow (319597) on Monday February 08, 2010 @08:40AM (#31059680) Homepage

      I have a hard enough time convincing people they need to re-buy cables (and peripherals) for their new TV as it is.

      "You need an HDMI cable"
      "But I already have this SCART thing and this composite thing"
      "Yeah, but you only have HDMI on your new TV"
      "Is that because it's HD?"
      "Well, no, you can send an HD signal over SCART or composite just the same, but they just don't want to let you. They want you to buy HDMI leads and TV's and equipment with HDMI."
      "Who's they?"
      "The people who license the HDMI technology."
      "Er... so I have to throw away my DVD player unless I pay extra to get legacy ports too?"
      "Or buy a Blu-Ray with HDMI or a newer player with HDMI. The new ones upscale the DVD so it *looks* like HD but isn't really."
      "Mmm..."

      And then add an hour of conversation as you explain the various *revisions* of HDMI and everything else, and why they can't just buy a £10 signal-splitter or cable-switcher without it potentially interfering with their recording of HD programmes, or why some models just won't negotiate a HD signal with some other models, or why the cheap, shit imported versions of DVD players and Blu-Ray let you just use a composite output, or why all this was to stop pirates when you can find and download HD-anything online in the same time as you used to be able to download SD content.

      Call me when consumers get bored of this crap. Then I might have a look and see if there's a *standard* (i.e. unchanging, common, open, useful) cable set I can use to watch TV and record the stuff I want. To be honest, there already is - it's called "ADSL over a phone line from a widescreen laptop".

      • Re: (Score:3, Insightful)

        by Arcady13 (656165)
        I have a hard enough time convincing people they need to re-buy cables (and peripherals) for their new TV as it is.

        It wouldn't be hard to convince them to buy the HDMI cable if stores would stop charging $50-$100 for something that nobody tells them costs $8 online.
        • by hazydave (96747)

          Yeah, that's a crime... and if you're paying $8.00 on line, you're getting a pretty long cable, or you're overpaying. Try Monoprice.com.... they even do volume pricing.

          There's a simple reason for the $100+ cables at Best Buy -- they have determined customers are smart on some issues, stupid on others. They will price shop to get the best deal on the HDTV. But once they're getting that, they go slightly insane, and spend hundreds on accessories, like $100 cables (actually, Best Buy has one for over $130 last

      • by PitaBred (632671)

        Ugh. At least tell them that HDMI is much easier to use than that SCART crap. Seriously... that thing is a behemoth relic. I do agree that the splitter stuff is nonsense, but that's not the fault of the cable per-se. It's the fault of the studios. Tell your friends/customers that the reason they can't get a cheap HDMI splitter or switcher is because the movie studios think he's a criminal. Lay the blame where the blame is due.

      • I have a hard enough time convincing people they need to re-buy cables (and peripherals) for their new TV as it is.

        Well, given a good bit of the dialogue you posted, I can see why. Most of the dialogue past the fourth line sounds suspiciously like you're either:

        a) intentionally trying to confuse them;

        b) trying to impress them with your knowledge (which doesn't preclude "a)"; or

        c) hoping to proselytize them into your way of looking at things, but doing a piss-poor job of it.

        I mean, c'mon! How much of the following is actually necessary?

        "You need an HDMI cable"
        "But I already have this SCART thing and this composite thing"
        "Yeah, but you only have HDMI on your new TV"
        "Is that because it's HD?"
        "Well, no, you can send an HD signal over SCART or composite just the same, but they just don't want to let you. They want you to buy HDMI leads and TV's and equipment with HDMI."
        "Who's they?"
        "The people who license the HDMI technology."
        "Er... so I have to throw away my DVD player unless I pay extra to get legacy ports too?"
        "Or buy a Blu-Ray with HDMI or a newer player with HDMI. The new ones upscale the DVD so it *looks* like HD but isn't really."
        "Mmm..."

      • Re: (Score:3, Funny)

        by Ant P. (974313)

        Last year I found a case where someone had already been "convinced" they needed to buy a £60 cable for their new HD setup. It was a SCART cable. They had a HDMI cable plugged in which had never been used until I asked why they had two A/V connections between the same things.

      • Am I the only geek on the planet who checks for multiple inputs before buying a TV? While I realize that video- and audio- philes will probably disagree with me, I'd say that the TV is the best device to act as the central hub for 99% of consumers out there. It's simple and straightforward for them to figure out where to keep plugging stuff into. Why would anyone ever buy a TV with only one (type of) input?

        For example, I recently replaced my 10 year old Sony WEGA with a 55" Samsung LCD (half price off a

        • by anss123 (985305)
          All TVs I've seen got multiple inputs. We recently bought a cheap one and it had 4 HDMI, 2 SCART, RGB, Svideo, USB, a couple of ports I don't know what is, and - of course - the port we actually use: cable in.
        • One of the first things I do is try to save money by buying a TV with fewer input jacks because I expect my receiver to switch audio and video for me. People who actually use their TV speakers for sound can ignore this comment of course.

        • by hazydave (96747)

          I'm a A/V wizard (audiophile has such "another one born every minute" connotations) and I totally agree.. for the average consumer, the TV makes a great hub. And while the manufacturers do seem to be cutting down on the support for ports in monitors, I have seen no such evidence of this happening in full televisions.

          The one thing I do miss is an audio chain output from the TV. Ok, my current Samsung has a Sony/Philips optical output, but that's just silly on a TV with HDMI inputs -- I need PCMx8 audio chain

        • by ageoffri (723674)
          When I was shopping for a new TV a couple years ago I didn't care how many inputs it had. I made sure it had a HDMI input. Then I spent time making sure my AV Receiver had all the inputs I needed and then some extras. A TV should not be the central control point in a decent home theater setup.
      • by hazydave (96747)

        Oh, settle down... televisions all have analog inputs still: CVBS, Y/C, YPrPb, probably even VGA. Sure, in Europe, you pronounce that "SCART" and just hope your favorite of the many SCART interfaces is implemented on your new piece of kit (don't start with me, I designed video devices in Germany back in the late 1990s... SCART was a mess, even then, even if it did occasionally work well). Hell, I have all of these, as well as the HDMI's, on my dual 24" monitors. At least in the USA, the demand in new TVs is

    • Have you seen how much bestbuy charges for a HDMI cable? Never mind the nonsense of selling Monster Cable HDMI cables to get the "best picture".(Never mind that a $5 cable from monoprice has the same picture.) I can't imagine how much more ripping off there's going to be.
      • The $5 monoprice cable has other advantages as well. For instance, they have a variety of lengths to choose from, so you can cut down on the cable clutter by getting one that's just-right. And if you have multiple devices, you can get multiple colors as well, to help with the organization.

  • "Free"? (Score:5, Insightful)

    by perrin (891) on Monday February 08, 2010 @07:02AM (#31059398)

    The document has an EULA. While that is bad enough on its own, in it you find this gem: "The term of this Agreement is one year. Agent in its sole discretion may terminate or extend this Agreement at any time and without prior notice. Upon expiration or termination of this Agreement, You shall immediately destroy and cease all use of the Specification Portion and all materials and information related to the Specification Portion." To add insult to injury, they also slap an indemnification clause to the document's EULA.

    So, you agree to not distribute it and to destroy the document after one year. If they are sued for whatever reason, and they can blame it on you, you agree to cover all their expenses. Yay for openness!

  • by G3ckoG33k (647276) on Monday February 08, 2010 @07:42AM (#31059526)
    Despite the success of Avatar, this may be more important for gaming than Hollywood, or? Didn't gaming bypass Hollywood in turnover some time ago?
    • Re: (Score:2, Interesting)

      by aflag (941367)

      Let's see how this "3D hype" goes. I remember when VR was the hype. It was the future of the game! Heck, the future period. I'm in the future, where's my virtual world where I can live a better life along with my virtual friends without ever leaving home? Where's that world of wonder where everything is possible and I'm a superman? Where's -- oh, wait, nevermind.

      • by delinear (991444) on Monday February 08, 2010 @09:20AM (#31059824)
        You know, what you're describing as "virtual reality" is really just the interface. It's like calling your monitor a first person shooter. The actual virtual reality is the programmed world, and sure we were promised virtual reality immersive headgear which didn't really transpire (augmented reality is the new virtual reality, it seems), but the promise of an interactive world certainly came to pass in any number of online multiplayer games.
        • by gaelfx (1111115)
          Correct me if I'm wrong, but wouldn't that make any kind of internet based application a form of virtual reality? Are we not participating in such a phenomenon already? I mean, I sign on to Skype or AIM or IRC or whatever and I'm participating in a world that is, for most intents and purposes, a programmed world that is reasonably separated from what is traditionally considered reality? I don't have to be the person I am in reality when I sign on to the internet, I am capable of participating in a community
          • by Dunbal (464142) *

            but wouldn't that make any kind of internet based application a form of virtual reality?

                  The problem with THIS "virtual reality" is that it seems intent on selling you viagra or enlarging your penis.

          • by dkf (304284)

            Correct me if I'm wrong, but wouldn't that make any kind of internet based application a form of virtual reality? Are we not participating in such a phenomenon already?

            Yes. Prosaic, isn't it?

      • by Nidi62 (1525137)

        where's my virtual world where I can live a better life along with my virtual friends without ever leaving home? Where's that world of wonder where everything is possible and I'm a superman? .

        It's called LSD. They tried it back in the 60s. Didn't really work out so well.

      • by Ash-Fox (726320)

        Where's that world of wonder where everything is possible and I'm a superman?

        Provided you have a decent computer that can handle content that is not prerendered (due to the dynamic nature of such a virtual world, it's not really possible to prerender), you could try Second life [secondlife.com].

    • by MrNemesis (587188)

      I sincerely fucking hope not; I'm one of those annoying people that can't see 3D without concentrating, with the result that I get a splitting headache from pretty much any enforced stereoscopy after a minute or two, similarly I can't see magic eye images. Last I read something like 10% of the western population have this defect; it's sometime correctable but I was advised against it; I was told I'd likely lose my 25/20 and 20/20 vision

      I've not been able to see Avatar anywhere since there's no cinemas near

      • Where we're going we don't need roads^H^H^H^H^Hgoggles. Lenticular 3D displays are currently available, mostly for promo use. They are power hungry, have a small sweet spot, and need a beowulf cluster to drive them. They may get better with time

      • Magic eye images depend on being able to relax your eye muscles. Stereo input depends on your brain not trying to change your focal plane to match your depth perception.

        They're two different issues, and while I empathize, I can't help but wonder where we'd be if all video games had to cater to all vision deficiencies, like the colour-blind (and they make up a good percentage of the population).

  • by HockeyPuck (141947) on Monday February 08, 2010 @11:18AM (#31060672)

    I just want displayport implementations in laptops to transfer audio as well? Even the current Lenovo W500 laptop, which comes with a displayport, does not transfer audio over it. Even if you buy the bluray option for the laptop. It just pisses me off that there's no way to get high quality audio out of this laptop. Even lenovo admits [lenovo.com] this.

    I'm just sick of these ever evolving home theater standards. There was a time when most home receivers didn't support DTS and dolby digital and so if you had a DVD which was only encoded in one, and your receiver didn't support it, you were out of luck.

    • Many PC laptops, and all recent Macbooks (Pros) have TOSLINK optical outputs - I'm surprised the W500 is lacking this feature. Consumer TOSLINK/SPDIF does not enough bandwidth for the uncompressed surround available on some movies, but it is sufficient for DD5.1, DTS, etc.

      It is unfortunate that you bought a laptop that didn't meet your needs.

    • I'm just sick of these ever evolving home theater standards.

      Just don't buy into the BS then. There's nothing wrong with component video or VGA and either Toslink or coax for audio. Make sure you buy hardware that supports them and you won't have any troubles connecting things together. There will be no silly resolution restrictions, or DRM handshake dropouts. Everything that's been foisted on us since then has been an attempt to lock us out of handling our media in a convenient way and doesn't add anything new to the user experience.

  • How are they going to display the 3D data?

    Or is it a typo, and they meant stereo 2D?

    • by TheSync (5291)

      Or is it a typo, and they meant stereo 2D?

      It is stereoscopic 3D. There is a format developed by Philips called Declipse [inist.fr] that provided a 2D + depth + occlusion + transparency data that could provide a useful multi-view, non-glasses-based 3D display.

    • by TheSync (5291)

      Update: I just downloaded the spec, and it does contain the ability to transmit "L+Depth" which I believe is a 2D+depth representation that could be used by non-glasses-based mutiview displays, but will probably not be too good to look at in such a display without occlusion or transparency data.

  • Why cant everyone from the computer camp, the home entertainment camp and elsewhere come together and create one unified set of next generation display standards for everything that both the CE guys and the computer guys could support?

    What makes DVI and Display Port and other "computer" technologies better than HDMI for computer use?

    • by hazydave (96747)

      DVI was better than HDMI in some computer uses, simply because it was a transitional form... it can carry analog and digital signals. HDMI is digital-only. In fact, HDMI is signal compatible with DVI... HDMI 1.3 can run to pixel rates beyond those of DVI, but any single-link DVI output can run into an HDMI input (there's a dual link HDMI cable, but no one uses it).

      DisplayPort is better because, well, it's a computer industry standard. Most of the computer world has an uncomfortable relationship with HDMI...

Life would be so much easier if we could just look at the source code. -- Dave Olson

Working...