VGA and DVI Ports To Be Phased Out Over Next 5 Years 704
angry tapir writes "Legacy VGA and DVI display ports are likely to be phased out in PCs over the next five years, according to a study by NPD In-Stat. Intel and Advanced Micro Devices are ending chipset support for VGA by 2015. The VGA interface was originally introduced in 1986 and DVI was introduced in 1999."
why phase out DVI? (Score:5, Insightful)
Re:why phase out DVI? (Score:5, Insightful)
Trying to close the analog hole I guess. Using "smart" HDMI can more easily be used with DRMs. Coupled with machine you can not choose the OS of, and you might have quite annoying copy protection schemes.
Re:why phase out DVI? (Score:5, Insightful)
The "Analogue Hole" is unaffected by digital restrictions
It's the illegitimate* analogue re-capturing of a legitimately decoded digital stream
Think TV-capture card
* From "their" POV
Re:why phase out DVI? (Score:5, Informative)
Re:why phase out DVI? (Score:5, Informative)
So does this phase-out mean I won't be able to use the 4 VGA CRTs and 1 DVI LCD I have accumulated over the years?
What a waste of perfectly functional equipment.
Re:why phase out DVI? (Score:5, Informative)
No, You'll be able to use adapters...according to the article.
Otherwise. yeah that would be a waste.
Re:why phase out DVI? (Score:4, Insightful)
Adapters? I wuv adapters!
How much do they cost? I know a bunch of clients and businesses that will be utterly delighted that their investment in hundreds of LCD monitors is going to be destroyed without the additional purchase of hundreds of adapters to work with new computers they purchase. It's not like they are going to spend the money to buy all new monitors.
Business does not upgrade unless it absolutely has to do so (in my experience) and will attempt to retain the investment in every single piece of hardware they have. Take a guess why XP is till being used damn near everywhere in so many businesses? No reason to upgrade that justifies the cost of the licensing and retraining. I have a ton of LCD monitors that support DVI, but are connected with VGA simply because the thin/thick clients don't have DVI connectors.
If we have not even switched over to DVI completely in business yet, what makes them think they can switch us to HDMI/Display Port? There has to be millions of perfectly good LCD monitors out there with DVI connectors capable of high resolutions that can be in service for at least another 5-10 years from today.
VGA is understandable, but why on Earth get rid of DVI just yet?
I just hope they are not dicks and there is a $100-$200 Display Port monitor out there when they do. It's not like those monitors are plentiful today on the market.
Re:why phase out DVI? (Score:5, Informative)
Re:why phase out DVI? (Score:4, Insightful)
Re: (Score:3)
From your nic, I presume that you're not much up on the leading edge of computer hardware. In fact, I'll bet you have computers that have DB-25 ports.
Perhaps you could hook up those monitors to your Hercules card [wikipedia.org]?
Re:why phase out DVI? (Score:5, Funny)
From your nic, I presume that you're not much up on the leading edge of computer hardware.
You mean he has an ancient ISA one with 10Base2 BNC connectors or something like that?
Re:why phase out DVI? (Score:5, Informative)
Huh, I thought your karma was so bad you weren't allowed to post anymore.
No, you won't have to throw it away, these ports simply won't appear on new equipment. Being able to connect to VGA is still useful for old projectors, but it's no longer sufficiently important to waste board space on it. I bought a mini DisplayPort to VGA adaptor for £5 including delivery. It contains a set of three 10-bit DACs to generate the VGA signal and works well. I take it with me when I'm going to give presentations, but the rest of the time my laptop is quite happy without VGA.
I suppose that if your existing computer dies and you can afford a new computer, but can't afford a £5 adaptor then you may have to throw them away...
Re:why phase out DVI? (Score:5, Informative)
HDCP supported DVI before it supported HDMI, and has been available on graphics cards for years. This won't be closing any holes.
Re:why phase out DVI? (Score:4, Insightful)
Ah yes, but neither DVI graphics cards nor DVI monitors required HDCP so it would always downgrade but then refuse to play protected content. HDMI has always had HDCP, it is required. So they are getting rid of the last unencrypted connections, of course HDCP is broken but still. Now you will no longer get a picture on an unlicensed device without being a criminal under the DMCA.
Re:why phase out DVI? (Score:5, Informative)
Neither HDMI graphics cards nor HDMI monitors require HDCP. HDCP is not required on HDMI.
I ran HDMI from my DirecTV receiver to my Dell display (DVI input) for years. No HDCP required nor used (and the display didn't support it!).
There is nothing in the system that requires HDCP except the signal transmitting device. After the HDMI connection is set up, the transmitter knows whether it has active HDCP or not. The transmitter may then refuse to transmit video if the video it is to send is marked as not transportable over digital connections that don't use HDCP. For example an Xbox 360 will play games but not media content over a non-HDCP HDMI connection. A PS3 won't show anything at all over HDMI if there is no HDCP.
There is absolutely nothing enforced by the monitor vis-a-vis HDCP. If the sender sends video and monitor understands the format and encryption it displays it. It is completely up to the sender to decide what should and should not be displayed.
The rules for sending content over DVI are exactly the same as those over HDMI. If the content is marked as not showable over non-encrypted digital connections it cannot be shown over any non-encrypted digital connections, whether HDMI, DVI, MiniDP, etc.
Would it be too big an imposition to become informed about the facts before projecting hate?
Re:why phase out DVI? (Score:5, Funny)
Trying to close the analog hole I guess. Using "smart" HDMI can more easily be used with DRMs. Coupled with machine you can not choose the OS of, and you might have quite annoying copy protection schemes.
Yep. Hollywood and Big Media will be pushing for a monitor standard which detects uncertified video, blocks it, reports you and sets your house on fire.
Re:why phase out DVI? (Score:4, Funny)
Yep. Hollywood and Big Media will be pushing for a monitor standard which detects uncertified video, blocks it, reports you and sets your house on fire.
Yes, but it will only work on x86 CPUs with the HCF instruction set extension.
Re:why phase out DVI? (Score:5, Funny)
They started a joint venture with Ubisoft?
Re:why phase out DVI? (Score:5, Interesting)
Yep. Hollywood and Big Media will be pushing for a monitor standard which detects uncertified video, blocks it, reports you and sets your house on fire.
That's pretty overkill, and probably only represents *AA's starting position in the bargaining. I'm sure they'd be happy with just detonating the explosive slave collar [wikia.com], if they can get the mandate for that piece of content-protection hardware added in the next round of copyright protection legislation.
Seems I read a story in the news yestiddy - MegaUpload cost them $500 Million in losses due to piracy. Really? How did they come to that figure? Wild guess? Actual accounting? Is anyone here just blindly accepting that figure? Don't talk to me about overkill when it comes to Big Media and Hollywood (Hollywood even - legendary for Hollywood Accounting and such phrases as, "Yes the picture did gross $784 Million, but after production, distribution, marketing, promotional activities, etc, etc, etc, we lost money on the picture, which is why you are not getting a 5% of Net, because there is none. By the way, are you available for the Sequel? It'll be collassal!") $500 Million .. sure. And I'm an Asthmahound Chihuahua named 'Stimpy'.
Re: (Score:3)
And I'm an Asthmahound Chihuahua named 'Stimpy'.
I'm skeptical of this claim. You didn't call me an "eeeediot".
Yeah, "overkill" isn't strictly the right word. The *AA takes an Orkish perspective [tvtropes.org] on overkill.
I'm just saying that burning down a house may imperil the ability of other well-tamed media consumers in that household to properly and subserviently pay for and consume media. So the slave collar is a more nuanced and somewhat less damaging approach. The only thing it lacks is the unthinking terror tha
Re:why phase out DVI? (Score:5, Insightful)
Re:why phase out DVI? (Score:5, Insightful)
[DVI] gives me crystal-clear digital connection to my monitor, and unlike HDMI, it works every time without fail.
Trying to close the analog hole I guess. Using "smart" HDMI can more easily be used with DRMs. Coupled with machine you can not choose the OS of, and you might have quite annoying copy protection schemes.
Nevermind that HDMI is electrically equivalent (adapters are under $3 [monoprice.com].
Nevermind that DRM operates at different layer than the physical interface, which itself is different from the electrical interface.
Nevermind that HDMI and DVI, by virtue of the above, support the . Note that this is independent of whether a particular display does. [wikipedia.org]
No, no, forget all that nonsense, the real question I have for your post is how you think anyone can try to close the analog hole by deprecating a digital interface?!
Re:why phase out DVI? (Score:5, Informative)
Because DVI is also an analog interface? Or are you forgetting the VGA-compatible (analog) C1-C5 signals? Which are, amazingly not at all present in a HDMI connection.
The digital portion of DVI is HDMI-equivalent. The analog portion of DVI is VGA-equivalent. The intent is to demolish VGA, including its equivalents. Hence, DVI has to be banished too.
QED.
Re:why phase out DVI? (Score:5, Informative)
DisplayPort is not just an industry standard, it is a royalty free standard, but HDMI seems to be winning - the only device I've seen with DisplayPort is my 2+ year old HP laptop and I have about 18 devices with HDMI in my household (heck, our cellphones even have it).
Re: (Score:3)
HDCP has been broken once and for all. The master key is out there.
Re:why phase out DVI? (Score:5, Funny)
That is why it is being killed off.
Re:why phase out DVI? (Score:4, Interesting)
That is why it is being killed off.
Puts me in mind of the wonderful move to SATA connectors .. you know, those damn things which come loose and you have to shut down, open cabinet and push back in place? Honestly, what a horrible connector. HDMI impresses me as another connector which is weak. The next standard will probably have a built in spring for pushing it out at various intervals (usually while you are in the middle of that big presentation, like I was on Wednesday and the video cable to the projector kept falling out.)
Re: (Score:3)
As others have said, all my SATA connectors lock into place. That you bought yours from Jose at the corner for 5 cents a piece is not a problem with SATA but with you.
Re:why phase out DVI? (Score:5, Informative)
Actually, it was a legitimate problem with early SATA connectors. Manufacturers have since redesigned the connectors so that they don't suffer from this problem; these days, you can feel the SATA connector snapping into place when you plug it in. That wasn't true when they first came out, regardless of price.
Re: (Score:3)
Because futurists are moronic and don't understand that dvi = hdmi in terms of quality.
Re:why phase out DVI? (Score:5, Insightful)
DVI was confusing to non-geeks.
You had, what..
DVI-D, DVI-A, and DVI-I .. plus "single link" and "dual link" thrown in for good measure, and different cables supporing subsets of those and adapters and a variety of "this works with that, but not this other thing".
HDMI is HDMI .. you plug it in and not worry about whether you are using the right mode / cable for your setup.
Re: (Score:3, Insightful)
Re:why phase out DVI? (Score:5, Insightful)
Then again there's HDMI 1.0, 1.2, 1.3, 1.4, then for cables there's Standard, Standard with Ethernet, High Speed, plus converter cables to/from DVI, DisplayPort, VGA, and then of course there's HDCP...
...It's always going to be confusing to 90% of people no matter what.
Re: (Score:3)
Close...From Wikipedia:
Standard HDMI Cable – up to 1080i and 720p
Standard HDMI Cable with Ethernet
Automotive HDMI Cable
High Speed HDMI Cable – 1080p, 4K, 3D and Deep Color
High Speed HDMI Cable with Ethernet
Re: (Score:3)
Sigh.. good to see there were some good lessons learnt :S
Re:why phase out DVI? (Score:5, Insightful)
As best I can tell the problem is that they created a "smart" spec for the cable then didn't force manufacturers to only put them on "smart" devices, or didn't make a certain degree of smartness and sensible fallbacks part of the spec. Consequently, we've got a bunch of idiot devices that think they're smart and do all kinds of dumb things that a "dumb" connection like VGA wouldn't allow.
Instead of "just work" we've got "just work IF your devices like each other and IF you turn them on in the correct order (note, not always the best or most intuitive order) and IF you have a shaman do the HDMI dance first." My guess is a bit tighter spec and better testing requirements tied to using the HDMI name/logo would have reduced these problem from nearly universal to occasional, at least.
If nothing else the devices all ought to have a "stop trying to be smart and FUCKING DO EXACTLY WHAT I TELL YOU TO DO" mode. You think your source isn't 5.1? BULLSHIT, yes it is. You think you ought to defer to another device for audio out? NO, you're the goddamn audio receiver and I want you to NEVER do that. You went to sleep, woke back up, and now you think there's no capable audio device connected to your HDMI port and you'll continue to think that until I restart you? NO, just send the goddamn bits, because you're wrong.
Actually, that's what the override mode should be called: "Just send the goddamn bits"
Re:why phase out DVI? (Score:5, Informative)
the port connector's huge. Not to mention Dual Link DVI is a pain in the ass.
Display Port/Mini Display Port is tiny and free.
Re:why phase out DVI? (Score:5, Interesting)
The port connector might be 'huge' by your standard, but at least it won't get out by itself, from either the computer or the monitors. I'm using 3 30" monitor in 2560x1600, and the images are always perfect, switch on immediatly too.
On a TV gaming setup in the basement with HDMI, when there's way too much bass and the TV is vibrating with the sound (older retroprojection TV in which there is a lot of air), there happens some time where the security signal is lost and we lose the image for a few seconds, until it synchronize back. Doesn't happen with the DVI connector, which is a big plus for them.
Anyway, what was your point about the dual link DVI being a PITA ?
Re: (Score:3, Interesting)
This could be solved without actually modifying the HDMI "connector" itself - just some body work around it.
No technical reason you couldn't put securing screws around an HDMI connector, is what I mean.
Re:why phase out DVI? (Score:5, Informative)
All fullsize displayport adapters i've seen feature a couple of little hooks to prevent cable getting loose.
You then need to press some kind of button to release the plug and extract it, a-la rj45.
Re: (Score:3)
Don't you mean lack of dual link DVI is a pain in the ass? What problems are there with dual link, other than cheap graphics cards from 5 years ago not having had it?
Re: (Score:3, Informative)
It lacks any "copy protection". Don't worry, the (MP|RI)AA thought police will be around shortly to help correct your faulty logic. If this fails, then they will work with their friends in the government to put you someplace safe and quite where you cannot be a threat to others with your silly notions.
Re:why phase out DVI? (Score:5, Informative)
DVI-D has copy protection just as good as HDMI. It supports HDCP.
Re:why phase out DVI? (Score:5, Informative)
It is in fact electrically compatible with HDMI (Score:3)
You can cross connect them without issue. Now they each support features the others don't. HDMI can do audio, DVI can't, DVI can do analogue, HDMI can't (of course the ports can be made to work either way), but the video signal is the same electrically.
The real difference is just connector size. Also normal HDMI connectors don't do dual link, but that isn't such an issue these days as we can just use higher frequencies to get higher bandwidth.
Re: (Score:3, Interesting)
I just had a funny thought last night...
Say you're a consumer electronics manufacturer, like, oh, Sony, and you make audio, then later video recording devices "for the masses," you sell them in quantity, get them in the hands of Joe Average, and let him record whatever he wants, which you know is about 98.9% copies of commercial works and 1% his garage band (playing cover tunes without a license) or 2 year old reciting the cereal commercial from T.V., and 0.1% actual, legitimate use.
So, what's your next mov
Re:why phase out DVI? (Score:5, Informative)
DVI can support the same HDCP protection as HDMI because it's the same fucking thing with a different connector shape. The anti-HDMI fud here is idiotic.
Making HDMI ports requires a license/royalty (whereas things like DisplayPort is an open VESA standard and requires no royalty payments).
Re: (Score:3)
I've heard that DisplayPort is technically superior, but at this point it's not going to be catching up, at least not in the mainstream. Reminds me a bit of Firewire, it was in someways superior, but outside of niche uses how many people even have a port anymore?
Re:why phase out DVI? (Score:5, Informative)
No, you can also play Blu-ray movies with an HDCP compliant monitor, video card and DVI cable. Or you can do it with VGA. Or you can do it on a integrated display (like a laptop). You just can't do it with non-HDCP digital video out whether HDMI or DVI (well, not at full res you can't, it must be downsampled to 960x540).
Re:why phase out DVI? (Score:5, Insightful)
Re:why phase out DVI? (Score:5, Insightful)
Using an unlicensed decryption program to decrypt a Blu-Ray DVD that you own, rented or borrowed in order to watch it on a non HDCP compliant system it is not piracy, no matter what the MPAA tells you. It may or may not violate the DMCA, but it is absolutely not piracy.
Re: (Score:3)
it gives me crystal-clear digital connection to my monitor, and unlike HDMI, it works every time without fail.
I suspect this is the problem. There are better option that are Defective By Design compliant, i don't see why we should settle for anything less in 2012.
Re:why phase out DVI? (Score:5, Informative)
Single-link DVI and HDMI are the same signal! They have the exact same TMDS pins! The ones and zeroes are identical! It's the same thing!
Re:why phase out DVI? (Score:4, Insightful)
How exactly do you send sound over DVI? You can with HDMI, so how are they identical?
Re:why phase out DVI? (Score:4, Informative)
Audio data is superimposed in a vertical blanking interval of video data. It's hilarious really.
Re: (Score:3)
Because mini-displayport is more expensive, I mean "better".
In what sense is it more expensive? Apple made the license free and it's been submitted and accepted as an official part of the VESA DisplayPort 1.2 spec. And unlike DVI, you can also put audio over it. HDMI also supoorts audio, but it costs $0.15/unit in licensing fees plus an annual $10,000 fee.
Re:why phase out DVI? (Score:5, Informative)
Not sure where you get that. The data stream for DisplayPort is not that much more complicated than HDMI. The only real difference is that you have to have a little more advanced logic to check the packet type before you shove the data into the monitor's frame buffer. And ideally, you should do something with some of the other packet types, like providing extra ports, but that's entirely optional. It certainly is not the case that the monitor is doing anything that could be done in the GPU. In all cases, the monitor has to decode the protocol and buffer it, then read that buffer back as it paints the screen. Digital video is not like analog video to a CRT where you could basically let the signal drive the tube....
The DVI-A to VGA adapters cost nothing because they're nothing more than a handful of wires. Of course any adapter that contains electronics is going to cost more than a wire. If you need an HDMI to VGA adapter, that's going to cost you a lot more than a cable, too (about $40—$10 more than a DP to VGA adapter, BTW). It has nothing to do with DP being too complex and everything to do with the fact that active electronics are required to do the job. That and the fact that there are not enough purchasers to drive prices down through economies of scale.
Interesting (Score:5, Interesting)
The one that was introduced 13 years later is being phased out at the same time as the one that was introduced thirty-six years ago? How odd.
Re: (Score:3)
I guess DVI will stay, but DVI-A and DVI-I which both still have the VGA signal in the DVI connector are going to be a thing of the past. DVI without analogue is just HDMI without audio. Do you see HDMI go anytime soon?
EGA was introduced in 1984. VGA in 1987. Guess VGA hold up pretty well, but it's time for VGA to go. It's pushed to the limits, and if you connect a large resolution monitor to VGA you will notice it.
Study math much? (Score:3)
Re: (Score:3)
Re:Interesting (Score:5, Funny)
I guess 25 is almost 35.
Especially when you're 16.
Re: (Score:3)
I, quite literally, own t-shirts older than you.
Just sayin'
DG
All about HDCP (Score:5, Insightful)
I suspect the driving force toward HDMI-only is anti piracy efforts in the form of mandatory HDCP on any new display hardware.
Re:All about HDCP (Score:5, Interesting)
*is* there usb over hdmi? I know they hacked it to support ethernet.
as for audio, here's some semi-hidden truth: audio is multiplexed in (not literally but close enough) to the digital stream. ie, they take digital video (dvi is fine for that and *always* has been) and then they do a data stream *merge* and create a new binary stream that is a+v. audio comes from 2-channel spdif or multichannel dd/dts.
here's the part that annoys the hell out of me (and should, you, too): those two streams did NOT have to be mixed. video could have gone down the cable in one wire set (like dvi) and audio could have been run in a single coax cable or fiber (spdif or toslink, which all modern home stereos support as audio inputs).
this means that you ONCE did have control over your a and v. you could patch in or out, anything, at any time. no fancy equipment needed. running audio as audio wiring and video as video wiring is just two cables and it was never a big deal to run those 2 in the back of your stereo or tv console. the rgb multi bnc cable madness of analog video *was* nuts and did need cleaning up, but once you go digital, 2 small wires or 1 is not a big deal.
but to 'content guys' they saw an opportunity to swoop in and fuck things up. they mixed the 2 physical streams into a new one and encoded it. 'hdmi-audio' is then born. and it sucks.
you can currently get 24bit audio with 192k and higher bitrates on stereo. I suspect you can do full multichannel on a single coax cable, too; and perhaps on a higher grade fiber (not toslink) as well. there is still no reason to have to weave in audio to video and encrypt the whole thing as a 'secure bundle'. it really messes up higher end audio systems that use outboard DACs. getting audio out of hdmi is expensive and that is totally uncalled for.
dvi is perfectly fine for video but if they are 'coming' for dvi, I bet they are going to try to cut out or cut back on digital audio (as regular pure audio, or as audio that accompanies a video stream). I actually care more about audio than video and so I really do insist on my audio stream staying separate. and its easy, today; you run a HTPC with an spdif-out sound card and you point your app's 'sound card' at the spdif card. if they are shooting for 'no more hdmi' I bet they are not long for 'no more spdif'. spdif is (can be) bit-perfect and so I realize its a continual annoyance to 'content creators'. I hope I'm wrong about a future attempt to grab digital audio as its own wiring (sans video).
Re: (Score:3)
important typo: 192k and higher sample-rate, not bitrates. 192k sample rate as opposed to 96k (which is considered very high res in music) and 44.1k which is redbook cd audio (still very good and even audiophile if done right).
you can fit 24bit audio at that pro-audio level 192k samplerate on a single cheap toslink cable or a regular old 75ohm video grade cable (ie, spdif stuff). there was never a need to 'upgrade' audio pipes on our systems - that's my point.
Ain't happening (Score:5, Insightful)
We've still got serial ports. There are still motherboards with a parallel port, for goodness sake. VGA ain't going away anytime soon.
Re: (Score:3)
Sure it is. When the source of your video signal no longer includes a DAC to generate the VGA signal, there's no point in including the connector on the mainboard. If Intel and AMD are dropping VGA support for their integrated GPUs, then your only option will be an external GPU for VGA. And even then the significantly reduced usefulness of a VGA port means it'll be rapidly dropped from new monitors.
I haven't s
Re:Ain't happening (Score:5, Informative)
Deep breath everyone. DVI==HDMI (Score:5, Insightful)
While I like DVI and have a monitor that uses it, going HDMI only is not a big deal. HDMI is just DVI plus a little extra, for audio, and the cost of that "little extra" is already negligible.
This means that a DVI-DVI, HDMI-HDMI, and DVI-HDMI cable are the same price. I spent $5 on one a few years back.
No difference! Unbunch your panties
easy solution, HDMI/DVI adapter (Score:5, Interesting)
Since DVI-D is just HDMI minus the audio, you can get cheap passive HDMI-to-DVI adapters. They work fine for connecting DVI monitors to HDMI video sources, or vice versa. No quality loss.
Re: (Score:3)
I don't stretch my cords to the limit. That's a great way to make connectors fail, regardless of screws.
30 Years of VGA (Score:5, Insightful)
Lets hope that whatever follows has the same longevity as VGA. In a world where we've invented USB 3 times (USB, mini USB and micro USB) with non-compatible connectors in just 11 years, the future does not look as good.
Re:30 Years of VGA (Score:4, Informative)
All the three you listed are electrically compatible. You can buy cables with a different one of those three sized connectors on either end.
The reason there are three different size USB connectors is for devices with different form factors. A mouse is fine for full size USB, a mobile phone will want micro-USB and something like a PS3 controller uses mini-USB.
Re: (Score:3)
Dumb question: why don't they all use micro?
Re:30 Years of VGA (Score:5, Informative)
Actually Mini-USB was the fragile one and Micro was introduced because it's more robust.
Re:30 Years of VGA (Score:5, Interesting)
USB and the mini and micro variants are a success story if anything. The plugs are electrically identical, just physically smaller, and they're industry-standard. Meanwhile Apple was coming up with all kinds of crazy stupid proprietary shit (going so far as the infamous Charger Resistor Trick) and many other manufacturers were making proprietary connectors for individual device models.
HDMI fasteners? (Score:4, Interesting)
Re:HDMI fasteners? (Score:5, Informative)
In case anyone was wondering (Score:3)
Ten years ago, there were enough other companies in the game that your chances of finding one supporting "legacy" interfaces was a lot better.
It doesn't really matter. (Score:3)
VGA has been dead for some time - even the cheapest monitors are starting to use DVI, so in 5 years, I can see it totally dying out - I mean, sure, some people will still be using it with older machines and older monitors, but in new ones, yeah.
As to DVI? It's not a big loss to loose the ports. Even they start putting HDMI and DisplayPort everywhere, it takes a simple cable to go from HDMI to DVI or visa-versa. My monitors currently wiegh in at one with 1xDVI, 1xVGA, one with 1xDVI, 1xVGA, 1xS-Video, 1xComposite, 1xComponent, and one with 1xVGA, 2xDVI, 1xDP, 1xHDMI, 1xComponent, 1xComposite.
I think 5 years sounds like a reasonable timespan to see the newer ports become big. That said, I see a lot of HDMI adoptation, but most of the graphics cards are still DVI and HDMI - the only machine I have with DisplayPort out is my laptop. 5 years is a lot of new graphics cards however.
As to the replacements, I'm not going to complain. HDMI and DisplayPort are much nicer to plug/unplug than DVI cables - and no need to worry about dual-link or not. As to VGA - I havn't used it in a long time. Due to the HDMI/DVI compatibility, I don't really see this causing much hurt to anyone either.
Re:It doesn't really matter. (Score:4, Informative)
HDMI sucks.
Why? not because of HDMI.... Because of the worthless HDCP that is designed to make life miserable.
HDCP keys, handshakes, etc all make hdmi distribution expensive. The low grade dog food stuff does not do Key caching and management.
And god help you if you want to do a hdmi matrix. the ONLY company that has one that works is Crestron. Their DM switchers are the ONLY choice for a 16X16 or larger Hdmi switching that works.
Re:It doesn't really matter. (Score:4, Informative)
It's not that HDMI or DVI sucks... it's the HDCP that sucks.
Remove HDCP and digital video becomes a dream.
In fact that is the best thing to do, buy and install HDCP strippers at every source point and all switching and routing issues disappear.
But I cant do that, It's illegal in the usa. Soon you get to spend life in gitmo for even telling someone that such a device exists.
Projectors (Score:4, Interesting)
I travel to give the occasional presentation and I think I've only seen one or two projectors in the past 5 years that had something other than a VGA input. This is probably why many business laptops still have VGA outputs at the expense of providing others like DisplayPort, DVI, or HDMI.
The other problem is that monitors and projectors long outlive their PC contemporaries. I've got a 20" Dell LCD that I purchased in 2003 that's still going strong today. It has VGA and DVI inputs, since only in the past few years have HDMI and DisplayPort become standard on monitors.
I'm rather partial to DisplayPort and Thunderbolt since the connectors are smaller and don't have pins that are easily bent, but these outputs aren't too common in laptops, unless you have a Mac.
Re:Projectors (Score:4, Interesting)
Corporate projectors will often have a lot of different inputs, but as a general rule of thumb, the only cable connected will be VGA. For corporate presentations it's still VGA all the way. (Have seen Mac users learn this the hard way.)
Business use laptops and projectors (Score:4, Insightful)
It may be that many of you in the home market won't miss VGA, but in most corporate offices, VGA is the only common connection supported by the projectors in most conference rooms. While an adapter is an option, I suspect that laptops marketed to businesses will have VGA adapters for longer than the next five years as the refresh cycle for projectors is generally much longer than the refresh cycle for laptops.
Re:Business use laptops and projectors (Score:5, Interesting)
There is little support for HDMI within the seminar room equipment available, it's practically all VGA only. It's only very high end kit which has HDMI support.
If you add to that most, if not all, "presenter" units (that is the back-lit camera systems people can use to show objects or hand-written notes) are VGA output only, the only real solution is analogue video, even though it doesn't travel long distances well (though this can be worked around with video senders).
The reason I know this is that only a couple of years ago I was on a committee running the kitting out of some lecture theatres and seminar rooms. None of the tendering A/V companies could supply a complete system using DVI, HDMI or any other digital video technology even though we asked them to look into it.
VGA is *THE* de-facto lowest common denominator computer video format, it's likely to stay that way for a *VERY* long time.
Re:Business use laptops and projectors (Score:4, Funny)
Had a heck of a time finding a DVI cable (Score:3)
Looks like the phase-out already started. I set up a computer for my parents over the holidays and we had to drive all over to find one. Only one I found was an overpriced gold-plated Radio Shack model.
Awesome (Score:5, Funny)
Now I can pull video cable up the back of a workstation without it catching on every god damned cable, wire, footstool and purse in the remote vicinity.
Raspberry Pi (Score:5, Funny)
And people complain that the Raspberry Pi (which is not even out the door yet) doesn't support VGA... sheesh.
There are DisplayPort-to-* converters (Score:5, Informative)
DisplayPort can be converted to HDMI or single-link DVI with a cheap, passive adapter.
You can also convert it to VGA or dual-link DVI using active adapters (they show up to the computer as DisplayPort devices).
MPAA (Score:3, Insightful)
Re: (Score:3)
And with modern monitor shapes, it might even have as much as 600 vertical pixels!
Try high-speed HDMI cables (Score:3)
The blight of HDMI (inconsistent throughput for even the PALTRY 1080p in most cables)
A "standard" HDMI cable is guaranteed only up to about 1 Mpx per frame, or 720p. A "high-speed" HDMI cable can do four times that: 1080p 3D or 1440p.
Re: (Score:3)
Electrically HDMI and DVI are equivilent, but HDMI has a smaller and more robust connector.
They don't have all the same pins. HDMI can carry audio while DVI can't, and DVI can carry analog video in addition to digital video while HDMI can only carry digital.
Re:DVI is HDMI without sound and video cards (Score:5, Insightful)
DVI is HDMI without sound and video cards are not the best for sound and PC displays do not have more then 2 speakers any ways.
Does any PC display with HDMI have some kind of DD pass though or 5.1 or more analog out?
Video cards are as good for digital sound as anything. All they do is take the digital signal from your applications and send them digitally over HDMI. Barring driver bugs, it's just the same as any digital output on anything.
I think for DD pass-through a device has to support DD. I have my 360 connected to my TV and my TV connected to my surround sound and DD5.1 works fine. My TV doesn't support DTS though, so I have to connect my PS3 directly to my surround sound in order for that to work.
Re: (Score:3)
Re:And switch to HDMI? (Score:5, Informative)
The sync of an HDMI cable isn't fast -- it's slow. So if you swap to a HDCP protected stream and then off of it, the monitor will flicker or sometimes, not come back at all. Then you need to reboot.
Just basically, it sucks. Read about HDMI handshake issues and you'll see what I mean.
Re: (Score:3)
That was a Windows issue, not a Display Port issue. It work perfectly with up-to-date OS.