The State of Video Connections 235
mikemuch writes "Joel Durham provides a nice rundown on what's happening in video interfaces as we leave VGA behind and move through the DVI flavors, visit HDMI along the way, and look forward to UDI and DisplayPort."
Print Version (Score:5, Informative)
I agree and the content is worthless as well. (Score:5, Informative)
DVI (and thus HDMI) is limited to very short cables, SDI does several hundred meters.
There is no DRM on SDI.
The cable for SDI is simple coax cable, it doesn't get much cheaper or robust than this.
The connectors are BNC, also robust and cheap.
The transmitters and receivers are also relatively simple and reliable.
SDI is what is currently used for digital transmission of video in professional environments, so it's not like it's completely unknown.
I'm pretty sure SDI is what we would be using if the MPAA didn't get to write standards.
Re: (Score:3, Informative)
i agree that it's "nice" that it uses coax connectors... but that is also its downfall. with the lack of data transfer it's far from future proofed. hell, 1080p is the max it can keep up with. want a screen bigger then that? no can do.
Re: (Score:2, Funny)
The really impressive part is that you can type all that while apparently sucking your own tiny cock.
article text to avoid annoying 6 pages (Score:5, Informative)
The analog VGA was the standard for such a long time, some of us just got used to it. Today, I don't remember the last time I got a performance-grade graphics card with a VGA port on the back of it; I have a small cadre of DVI-to-VGA adapters that I use to plug in my monitors.
DVI as a standard features a number of sub-standards, some analog, some digital. Now DVI is already seeing the writing on the wall due to its limited bandwidth, just as the world grows accustomed to it. HDMI is crossing from the TV set to the computer, UDI is creeping into the market, and DisplayPort is riding over the horizon and hoping to take over the world.
What if you just want to play Supreme Commander or do your taxes? Can't you just poke a monitor cable plug into a display adapter and be done with it? Sure you can, if you know what to expect when you face the next generation of graphics-to-display connections.
VGA
Sure it's old, but it still works. Video Graphics Array (VGA) has been around since 1987, a few years after which it became the standard connection between the PC and its monitor and stayed that way for more than a decade. If you happen to purchase an analog CRT monitor, even one made today, it's likely to require a VGA connection to a computer.
The term VGA has come to mean a number of things. In one sense, it's used to refer to the actual port found on a graphics card or the corresponding plug (a 15-pin mini D-sub male) on a monitor cable. VGA is also sometimes used to describe the outdated and rarely used screen resolution of 640x480 pixels, which was once considered sharp and sexy.
VGA Connector
click on image for full view
VGA graphics cards date back to the days of ISA expansion ports. Such cards were typically capable of addressing only 256K of local memory and displaying 256 colors at 640x480 at a 70Hz refresh rate. As demand grew for higher resolutions and more robust graphics support, the original VGA spec became outmoded but the connection port was preserved.
VGA is analog. Graphics cards with VGA compatibility employ RAMDAC (random access memory digital to analog converter) chips to pipe digital graphics signals through the analog display cable. Of course, with digital displays like flat-panel monitors being all the rage, it would be even cooler to have a direct digital-to-digital connection from PC to display, wouldn't it? That's where DVI came to the rescue.
DVI
DVI stands for Digital Visual Interface. As digital flat-panel monitors started to become the rage at the tail end of the last century, the analog VGA connector quickly became inadequate for the needs of such displays. The DVI port is quite different from that of VGA: It's made up of up to 24 pins (most of which are for TMDS) and an additional five pins for analog compatibility. TMDS stands for Transition Minimized Differential Signaling; it's a high-speed serial interface used by the DVI and HDMI display standards.
DVI comes in three flavors:
* DVI-A, in which the A stands for analog. This type of DVI connection only transmits analog signals and is intended for use with CRT monitors. You almost never see DVI-A.
* DVI-D, the D meaning digital. This is purely digital, without any analog compatibility at all.
* DVI-I, with the I standing for integrated. This connection carries both analog and digital signals and can be used with either analog or digital displays. This is the most common DVI connector found on graphics cards.
To further complicate matters, DVI-D and D
Huh? (Score:5, Interesting)
My VGA card had 256k of RAM, and it did 640x480 at 16 colors. I wonder why...
640*480=307200
256k=262144 bytes
That's also why most early "VGA" games ran at 320x200x256. I understand that 640x480 is sometimes referred to as VGA regardless of color depth, but that doesn't seem to be what he's doing here.
Re: (Score:2)
The reason games ran at 320x200x256 was because that was what the low-end PS/2 machines came with standard, and it was the PS/2 that pretty much defined where IBM wanted "beyond EGA" to go.
Mal-2
VESA (Score:3, Informative)
That's why VESA came with it's BIOS extension, exposing much more function via the int 0x10, either as a TSR driver, or as part of the video BIOS later.
1.0 gave additional video modes, a standard mechanism for bank-switching, and other similar facilities (including saving the screen and mouse state).
2.0 gave l
Re: (Score:3, Interesting)
Re:article text to avoid annoying 6 pages (Score:5, Funny)
And you get a warmer picture than the rest of us, right?
Re: (Score:2)
Yup. Another neat feature, is that the projector is the only signifigant heat source up there, and if I leave it on during the day, it will get up to 60^oF in the third floor. As it is, I think the snow is the only signifigant insulation up there.
Re: (Score:2)
And kill the (expensive) light bulb in it while you're at it. It's probably cheaper to heat the room with your furnace, you know.
Plus.... (Score:5, Interesting)
If you move into TV-land you also have coaxial, composite, s-video, component, and HDMI, as well as 1/8 and 1/4" phone jacks, RCA, digital-coax, and digital-optical for audio.
My personal theory [putting on tinfoil hat] is that's it's all a vast conspiracy by the cable and connector manuafactuers. Every new connector requires new cables, adaptors, and, in the end, replacing "obsolete" equipment that can no longer talk to one other.
And why does an optical or HDMI cable of sufficient length end up costing more than most DVD players? It's a CABLE for Pete's sake.
cable prices (Score:5, Informative)
Because that's where the big electronics stores make their profit. Ask a BestBuy employee how much that $100 monster cable costs him under the employee discount program. It'll be significantly closer to the $0 side of the range than the sticker price...
That said, there are some good companies out there that will sell perfectly good HDMI (and other) cables at reasonable prices. http://www.monoprice.com/ [monoprice.com] is one I've ordered from multiple times and had great results with. My last purchase was 10' of HDMI - I think I paid $10 shipped.
I actually was surprised to see that Target had 6' of HDMI for $15. A lot better than the $60/6' that was the best I found when I was looking for a quick cable at BestBuy...
Re: (Score:2)
There's your problem.
"Those stores" need to make money somewhere, and if you HAVE to have it NOW, then you can help pay the bills that make it available to you, NOW.
We all know you can get a better deal, if you can shop and wait.
Re: (Score:2)
Re: (Score:2)
Sure, component is slightly better than S-video. Or was it the other way around? Either way, it's time for a single next standard. That's why they call it a standard. The cabling really isn't the limiting factor in image quality, but it seems to cause the most annoyance.
Re: (Score:3, Interesting)
HDMI is a all-digital Transition Minimized Differential Signaling (TMDS) method that also includes audio data. HDMI beats component on todays "digital" LCD and plasma screens when used with digital sour
D to A to D (Score:2)
I keep hearing this and wondering if it is really better, and if so, in what way better.
IOW, what difference will I see on my screen? What should I look for to recognize signs of degradation?
I can't help but feeling that "it's better because it's -all- digital" is just BS, kind of like "but it goes all the way to 11!".
we don't, we need a *device* connector (Score:3, Interesting)
Re:Is delay an issue? (Score:2)
Re: (Score:3, Insightful)
Evolution (Score:3, Insightful)
I would hate to see the day when I use one display device for Linux and need an entirely different device to be compatible with proprietary DRM/TC/HD output or have to buy a third party descrambler type box--because we all know what a racket those were. It'd be like early 80s cable TV wars all over again.
Piss off! (Score:5, Insightful)
Please stop this crap! Just give us simple digital connectors and let the boxes talk to each other. How about something plain and simple 10Gb Ethernet?
Re: (Score:3, Insightful)
How about just analog RGB and quit pretending we need digital connections at all?
You want high bandwidth? Analog RGB can do it. You want deep color? Analog RGB can do it. You want to avoid DRM? Analog RGB is perfect for that. You want easy to record? Analog RGB -> Analog recording media *or* digital(ized) media. You want easy to connect? Analog RGB. You want easy to switch between signal sources? Analog RGB. You want easy to buffer and redistribute? Analog RGB. You want auto-mode detection? We fool e
Re: (Score:2)
Re: (Score:2)
Conversion to digital is trivial. So is conversion back. End of problem. Or more to the point, there never *was* a problem.
Yes, I've noticed the sun bouncing bits off me towards the camera quite often. And when the sun isn't out, what would we do without our digital floodlights, merrily em
Re: (Score:3, Insightful)
I hate to break it to you, but newer displays (i.e., LCD and everything else that's not CRT) are inherently digital. So yes, we are talking about DAD conversion.
Re: (Score:2)
Really? So you maintain that the liquid crystal in an LCD cell responds in digital - discrete - steps of brightness. The crystal is standing in the cell on a 64- or 256-step ratchet, waiting to pivot, driven by six or eight bits of control, is it? This must be that "new nano physics" I've heard about. :)
No, the fact is that LCD display cells are purely analog in mechanism; apply an analog voltage or current, and
Re: (Score:2)
Really? You mean pixels on an LCD aren't signaled in a (more or less) similar way as cells in a RAM chip? I'm no electrical engineer, but I would have assumed...
Re: (Score:3, Interesting)
There are two issues that relate directly to your concern. One is addressing; in order to pick a ram location or a screen pixel, a signal needs to be sent to the particular location that says "hey you, and not any other." Displays and RAM can be similar in this regard, though it is more likely that a simpler scheme of sequential counters is use
Re:Piss off! (Score:4, Interesting)
How about we stop pretending that analog RGB looks good? Ever try screwing with the contrast setting on an LCD? That's analog technology at work.
DVI lets me see the image outputted by my graphics card - pixel and value precise. Neither my monitor nor my graphics card supports HDCP, so DRM isn't a problem.
As a public service, let me remind you that high-bandwidth analog signals are problematic. It doesn't take much for noise, crosstalk, or other issues to show up on an analog monitor at high resolutions.
Try connecting your monitor to your desktop with a 20 foot DVI cable - then try doing the same thing with an analog RGB cable.
Try using a crappy KVM. Most screw up resolutions greater than 1600x1200.
Analog is the reason my cable signal looks like shit. It's the reason why broadcast TV looks crappy. It's the reason why AMPS cellphones have static.
So, hell, why shouldn't we take a nice clean digital signal, run it throguh a DAC, throw it through a cable, and try to reconstruct it into a digital signal with an ADC at the other end. Extra components, extra complexity, and more chances for interference. What a great idea.
Re: (Score:2)
The problem 'tho isn't the digital nature of the connection, it's the bloody DRM that everybody is trying to ram down our throats on any new standard. In 5-10 years, the cascade of failures and incompatibilities arising from DRM coupled with it's complete failure to protect content will make it the Edsel of computing.
Re: (Score:2, Insightful)
Re: (Score:2)
Try finding a "crappy" KVM that supports DVI. (Hint: they don't exist -- or at least they'd better not be crappy, for that price!)
Other than that, I agree with you.
Re:Piss off! (Score:5, Informative)
How about we stop pretending it doesn't? Especially, as in your case, when there is no basis for such an assertion. I have full HD over component. My system looks beautiful. Ergo, analog doesn't give you a poor image, there's nothing inherent in it that prevents a good picture.
Please. My cables hang slack in the basement, hooked over projecting screws, run about 30 feet, and they are fine. Why? Because it doesn't take much (as in, proper termination, decent coax, low-loss connectors) to run high bandwidth analog just about any distance you like. Claims to the contrary are nonsense. Can you screw up such a run? Sure. Just try it using audio cables. But for that matter, try running a multi GB/s digital signal through an audio cable and watch what happens. I mean, aside from hosing every RF receiver in your home. Yes, we're in a zone where the cables need to be right. This is no different from a digital copper run. Optical is something else entirely. But of course, you can run analog optically as well. :)
Oh, please. Such marketing-inspired tripe. You picked the wrong person to try and push over what you thought was a hypothetical.
I have a 17 foot (204 inch) display driven exclusively by component from the receiver, though I also feed it analog from a Mac via a VGA input - that's the media librarian using Delicious Library. It looks absolutely fabulous either way. You can see every glorious pixel in HD, up close. The projector has about 30 feet of cable on it, not counting the various lengths of cable the component HD input sources (XBox360, HDDVD, Blueray, PS3, Satellite) feed to the receiver and the switches; there are no problems with ringing or artifacts whatsoever. The cables go down through the floor, along for quite a distance, and back up at, and through, the projector's pedestal. Of course I don't use radio shack RCA cables to do this, I use a triple run of coax and I have it properly terminated, but this is no big deal and the technology can be built into any simple cable without adding significant cost as compared to, for instance, a many-pinned multi-pin connector.
The manufacturers have been feeding you bullshit so long you think it is true. Well, it's not, and I can prove it.
Are there advantages or unique uses to/for digital transport? Certainly. But is digital transport in any way required to view for instance, full HD at 1080x1920 at 60fps in high quality? No. Absolutely, resoundingly, factually, no.
No, shitty equipment and/or shitty standards and/or shitty service is why your cable looks like shit. Cable can look butter smooth. The fact that yours doesn't isn't a reflection on technology, it is a reflection on what consumers will put up with because they're badly misinformed about what is reasonable and possible.
Listen to yourself. "Try using a crappy..." Why would I do that? Really, why? When I need one, I use one that is adequate to my needs. Nothing screws up at all. I switch between linux servers using a KVM and the results are pixel-perfect. It's 100% analog. Using crappy equipment will certainly get you crappy results, but why would you think this has any bearing whatsoever upon the inherent capabilities or limitations of the underlying technology? Talk about backwards reasoning!
Re: (Score:3, Insightful)
Well, you sure told us with your lone anecdotal data point.
Computer display data starts out in the digital domain. An LCD panel requires digital signals to generate an image. There's NO GOOD REASON to convert that signal from digital to analog to digital in between -- there WILL be degradation, however slight.
Re: (Score:3, Informative)
It proves the point; such systems are workable. That's all it is there for, to solidly discredit the ridiculous claims that are appearing in this thread about noise, resolution and so forth at HD. However, presuming mine is the only such system is really kind of dim. I bought everything off the shelf. You can reasonably assume I am not the only consumer to have done so. Or do you think I'm really the only guy with a high end component system?
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
Other than that I agree with you for the most part.
HDCP: it still sucks (Score:5, Interesting)
Unfortunately, HDCP implementation sucks. Standard procedure for the problems almost everyone has with HDCP-enabled cable boxes is to *reboot the box*. Apparently, in the exchange of encryption keys a handshake sometimes gets dropped, and nobody has a firmware solution.
Of course, even it worked right, HDCP would still suck.
Re: (Score:2)
Re: (Score:3, Insightful)
No, the implementations of HDCP TOTALLY RAWK!
This way, people who would normally never care enough about DRM and copy prevention to even notice are getting a big steaming cup of wake-up. Anyone who has to put up with HDCP handshake failures
Re:HDCP: it still sucks (Score:4, Interesting)
It's still a very real issue. I returned a Philips DVD player because it would consistently fail to renegotiate HDCP when I switched the TV's inputs off of the DVD player. Now, maybe this was just a timing problem and Philips failed to implement the specs correctly. Of course that's not HDCP's fault. It's entirely Philips's fault.
Still, no matter where the blame lies, the fact is it wouldn't be an issue AT ALL if HDCP didn't exist. It's because of HDCP that there are specs to implement incorrectly. I still don't buy the idea that HDCP is going to have any meaningful impact on piracy. It's just yet another inconvenience because the consortiums assume everyone is planning to do the wrong thing. It's insulting.
What's wrong ith DVI? (Score:2)
First, I don't want to go back to VGA. I'm quite happy with my DVI, thanks. And DVI does not require HDCP -- I'll likely never see HDCP as long as I'm running Linux.
And second, I haven't looked since last year, but I suspect you're just plain wrong. My video card comes with two DVI ports, but it also comes with DVI->VGA connectors for them. My monitor comes with a DVI port, a VGA
What's happening... (Score:5, Insightful)
Re: (Score:2)
But the real problem is with my dual monitor setup. There are very few 4 port KVM switches with dual DVI, and even less that also have USB, and they are VERY expensive. Add the cable set
Re: (Score:2)
Re:What's happening... (Score:4, Insightful)
Honestly engineers and marketing guys talk all day long about how good X or Y is, and it all comes down to "how can we shove our DRM into the new standard and fool customers into buying it."
My friend though I was nuts buying a pair of 21" LCD's that had only VGA on them. they look fantastic, play FPS great and work just fine with my 7300GT card.
VGA will disappear as soon as RS232 disappears. which by what I see in the integration market, is many many years from now IF electronics makers get off their asses, which is highly unlikely from what I have seen.
Re: (Score:2)
Re: (Score:2)
Right?
OK, so maybe you don't need those because your servers run command-line friendly operating systems, which will have an IPMIv2 daughterboard on the motherboard IPMI interface to enable remote power control, serial-console-over-LAN, etc. Right?
Right?
*sigh*. I'll go get my keyboard and VGA cable
Re:What's happening... (Score:4, Funny)
Fibre: The other video connector (Score:2)
Me? I fly with proprietary fibre [engadget.com] solutions! Well, I would if i were dirt rich.
Having your graphics display remote from the consoles they are attached to is absolutely amazing. I wish we could wire our entire office with decent thin clients.
Important: Intel Opinion Center (Score:4, Interesting)
Anyway, Intel posted a number of press releases and got a few comments here and there. But sometime last week they decided to get out of the deal. There is nothing wrong with that, but they DELETED all the previous stories and posted some lame excuse. Not that this means anything, but the comments on Intel's previous stories could still be viewed if you knew the exact url. In other words only the stories were deleted; the comments were not. This action generated a number of negative comments on the whole Intel "Opinion Center" idea. Today I went back to check on it and lo and behold they have DELETED ALL THE COMMENTS and marked the story as READ ONLY. While Slashdot claims that they can't or won't delete comments, I think it is pretty clear that things can be done if the price is right. Although I suppose we all already knew this from previous incidents, this time in particular it caught me by surprise. While a few of the comments were trolls, most of them voiced honest but negative opinions of the "Opinion Center". If you want to call it an "Opinion Center", then you should be ready to hear opinions. Otherwise just call a spade a spade: Intel "Marketing Center".
I never liked the idea in the first place, but deleting all the previous stories AND comments is really weak and speaks a lot about the integrity of both Intel and Slashdot. If you think Intel and Slashdot did the wrong thing here, please mod this post up.
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:2, Informative)
that entire article (Score:2, Informative)
One cable to rule them all (Score:2, Interesting)
Re: (Score:2)
That's because nobody needs what you are after (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
Though as other people have suggested, you can get a DVI-D cable. This should physically fit both ends, and the missing pins on the monitor side are not needed anyway. Other solution is
Too many ambiguous standard names (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
To answer your question, at that period, Amiga users were mocking the PC for their lame CGA resolution in games and were proud of the the Amiga standard 320*256 ( yep it is 5/4 ). Was not on the graph, though.
Wireless Video? (Score:2)
I have a wireless keyboard, wireless mouse, wireless hea
Re:Wireless Video? (Score:5, Informative)
Do the math your self: 1280 x 1024 x 24 x 60 = 1.887Gbps
This doesn't even begin to take into account any protocol overhead, sync signals, or other useful data such as audio.
compress it! (Score:2)
comment |= (joke|irony);
Re: (Score:2)
You think so? Do the math. A resolution of 1024 by 768 pixels, with 24 bits color, contains about 19 megabits of information. With a 50 Mbps link, even assuming you could use the full capacity of that link, that woul
Re: (Score:2)
As others have pointed out, there is the bandwidth issue because your PC is sending out an uncompressed signal. In that same vein, the streams you're getting over wireless 802.11 (assuming you're talking about A/V streams) are being t
Just one simple question.... (Score:3, Funny)
2 cents,
QueenB
What's the straight dope on HDMI? (Score:3, Interesting)
They ended up with the comment that the video quality wasn't up there with component.
So, were they blowing sunshine up my skirt, or is HDMI really the tarpit they describe?
Re: (Score:2, Informative)
Re: (Score:2)
I want long, lossless video cables-where are they? (Score:2)
The ideal cable for me would be one that I could pre-network the house with, so that I could choose to display the output from any of several computers in my house. That way, I could get
Re:I want long, lossless video cables-where are th (Score:2)
So what do you need for longer runs? (Score:3)
So what would it take to make a cable that accomodates long runs and can transfer over 6 Gb/s? Would long runs require an even higher bandwidth because they would need a higher overhead? And how would you accomodate that with a cable? More signal-carrying wires? And how about fiber? The line loss on that wo
Re: (Score:2)
As Overzeetop said, what you need is gigabit ethernet to carry compressed signals and a computer at the other end to decode them. Trying to pipe uncompressed high-resolution video all over your house is impractical and, frankly, stupid.
Re:I want long, lossless video cables-where are th (Score:2, Informative)
For longer distances you'll have to rely on extenders.
DVI and KVM switches.. (Score:3, Interesting)
Stop the madness! (Score:4, Insightful)
By the time I got DVI on my DVD player and HTPC, I found my TV had HDMI. Now, I'm told "...it's unlikely that HDMI will become more than a footnote in the epic story of PC display technology..." Well that's just great. Yet another adapter that costs $50 at my local outlet for
Many devices today still don't support the existing connections properly, so I have little faith that new connections will improve things. Many TV's have DVI inputs but still overscan. DVDs are still encoded with interlacing. HDCP has connectivity issues like the PS3 debacle. I know people who still tell me that their s-video connection is state of the art. And while most new TVs are using composite cables, that is STILL analog and YUV based instead of digital. The industry is not ready for new connectors.
For an example of connectivity done right, look at USB 2. USB 1, USB 1.1, and USB 2 all use the same connection. The devices negotiate the appropriate speed. Ethernet does this too. Unless there is very very good reason, please don't change the physical connections. Increase the bandwidth in a backward-compatible way if necessary.
Re: (Score:2)
laser-7 output (Score:2, Funny)
Maintenance Guy: Uhh, what kind of output does this have? This is some ancient Super-VHS output or somethin'. I can't connect it to your float screen.
Cartman: There's gotta be some way to hook it up! It's the freakin' future!
Maintenance Guy: It may be the future for you, but I can't hook up anything to a float screen without at least a laser-7 output.
All I wanna do is play Nintendo Wii!
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Is UDI HDC mandatory? (Score:2)
Is there something somewhere that states UDI HDCP is mandatory for implementors?
Could be an WHQL logo requirement (Score:2)
Re:DRM/HDCP must be optional. Remember Chinese fac (Score:3, Funny)
Indeed, let's also include the graphics card with the monitor instead of the computer and run an X server on the monitor and connect it through ethernet. If we in addition connect the keyboard and mouse directly to that monitor, we could even put it remote from the actual computer if we wish to. We just need a nice name for that monitor/keyboard/mouse combination running X. Well, wha
Re: (Score:2)
If you RTFA it's in there.
Metonymy (Score:3, Informative)
Re: (Score:2)
Glasses - real live HD (Score:2, Insightful)