Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Displays Hardware

The State of Video Connections 235

mikemuch writes "Joel Durham provides a nice rundown on what's happening in video interfaces as we leave VGA behind and move through the DVI flavors, visit HDMI along the way, and look forward to UDI and DisplayPort."
This discussion has been archived. No new comments can be posted.

The State of Video Connections

Comments Filter:
  • Print Version (Score:5, Informative)

    by Shimdaddy ( 898354 ) on Tuesday February 13, 2007 @07:03PM (#18004794) Homepage
    Spare your eyeballs with the ad free, one page print version [extremetech.com].
  • by Anonymous Coward on Tuesday February 13, 2007 @07:06PM (#18004842)
    Most of my monitors are 19-inch CRT monsters. They do what I need them to do, they deliver a pretty image, but they're old. I still have a ViewSonic Optiquest V95 in service that dates back to around 1999. It's a VGA monitor, as are all of my displays. I shudder at the idea of updating them, not because of some sentimental attachment, but because connecting displays to computers has become so darned complicated.

    The analog VGA was the standard for such a long time, some of us just got used to it. Today, I don't remember the last time I got a performance-grade graphics card with a VGA port on the back of it; I have a small cadre of DVI-to-VGA adapters that I use to plug in my monitors.

    DVI as a standard features a number of sub-standards, some analog, some digital. Now DVI is already seeing the writing on the wall due to its limited bandwidth, just as the world grows accustomed to it. HDMI is crossing from the TV set to the computer, UDI is creeping into the market, and DisplayPort is riding over the horizon and hoping to take over the world.

    What if you just want to play Supreme Commander or do your taxes? Can't you just poke a monitor cable plug into a display adapter and be done with it? Sure you can, if you know what to expect when you face the next generation of graphics-to-display connections.

    VGA

    Sure it's old, but it still works. Video Graphics Array (VGA) has been around since 1987, a few years after which it became the standard connection between the PC and its monitor and stayed that way for more than a decade. If you happen to purchase an analog CRT monitor, even one made today, it's likely to require a VGA connection to a computer.

    The term VGA has come to mean a number of things. In one sense, it's used to refer to the actual port found on a graphics card or the corresponding plug (a 15-pin mini D-sub male) on a monitor cable. VGA is also sometimes used to describe the outdated and rarely used screen resolution of 640x480 pixels, which was once considered sharp and sexy.
    VGA Connector
    click on image for full view

    VGA graphics cards date back to the days of ISA expansion ports. Such cards were typically capable of addressing only 256K of local memory and displaying 256 colors at 640x480 at a 70Hz refresh rate. As demand grew for higher resolutions and more robust graphics support, the original VGA spec became outmoded but the connection port was preserved.

    VGA is analog. Graphics cards with VGA compatibility employ RAMDAC (random access memory digital to analog converter) chips to pipe digital graphics signals through the analog display cable. Of course, with digital displays like flat-panel monitors being all the rage, it would be even cooler to have a direct digital-to-digital connection from PC to display, wouldn't it? That's where DVI came to the rescue.

    DVI

    DVI stands for Digital Visual Interface. As digital flat-panel monitors started to become the rage at the tail end of the last century, the analog VGA connector quickly became inadequate for the needs of such displays. The DVI port is quite different from that of VGA: It's made up of up to 24 pins (most of which are for TMDS) and an additional five pins for analog compatibility. TMDS stands for Transition Minimized Differential Signaling; it's a high-speed serial interface used by the DVI and HDMI display standards.

    DVI comes in three flavors:

    * DVI-A, in which the A stands for analog. This type of DVI connection only transmits analog signals and is intended for use with CRT monitors. You almost never see DVI-A.
    * DVI-D, the D meaning digital. This is purely digital, without any analog compatibility at all.
    * DVI-I, with the I standing for integrated. This connection carries both analog and digital signals and can be used with either analog or digital displays. This is the most common DVI connector found on graphics cards.

    To further complicate matters, DVI-D and D
    • Huh? (Score:5, Interesting)

      by JMZero ( 449047 ) on Tuesday February 13, 2007 @07:17PM (#18005000) Homepage
      Such cards were typically capable of addressing only 256K of local memory and displaying 256 colors at 640x480

      My VGA card had 256k of RAM, and it did 640x480 at 16 colors. I wonder why...

      640*480=307200
      256k=262144 bytes

      That's also why most early "VGA" games ran at 320x200x256. I understand that 640x480 is sometimes referred to as VGA regardless of color depth, but that doesn't seem to be what he's doing here.
      • by Mal-2 ( 675116 )
        This is why some old extended-VGA cards (most based on Paradise chipsets) supported 640x400 at 256 colors -- it fits nicely in a 256k memory space. If you ran 640x480x256 on cards that supported (but did not have) 512k, the bottom of the screen was garbage.

        The reason games ran at 320x200x256 was because that was what the low-end PS/2 machines came with standard, and it was the PS/2 that pretty much defined where IBM wanted "beyond EGA" to go.

        Mal-2
    • Re: (Score:3, Interesting)

      Yup. Analog here. Many tubes, and an analog NEC projector. HDMI can go screw itself. I'll be analog with coax until anything else is far far far cheaper (or better).
      • by value_added ( 719364 ) on Tuesday February 13, 2007 @08:16PM (#18005722)
        Analog here. Many tubes, and an analog ...

        And you get a warmer picture than the rest of us, right? ;-)
        • And you get a warmer picture than the rest of us, right? ;-)

          Yup. Another neat feature, is that the projector is the only signifigant heat source up there, and if I leave it on during the day, it will get up to 60^oF in the third floor. As it is, I think the snow is the only signifigant insulation up there.
          • ...if I leave [the projector] on during the day, it will get up to 60^oF in the third floor.

            And kill the (expensive) light bulb in it while you're at it. It's probably cheaper to heat the room with your furnace, you know.

    • Plus.... (Score:5, Interesting)

      by shmlco ( 594907 ) on Tuesday February 13, 2007 @08:19PM (#18005758) Homepage
      "Why do we need another display connector?"

      If you move into TV-land you also have coaxial, composite, s-video, component, and HDMI, as well as 1/8 and 1/4" phone jacks, RCA, digital-coax, and digital-optical for audio.

      My personal theory [putting on tinfoil hat] is that's it's all a vast conspiracy by the cable and connector manuafactuers. Every new connector requires new cables, adaptors, and, in the end, replacing "obsolete" equipment that can no longer talk to one other.

      And why does an optical or HDMI cable of sufficient length end up costing more than most DVD players? It's a CABLE for Pete's sake.
      • cable prices (Score:5, Informative)

        by Matthew Bafford ( 43849 ) on Tuesday February 13, 2007 @10:45PM (#18007106) Homepage

        And why does an optical or HDMI cable of sufficient length end up costing more than most DVD players? It's a CABLE for Pete's sake.

        Because that's where the big electronics stores make their profit. Ask a BestBuy employee how much that $100 monster cable costs him under the employee discount program. It'll be significantly closer to the $0 side of the range than the sticker price...

        That said, there are some good companies out there that will sell perfectly good HDMI (and other) cables at reasonable prices. http://www.monoprice.com/ [monoprice.com] is one I've ordered from multiple times and had great results with. My last purchase was 10' of HDMI - I think I paid $10 shipped.

        I actually was surprised to see that Target had 6' of HDMI for $15. A lot better than the $60/6' that was the best I found when I was looking for a quick cable at BestBuy...
        • ...when I was looking for a quick cable ...

          There's your problem.

          "Those stores" need to make money somewhere, and if you HAVE to have it NOW, then you can help pay the bills that make it available to you, NOW.

          We all know you can get a better deal, if you can shop and wait.

        • The same reason that Future Shop will sell you a $50 printer, then try to sell you a $30 USB cable to go with it. (Dollar stores are a great place for dirt cheap, high quality cables for non-bleeding edge cables. You won't find HD cables there, but for gold plated brand name [GE] USB and others, you can't beat the price.)
      • by cgenman ( 325138 )
        I won't pick up a piece of home electronic gear unless it handles RCA, for this very reason.

        Sure, component is slightly better than S-video. Or was it the other way around? Either way, it's time for a single next standard. That's why they call it a standard. The cabling really isn't the limiting factor in image quality, but it seems to cause the most annoyance.

        • Re: (Score:3, Interesting)

          by shmlco ( 594907 )
          S-video carries the video data as two separate signals (brightness and color), unlike composite video which carries the entire set of signals in one signal line. S-video is better. Component typically breaks them out to three different RGB or Y'PbPr signals. Component beats s-video hands down. All are analog.

          HDMI is a all-digital Transition Minimized Differential Signaling (TMDS) method that also includes audio data. HDMI beats component on todays "digital" LCD and plasma screens when used with digital sour
          • HDMI beats component on todays "digital" LCD and plasma screens when used with digital sources like cable or DVDs, as you're not converting from digital to analog back to a digital matrix.

            I keep hearing this and wondering if it is really better, and if so, in what way better.

            IOW, what difference will I see on my screen? What should I look for to recognize signs of degradation?

            I can't help but feeling that "it's better because it's -all- digital" is just BS, kind of like "but it goes all the way to 11!".

      • ...what I want is for my satellite box, dvd, amplifier, projector etc to all hook up intelligently to each other: no multiple remotes, no daisy-chaining scart cables, no having to switch the TV to AV1 and the amp to DVD each time - I want it to just work. I want an interconnect that can do video, audio AND control devices. I want my DVR to change my satellite box to the right channel at the right time without messing around with IR blasters and the like. No-one wants complexity, yet look at the average
    • Knowing that with over the air DTV, there is quite a lag between the digital broadcast and the digital. I presume in the over the air TV standards, there is some encoding time lag + the decoding lag and display. Often when switching between analog and the digital on TV, I can get a repeat of an entire sentance or two. It is not limited to just a single broadcaster, but is common to all the networks. I imagine some content is originaly HD, and is then decoded in the studio and cropped for for NTSC aspect
  • Evolution (Score:3, Insightful)

    by HomelessInLaJolla ( 1026842 ) <sab93badger@yahoo.com> on Tuesday February 13, 2007 @07:06PM (#18004854) Homepage Journal
    On one side updating the video connector may be a necessary advancement to accomodate higher bandwidth video modes. On the other side we can only hope that system vendors don't begin bundling their desktops with their monitors and inhibiting cross-pollination by strictly enforcing IP on their video adapter design.

    I would hate to see the day when I use one display device for Linux and need an entirely different device to be compatible with proprietary DRM/TC/HD output or have to buy a third party descrambler type box--because we all know what a racket those were. It'd be like early 80s cable TV wars all over again.
  • Piss off! (Score:5, Insightful)

    by Anonymous Coward on Tuesday February 13, 2007 @07:08PM (#18004890)
    What's with these never ending fscking changes? Obsolescence built in, incompatible formats, changing far too frequently. Bullshit DRM "features" in each new revision.

    Please stop this crap! Just give us simple digital connectors and let the boxes talk to each other. How about something plain and simple 10Gb Ethernet?
    • Re: (Score:3, Insightful)

      by fyngyrz ( 762201 ) *

      How about just analog RGB and quit pretending we need digital connections at all?

      You want high bandwidth? Analog RGB can do it. You want deep color? Analog RGB can do it. You want to avoid DRM? Analog RGB is perfect for that. You want easy to record? Analog RGB -> Analog recording media *or* digital(ized) media. You want easy to connect? Analog RGB. You want easy to switch between signal sources? Analog RGB. You want easy to buffer and redistribute? Analog RGB. You want auto-mode detection? We fool e

      • by jZnat ( 793348 ) *
        In the software (or computer science) side of the story, it's far easier to deal with digital (i.e. discrete) signals than analogue ones. Besides, all the visual data is digital in the first place in these situations (unless you're using VHS or analogue OTA broadcasting), so all this DAD conversion nonsense is just that: nonsense.
        • by fyngyrz ( 762201 ) *

          In the software (or computer science) side of the story, it's far easier to deal with digital (i.e. discrete) signals than analogue

          Conversion to digital is trivial. So is conversion back. End of problem. Or more to the point, there never *was* a problem.

          Besides, all the visual data is digital in the first place in these situations

          Yes, I've noticed the sun bouncing bits off me towards the camera quite often. And when the sun isn't out, what would we do without our digital floodlights, merrily em

          • Re: (Score:3, Insightful)

            by mrchaotica ( 681592 ) *

            Yes, especially since we're talking about ADA, not DAD.

            I hate to break it to you, but newer displays (i.e., LCD and everything else that's not CRT) are inherently digital. So yes, we are talking about DAD conversion.

            • by fyngyrz ( 762201 ) *

              ...newer displays (i.e., LCD and everything else that's not CRT) are inherently digital.

              Really? So you maintain that the liquid crystal in an LCD cell responds in digital - discrete - steps of brightness. The crystal is standing in the cell on a 64- or 256-step ratchet, waiting to pivot, driven by six or eight bits of control, is it? This must be that "new nano physics" I've heard about. :)

              No, the fact is that LCD display cells are purely analog in mechanism; apply an analog voltage or current, and

              • They are rigid in X:Y location but getting the signal to the right place at the right time doesn't in any way require the conversion of the signal from analog to digital at any point.

                Really? You mean pixels on an LCD aren't signaled in a (more or less) similar way as cells in a RAM chip? I'm no electrical engineer, but I would have assumed...

                • Re: (Score:3, Interesting)

                  by fyngyrz ( 762201 ) *

                  Really? You mean pixels on an LCD aren't signaled in a (more or less) similar way as cells in a RAM chip? I'm no electrical engineer, but I would have assumed...

                  There are two issues that relate directly to your concern. One is addressing; in order to pick a ram location or a screen pixel, a signal needs to be sent to the particular location that says "hey you, and not any other." Displays and RAM can be similar in this regard, though it is more likely that a simpler scheme of sequential counters is use

      • Re:Piss off! (Score:4, Interesting)

        by RzUpAnmsCwrds ( 262647 ) on Tuesday February 13, 2007 @09:12PM (#18006334)

        How about just analog RGB and quit pretending we need digital connections at all?


        How about we stop pretending that analog RGB looks good? Ever try screwing with the contrast setting on an LCD? That's analog technology at work.

        DVI lets me see the image outputted by my graphics card - pixel and value precise. Neither my monitor nor my graphics card supports HDCP, so DRM isn't a problem.

        As a public service, let me remind you that high-bandwidth analog signals are problematic. It doesn't take much for noise, crosstalk, or other issues to show up on an analog monitor at high resolutions.

        Try connecting your monitor to your desktop with a 20 foot DVI cable - then try doing the same thing with an analog RGB cable.

        Try using a crappy KVM. Most screw up resolutions greater than 1600x1200.

        Analog is the reason my cable signal looks like shit. It's the reason why broadcast TV looks crappy. It's the reason why AMPS cellphones have static.

        So, hell, why shouldn't we take a nice clean digital signal, run it throguh a DAC, throw it through a cable, and try to reconstruct it into a digital signal with an ADC at the other end. Extra components, extra complexity, and more chances for interference. What a great idea.
        • by monopole ( 44023 )
          Dead on.
          The problem 'tho isn't the digital nature of the connection, it's the bloody DRM that everybody is trying to ram down our throats on any new standard. In 5-10 years, the cascade of failures and incompatibilities arising from DRM coupled with it's complete failure to protect content will make it the Edsel of computing.
        • Re: (Score:2, Insightful)

          by Anonymous Coward
          Digital is the reason I LOSE THE ENTIRE PICTURE(whole screeen jams/tears/artifacts) when watching cable rather than a slight glitch or artifact.
        • Try using a crappy KVM. Most screw up resolutions greater than 1600x1200.

          Try finding a "crappy" KVM that supports DVI. (Hint: they don't exist -- or at least they'd better not be crappy, for that price!)

          Other than that, I agree with you.

        • Re:Piss off! (Score:5, Informative)

          by fyngyrz ( 762201 ) * on Wednesday February 14, 2007 @02:23AM (#18008492) Homepage Journal
          How about we stop pretending that analog RGB looks good

          How about we stop pretending it doesn't? Especially, as in your case, when there is no basis for such an assertion. I have full HD over component. My system looks beautiful. Ergo, analog doesn't give you a poor image, there's nothing inherent in it that prevents a good picture.

          As a public service, let me remind you that high-bandwidth analog signals are problematic. It doesn't take much for noise, crosstalk, or other issues to show up on an analog monitor at high resolutions.

          Please. My cables hang slack in the basement, hooked over projecting screws, run about 30 feet, and they are fine. Why? Because it doesn't take much (as in, proper termination, decent coax, low-loss connectors) to run high bandwidth analog just about any distance you like. Claims to the contrary are nonsense. Can you screw up such a run? Sure. Just try it using audio cables. But for that matter, try running a multi GB/s digital signal through an audio cable and watch what happens. I mean, aside from hosing every RF receiver in your home. Yes, we're in a zone where the cables need to be right. This is no different from a digital copper run. Optical is something else entirely. But of course, you can run analog optically as well. :)

          Try connecting your monitor to your desktop with a 20 foot DVI cable - then try doing the same thing with an analog RGB cable.

          Oh, please. Such marketing-inspired tripe. You picked the wrong person to try and push over what you thought was a hypothetical.

          I have a 17 foot (204 inch) display driven exclusively by component from the receiver, though I also feed it analog from a Mac via a VGA input - that's the media librarian using Delicious Library. It looks absolutely fabulous either way. You can see every glorious pixel in HD, up close. The projector has about 30 feet of cable on it, not counting the various lengths of cable the component HD input sources (XBox360, HDDVD, Blueray, PS3, Satellite) feed to the receiver and the switches; there are no problems with ringing or artifacts whatsoever. The cables go down through the floor, along for quite a distance, and back up at, and through, the projector's pedestal. Of course I don't use radio shack RCA cables to do this, I use a triple run of coax and I have it properly terminated, but this is no big deal and the technology can be built into any simple cable without adding significant cost as compared to, for instance, a many-pinned multi-pin connector.

          The manufacturers have been feeding you bullshit so long you think it is true. Well, it's not, and I can prove it.

          Are there advantages or unique uses to/for digital transport? Certainly. But is digital transport in any way required to view for instance, full HD at 1080x1920 at 60fps in high quality? No. Absolutely, resoundingly, factually, no.

          Analog is the reason my cable signal looks like shit.

          No, shitty equipment and/or shitty standards and/or shitty service is why your cable looks like shit. Cable can look butter smooth. The fact that yours doesn't isn't a reflection on technology, it is a reflection on what consumers will put up with because they're badly misinformed about what is reasonable and possible.

          Try using a crappy KVM. Most screw up resolutions greater than 1600x1200.

          Listen to yourself. "Try using a crappy..." Why would I do that? Really, why? When I need one, I use one that is adequate to my needs. Nothing screws up at all. I switch between linux servers using a KVM and the results are pixel-perfect. It's 100% analog. Using crappy equipment will certainly get you crappy results, but why would you think this has any bearing whatsoever upon the inherent capabilities or limitations of the underlying technology? Talk about backwards reasoning!

          • Re: (Score:3, Insightful)

            I have full HD over component. My system looks beautiful. Ergo, analog doesn't give you a poor image, there's nothing inherent in it that prevents a good picture.

            Well, you sure told us with your lone anecdotal data point.

            Computer display data starts out in the digital domain. An LCD panel requires digital signals to generate an image. There's NO GOOD REASON to convert that signal from digital to analog to digital in between -- there WILL be degradation, however slight.
            • Re: (Score:3, Informative)

              by fyngyrz ( 762201 ) *

              Well, you sure told us with your lone anecdotal data point.

              It proves the point; such systems are workable. That's all it is there for, to solidly discredit the ridiculous claims that are appearing in this thread about noise, resolution and so forth at HD. However, presuming mine is the only such system is really kind of dim. I bought everything off the shelf. You can reasonably assume I am not the only consumer to have done so. Or do you think I'm really the only guy with a high end component system?

      • Want to use a display at up to 2560 x 1600 resolution [apple.com]? Oh wait, you can't use an analog RGB signal.
        • Re: (Score:3, Informative)

          by toddestan ( 632714 )
          There's no reason analog RGB won't carry 2560 x 1600 resolution - it's just that Apple doesn't support it.
      • by Apotsy ( 84148 )
        Trouble is, the analog output quality of video hardware varies greatly. Older nVidia graphics cards with VGA ports are a good example. They have exremely poor output quality compared witih most cards. There was a trick you could do to solder an extra bit of electronics onto the board and improve things (nVidia was using cheap parts), but how many people are willing to do that?

        Other than that I agree with you for the most part.

  • HDCP: it still sucks (Score:5, Interesting)

    by schwaang ( 667808 ) on Tuesday February 13, 2007 @07:10PM (#18004906)
    This article pimps UDI, which uses an HDMI-backwards compatible plug and can do higher bandwidth (10.8Gbps) and HDCP (copy protection enforcement).

    Unfortunately, HDCP implementation sucks. Standard procedure for the problems almost everyone has with HDCP-enabled cable boxes is to *reboot the box*. Apparently, in the exchange of encryption keys a handshake sometimes gets dropped, and nobody has a firmware solution.

    Of course, even it worked right, HDCP would still suck.
    • Standard procedure for the problems almost everyone has with HDCP-enabled cable boxes is to *reboot the box*. Apparently, in the exchange of encryption keys a handshake sometimes gets dropped, and nobody has a firmware solution.
      Heck, even the hardware devices won't cooperate when they're forced to use HDCP!
    • Re: (Score:3, Insightful)

      Unfortunately, HDCP implementation sucks. Standard procedure for the problems almost everyone has with HDCP-enabled cable boxes is to *reboot the box*. Apparently, in the exchange of encryption keys a handshake sometimes gets dropped, and nobody has a firmware solution.

      No, the implementations of HDCP TOTALLY RAWK!

      This way, people who would normally never care enough about DRM and copy prevention to even notice are getting a big steaming cup of wake-up. Anyone who has to put up with HDCP handshake failures
  • by Anonymous Coward on Tuesday February 13, 2007 @07:13PM (#18004946)
    VGA isn't going anywhere until we replace all our KVM rack switches and who needs HD for a TTY?

    • by Nik13 ( 837926 )
      Even for home setups! I've looked at replacing my current monitors (nothing wrong with them) with some nice Dell Ultrasharp LCDs. But 4 new monitors, a new 4 port KVM switch with DVI and preferably USB too (some PC mfgs are dropping PS/2 plugs), 4 new video cards... That's already quite the expense just to have flat screens.

      But the real problem is with my dual monitor setup. There are very few 4 port KVM switches with dual DVI, and even less that also have USB, and they are VERY expensive. Add the cable set
      • by stuuf ( 587464 )
        What the hell kind of setup do you have at your "home," and what do you use it for? From reading the shopping list you posted, you seem to have four computers, each with dual-head video, two monitors connected to a KVM switch, and two more monitors not doing anything. And you're complaining that buying more video hardware at once than most home users buy in three years isn't cheap.
    • by Lumpy ( 12016 ) on Tuesday February 13, 2007 @09:38PM (#18006592) Homepage
      Yup. and guess what. good old VGA does HDTV just fine. I do 1280X1024 all day long on a VGA connection. so that is far higher than the HDTV 720p. I am sure I can do 1080p over VGA, just haven't found a video card or LCD that can handle it yet that did not cost 3 arms 2 legs and a kidney.

      Honestly engineers and marketing guys talk all day long about how good X or Y is, and it all comes down to "how can we shove our DRM into the new standard and fool customers into buying it."

      My friend though I was nuts buying a pair of 21" LCD's that had only VGA on them. they look fantastic, play FPS great and work just fine with my 7300GT card.

      VGA will disappear as soon as RS232 disappears. which by what I see in the integration market, is many many years from now IF electronics makers get off their asses, which is highly unlikely from what I have seen.
      • Guess what? VGA == RGB (almost). Now component video is Y+(B-Y)+(R-Y) or just the demodulated but un-dematrixed version of S-Video. S-video is Y + ((B-Y)+(R-Y)+subcarrier)). S-video simply transmits the color subcarrier on its' own wire so you don't need a comb filter to pull it out of the Y signal. This also removes any limits on the bandwidth of the Y signal. With both component video and S-video the monitor must still go though the de-matrix process to generate good old RGB to display the image. S
    • Of course, all your new servers come with management cards that provide a VGA BIOS and a video card driver but send the frame buffer (and take keyboard/mouse I/O) over a management LAN anyway using RFB/ICA, right?

      Right?

      OK, so maybe you don't need those because your servers run command-line friendly operating systems, which will have an IPMIv2 daughterboard on the motherboard IPMI interface to enable remote power control, serial-console-over-LAN, etc. Right?

      Right?

      *sigh*. I'll go get my keyboard and VGA cable
  • In the mean while, the inquirer [theinquirer.net] continues its series of posts of articles about external video card connections.

    Me? I fly with proprietary fibre [engadget.com] solutions! Well, I would if i were dirt rich.

    Having your graphics display remote from the consoles they are attached to is absolutely amazing. I wish we could wire our entire office with decent thin clients.
  • by Anonymous Coward on Tuesday February 13, 2007 @07:16PM (#18004982)
    As I'm sure many of you have noticed, Intel and OSTG went into some kind of marketing deal with the Intel "Opinion Center" [slashdot.org] on Slashdot. There is nothing inherently wrong with that as all of the "stories" (rehashed press releases) were posted in Intel's own section; none of them were on the front page or in any of the other sections. AMD had a similar deal [slashdot.org] a while ago, but that appears to have been over for a while now. The strange thing about Intel's deal is that the link on the front page is in a somewhat prominent position and has a different color scheme in order to make it stand out. But what is more interesting is that the link IS NOT A DIRECT LINK. Instead it redirects through DoubleClick for some reason. I am not trying to make this sound sinister, but I found that a little odd.

    Anyway, Intel posted a number of press releases and got a few comments here and there. But sometime last week they decided to get out of the deal. There is nothing wrong with that, but they DELETED all the previous stories and posted some lame excuse. Not that this means anything, but the comments on Intel's previous stories could still be viewed if you knew the exact url. In other words only the stories were deleted; the comments were not. This action generated a number of negative comments on the whole Intel "Opinion Center" idea. Today I went back to check on it and lo and behold they have DELETED ALL THE COMMENTS and marked the story as READ ONLY. While Slashdot claims that they can't or won't delete comments, I think it is pretty clear that things can be done if the price is right. Although I suppose we all already knew this from previous incidents, this time in particular it caught me by surprise. While a few of the comments were trolls, most of them voiced honest but negative opinions of the "Opinion Center". If you want to call it an "Opinion Center", then you should be ready to hear opinions. Otherwise just call a spade a spade: Intel "Marketing Center".

    I never liked the idea in the first place, but deleting all the previous stories AND comments is really weak and speaks a lot about the integrity of both Intel and Slashdot. If you think Intel and Slashdot did the wrong thing here, please mod this post up.
    • Re: (Score:3, Informative)

      by Matt Perry ( 793115 )
      I've found that it works better for me to get rid of the left bar. It's just eating up space anyway. You can do the same thing by putting the following in your userContent.css file:

      /* Remove the left column */
      #links {
          display: none !important;
      }
      #contents {
          margin-left: 0px !important;
      }
      Now I never have to worry about "Opinion Centers".
    • by Saeger ( 456549 )
      Guess it should have been called "Intel Positive-Opinion Center Only". I had no idea such corporate lameness was going on behind that odd-colored intel ad-link (and the doubleclick redirect doesn't even work when using privoxy).
    • Re: (Score:2, Informative)

      by !eopard ( 981784 )
      I had no idea what this post was about, then I actually looked at the left hand navigation frame. I had never noticed this Opinion Center, with the highlighted 'Intel' under it. How long has it been there? Guess I have been ready Slashdot so long now* that I don't bother looking at those things. At home my RSS feed means I never even see the front page. Now to get back top ignoring those links. *early 2000, though I didn't register until recently.
  • No mention of Wireless HDMI? [engadget.com]
  • I wish they'd hurry up and standardise the damn things. I just bought a Chimei LCD and the cable supplied is a DVI-I to DVI-I but my video card (Xpertvision Geforce 6600GT) has a DVI-D port, and for the life of me I can't find a shop here in Australia that sells a DVI-I to DVI-D cable! I can see why so many people don't like computers. Standards like SATA (small cables!) and USB (plug just about anything in) are going the right way. Hey, why couldn't we use USB2, wouldn't ~400mbits be enough?
    • by sharkey ( 16670 )
      You can plug a DVI-D cable into a DVI-I jack, IIRC.
    • Get a DVD-D to DVD-D cable, it will work fine. DVD-I is just setup to carry analogue and digital signals, your monitor doesn't need both. A D cables works fine in an I port.
    • Hey, why couldn't we use USB2, wouldn't ~400mbits be enough?
      Nope. 1080p at 30fps uses more than triple that.
    • by repvik ( 96666 )
      400mbit/sec? Considering I'm running my PC display at 1600x1200 32-bits in 100Hz, that's a whopping 6144000000 bits per second. USB2 isn't by far enough for HDTV (720p clocking in at 1.4Gb/sec IIRC).
    • Re: (Score:3, Funny)

      by crabpeople ( 720852 )
      Sata keyboards ftw. It make you type bettar!

  • There are too many lame names [wikipedia.org] for all the "standards"! They aren't self-explanatory at all.
  • Why is it that I can get wireless 50mbps streams over wireless (well, when things are working), in a generic, 802-wireless way, over a hundred yards or so, but I can't get video from my computer to my monitor, over one foot, a fraction of the bandwidth, to a wireless monitor. The time is way overdue for a ubiquitous wireless monitor spec... I'm actually surprised Apple hasn't innovated on this front (although their iMac's are an elegant alternative).

    I have a wireless keyboard, wireless mouse, wireless hea
    • Re:Wireless Video? (Score:5, Informative)

      by dpokorny ( 241008 ) on Tuesday February 13, 2007 @07:53PM (#18005450)
      Perhaps this is because even a modest resolution (by today's standards) needs nearly 2Gbps of bandwidth?

      Do the math your self: 1280 x 1024 x 24 x 60 = 1.887Gbps

      This doesn't even begin to take into account any protocol overhead, sync signals, or other useful data such as audio.

      • obviously there needs to be a chip at each end to compress the signal to/from mpeg2 that'd sort out the bandwith!

        comment |= (joke|irony);
    • ``Why is it that I can get wireless 50mbps streams over wireless (well, when things are working), in a generic, 802-wireless way, over a hundred yards or so, but I can't get video from my computer to my monitor, over one foot, a fraction of the bandwidth, to a wireless monitor.'' (emphasis mine)

      You think so? Do the math. A resolution of 1024 by 768 pixels, with 24 bits color, contains about 19 megabits of information. With a 50 Mbps link, even assuming you could use the full capacity of that link, that woul
    • Why is it that I can get wireless 50mbps streams over wireless (well, when things are working), in a generic, 802-wireless way, over a hundred yards or so, but I can't get video from my computer to my monitor, over one foot, a fraction of the bandwidth, to a wireless monitor.

      As others have pointed out, there is the bandwidth issue because your PC is sending out an uncompressed signal. In that same vein, the streams you're getting over wireless 802.11 (assuming you're talking about A/V streams) are being t

  • by queenb**ch ( 446380 ) on Tuesday February 13, 2007 @08:00PM (#18005524) Homepage Journal
    does this mean better p0rn?

    2 cents,

    QueenB
  • I've heard from the home theater folks that HDMI was a seriously broken implementation. v1.1 wasn't necessarily compatible from device to device, v1.2 only carried stereo, and at the time I was in the ,market, only the PS3 used v 1.3....and they weren't necessarily backwards compatible.

    They ended up with the comment that the video quality wasn't up there with component.

    So, were they blowing sunshine up my skirt, or is HDMI really the tarpit they describe?
  • I learned nothing in that article. Here are some issues that interest me: Which of these standards can support very long cables with perfect digital reproduction at 1080p/120Hz? Because I'm looking forward to ceiling-mounted projectors capable of this and I'll need a long cable from the display source, probably a computer.

    The ideal cable for me would be one that I could pre-network the house with, so that I could choose to display the output from any of several computers in my house. That way, I could get

    • You're going to need just over 6Gb/s of data trasfer, presuming 24 bits and 10% overhead. That doesn't bode well for long runs. I presume you'll be using a computer for this (as no consumer grade video does 1080/120), so you may as well put your computer/scaler near where the projector is and plan on a network connection that will handle the compressed traffic.
      • Well, I was pulling those numbers from thin air a bit. This isn't a plan yet, this is more of a wish for the future. There is no projector (that I can afford anyway) which can meet the specs I outlined.

        So what would it take to make a cable that accomodates long runs and can transfer over 6 Gb/s? Would long runs require an even higher bandwidth because they would need a higher overhead? And how would you accomodate that with a cable? More signal-carrying wires? And how about fiber? The line loss on that wo

        • As Overzeetop said, what you need is gigabit ethernet to carry compressed signals and a computer at the other end to decode them. Trying to pipe uncompressed high-resolution video all over your house is impractical and, frankly, stupid.

    • by Anonymous Coward
      DisplayPort: 15m with 2 wires at 1080p. Demonstrated with 2 crappy wires in one of the earliest demonstrations.
      For longer distances you'll have to rely on extenders.
  • by antdude ( 79039 ) on Tuesday February 13, 2007 @09:47PM (#18006668) Homepage Journal
    What bugs me about DVI is that KVM with them are still expensive. I am still using VGA with my old Belkin OmniCube KVM switches that I bought back in 2001.
  • Stop the madness! (Score:4, Insightful)

    by MobyDisk ( 75490 ) on Wednesday February 14, 2007 @12:04AM (#18007628) Homepage
    UDI? If another connection comes out, the back of my TV set will look like the interior of a Borg Cube.

    By the time I got DVI on my DVD player and HTPC, I found my TV had HDMI. Now, I'm told "...it's unlikely that HDMI will become more than a footnote in the epic story of PC display technology..." Well that's just great. Yet another adapter that costs $50 at my local outlet for .45 cents + shipping on ebay. And the excuse that this is "just for PCs" doesn't help since my PC's hook to my TV's (and I'm not alone any more, this is happening more often.)

    Many devices today still don't support the existing connections properly, so I have little faith that new connections will improve things. Many TV's have DVI inputs but still overscan. DVDs are still encoded with interlacing. HDCP has connectivity issues like the PS3 debacle. I know people who still tell me that their s-video connection is state of the art. And while most new TVs are using composite cables, that is STILL analog and YUV based instead of digital. The industry is not ready for new connectors.

    For an example of connectivity done right, look at USB 2. USB 1, USB 1.1, and USB 2 all use the same connection. The devices negotiate the appropriate speed. Ethernet does this too. Unless there is very very good reason, please don't change the physical connections. Increase the bandwidth in a backward-compatible way if necessary.
  • by josh2a ( 577818 )
    Cartman: Come on! Come on! Dude, what is taking so long! I wanna play!

    Maintenance Guy: Uhh, what kind of output does this have? This is some ancient Super-VHS output or somethin'. I can't connect it to your float screen.

    Cartman: There's gotta be some way to hook it up! It's the freakin' future!

    Maintenance Guy: It may be the future for you, but I can't hook up anything to a float screen without at least a laser-7 output.

    All I wanna do is play Nintendo Wii!

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...