Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Displays

Goodbye, VGA 356

jones_supa writes "Leading PC companies have expressed their will to finally start kicking out legacy display interfaces. Intel plans to end support of LVDS in 2013 and VGA in 2015 in its PC client processors and chipsets. While the large installed base of existing VGA monitors and projectors will likely keep VGA on PC back panels beyond 2015, PC and display panel makers are in strong support of this transition. The DisplayPort connector interface provides backwards and forwards compatibility by supporting VGA and DVI output via certified adapters, while also providing new capabilities such as single connector multi-monitor support."
This discussion has been archived. No new comments can be posted.

Goodbye, VGA

Comments Filter:
  • by Corporate Troll ( 537873 ) on Thursday December 09, 2010 @08:44AM (#34499718) Homepage Journal

    Oh, I wouldn't say goodbye just yet.... 2015 is still a long way to go. Recently, the monitor at my parents failed (a 2 or 3 year old 1280x1024 LCD panel... All CRTs before that lasted way longer. This LCD craze does have its downsides). Their computer has an old GeForce 4 MX 4400 or so with only a VGA port. I went to a local electronics shop and found a 23" Full HD LCD panel for an incredible 149€. I bought it, but then I got worried. Wait, the box doesn't mention VGA at all only DVI. I was a bit scared I'd have to upgrade to DVI, not that it matters, I have tons of older video cards with DVI so it would just have been a bit extra work.

    Turned out that when I opened the box, only a VGA cable was included. DVI connector was there, and I'm pretty sure that it would work. For me it was ideal, for someone planning to connect to a DVI-only machine would probably have needed to go back to buy a cable.

    Also keep in mind that a lot of laptops only have VGA. As far as I know there are no VGA-DVI adapters (DVI-VGA does exist). Since these days 5 year old computers and older fullfil the need of most computer users, don't expect VGA monitors to disappear soon. Companies will cater the needs of those "left behind".

    DisplayPort? Haven't even seen a computer having that by default... Macs perhaps? I don't know, we only have a iMac and since the monitor is built-in, I didn't bother looking for display connectors.

    No, wait... I think my fathers new Alienware laptop has a displayport. Totally forgot about that. It's less than a year old though.

    • by jimicus ( 737525 ) on Thursday December 09, 2010 @08:50AM (#34499772)

      Macs have DisplayPort connectors, and have done for some time.

      Though I wouldn't be too surprised to see this continue for some time - hell, you can still buy a PC with PS/2 connectors, FFS.

      • by jawtheshark ( 198669 ) * <slashdot.jawtheshark@com> on Thursday December 09, 2010 @08:55AM (#34499824) Homepage Journal

        What's wrong with PS/2 connectors? I prefer them, unlike USB they don't require polling as they are interrupt driven. When I can choose, I take PS/2 over USB for keyboards and mice. Saves USB ports too for other duties.

        • by Pojut ( 1027544 )

          Agreed! I just wish I didn't have fifty billion PS/2 adapters in my desk drawers -_-;;

          Note: I'm totally kidding. It's more like sixty billion.

        • by corbettw ( 214229 ) on Thursday December 09, 2010 @09:07AM (#34499912) Journal

          Or you could, I dunno, get a USB keyboard that has two or four USB ports on it, itself. Try doing that with PS2.

          • Oddly enough I have one of those at work... I never think of using them... I always connect stuff directly to the laptop or the docking station. In my mind a keyboard is still something standalone... Heck, even my external monitor has USB connectors. I never use those either. I simply don't think of them as USB hubs.

          • Great plan. As long as you're not using too much power on your USB devices, that is...

          • by Lumpy ( 12016 ) on Thursday December 09, 2010 @09:19AM (#34500068) Homepage

            And slow your stuff down to usb1.1 spec. Oh and zero power there. I have a USB flash stick that will not work off of a Keyboard USB port. Not enough power there.

            extra usb ports on your keyboard are like stick on air vents for a car... There for show only.

            • My Mac extended keyboard has two USB ports on it. My USB2 3.5" hard drive is connected to one, an ancient Logitech mouse on the other, and the HDD speeds are perfectly in line with a USB2 device.

              USB2 thumb drives works fine off of it too. The only thing it can't do is power an external 2.5" drive or my iPhone--have to connect directly to the computer's USB for that.

              So, hardly for show only.

              Conclusion: it's not USB keyboards that's the problem, it's poor-quality ones.

        • What are you doing with your mouse and keyboard that the protocol makes a practical difference? I'm legitimately curious, not sniping. I use a PS/2 keyboard to save ports myself.

          • Only one thing I can thing of... IBM Model M It tends to have issues with PS/2 to USB adapters. I use PS/2 mice and keyboards too (and ADB on all my Macs since they are old). PS/2 keyboards always seem to work correctly on boot-up for BIOS setup use, I've had problems with USB keyboards on some machines. PS/2 mice don't have the polling problems that another poster mentioned.
            • by Bigbutt ( 65939 )

              What sort of issues? I have two and one is connected to my laptop via a DIN to PS/2 to USB connectors, the other is a DIN to PS/2 connected to my main computer.

              [John]

            • Only one thing I can thing of... IBM Model M It tends to have issues with PS/2 to USB adapters

              Best thing I've found for this:

              http://pckeyboards.stores.yahoo.net/customizer.html [yahoo.net]

              It's essentially a marginally updated clone of the IBM Model M. Available in black, has a 104 key layout instead of 101, USB interface available, and a straight cord rather than the coiled annoying one from the original model M. I've got the Unicomp spacesaver version (same layout but less border plastic around the edges) as well as a real IBM Model M (as well as 2 other different brand mech switch keyboards), and the Unicom

          • In my organization, all computers run full disk encryption with a pre-boot screen that pops up to enter a password. We use both Guardian Edge and WinMagic products for this purpose. We've found that in one fairly common failure mode seen while Guardian Edge Hard Disk disk encryption is used, when we need to type in an admin account name and password to unlock machines, the machines simply don't recognize USB devices. Plug in a PS2 keyboard, reboot, and then we can log on and fix 'em.

            I'm pretty clueless a

            • by Guspaz ( 556486 )

              There's an option in most BIOSes about if they should handle USB keyboard support themselves until the OS takes over. You probably need to flip the setting.

            • Afaict it's a bios issue. During the early boot phase the bios is doing the job of providing basic input from the keyboard and output to the screen (hence the name bios ;) ).

              Unfortunately there was a long time lag between the introduction of USB and proper bios support for UISB keyboards (especially for keyboards behind hubs such as the ones in most USB KVM switches). And even when the bios does support it it's not always turned on.

              After windows loads there is another phase of annoyance if devices have move

          • by AdamHaun ( 43173 )

            The USB keyboard protocol polls the keyboard for changes at regular intervals. If two keys change state very close together (i.e., if you're a fast typist), the changes will be sent in the same data packet. The problem is that the protocol doesn't care about the order of the keypresses and just handles the changes in QWERTY order, so I get typos in my text whenever I type in the "wrong" order. The $100 Das Keyboard is particularly bad about this due to its N-key repeat feature, but others do it too.

            Modern c

            • by Ark42 ( 522144 )

              I've seen that on cheap PS/2 keyboards too. It's really annoying when you can't even type a 3 letter word like TWO without it coming out as WTO every single time. At first I thought I was crazy, but it's very reproducible if you type fast.

        • What's wrong with PS/2 connectors? I prefer them, unlike USB they don't require polling as they are interrupt driven. When I can choose, I take PS/2 over USB for keyboards and mice. Saves USB ports too for other duties.

          Gosh, back when USB wasn't so common, there was an article here talking about the relative work Linux needed to do to handle both. The PS/2 port needed to be sampled at 200Hz, while USB was doing hardware offloading of the work, so the computer could have more interrupts for other (server-ty

      • by xded ( 1046894 )
        USB is actually inferior to PS/2 for keyboards (see n-key rollover [geekhack.org]).
    • New Dell Precision workstation we got recently only has either DisplayPort or mini-HDMI connectors on the graphics card. There was an adapter included to convert to DVI output so I just used that.

    • VGA, HDMI, and DVI on mine.
    • by Moryath ( 553296 ) on Thursday December 09, 2010 @09:08AM (#34499928)

      The phrase "certified adapter" means "video quality degraded to crap and DRM added."

      Just FYI.

      • Re: (Score:3, Informative)

        by Guspaz ( 556486 )

        More like the adapters are defined in the DisplayPort specs rather than just being after-market addons like a DVI to component adapter would be. You can't add DRM to VGA (although you can degrade it, as you pointed out).

    • VGA-DVI adapters (actually converters, as you need to do analog-to-digital) exist, they're just rather expensive.

      http://www.networktechinc.com/vga-dvi.html [networktechinc.com]

    • As far as I know there are no VGA-DVI adapters (DVI-VGA does exist)
      The adaptors you speak of are just wiring adaptors. They (along with DVI-I sockets) let a computer or monitor manufacturer offer both analog and digital on the same port but the analog output hardware still has to be present in the computer. Afaict if the monitor supports it you can use them at the monitor end as well.

      There are adaptors that actually convert between digital and analog but they don't come cheap.

    • by arivanov ( 12034 )

      You forgot to mention - does it work properly after that over the VGA cable?

      Most VGA cables cannot take the frequencies required to transmit a HD signal cleanly so you get pretty nasty ghosting. The same is valid for a lot of recent video cards which have VGA as an afterthought on a cable hanging of a header on the side.

      On the negative side, this is likely to reinstate the whole debacle about resolutions, DRM and the other "digital may allow people to steal stuff" that kind'a went away from the PC and got c

      • Ghosting isn't really about frequency response. Lack of high frequency response would cause bluring.

        If you really have ghosting it is likely a result of impedence mismatches either because you are attempting to use a passive splitter, because the characteristic impedance of the cable is wrong or because the termination in the devices sucks.

        Personally I've had pretty good luck with VGA EXCEPT when trying to drive HDTVs. My conclusion is that the VGA inputs on those things just suck.

        "FULL HD" isn't really tha

      • by jedidiah ( 1196 )

        You must be joking...

        Some of have been happly using VGA cables for "HD" signals since long before any HD standard was defined.

        This has to be the dumbest thing yet that anyone has come up with on this thread.

    • by dasunt ( 249686 ) on Thursday December 09, 2010 @09:17AM (#34500038)

      Recently, the monitor at my parents failed (a 2 or 3 year old 1280x1024 LCD panel... All CRTs before that lasted way longer. This LCD craze does have its downsides).

      Often, if an LCD goes after just a few years, it's due to a bad capacitor or two on the motherboard.

      If you do a bit of research and find out what the requirements for the capacitors are (usually low-ESR, etc), the cost for each capacitor is under $1, and anyone with basic desoldering skills can replace them.

      • by Sycraft-fu ( 314770 ) on Thursday December 09, 2010 @11:13AM (#34501730)

        LCDs can last a damn long time. We've got some at work going on 9 years now, still working fine, still good image quality. I get a little tired of the "All old stuff was better and lasted longer, new stuff sucks." No. Wrong. This is just more looking at the past with rose coloured glasses.

        For one, you only see examples today of the stuff that lasted, not the stuff that broke. The stuff that broke was thrown away. So sure, if you find a CRT in service now, it lasted a long time. However that doesn't mean that there aren't a thousand more in a land fill that broke.

        Also, for brand new stuff you cannot very well demand to know its lifetime and failure rate as it is new, it hasn't been tested. I can't tell you if a specific device will last 20 years until 20 years have gone by.

        In the case of monitors, LCDs are actually far more reliable in the long run. As you note, much of what can go wrong is cheap to fix, and fixable by a consumer. Caps aside (which are more rare to break these days) the main thing to go is the backlight. It will usually go out somewhere in the 8-12 year range, though it could be longer for less used devices. Good news is that isn't expensive to replace. Get a new one and things work again.

        What's more, other than lower brightness due to the backlight fading, LCDs don't lose image quality with time. Replace a backlight in a 10 year old LCD and it looks as good as it ever did. Not as good as current LCDs, the tech has progressed, but the image will still be stable, with perfect focus and geometry. CRTs start to suck as they get old. They fade too, but they also lose focus, geometry control, image stability and so on. They can be pretty poor looking after a decade.

        Look past personal examples to the general trend and you find LCDs are nice and reliable. Some break, but then so did some CRTs. The tech overall is very reliable, and much easier to repair minor flaws.

        • by Rudeboy777 ( 214749 ) on Thursday December 09, 2010 @02:47PM (#34505384)
          Those LCDs that have been running for 9 years probably cost over $1000. Look how the Wal-mart mentality has driven down the price of LCDs today. I can get a 19" LCD for UNDER $100!!! but, importantly, with only a 1-year warranty. There is no way the components could be a comparable quality in the throwaway units on the shelf today.
    • by DrSkwid ( 118965 )

      http://www.ramblers.org.uk/ [ramblers.org.uk]

    • Almost all new corporate laptops now have display port (Lenovo, HP, Dell). All AMD-based corporate desktops now have DisplayPort as the "2nd monitor" on the onboard motherboard. (e.g. HP Compaq 6005)

      However - it is ironic that "display panel" makers are "anxious", because if you look at most display makers' LCD offerings - maybe 1-2 models out of 15 or so will actually have a DisplayPort port - and you'll be paying a hefty premium for that. We have to buy adapters with every computer we buy for users who

      • by TheRaven64 ( 641858 ) on Thursday December 09, 2010 @09:28AM (#34500210) Journal

        Many still are VGA-only to save on costs

        That doesn't make sense. Driving a TFT from VGA requires a lot more circuitry than driving it from DVI-D. That's why the Apple monitors only had DVI-D input; it was cheaper to produce. In reality, the cheap TFT monitors are VGA only for differentiation: they're convinced people that it's worth paying a premium to be able to drive your digital display from a digital signal, and so people do.

        DisplayPort should be even cheaper. It's designed to be easy to use to drive a typical TFT and, unlike DVI and HDMI, doesn't require you to pay a royalty to use. Monitors that are DisplayPort-only are going to be cheaper to produce than any of the other options. Of course, that doesn't mean that they'll cost less to consumers...

    • This LCD craze does have its downsides

      See, this is where my conspiracy theories kick in. It's actually a good business model if you make a monitor that only lasts 2-3 years opposed to one that lasts decades. TV/Monitor manufactures may very well skimp on several areas, knowing full well you will be replacing your device much sooner than before.

      I don't think the entire world is evil, I just think all corporations are.

      • See, this is where my conspiracy theories kick in. It's actually a good business model if you make a monitor that only lasts 2-3 years opposed to one that lasts decades.

        I have a decade-old LCD and recently got rid of my last decade-old CRT monitor (still have a CRT TV that old). Neither one remains a quality display after a decade of use; the CRT gets dimmer and turning it up to compensate makes the image lousier, and the CCF-backlit LCD backlight gets dimmer as well. I'd expect an LED-backlit LCD to last

        • Yea I hear ya. 2-3 years is actually a really short timespan. If anything broke within that time frame (barring someone smashing it or dropping it), I doubt I would buy from that manufacturer again.

          But at the same time, you can't discredit the 'big business' logic above. They only want it to last long enough so that you feel compelled to buy another one from them when it does eventually die.
    • As far as I know there are no VGA-DVI adapters (DVI-VGA does exist).

      DVI-A to VGA adapters exist, but the video card must explicitly support DVI-A (Analog) - it's simply a way to map VGA analog video and digital video into a single DVI connector. If the video card is DVI-D (Digital) only, then it can't be converted to VGA.

      You could get/make an adapter from VGA to DVI-A but most monitors' DVI ports support DVI-D only, if they support VGA they will have a discrete VGA port.

      Wikipedia article on DVI [wikipedia.org]

  • Conference rooms (Score:5, Interesting)

    by dimer0 ( 461593 ) on Thursday December 09, 2010 @08:45AM (#34499724)

    Only place I use VGA anymore (and have used in the past 4-5 years) is for overhead projectors in conference rooms.

    • Re:Conference rooms (Score:5, Interesting)

      by snookerhog ( 1835110 ) on Thursday December 09, 2010 @08:49AM (#34499764)
      i am assuming that shitty old VGA projectors will continue to cause problems for my presentations well beyond 2015, but I will be happy to be proven wrong.
    • by ledow ( 319597 ) on Thursday December 09, 2010 @09:21AM (#34500100) Homepage

      Every school I've ever worked in has VGA equipment as default. PC's, laptop connectors, projectors, monitors, video distribution system and digital signage all have/had VGA connectors. They might have HDMI / DVI *as well* but they all operate on a VGA basis primarily. I can see why - it's simple, it's compatible back to their oldest available machines without having to spend extra money on adapters and convertors (that half the time break or just plain don't work because they bought the wrong pins on their DVI adaptor and expect it to work). The rest of the advantages don't have any bearing - they can get whatever resolution they like going through meters of inexpensive 15-core cable that's been in the walls for years to their projectors over VGA and *not* notice any performance degradation. The only places that I would argue NEED better connectors are those places that are specialist anyway - CAD, Video and huge display signs.

      In general use, what advantages does anyone with significant investment in VGA really see from a DVI / HDMI conversion? Hell, I ran a 1024x768 VGA signal over a 75m CAT5e cable with an adaptor (the Cat5e was actually already there, by luck, so wasn't installed with that usage in mind) and that's STILL running a school's main entrance signage on a HUGE TV and nobody cries about the signal quality (the TV also has HDMI, SCART, S-Video, Component, Composite, RF, etc. in and works fine for them all but why bother when the lowest common denominator just works for everyone?).

      If something works, that's good enough. Especially if it works on ALL machines you can get (up until now, obviously). If you chose DVI-only then it wouldn't work on older machines without adapters. If you chose HDMI-only, it would work on even less. The transition has taken place so now the other way is beginning to start but it'll be another 3-4 years before schools and large businesses have to go to special efforts (e.g. special order, pick up particular models, or use adaptors) to get VGA inputs/outputs on their devices.

      This isn't a shock, like getting rid of PS/2 ports isn't a shock, because there are several alternatives already existing. The problem is that it's an enforced obsolescence of something for not-very-convincing reasons. Give it three years and there'll still be places with VGA convertors everywhere until hardware replacement time is due. VGA isn't a chore to use, or a problem to configure (hell, teachers can manage it - it's just a matter of Fn-Whatever and plugging a cable in). I have *just* been given my first work laptop that had something more than a VGA or S-Video port, and that's because it's really a gaming laptop in order to meet my minimum spec.

      Computers will come with VGA. People will buy adaptors for a few years until they buy a non-VGA device on both ends. Then the world will carry on as normal. It wasn't a "disaster" that actually needed to be fixed in the first place - I still have no use for DVI or HDMI devices myself - and thus there are probably millions of people that will have to do something in the future, but they would have to eventually anyway, and it'll be absorbed into their ordinary replacement costs anyway. All it means is that I don't budget quite so much for VGA cables next year and I have to convince my employers that all those perfectly-working interactive whiteboards and projectors really do need, at minimum, a new cable run, a new socket or a new adaptor unless they want me to overhaul the entire place. Big deal, I have to have that conversation about once every six months about *something*.

      The sky isn't falling. It wasn't even cracked to start with. We just have this new, brighter, hi-def sky that apparently needs a pair of sunglasses to view properly.

    • Only place I use VGA anymore (and have used in the past 4-5 years) is for overhead projectors in conference rooms.

      And only people who can fail at projecting their slides worse than a laptop with a 1999 linux distro, are those with a mac and no vga adapter... i've seen this happen plenty of times...

  • Damn... (Score:4, Funny)

    by splerdu ( 187709 ) on Thursday December 09, 2010 @08:55AM (#34499820)

    I thought we would finally be rid of Spike's Video Game Awards.

  • On both of my HDTVs (different brands, a cheap-o from 3-4 years ago, and a high-quality one this year) I'm using a VGA cable from the DVI out on my computers. Why? Because whenever I use an DVI to HDMI cable, it results in horrendous overscan instead of displaying at the native screen resolution. This means everything is scaled up, even though the monitor resolution is reported correctly to Windows and OSX, leading to horrible image quality. You can somewhat correct this with system display settings, but th

    • by Pojut ( 1027544 )

      I'm still trying to figure out why more TV manufacturers don't include an actual DVI port on their products...

      • by NJRoadfan ( 1254248 ) on Thursday December 09, 2010 @09:26AM (#34500180)
        Because HDMI is electronically the same. What they should be doing is going a menu option to turn off rescaling/overscanning of signals at the display's native resolutions.
        • My first HDTV had that option. By default (and oddly) it would overscan but not rescale, which lead to a frankly worthless black box around the whole screen. It was easily turned off in the menu, though.

          My current TV does not. It does the whole rescaling/overscanning thing, and it makes using a DVI/HDMI hookup for my computer worthless. Luckily, it has a VGA port, even though it doesn't maintain the aspect ratio when scaling non-16:9 resolutions (another downgrade from the earlier set).

          I miss my old TV so m

        • From my experience there usually is, but they don't document it very well.

          For example, I have a Samsung TV with a PC hooked up by HDMI. To turn off overscan and rescaling, I have to go into the menu to rename the input and rename that HDMI port "DVI/PC". Everything in the UI suggests that's just the name I'll see on the input menu, and for every other combination of input type and possible name I've tried, that's all it is. The manufacturer's docs say I should do this when connecting a PC but don't say any

      • by Lumpy ( 12016 )

        they do it's called HDMI. I hook DVI into a HDMI port all the time.

      • by horatio ( 127595 )

        I'm not 100% sure, but the HDTV set I bought just a few months ago came w/ a VGA port, but no DVI ports. I think this is because HDMI and DVI are somehow compatible without conversion?

        From wikipedia

        Because HDMI is electrically compatible with the signals used by Digital Visual Interface (DVI), no signal conversion is necessary, nor is there a loss of video quality when a DVI-to-HDMI adapter is used.[3]

    • by daid303 ( 843777 )

      Funny, it's the other way around for me on my TV. It has a VGA port, but refuses to go beyond 1024x786, which looks horrible on a 42" TV. HDMI detects the right native resolution and works instantly.

  • by Ngarrang ( 1023425 ) on Thursday December 09, 2010 @09:17AM (#34500036) Journal

    Intel will drop VGA from their chipsets and this will be a boon for video card makers. Video card makers already cater to the those who need better video, or different ports, or more ports, or whatever. As long as monitors include a VGA port, card makers will, too. Intel has the luxury of being able to drop it. It will save them money. They also know that no one is being left behind thanks to card makers. It is a win for both sides.

    • DVI can carry a VGA signal via an adaptor. You don't need an seperate VGA port on there. I've got a card from as long ago as 2005 that has two DVI ports and came with a matching pair of adaptors.

  • by retech ( 1228598 ) on Thursday December 09, 2010 @09:18AM (#34500052)
    Seriously, all this fast paced change and incredibly quick adoption of new technology makes my head spin. I just got through building the recommended case out of plywood for my Apple motherboard. Now I find out that I will have to use some fancy new type of video doohickey. Gees Louise!
  • I'm still using CGA [wikipedia.org] you insensitive clod
    • by Yvan256 ( 722131 )

      I don't need your low resolution and puke-inducing color palettes, I'd rather use Hercules [wikipedia.org]!

      • I actually had a computer with a Hercules graphics adapter and an amber monochrome monitor. I had the manual to my dot-matrix printer and wrote a program in BASIC to put it in graphics mode and print monochrome graphics on it. IIRC it could BLOAD video memory dumps to print, and I included BSAVE hotkey functions in a few of my other programs to save screenshots that I could print out.

        Good times...

  • Even though that's a few years off, it's still an announced end to the VGA video interface. VGA has been dead to me for a few years now, but it's crazy how fast time has flown.
    • I never know if my system will be a desktop or end up as a server (spare system). in my 'server room' (ha!) I have a bunch of systems that I run mostly headless, but occasionally I have to see some console message or control the boot process or something. maybe the network is not up (stupid persistent.rules.linux, doh!). but I'll need console access and for pc, console != rs232. console = vga and keyboard.

      for that reason, I've been buying mostly boards that have at least an onboard vga connector. this

  • I still have in storage a backup Sun monitor and cables with 5 coax connectors. Seems the scanning electron microscope controller output only provides that type of connectivity. Anyone have a Display Port adapter for that type of equipment?
    • by Amouth ( 879122 )

      i have a VGA to 5 coax cable. it even handles the sync on green if needed (although that is a second adapter..)

  • by TheGratefulNet ( 143330 ) on Thursday December 09, 2010 @09:45AM (#34500452)

    analog video is video you can't 'control'. no DRM (or none that is hard).

    its not at all surprising people of interest want to kill it.

    they are convincing people to abandon spdif, for audio, too. the new kids who are brought up with hdmi think there's nothing wrong with it. in fact, the way they mixed audio and video made the whole combo stream all DRMed. we once had mostly free and clear spdif (scms ignored since it was defeatable easily) and then they upped the bitrate so that spdif toslink and copper paths would not easily (or at all) carry the new digital audio formats (blu ray audio and so on). the new codecs are using bitstream audio for all channels which is HUGE overkill for sound tracks on movies, but its a middle finger from the entertainment industry saying 'at least we get to fill up your disks with more bits than we needed'. effectively a DOS attack from them to you, stealing your disk space when you do direct BD rips or keep BD copies around.

    hdmi audio is now in the so-called 'protected path' and that's never a good thing for consumers. spdif audio was never in any protected path and that's why they are trying to kill it.

    vga video is also not in a protected path and so they also want to kill it.

    it really is all about 'migrating the user away' from the open formats and onto closed, controlled ones.

    • they are convincing people to abandon spdif, for audio, too. the new kids who are brought up with hdmi think there's nothing wrong with it. in fact, the way they mixed audio and video made the whole combo stream all DRMed. we once had mostly free and clear spdif (scms ignored since it was defeatable easily) and then they upped the bitrate so that spdif toslink and copper paths would not easily (or at all) carry the new digital audio formats (blu ray audio and so on). the new codecs are using bitstream audio for all channels which is HUGE overkill for sound tracks on movies, but its a middle finger from the entertainment industry saying 'at least we get to fill up your disks with more bits than we needed'. effectively a DOS attack from them to you, stealing your disk space when you do direct BD rips or keep BD copies around.

      Wow, a conspiracy to add too much quality to the media we buy so we are discouraged to make copies of them. So eeeeeeevil. They even put a mandatory scratch resistance layer on DB discs to make them EEeeeeviillly last longer.
      Personally, I think if you had to buy them, they aren't "rights". You are buying permission and it comes with conditions. If you don't like it, go make movies or something. Yah, the government intervened and decided what a fair amount of permission is, but that does not give it equ

  • Great. A VGA Cable costs $5. A DVI cable costs $25, and that's if you order from a really cheap vendor, and you have to pay shipping on that shit. If you go to Best Buy, they have their $50 gold plated one. If that's in stock at all. Usually it's just the $100+ Monster DVI cable...Fucking wonderful.
    • A VGA Cable costs $5. A DVI cable costs $25, and that's if you order from a really cheap vendor, and you have to pay shipping on that shit.

      You have to pay shipping on the $5 VGA cable too; the VGA cables I saw at Best Buy were far more expensive than $5. So you might as well go to Monoprice and order some $5 HDMI cables and some $5 network cables to be shipped in the same box, and then sell them to friends and family at a reasonable markup. Do you see the business opportunity yet?

      But another problem is with standard-definition TVs and DVD recorders. I predict that used CRT SDTVs will still sit on the shelves of thrift stores come this 2015

    • by IICV ( 652597 )

      Umm? [amazon.com] Or how about this? [newegg.com]

      Maybe you should stop buying cables at brick and mortar stores? They always rip you off on cables, no exceptions - the theory being, I imagine, that you need to have the cable right now or else you wouldn't be buying it from them.

    • by jimicus ( 737525 )

      I know it's UK prices, but you are getting seriously ripped off [ebuyer.com].

  • by Clomer ( 644284 ) on Thursday December 09, 2010 @11:18AM (#34501798)
    I work as a student employee at my university. Over the last summer, we replaced about 500 computers across campus (most of our student lab machines). The new machines only have Display Port as their graphics interface, and we have had lots of problems with it. Lots of various software glitches, and even some significant hardware issues as well.

    One issue is that the physical connector is not very sturdy. One good whap (which is not uncommon in an academic environment) and the connector gets destroyed, sometimes taking the graphics card with it. We've had to replace several graphics cards because of this. This was not a problem with our previous batch of machines, which used *gasp* VGA. There are other issues as well, such that there was actually some serious discussion at upper levels of management about the possibility of returning the whole lot of computers (remember, about 500) and demanding the replacement use either VGA or DVI. In the end, they decided that this would be more trouble than it was worth, and that we'd just deal with Display Port issues as they arise. Which, they continue to do.

    As for myself, I have no intention of ever using Display Port as my primary display interface on my personal machines unless there is literally no other option. In my opinion, DVI is superior in every respect that matters, and even VGA is preferable.
  • by antdude ( 79039 ) on Thursday December 09, 2010 @12:00PM (#34502598) Homepage Journal

    Aww, that means I have to buy a new expensive KVM with DVI or something.

  • by Marrow ( 195242 ) on Thursday December 09, 2010 @12:58PM (#34503730)

    Not everything is better digital. Analog is a good format for long cable runs like running a display via CAT-5. I don't like the change to display port. It requires you waste money if you want to change formats from DVI to VGA because the DP->DVI connectors will convert to VGA. So you need a DVI converter AND a VGA converter. At 25 bucks a pop.

    DVI and DisplayPort are both more expensive in most situations. The monitors (as mentioned above) do not come with DVI cables.

    All in All, I see this as a Loss for the consumer.

    The big advantage for DisplayPort is to drive screens that dont even exist yet. Resolutions that DVI cannot handle. But what needs those 1080p+ resolutions yet? Desktop monitors do not. Bigscreens do not. What then is the point?

    • Ok, here is a price check from Monoprice.

      3' cables: VGA: $2.47 - DVI: $3.64 - DispP: $4.05

      Those are obviously economy cables - but they still work fine. DisplayPort is the most expensive, by an entire $1.58. As DisplayPort becomes more popular the price will come down. In addition, because DisplayPort uses fewer* conductors then DVI, the cost of longer cables should be less then DVI. VGA has even fewer conductors but, because of the analog signals, has to use cables with better shielding.

      The big advantage for DisplayPort is to drive screens that dont even exist yet. Resolutions that DVI cannot handle. But what needs those 1080p+ resolutions yet?

      Well, my

Genius is ten percent inspiration and fifty percent capital gains.

Working...