Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel Displays

Goodbye, VGA 356

jones_supa writes "Leading PC companies have expressed their will to finally start kicking out legacy display interfaces. Intel plans to end support of LVDS in 2013 and VGA in 2015 in its PC client processors and chipsets. While the large installed base of existing VGA monitors and projectors will likely keep VGA on PC back panels beyond 2015, PC and display panel makers are in strong support of this transition. The DisplayPort connector interface provides backwards and forwards compatibility by supporting VGA and DVI output via certified adapters, while also providing new capabilities such as single connector multi-monitor support."
This discussion has been archived. No new comments can be posted.

Goodbye, VGA

Comments Filter:
  • by Opportunist ( 166417 ) on Thursday December 09, 2010 @09:50AM (#34499776)

    it likely will end up that if users want to watch new movies, they have to upgrade the computer, video card, and monitor to support the copy protection.

    Proabably some will. Most will just figure out that it's way cheaper to head for TPB or the likes, get movies in a format their hardware supports and that's also more flexible when it comes to the storage medium it can reside on.

    Sometimes I wonder what's the advantage of those "copy protected" devices I hear about. I can't see a single good thing in them.

  • What's wrong with PS/2 connectors? I prefer them, unlike USB they don't require polling as they are interrupt driven. When I can choose, I take PS/2 over USB for keyboards and mice. Saves USB ports too for other duties.

  • by Talderas ( 1212466 ) on Thursday December 09, 2010 @10:00AM (#34499862)

    You're highly optimistic regarding the average consumer.

    What drugs are you consuming?

  • by corbettw ( 214229 ) on Thursday December 09, 2010 @10:07AM (#34499912) Journal

    Or you could, I dunno, get a USB keyboard that has two or four USB ports on it, itself. Try doing that with PS2.

  • by petermgreen ( 876956 ) <plugwash.p10link@net> on Thursday December 09, 2010 @10:10AM (#34499948) Homepage

    As far as I know there are no VGA-DVI adapters (DVI-VGA does exist)
    The adaptors you speak of are just wiring adaptors. They (along with DVI-I sockets) let a computer or monitor manufacturer offer both analog and digital on the same port but the analog output hardware still has to be present in the computer. Afaict if the monitor supports it you can use them at the monitor end as well.

    There are adaptors that actually convert between digital and analog but they don't come cheap.

  • by Ngarrang ( 1023425 ) on Thursday December 09, 2010 @10:17AM (#34500036) Journal

    Intel will drop VGA from their chipsets and this will be a boon for video card makers. Video card makers already cater to the those who need better video, or different ports, or more ports, or whatever. As long as monitors include a VGA port, card makers will, too. Intel has the luxury of being able to drop it. It will save them money. They also know that no one is being left behind thanks to card makers. It is a win for both sides.

  • by dasunt ( 249686 ) on Thursday December 09, 2010 @10:17AM (#34500038)

    Recently, the monitor at my parents failed (a 2 or 3 year old 1280x1024 LCD panel... All CRTs before that lasted way longer. This LCD craze does have its downsides).

    Often, if an LCD goes after just a few years, it's due to a bad capacitor or two on the motherboard.

    If you do a bit of research and find out what the requirements for the capacitors are (usually low-ESR, etc), the cost for each capacitor is under $1, and anyone with basic desoldering skills can replace them.

  • by retech ( 1228598 ) on Thursday December 09, 2010 @10:18AM (#34500052)
    Seriously, all this fast paced change and incredibly quick adoption of new technology makes my head spin. I just got through building the recommended case out of plywood for my Apple motherboard. Now I find out that I will have to use some fancy new type of video doohickey. Gees Louise!
  • by ledow ( 319597 ) on Thursday December 09, 2010 @10:21AM (#34500100) Homepage

    Every school I've ever worked in has VGA equipment as default. PC's, laptop connectors, projectors, monitors, video distribution system and digital signage all have/had VGA connectors. They might have HDMI / DVI *as well* but they all operate on a VGA basis primarily. I can see why - it's simple, it's compatible back to their oldest available machines without having to spend extra money on adapters and convertors (that half the time break or just plain don't work because they bought the wrong pins on their DVI adaptor and expect it to work). The rest of the advantages don't have any bearing - they can get whatever resolution they like going through meters of inexpensive 15-core cable that's been in the walls for years to their projectors over VGA and *not* notice any performance degradation. The only places that I would argue NEED better connectors are those places that are specialist anyway - CAD, Video and huge display signs.

    In general use, what advantages does anyone with significant investment in VGA really see from a DVI / HDMI conversion? Hell, I ran a 1024x768 VGA signal over a 75m CAT5e cable with an adaptor (the Cat5e was actually already there, by luck, so wasn't installed with that usage in mind) and that's STILL running a school's main entrance signage on a HUGE TV and nobody cries about the signal quality (the TV also has HDMI, SCART, S-Video, Component, Composite, RF, etc. in and works fine for them all but why bother when the lowest common denominator just works for everyone?).

    If something works, that's good enough. Especially if it works on ALL machines you can get (up until now, obviously). If you chose DVI-only then it wouldn't work on older machines without adapters. If you chose HDMI-only, it would work on even less. The transition has taken place so now the other way is beginning to start but it'll be another 3-4 years before schools and large businesses have to go to special efforts (e.g. special order, pick up particular models, or use adaptors) to get VGA inputs/outputs on their devices.

    This isn't a shock, like getting rid of PS/2 ports isn't a shock, because there are several alternatives already existing. The problem is that it's an enforced obsolescence of something for not-very-convincing reasons. Give it three years and there'll still be places with VGA convertors everywhere until hardware replacement time is due. VGA isn't a chore to use, or a problem to configure (hell, teachers can manage it - it's just a matter of Fn-Whatever and plugging a cable in). I have *just* been given my first work laptop that had something more than a VGA or S-Video port, and that's because it's really a gaming laptop in order to meet my minimum spec.

    Computers will come with VGA. People will buy adaptors for a few years until they buy a non-VGA device on both ends. Then the world will carry on as normal. It wasn't a "disaster" that actually needed to be fixed in the first place - I still have no use for DVI or HDMI devices myself - and thus there are probably millions of people that will have to do something in the future, but they would have to eventually anyway, and it'll be absorbed into their ordinary replacement costs anyway. All it means is that I don't budget quite so much for VGA cables next year and I have to convince my employers that all those perfectly-working interactive whiteboards and projectors really do need, at minimum, a new cable run, a new socket or a new adaptor unless they want me to overhaul the entire place. Big deal, I have to have that conversation about once every six months about *something*.

    The sky isn't falling. It wasn't even cracked to start with. We just have this new, brighter, hi-def sky that apparently needs a pair of sunglasses to view properly.

  • by TheGratefulNet ( 143330 ) on Thursday December 09, 2010 @10:45AM (#34500452)

    analog video is video you can't 'control'. no DRM (or none that is hard).

    its not at all surprising people of interest want to kill it.

    they are convincing people to abandon spdif, for audio, too. the new kids who are brought up with hdmi think there's nothing wrong with it. in fact, the way they mixed audio and video made the whole combo stream all DRMed. we once had mostly free and clear spdif (scms ignored since it was defeatable easily) and then they upped the bitrate so that spdif toslink and copper paths would not easily (or at all) carry the new digital audio formats (blu ray audio and so on). the new codecs are using bitstream audio for all channels which is HUGE overkill for sound tracks on movies, but its a middle finger from the entertainment industry saying 'at least we get to fill up your disks with more bits than we needed'. effectively a DOS attack from them to you, stealing your disk space when you do direct BD rips or keep BD copies around.

    hdmi audio is now in the so-called 'protected path' and that's never a good thing for consumers. spdif audio was never in any protected path and that's why they are trying to kill it.

    vga video is also not in a protected path and so they also want to kill it.

    it really is all about 'migrating the user away' from the open formats and onto closed, controlled ones.

  • by Sycraft-fu ( 314770 ) on Thursday December 09, 2010 @12:13PM (#34501730)

    LCDs can last a damn long time. We've got some at work going on 9 years now, still working fine, still good image quality. I get a little tired of the "All old stuff was better and lasted longer, new stuff sucks." No. Wrong. This is just more looking at the past with rose coloured glasses.

    For one, you only see examples today of the stuff that lasted, not the stuff that broke. The stuff that broke was thrown away. So sure, if you find a CRT in service now, it lasted a long time. However that doesn't mean that there aren't a thousand more in a land fill that broke.

    Also, for brand new stuff you cannot very well demand to know its lifetime and failure rate as it is new, it hasn't been tested. I can't tell you if a specific device will last 20 years until 20 years have gone by.

    In the case of monitors, LCDs are actually far more reliable in the long run. As you note, much of what can go wrong is cheap to fix, and fixable by a consumer. Caps aside (which are more rare to break these days) the main thing to go is the backlight. It will usually go out somewhere in the 8-12 year range, though it could be longer for less used devices. Good news is that isn't expensive to replace. Get a new one and things work again.

    What's more, other than lower brightness due to the backlight fading, LCDs don't lose image quality with time. Replace a backlight in a 10 year old LCD and it looks as good as it ever did. Not as good as current LCDs, the tech has progressed, but the image will still be stable, with perfect focus and geometry. CRTs start to suck as they get old. They fade too, but they also lose focus, geometry control, image stability and so on. They can be pretty poor looking after a decade.

    Look past personal examples to the general trend and you find LCDs are nice and reliable. Some break, but then so did some CRTs. The tech overall is very reliable, and much easier to repair minor flaws.

Work is the crab grass in the lawn of life. -- Schulz

Working...