Goodbye, VGA 356
jones_supa writes "Leading PC companies have expressed their will to finally start kicking out legacy display interfaces. Intel plans to end support of LVDS in 2013 and VGA in 2015 in its PC client processors and chipsets. While the large installed base of existing VGA monitors and projectors will likely keep VGA on PC back panels beyond 2015, PC and display panel makers are in strong support of this transition. The DisplayPort connector interface provides backwards and forwards compatibility by supporting VGA and DVI output via certified adapters, while also providing new capabilities such as single connector multi-monitor support."
That's one heck of a "long goodbye" (Score:4, Informative)
Oh, I wouldn't say goodbye just yet.... 2015 is still a long way to go. Recently, the monitor at my parents failed (a 2 or 3 year old 1280x1024 LCD panel... All CRTs before that lasted way longer. This LCD craze does have its downsides). Their computer has an old GeForce 4 MX 4400 or so with only a VGA port. I went to a local electronics shop and found a 23" Full HD LCD panel for an incredible 149€. I bought it, but then I got worried. Wait, the box doesn't mention VGA at all only DVI. I was a bit scared I'd have to upgrade to DVI, not that it matters, I have tons of older video cards with DVI so it would just have been a bit extra work.
Turned out that when I opened the box, only a VGA cable was included. DVI connector was there, and I'm pretty sure that it would work. For me it was ideal, for someone planning to connect to a DVI-only machine would probably have needed to go back to buy a cable.
Also keep in mind that a lot of laptops only have VGA. As far as I know there are no VGA-DVI adapters (DVI-VGA does exist). Since these days 5 year old computers and older fullfil the need of most computer users, don't expect VGA monitors to disappear soon. Companies will cater the needs of those "left behind".
DisplayPort? Haven't even seen a computer having that by default... Macs perhaps? I don't know, we only have a iMac and since the monitor is built-in, I didn't bother looking for display connectors.
No, wait... I think my fathers new Alienware laptop has a displayport. Totally forgot about that. It's less than a year old though.
Re:That's one heck of a "long goodbye" (Score:5, Interesting)
Macs have DisplayPort connectors, and have done for some time.
Though I wouldn't be too surprised to see this continue for some time - hell, you can still buy a PC with PS/2 connectors, FFS.
Re:That's one heck of a "long goodbye" (Score:5, Insightful)
What's wrong with PS/2 connectors? I prefer them, unlike USB they don't require polling as they are interrupt driven. When I can choose, I take PS/2 over USB for keyboards and mice. Saves USB ports too for other duties.
Re: (Score:2)
Agreed! I just wish I didn't have fifty billion PS/2 adapters in my desk drawers -_-;;
Note: I'm totally kidding. It's more like sixty billion.
Re:That's one heck of a "long goodbye" (Score:5, Insightful)
Or you could, I dunno, get a USB keyboard that has two or four USB ports on it, itself. Try doing that with PS2.
Re: (Score:2)
Oddly enough I have one of those at work... I never think of using them... I always connect stuff directly to the laptop or the docking station. In my mind a keyboard is still something standalone... Heck, even my external monitor has USB connectors. I never use those either. I simply don't think of them as USB hubs.
Re: (Score:2)
Great plan. As long as you're not using too much power on your USB devices, that is...
Re:That's one heck of a "long goodbye" (Score:5, Informative)
And slow your stuff down to usb1.1 spec. Oh and zero power there. I have a USB flash stick that will not work off of a Keyboard USB port. Not enough power there.
extra usb ports on your keyboard are like stick on air vents for a car... There for show only.
Re: (Score:3)
My Mac extended keyboard has two USB ports on it. My USB2 3.5" hard drive is connected to one, an ancient Logitech mouse on the other, and the HDD speeds are perfectly in line with a USB2 device.
USB2 thumb drives works fine off of it too. The only thing it can't do is power an external 2.5" drive or my iPhone--have to connect directly to the computer's USB for that.
So, hardly for show only.
Conclusion: it's not USB keyboards that's the problem, it's poor-quality ones.
Re: (Score:2)
I plug my mouse into the keyboard that way I only have one cable going under the desk. How is that not simpler? It also means I can move the keyboard and mouse together as opposed to haveing one of the cables to short.
Simplify your stuff. The smaller number of cable runs you have the neater and more organized they will be.
Re: (Score:2)
What are you doing with your mouse and keyboard that the protocol makes a practical difference? I'm legitimately curious, not sniping. I use a PS/2 keyboard to save ports myself.
Re: (Score:2)
Re: (Score:2)
What sort of issues? I have two and one is connected to my laptop via a DIN to PS/2 to USB connectors, the other is a DIN to PS/2 connected to my main computer.
[John]
Re: (Score:2)
Only one thing I can thing of... IBM Model M It tends to have issues with PS/2 to USB adapters
Best thing I've found for this:
http://pckeyboards.stores.yahoo.net/customizer.html [yahoo.net]
It's essentially a marginally updated clone of the IBM Model M. Available in black, has a 104 key layout instead of 101, USB interface available, and a straight cord rather than the coiled annoying one from the original model M. I've got the Unicomp spacesaver version (same layout but less border plastic around the edges) as well as a real IBM Model M (as well as 2 other different brand mech switch keyboards), and the Unicom
When PS2 is better - one example (Score:2)
In my organization, all computers run full disk encryption with a pre-boot screen that pops up to enter a password. We use both Guardian Edge and WinMagic products for this purpose. We've found that in one fairly common failure mode seen while Guardian Edge Hard Disk disk encryption is used, when we need to type in an admin account name and password to unlock machines, the machines simply don't recognize USB devices. Plug in a PS2 keyboard, reboot, and then we can log on and fix 'em.
I'm pretty clueless a
Re: (Score:2)
There's an option in most BIOSes about if they should handle USB keyboard support themselves until the OS takes over. You probably need to flip the setting.
Re: (Score:2)
Afaict it's a bios issue. During the early boot phase the bios is doing the job of providing basic input from the keyboard and output to the screen (hence the name bios ;) ).
Unfortunately there was a long time lag between the introduction of USB and proper bios support for UISB keyboards (especially for keyboards behind hubs such as the ones in most USB KVM switches). And even when the bios does support it it's not always turned on.
After windows loads there is another phase of annoyance if devices have move
Re: (Score:3)
The USB keyboard protocol polls the keyboard for changes at regular intervals. If two keys change state very close together (i.e., if you're a fast typist), the changes will be sent in the same data packet. The problem is that the protocol doesn't care about the order of the keypresses and just handles the changes in QWERTY order, so I get typos in my text whenever I type in the "wrong" order. The $100 Das Keyboard is particularly bad about this due to its N-key repeat feature, but others do it too.
Modern c
Re: (Score:3)
I've seen that on cheap PS/2 keyboards too. It's really annoying when you can't even type a 3 letter word like TWO without it coming out as WTO every single time. At first I thought I was crazy, but it's very reproducible if you type fast.
Re: (Score:3)
Re: (Score:2)
If your computer is under such heavy load that a USB mouse/keyboard is being dropped, I think it's time to upgrade from that 286!
That is about the lamest and most artificial response to a question I've seen in a long time. As another poster put it, if your computer is under that much load you have far bigger problems than what type of interface you are using for your keyboard. There is absolutely no technical reason to prefer USB keyboards over PS2 unless you have a machine that lacks a PS2 port. Personally I don't care either way. I use what I have on the given machine. Some of my boxen have PS2 keyboards and mice, some have US
Re: (Score:2)
Hmmm I hit reply to the GP post yet it posted it as a reply to yours. I wasn't criticizing you, I was going after the GP :)
I hate the new /. interface. It's utter crap.
Re: (Score:3)
For existing computers, sure. Although PS2's main problem is that it's not hot-pluggable, so if you don't have a keyboard plugged in at boot, you'll need to reboot the machine to get it detected (at least with Windows).
More to the point, modern motherboards tend not to have PS2 ports, and I don't believe any laptops have for what, at least half a decade?
Re:That's one heck of a "long goodbye" (Score:5, Informative)
The problem isn't the OS, it's the port: http://en.wikipedia.org/wiki/PS/2_connector [wikipedia.org] (see the "hotplugging" section).
You *must* power down the machine before plugging in a PS/2 device, or risk blowing up the port controller/fuse. I have killed at least one motherboard this way (PS/2 devices no longer work on it).
Re:That's one heck of a "long goodbye" (Score:5, Funny)
Even with the proper USB HID driver, USB is still limited to 6 keypresses at once, while PS/2 will handle as many keys as you can press.
Judging from the issues you're having, I'm guessing that you must be a house cat.
Re: (Score:3, Informative)
This has nothing to do with USB vs PS/2.
This has to do with masking.
http://www.dribin.org/dave/keyboard/one_html/ [dribin.org]
The bottom line is that keyboards don't have a dedicated circuit for each key - they use a bunch of small grids and detect key presses at the ends of rows/columns in the grid.
Multiple key presses within a grid can cause masking - where a keypress simply isn't recognized at a physical level. Ghosting can also happen, where a keypress is recognized when there isn't one.
How the grids are laid out p
Re: (Score:3)
What's wrong with PS/2 connectors? I prefer them, unlike USB they don't require polling as they are interrupt driven. When I can choose, I take PS/2 over USB for keyboards and mice. Saves USB ports too for other duties.
Gosh, back when USB wasn't so common, there was an article here talking about the relative work Linux needed to do to handle both. The PS/2 port needed to be sampled at 200Hz, while USB was doing hardware offloading of the work, so the computer could have more interrupts for other (server-ty
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
New Dell Precision workstation we got recently only has either DisplayPort or mini-HDMI connectors on the graphics card. There was an adapter included to convert to DVI output so I just used that.
Re: (Score:2)
Re:That's one heck of a "long goodbye" (Score:5, Informative)
The phrase "certified adapter" means "video quality degraded to crap and DRM added."
Just FYI.
Re: (Score:3, Informative)
More like the adapters are defined in the DisplayPort specs rather than just being after-market addons like a DVI to component adapter would be. You can't add DRM to VGA (although you can degrade it, as you pointed out).
Re: (Score:2)
VGA-DVI adapters (actually converters, as you need to do analog-to-digital) exist, they're just rather expensive.
http://www.networktechinc.com/vga-dvi.html [networktechinc.com]
Re: (Score:3)
You don't need to convert to digital, DVI has an analogue variant.
The problem is that I don't know of any displays with a DVI-analogue socket.
Re: (Score:2)
Or you could just get one of those [google.com] for $4.
Re:That's one heck of a "long goodbye" (Score:4, Insightful)
As far as I know there are no VGA-DVI adapters (DVI-VGA does exist)
The adaptors you speak of are just wiring adaptors. They (along with DVI-I sockets) let a computer or monitor manufacturer offer both analog and digital on the same port but the analog output hardware still has to be present in the computer. Afaict if the monitor supports it you can use them at the monitor end as well.
There are adaptors that actually convert between digital and analog but they don't come cheap.
Re: (Score:2)
You forgot to mention - does it work properly after that over the VGA cable?
Most VGA cables cannot take the frequencies required to transmit a HD signal cleanly so you get pretty nasty ghosting. The same is valid for a lot of recent video cards which have VGA as an afterthought on a cable hanging of a header on the side.
On the negative side, this is likely to reinstate the whole debacle about resolutions, DRM and the other "digital may allow people to steal stuff" that kind'a went away from the PC and got c
Re: (Score:3)
Ghosting isn't really about frequency response. Lack of high frequency response would cause bluring.
If you really have ghosting it is likely a result of impedence mismatches either because you are attempting to use a passive splitter, because the characteristic impedance of the cable is wrong or because the termination in the devices sucks.
Personally I've had pretty good luck with VGA EXCEPT when trying to drive HDTVs. My conclusion is that the VGA inputs on those things just suck.
"FULL HD" isn't really tha
Re: (Score:2)
You must be joking...
Some of have been happly using VGA cables for "HD" signals since long before any HD standard was defined.
This has to be the dumbest thing yet that anyone has come up with on this thread.
Re: (Score:3)
I really miss the vertical screen estate
I prefer 4x3 monitors, but you just need to select your monitors more carefully, and buy ones with stands that allow the monitors to rotate.
a 1920x1080 monitor rotated to 1080x1920 is pretty good for vertical real-estate. You may have to screw with cleartype if your running windows, but its well worth it. Of course I rotate my 4x3 1600x1200 monitors (at work) too, at home I found a irresistible set of 24" 1920x1200 IPS panels on ebay for $150 to replace my old 4x3's a
Re:That's one heck of a "long goodbye" (Score:5, Insightful)
Often, if an LCD goes after just a few years, it's due to a bad capacitor or two on the motherboard.
If you do a bit of research and find out what the requirements for the capacitors are (usually low-ESR, etc), the cost for each capacitor is under $1, and anyone with basic desoldering skills can replace them.
And just to further that (Score:5, Insightful)
LCDs can last a damn long time. We've got some at work going on 9 years now, still working fine, still good image quality. I get a little tired of the "All old stuff was better and lasted longer, new stuff sucks." No. Wrong. This is just more looking at the past with rose coloured glasses.
For one, you only see examples today of the stuff that lasted, not the stuff that broke. The stuff that broke was thrown away. So sure, if you find a CRT in service now, it lasted a long time. However that doesn't mean that there aren't a thousand more in a land fill that broke.
Also, for brand new stuff you cannot very well demand to know its lifetime and failure rate as it is new, it hasn't been tested. I can't tell you if a specific device will last 20 years until 20 years have gone by.
In the case of monitors, LCDs are actually far more reliable in the long run. As you note, much of what can go wrong is cheap to fix, and fixable by a consumer. Caps aside (which are more rare to break these days) the main thing to go is the backlight. It will usually go out somewhere in the 8-12 year range, though it could be longer for less used devices. Good news is that isn't expensive to replace. Get a new one and things work again.
What's more, other than lower brightness due to the backlight fading, LCDs don't lose image quality with time. Replace a backlight in a 10 year old LCD and it looks as good as it ever did. Not as good as current LCDs, the tech has progressed, but the image will still be stable, with perfect focus and geometry. CRTs start to suck as they get old. They fade too, but they also lose focus, geometry control, image stability and so on. They can be pretty poor looking after a decade.
Look past personal examples to the general trend and you find LCDs are nice and reliable. Some break, but then so did some CRTs. The tech overall is very reliable, and much easier to repair minor flaws.
Re:And just to further that (Score:4, Interesting)
Re: (Score:2)
http://www.ramblers.org.uk/ [ramblers.org.uk]
Re: (Score:2)
Almost all new corporate laptops now have display port (Lenovo, HP, Dell). All AMD-based corporate desktops now have DisplayPort as the "2nd monitor" on the onboard motherboard. (e.g. HP Compaq 6005)
However - it is ironic that "display panel" makers are "anxious", because if you look at most display makers' LCD offerings - maybe 1-2 models out of 15 or so will actually have a DisplayPort port - and you'll be paying a hefty premium for that. We have to buy adapters with every computer we buy for users who
Re:That's one heck of a "long goodbye" (Score:4, Interesting)
Many still are VGA-only to save on costs
That doesn't make sense. Driving a TFT from VGA requires a lot more circuitry than driving it from DVI-D. That's why the Apple monitors only had DVI-D input; it was cheaper to produce. In reality, the cheap TFT monitors are VGA only for differentiation: they're convinced people that it's worth paying a premium to be able to drive your digital display from a digital signal, and so people do.
DisplayPort should be even cheaper. It's designed to be easy to use to drive a typical TFT and, unlike DVI and HDMI, doesn't require you to pay a royalty to use. Monitors that are DisplayPort-only are going to be cheaper to produce than any of the other options. Of course, that doesn't mean that they'll cost less to consumers...
DPCP/HDCP (Score:5, Interesting)
DisplayPort [...] doesn't require you to pay a royalty to use.
Until the Hollywood-endorsed operating system used on the majority of home and office PCs fails to recognize DisplayPort monitors that fail to implement Hollywood-endorsed display encryption. DPCP and HDCP both have a hefty royalty.
Re: (Score:3)
See, this is where my conspiracy theories kick in. It's actually a good business model if you make a monitor that only lasts 2-3 years opposed to one that lasts decades. TV/Monitor manufactures may very well skimp on several areas, knowing full well you will be replacing your device much sooner than before.
I don't think the entire world is evil, I just think all corporations are.
Re: (Score:2)
I have a decade-old LCD and recently got rid of my last decade-old CRT monitor (still have a CRT TV that old). Neither one remains a quality display after a decade of use; the CRT gets dimmer and turning it up to compensate makes the image lousier, and the CCF-backlit LCD backlight gets dimmer as well. I'd expect an LED-backlit LCD to last
Re: (Score:2)
But at the same time, you can't discredit the 'big business' logic above. They only want it to last long enough so that you feel compelled to buy another one from them when it does eventually die.
Re: (Score:2)
As far as I know there are no VGA-DVI adapters (DVI-VGA does exist).
DVI-A to VGA adapters exist, but the video card must explicitly support DVI-A (Analog) - it's simply a way to map VGA analog video and digital video into a single DVI connector. If the video card is DVI-D (Digital) only, then it can't be converted to VGA.
You could get/make an adapter from VGA to DVI-A but most monitors' DVI ports support DVI-D only, if they support VGA they will have a discrete VGA port.
Wikipedia article on DVI [wikipedia.org]
Re: (Score:2)
(...) but you obviously can't send an analog signal into a digital interface.
This is why DVI-D to VGA (and conversely, VGA to DVI-D) adapters don't exist.That said, DVI-I (and DVI-A too, but I have yet to see one of those) can send both digital and analog signals, so I would call it an adapter.
Re: (Score:2)
You obviously haven't bought a graphics card recently :)
Most graphics cards sold in the past 5-10 years ship with at least one of 'em. Nowadays, they all do, since videocards rarely have VGA ports anymore. They normally have some combination of DVI-I, DisplayPort, and HDMI, and provide an adapter to use the DVI-I port as VGA.
Heck, my Radeon 9700 Pro shipped with two such adapters, since it only had two DVI-I ports ;)
Conference rooms (Score:5, Interesting)
Only place I use VGA anymore (and have used in the past 4-5 years) is for overhead projectors in conference rooms.
Re:Conference rooms (Score:5, Interesting)
Re: (Score:2)
That is because of bad device drivers. Most free drivers for Linux work ok, but the closed ones aren't in any way reliable.
Re: (Score:2)
I know it's not the same scenario, but hooking up my old laptop (it sports a Geforce Go 6200) with a CRT TV wasn't that hard to do. So... what about them beamers?
Re: (Score:2)
Re: (Score:2)
I've had problems with projectors from my Mac. The problem is that Apple doesn't provide a way of overriding the 'detect displays' feature. In a lot of places, people run really long VGA cables (often chained together) from the projector (all around the edge of the room), so the cable is way out of spec and can't carry the DCC signal back from the projector. This means that the laptop can't detect the display. With Windows and *NIX, you can just provide it with a resolution and refresh. With the Mac, y
Re: (Score:2)
Sometimes it's bad drivers and software, most manufacturers think that Windows' controls are not good enough, so provide their own, unique for each card apparently trying to explain how to mirror displays over the phone is hard enough as it is without manufacturers including their own crapware.
Sometimes it's just bad design. If for example you get a brand new Dell with only a VGA port on it for some reason it doesn't detect any external display, so you have to manually enable it after which the main display
Re:Conference rooms (Score:4, Insightful)
Every school I've ever worked in has VGA equipment as default. PC's, laptop connectors, projectors, monitors, video distribution system and digital signage all have/had VGA connectors. They might have HDMI / DVI *as well* but they all operate on a VGA basis primarily. I can see why - it's simple, it's compatible back to their oldest available machines without having to spend extra money on adapters and convertors (that half the time break or just plain don't work because they bought the wrong pins on their DVI adaptor and expect it to work). The rest of the advantages don't have any bearing - they can get whatever resolution they like going through meters of inexpensive 15-core cable that's been in the walls for years to their projectors over VGA and *not* notice any performance degradation. The only places that I would argue NEED better connectors are those places that are specialist anyway - CAD, Video and huge display signs.
In general use, what advantages does anyone with significant investment in VGA really see from a DVI / HDMI conversion? Hell, I ran a 1024x768 VGA signal over a 75m CAT5e cable with an adaptor (the Cat5e was actually already there, by luck, so wasn't installed with that usage in mind) and that's STILL running a school's main entrance signage on a HUGE TV and nobody cries about the signal quality (the TV also has HDMI, SCART, S-Video, Component, Composite, RF, etc. in and works fine for them all but why bother when the lowest common denominator just works for everyone?).
If something works, that's good enough. Especially if it works on ALL machines you can get (up until now, obviously). If you chose DVI-only then it wouldn't work on older machines without adapters. If you chose HDMI-only, it would work on even less. The transition has taken place so now the other way is beginning to start but it'll be another 3-4 years before schools and large businesses have to go to special efforts (e.g. special order, pick up particular models, or use adaptors) to get VGA inputs/outputs on their devices.
This isn't a shock, like getting rid of PS/2 ports isn't a shock, because there are several alternatives already existing. The problem is that it's an enforced obsolescence of something for not-very-convincing reasons. Give it three years and there'll still be places with VGA convertors everywhere until hardware replacement time is due. VGA isn't a chore to use, or a problem to configure (hell, teachers can manage it - it's just a matter of Fn-Whatever and plugging a cable in). I have *just* been given my first work laptop that had something more than a VGA or S-Video port, and that's because it's really a gaming laptop in order to meet my minimum spec.
Computers will come with VGA. People will buy adaptors for a few years until they buy a non-VGA device on both ends. Then the world will carry on as normal. It wasn't a "disaster" that actually needed to be fixed in the first place - I still have no use for DVI or HDMI devices myself - and thus there are probably millions of people that will have to do something in the future, but they would have to eventually anyway, and it'll be absorbed into their ordinary replacement costs anyway. All it means is that I don't budget quite so much for VGA cables next year and I have to convince my employers that all those perfectly-working interactive whiteboards and projectors really do need, at minimum, a new cable run, a new socket or a new adaptor unless they want me to overhaul the entire place. Big deal, I have to have that conversation about once every six months about *something*.
The sky isn't falling. It wasn't even cracked to start with. We just have this new, brighter, hi-def sky that apparently needs a pair of sunglasses to view properly.
Re: (Score:2)
Only place I use VGA anymore (and have used in the past 4-5 years) is for overhead projectors in conference rooms.
And only people who can fail at projecting their slides worse than a laptop with a 1999 linux distro, are those with a mac and no vga adapter... i've seen this happen plenty of times...
Damn... (Score:4, Funny)
I thought we would finally be rid of Spike's Video Game Awards.
Tell me they won't get rid of RF TV jack out too?? (Score:2)
Oh , wait...
I'm sticking with VGA (Score:2)
On both of my HDTVs (different brands, a cheap-o from 3-4 years ago, and a high-quality one this year) I'm using a VGA cable from the DVI out on my computers. Why? Because whenever I use an DVI to HDMI cable, it results in horrendous overscan instead of displaying at the native screen resolution. This means everything is scaled up, even though the monitor resolution is reported correctly to Windows and OSX, leading to horrible image quality. You can somewhat correct this with system display settings, but th
Re: (Score:2)
I'm still trying to figure out why more TV manufacturers don't include an actual DVI port on their products...
Re:I'm sticking with VGA (Score:5, Informative)
Re: (Score:2)
My first HDTV had that option. By default (and oddly) it would overscan but not rescale, which lead to a frankly worthless black box around the whole screen. It was easily turned off in the menu, though.
My current TV does not. It does the whole rescaling/overscanning thing, and it makes using a DVI/HDMI hookup for my computer worthless. Luckily, it has a VGA port, even though it doesn't maintain the aspect ratio when scaling non-16:9 resolutions (another downgrade from the earlier set).
I miss my old TV so m
Re: (Score:2)
From my experience there usually is, but they don't document it very well.
For example, I have a Samsung TV with a PC hooked up by HDMI. To turn off overscan and rescaling, I have to go into the menu to rename the input and rename that HDMI port "DVI/PC". Everything in the UI suggests that's just the name I'll see on the input menu, and for every other combination of input type and possible name I've tried, that's all it is. The manufacturer's docs say I should do this when connecting a PC but don't say any
Re: (Score:2)
they do it's called HDMI. I hook DVI into a HDMI port all the time.
Re: (Score:2)
I'm not 100% sure, but the HDTV set I bought just a few months ago came w/ a VGA port, but no DVI ports. I think this is because HDMI and DVI are somehow compatible without conversion?
From wikipedia
Re: (Score:2)
Funny, it's the other way around for me on my TV. It has a VGA port, but refuses to go beyond 1024x786, which looks horrible on a 42" TV. HDMI detects the right native resolution and works instantly.
Video Cards Will Continue It On (Score:5, Insightful)
Intel will drop VGA from their chipsets and this will be a boon for video card makers. Video card makers already cater to the those who need better video, or different ports, or more ports, or whatever. As long as monitors include a VGA port, card makers will, too. Intel has the luxury of being able to drop it. It will save them money. They also know that no one is being left behind thanks to card makers. It is a win for both sides.
Re: (Score:2)
DVI can carry a VGA signal via an adaptor. You don't need an seperate VGA port on there. I've got a card from as long ago as 2005 that has two DVI ports and came with a matching pair of adaptors.
What next, floppy drives? (Score:4, Insightful)
I'm still using CGA you insensitive clod (Score:2)
Re: (Score:2)
I don't need your low resolution and puke-inducing color palettes, I'd rather use Hercules [wikipedia.org]!
Re: (Score:2)
I actually had a computer with a Hercules graphics adapter and an amber monochrome monitor. I had the manual to my dot-matrix printer and wrote a program in BASIC to put it in graphics mode and print monochrome graphics on it. IIRC it could BLOAD video memory dumps to print, and I included BSAVE hotkey functions in a few of my other programs to save screenshots that I could print out.
Good times...
Bye VGA (Score:2)
Re: (Score:2)
I never know if my system will be a desktop or end up as a server (spare system). in my 'server room' (ha!) I have a bunch of systems that I run mostly headless, but occasionally I have to see some console message or control the boot process or something. maybe the network is not up (stupid persistent.rules.linux, doh!). but I'll need console access and for pc, console != rs232. console = vga and keyboard.
for that reason, I've been buying mostly boards that have at least an onboard vga connector. this
Display port to 5 connector coax adapter? (Score:2)
Re: (Score:2)
i have a VGA to 5 coax cable. it even handles the sync on green if needed (although that is a second adapter..)
its about DRM and control (Score:5, Insightful)
analog video is video you can't 'control'. no DRM (or none that is hard).
its not at all surprising people of interest want to kill it.
they are convincing people to abandon spdif, for audio, too. the new kids who are brought up with hdmi think there's nothing wrong with it. in fact, the way they mixed audio and video made the whole combo stream all DRMed. we once had mostly free and clear spdif (scms ignored since it was defeatable easily) and then they upped the bitrate so that spdif toslink and copper paths would not easily (or at all) carry the new digital audio formats (blu ray audio and so on). the new codecs are using bitstream audio for all channels which is HUGE overkill for sound tracks on movies, but its a middle finger from the entertainment industry saying 'at least we get to fill up your disks with more bits than we needed'. effectively a DOS attack from them to you, stealing your disk space when you do direct BD rips or keep BD copies around.
hdmi audio is now in the so-called 'protected path' and that's never a good thing for consumers. spdif audio was never in any protected path and that's why they are trying to kill it.
vga video is also not in a protected path and so they also want to kill it.
it really is all about 'migrating the user away' from the open formats and onto closed, controlled ones.
Re: (Score:3)
they are convincing people to abandon spdif, for audio, too. the new kids who are brought up with hdmi think there's nothing wrong with it. in fact, the way they mixed audio and video made the whole combo stream all DRMed. we once had mostly free and clear spdif (scms ignored since it was defeatable easily) and then they upped the bitrate so that spdif toslink and copper paths would not easily (or at all) carry the new digital audio formats (blu ray audio and so on). the new codecs are using bitstream audio for all channels which is HUGE overkill for sound tracks on movies, but its a middle finger from the entertainment industry saying 'at least we get to fill up your disks with more bits than we needed'. effectively a DOS attack from them to you, stealing your disk space when you do direct BD rips or keep BD copies around.
Wow, a conspiracy to add too much quality to the media we buy so we are discouraged to make copies of them. So eeeeeeevil. They even put a mandatory scratch resistance layer on DB discs to make them EEeeeeviillly last longer.
Personally, I think if you had to buy them, they aren't "rights". You are buying permission and it comes with conditions. If you don't like it, go make movies or something. Yah, the government intervened and decided what a fair amount of permission is, but that does not give it equ
UGH (Score:2)
Go to Monoprice (Score:2)
A VGA Cable costs $5. A DVI cable costs $25, and that's if you order from a really cheap vendor, and you have to pay shipping on that shit.
You have to pay shipping on the $5 VGA cable too; the VGA cables I saw at Best Buy were far more expensive than $5. So you might as well go to Monoprice and order some $5 HDMI cables and some $5 network cables to be shipped in the same box, and then sell them to friends and family at a reasonable markup. Do you see the business opportunity yet?
But another problem is with standard-definition TVs and DVD recorders. I predict that used CRT SDTVs will still sit on the shelves of thrift stores come this 2015
Re: (Score:2)
Umm? [amazon.com] Or how about this? [newegg.com]
Maybe you should stop buying cables at brick and mortar stores? They always rip you off on cables, no exceptions - the theory being, I imagine, that you need to have the cable right now or else you wouldn't be buying it from them.
Re: (Score:2)
I know it's UK prices, but you are getting seriously ripped off [ebuyer.com].
Display Port? No thank you. (Score:5, Interesting)
One issue is that the physical connector is not very sturdy. One good whap (which is not uncommon in an academic environment) and the connector gets destroyed, sometimes taking the graphics card with it. We've had to replace several graphics cards because of this. This was not a problem with our previous batch of machines, which used *gasp* VGA. There are other issues as well, such that there was actually some serious discussion at upper levels of management about the possibility of returning the whole lot of computers (remember, about 500) and demanding the replacement use either VGA or DVI. In the end, they decided that this would be more trouble than it was worth, and that we'd just deal with Display Port issues as they arise. Which, they continue to do.
As for myself, I have no intention of ever using Display Port as my primary display interface on my personal machines unless there is literally no other option. In my opinion, DVI is superior in every respect that matters, and even VGA is preferable.
New KVM then. :( (Score:3)
Aww, that means I have to buy a new expensive KVM with DVI or something.
Splitters/Extenders work better on VGA (Score:4, Interesting)
Not everything is better digital. Analog is a good format for long cable runs like running a display via CAT-5. I don't like the change to display port. It requires you waste money if you want to change formats from DVI to VGA because the DP->DVI connectors will convert to VGA. So you need a DVI converter AND a VGA converter. At 25 bucks a pop.
DVI and DisplayPort are both more expensive in most situations. The monitors (as mentioned above) do not come with DVI cables.
All in All, I see this as a Loss for the consumer.
The big advantage for DisplayPort is to drive screens that dont even exist yet. Resolutions that DVI cannot handle. But what needs those 1080p+ resolutions yet? Desktop monitors do not. Bigscreens do not. What then is the point?
Re: (Score:3)
Ok, here is a price check from Monoprice.
3' cables: VGA: $2.47 - DVI: $3.64 - DispP: $4.05
Those are obviously economy cables - but they still work fine. DisplayPort is the most expensive, by an entire $1.58. As DisplayPort becomes more popular the price will come down. In addition, because DisplayPort uses fewer* conductors then DVI, the cost of longer cables should be less then DVI. VGA has even fewer conductors but, because of the analog signals, has to use cables with better shielding.
The big advantage for DisplayPort is to drive screens that dont even exist yet. Resolutions that DVI cannot handle. But what needs those 1080p+ resolutions yet?
Well, my
Re:Paving the way for HDCP 2.0 (Score:5, Insightful)
it likely will end up that if users want to watch new movies, they have to upgrade the computer, video card, and monitor to support the copy protection.
Proabably some will. Most will just figure out that it's way cheaper to head for TPB or the likes, get movies in a format their hardware supports and that's also more flexible when it comes to the storage medium it can reside on.
Sometimes I wonder what's the advantage of those "copy protected" devices I hear about. I can't see a single good thing in them.
Re: (Score:2, Insightful)
You're highly optimistic regarding the average consumer.
What drugs are you consuming?
Re: (Score:2)
Re: (Score:2)
I fear that even DisplayPort monitors likely wouldn't work unless they have the latest (HDCP 2012 or whatever they will call it) standard.
DPCP [wikipedia.org] actually. I'm sure'll they'll want to push everyone there now that HDMI is utterly broken.
Re: (Score:2)
That is easily fixed with a hdmi to VGA adapter that also scrubs the useless HDCP out. I have 2 myself and they work great to hook a hdmi bluray player to an analog only plasma display.
Search for HDFury3 to get your own.
Re: (Score:2)
Correct. They should’ve been more specific.
Re:YOU CAN'T SPELL VAGINA WITHOUT VGA !! (Score:5, Informative)
YOU CAN'T SPELL VAGINA WITHOUT VGA !!
C-U-N-T.
(If this gets modded as 'Informative' I'm putting it on my resume.)