NVIDIA Shows Off "Optimus" Switchable Graphics For Notebooks 102
Vigile writes "Transformers jokes aside, NVIDIA's newest technology offering hopes to radically change the way notebook computers are built and how customers use them. The promise of both extended battery life and high performance mobile computing has seemed like a pipe dream, and even the most recent updates to 'switchable graphics' left much to be desired in terms of the user experience. Having both an integrated and discrete graphics chip in your notebook does little good if you never switch between the two. Optimus allows the system to seamlessly and instantly change between IGP and discrete NVIDIA GPUs based on the task being run, including games, GPU encoding or Flash video playback. Using new software and hardware technology, notebooks using Optimus can power on and pass control to the GPU in a matter of 300ms and power both the GPU and PCIe lanes completely off when not in use. This can be done without being forced to reboot or even close out your applications, making it a hands-free solution for the customer."
VOODOO (Score:5, Funny)
I knew if I just held off upgrading my Orchid Righteous 3d (Voodoo 1) card long eoungh discrete 3d cards would become relavent again. You guys with your fancy Banshee cards can suck it.
Re: (Score:2)
oh happy memories.
I'm sure you're finding the massive 640x480 resolution just as awesome as I do.
Re:VOODOO (Score:4, Funny)
oh happy memories.
I'm sure you're finding the massive 640x480 resolution just as awesome as I do.
Well, you're clearly not aware of the nature of the sham that pervades high-resolution graphics.
For instance, graphics hardware manufacturers will happily tell you that a resolution like 1920 x 1080 has nearly seven times as many pixels as 640 x 480. But what they don't tell you is that all of these pixels are a whole hell of a lot smaller than the ones on your good old VGA monitor! With my monitor, I may not have a lot of pixels, but I'm damn sure I'm getting my money's worth out of every single one!
Re: (Score:2)
Exactly, that's why I only have a resolution of 9x9 on my 25 inch screen.
Re: (Score:2)
Exactly, that's why I only have a resolution of 9x9 on my 25 inch screen.
Really, that's all you need to play a rousing game of "dot" [youtube.com]...
Re: (Score:2)
Yeah, I was getting that, too. Really annoying...
Re: (Score:2)
Pfft. Everyone knows the Monster3D was where it was at! [wikipedia.org]
Re: (Score:1)
Same card (both used the reference 3DFX Voodoo 1 chipset), but the Orchid card came out first ;)
Re: (Score:2)
Ah yes, I forgot about that one! That was from Canopus, right?
Boring (Score:1)
Please wake me when a company has brought the GPU on die.
Re: (Score:3, Insightful)
Why? So when my GPU fucks itself I have to buy a whole new cpu/GPU combo?
No thanks, I'll stick with discrete individual parts - makes repairs and upgrades so much easier.
Re: (Score:2)
Re: (Score:2)
Well intel can put graphics and processor in the same package without putting them on the same die. Indeed they have already done so with thier latest dual core chips (the current gen quad-core chips don't have any support for shared memory graphics at all)
The thing is intel have failed to make decent graphics soloutions (they are getting better but are not yet up to the standards of even the integrated graphics in nvidia chipsets let alone dedicated graphics cards) and nvidia haven't even tried to make x86
So where is your discrete IDE controller? (Score:2)
On your MB? What about your discrete USB ports?
Re: (Score:2)
Crap (Score:1, Funny)
Transformers jokes aside
This article is ruined for me :(
Re: (Score:2)
In the pcper.com video on the first page there is at least some homage to the Transformers that starts right around the 2:30 mark...
http://www.pcper.com/article.php?aid=868 [pcper.com]
I like my desktop. (Score:2)
I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them. But I'd also want solid state hard drives and hdmi cables to wire them to the TV...
these guys are close...
http://hothardware.com/News/Eurocom_launches_QuadCore_XEON_Based_Notebook_/ [hothardware.com]
But oddly, I would like to have an SSI EEB desktop case, that lies flat, like old PCs used to...
Re: (Score:2)
Re:I like my desktop. (Score:4, Insightful)
What all the cool kids are doing is dropping cases altogether. Thats right, nothing looks more badass than your motherboard laying on the desk with silicon chips sticking up in the air, with a giant fan overhead to help keep things cool and circulated. Your friends will be so jealous at all the blinking lights.
As for the Optimus, I think its a great idea. This change can come for desktops as much as it has for notebooks, if there is enough demand for such a product.
Think, you had to factor in the power supply when you bought that new Graphics card. So imagine how much power its actually eating up. Imagine if your desktop didn't have to use that much power when it didn't have to?
Re: (Score:1)
That technique is also helpful for troubleshooting and verifying laptops before putting them back together, because it's much more of a hassle to do so. And you don't even need a big fan as long as all the motherboard fans are attached, just make sure everything is laid flat on an ESD mat or other protective surface.
Re: (Score:3, Funny)
Re: (Score:2)
What with nVidia's unceremonious exit from the IGP market thanks to Intel's licensing, and the introduction of every-Intel-chip-comes-with-a-GPU, tech like this is a shrewd, and pretty essential, move by nVidia in order to remain relevant in the middle tiers. If people, whether on a laptop or desktop, can get the power savings of an Intel IGP with the ability to fall back on to a decent GPU, they'll claw back a good deal of marketshare from "prosumers" and the like. Conversely, ATI has made incredible impro
Re: (Score:2)
A twelve pound notebook? Sounds like a niche product.
Re: (Score:2)
By my estimation{ GBP12 = NGN3,000 - I hope its as good as OLPC!
Re: (Score:2)
Re: (Score:2)
Yes, but that computer wasn't competing with lighter devices. If you wanted an IBM PC that could be easily moved, the Portable PC was (or was close to) your only option. Now, we have laptops, netbooks, tablets, pdas and so on. If you were a field engineer/scientist, and had, say, a ADC PCI card or a General purpose GPU that you needed to use, there are options. Niche options
But the lighter a computer is, the more often it will be carried around. Even eight pounds can be a burden, presenting the user with a
Re:I like my desktop. (Score:5, Funny)
I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them. But I'd also want solid state hard drives and hdmi cables to wire them to the TV...
But... But ... But ... Marketing told me you guys wanted postage stamp size touch sensitive screens, batteries that last two hours, and 3 second e-ink refresh rates. And its gotta use a cloud, whatever that is. And an app store, gotta have an app store. I guess you must be wrong.
Re: (Score:2)
I guess you must be wrong.
Here's the crazy part. I don't even care about battery life or even having a battery. I just want something I can plug in wherever.
Re: (Score:3, Funny)
Here's the crazy part. I don't even care about battery life or even having a battery. I just want something I can plug in wherever.
Sounds good, as long as we don't let the folks in the adult novelty department get word of it.
Re: (Score:1)
You can plug keyboards and displays into a notebook.
Hdmi is currently only somewhat available, and SSDs are a tough trade off if you are concerned about the amount of live space (without an external drive).
Dual CPUs no, but multiple cores yes.
And they cost more.
Still, the number of people with needs that are not met by an $800 laptop is shrinking pretty fast.
Re: (Score:2)
I like my desktop too. But I can't carry it on the plane with me, it's a pain in the ass to haul to a friend's house when we want to do some LAN play, and I can't bring it other places I go so I still have a place to offload photos and such.
Desktops are great as long as you never leave your house, or never need or want a computer when you do so.
no transformers jokes? (Score:4, Funny)
Re:no transformers jokes? (Score:4, Funny)
Re:no transformers jokes? Bah! (Score:2)
Yes, we can talk about hardware without making a bunch of stupid jokes about its name*.
One of the great features of the Optimus chipset is its pipelining architecture, called the "Convoy". With this system a number of pending GPU tasks can be stored into containers, and the GPU hardware will process them quickly, moving the data to its destination, transforming it as necessary, etc. But the hardware apparently kept dying on them during the demonstration: they were able to get it up and running again each
What a relief (Score:5, Funny)
That's good. I'm tired of finishing before my video player can render the first frame.
Re: (Score:1)
"Optimus can transform and roll out"
too
Re: (Score:3, Informative)
Getting older will help your stamina too.
MacBook Pros (Score:2, Informative)
I believe the latest model MacBook Pros have been doing this for at least a year.
something like it on linux (Score:5, Informative)
Re: (Score:1)
Re: (Score:1)
"For a long time, things like highlighting text in firefox and then dragging it led to flickering of the screen, "
I had a monitor that would flicker whenever I opened up certain windows only while compiz was enabled. It didn't seem to flicker if the window was too small, or for anything else other than large windows with compiz enabled, and seemed to be due to the "beam-up" animation that was displayed whenever new windows were opened. After it first flickered when opening the window, it would not flicker w
Re: (Score:2)
Weird. I used a embedded nVidia chip (7050PV) last year, and I never had those problems. Did you forget the following options in your xorg.conf?
Option "AddARGBGLXVisuals" "true"
Option "UseEvents" "false" # This option must be either undeclared or false, in order to avoid periodic short-term freezes on beryl and other OpenGL intensive programs
Re:MacBook Pros (Score:4, Informative)
Nope, not really. I have one of those and the video on the PCPer article shows the process on a MacBook Pro. You have to change a settings in the control panel and then logout of the system to change GPU modes.
Re: (Score:2)
I have one of those and the video on the PCPer article shows the process on a MacBook Pro.
I read the PCPer article, but I don't recall any videos showing OS X. Are you referring to the process of switching graphics modes under Windows or under OS X?
Re: (Score:3, Informative)
It starts at time stamp about 3:00
http://www.pcper.com/article.php?aid=868&type=expert&pid=1 [pcper.com]
Re: (Score:2)
Ah, well the MBP solution is not nearly as cool as I thought it was then.
Re: (Score:2)
That's such an awkward solution that I was always amazed it was allowed to appear in an Apple product. Or any other product for that matter.
Re: (Score:2)
Nope, they cannot, just an effect of OSX's security permissions and driver model. Rebooting is required.
Besides, what's the point of having dual GPUs if you can't use both simultaneously for really heavy data processing?
Oh, that's right - Intel IGP, nVidia Discrete - you couldn't SLI it anyways without some serious hardware and software workarounds.
Re: (Score:2, Interesting)
The new thing seems to be that you can actually switch between the onboard and 'real' GPU on the fly and fast while everything is running.
The previous laptops with switchable graphics, such as my Sony Vaio which had a Geforce and an Intel chips, did have to at least reboot the graphics system (on OS X) or reboot the whole computer (Windows) in order to go to the power saving mode.
In my experience, I usually was too lazy / didn't want to close my work and kept using the good GPU all the time. The only times
Hey if it extends battery life... (Score:3, Interesting)
Re: (Score:1)
It is actually pretty nice to have the long battery life during the work/meeting day, and then plug it in and boost the graphics in the hotel room to participate in the guild raid that night.
Re: (Score:2)
Ah but you can use hardware acceleration in your desktop environment, but you might not always want it on. Playing video, running something like photoshop - theres a bunch of stuff that uses the video card that isn't a video game. Just FYI. So if you are sitting there browsing slashdot for an hour, it can switch to the Integrated low power one, but as soon as you boot up Media Player or something, it can switch to your full blown power monster.
Re: (Score:1)
...I'm all for it. But by how much will it extend the battery life? And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in. Mabye in a smaller segment of mobile gamers this will make a difference.
I'm one of the "netbooks folks", and the prospect of being able to play video, or even basic accelerated games without running out of juice in less than half the regular time sounds great to me.
Re: (Score:1)
Re:HybridSLI? (Score:4, Informative)
Read the article at pcper.com - it talks about the current versions of switchable graphics and how the new Optimus differs.
It's not a cosmetic change.
Can't they make a 'smarter' GPU? (Score:3, Interesting)
I would have thought that, instead of switching between a 'low power' video chip, and a 'high power' GPU, they would have concentrated on just making the Nvidia graphics cards use lower power when not doing things like rendering 3D graphics, or decoding video? I mean, mobile CPU's have some smarts built into them to allow them to vary how much power they consume, can't they do that with GPUs?
Re: (Score:1)
They do but nVidia doesn't seem to be paying much attention to it. I had to enable this myself on my 8800GTS, using video memory as a measure. [slashdot.org]
Re: (Score:2)
Not entirely. While you can do that, the chip in a laptop still has some real limits. You can't put off more than X watts of heat, because the laptop just can't dissipate it.
But if the GPU used for high intensity activities (such as games) is external to the laptop, you can have it give off 150 watts of heat because it can provide the necessary cooling capacity.
I'd love something like this. I have my MacBook Pro which I really like, but don't do much in the way of 3D. I'd love to be able to plug in a goo
Re: (Score:2)
Sounds like a new style of docking station with a high-end GPU built-in would be a good direction to go in.
Re: (Score:2)
And it's a piece of shit, too.
"providing 4 GBytes/s bandwidth to support ATI Radeon graphics cards to enable you to run the most demanding graphics applications."
Sorry, my crap onboard 8600GS tears that apart at 22.4GB/s
Needs moar lanes.
Re: (Score:2)
That 22.4 GB/s is the bandwidth between your GPU and your video card's RAM. It's not the bandwidth between your system and your video card. That bandwidth is a 16 lane PCIe bus. Which, and this might be a surprise to you, is 4 GB/s (250 MB/s per lane * 16 lanes.)
Re: (Score:2)
Incorrect
PCI Express 2.0 is what my card runs on - 16 lanes, 500MB/s.
That's 16GB/s for me. And it's still bottlenecking my GPU.
Re: (Score:2)
So you're on a 2.0 bus. And no, the 8 GB/s, full duplex still has nothing to do with "bottlenecking your GPU." Your GPU memory interface is completely separate from it's host interface. They have nothing to do with each other.
Re: (Score:2)
I have a new PCI-E 3.0 test desktop board as well, with the same GPU onboard (8600GS,) and half the memory (512 on my laptop versus 256 on the desktop.) Same games run better on the desktop board.
That's the same chip, same fab process, same clock speed and power consumption. Different bus, the one on the faster bus has half the memory, and the memory otherwise is the same (GDDR3.)
I suggest rethinking your statement.
Re: (Score:3, Interesting)
The problem with GPU throttling is it's far more visible (pun intended). If your CPU is rapidly switching between 3.0ghz and, say, 1.2ghz, you probably won't notice at all, but if your game or video app has uneven framerates or the dreaded micro-stutter, you will feel the overwhelming urge to smash your laptop against the nearest brick wall.
GPUs typically have two power modes: power-saving (idle), and full-blast (gaming). Your device drivers kick it into high-power mode whenever you launch a 3D app, so th
Re: (Score:2)
Great (Score:1)
Re: (Score:2)
From what I am told that is coming sooner than you might think. Expect to see something by April!
Linux hybrid graphics (Score:2, Informative)
The current progress of Linux hybrid graphics. [blogspot.com]
There has been a lot of progress in this area the past few weeks. Wonder if this will let NVIDIA switch gpu's without restarting X.
Linux support (Score:2)
Re: (Score:2)
Uhh....yah. It does use a profiling system.
That is detailed in the article. :)
correct me if im wrong.. (Score:1)
Re: (Score:2)
About time, if it works as advertised. (Score:2, Informative)
I have suffered from one of the multiple-display-device solutions, in the form of an Alienware M15X, so Optimus sounds like a huge step forward.
While in theory it was nice to have both a battery-friendly Intel GMA and a reasonably powerful Nvidia GeForce card in one (relatively) portable package, in reality it was lousy. As suggested by TFA, you had to reboot to switch between them, whether running Windows XP or Vista. That would have been bad enough, but wait, there's more!
This effectively meant that I c
Re: (Score:2)
But you really shouldn't have to reboot to switch devices.
Video devices are not something that has previously been needed to be "hot swappable" so unlike many other things the driver model for graphics hardware probably doesn't allow for devices to be turned on and off. In the case of your Intel/nVida combo I'm guessing that the BIOS enabled one or the other at boot and relied on Windows to detect this and switch drivers on next start. While Windows can work with two distanct graphics cards of different types they both need to be running at boot and until shutdown
Re: (Score:2)
The real solution to this problem is to reduce the base power consumption of the GeForce. Dual-GPU switching is a kludge, nothing more. A crutch for an inefficient GPU.
Re: (Score:2)
The thing I find amusing is that I've never had any problem with Powermizer. Even my "unsupported" GTS240 does it fine. I haven't used it under Windows because XP won't install on my Gigabyte motherboard (their response is that it works for them) but I've had no problems under Linux. Previously, I had other cards with powermizer, including laptops with 3700FX and before that 1500FX Quadro chips, and it worked fine on them, too. So I got a low-power video card (the GTS 240) and I'm rocking out with a 460W po
How fitting (Score:2)
what happened to good hardware design? (Score:1)
I am a Thinkpad T500 owner with switchable Intel/ATI, and it is a nice feature even that I need to reboot and change the mode at the BIOS to use one or the other chipset on Linux (I have not tried the recent X server restart experiments), I use more than 95% of the time the Intel IGP, but I still consider this software switching a horrible hack. Why do not design efficient chips (ATI/NVIDIA) able to power down parts of it when not using advanced features?
This is like putting two processor like the most powe
Re: (Score:2)
This is like putting two processor like the most power hungry Intel chip and an Intel Atom, and build software to switch from them when needed...
Shh, don't give them any new ideas.
I read articles about e-pcie years ago. (Score:2)
Where is it? External PCI-Express slot on the laptop - some kind of high end, many many pin plug to go to an external, powered '3D brick' nothing eventuated :/
A Micheal Bay associated brand? Must get. (Score:1)
5870s drop to 27-35W when not gaming already (Score:2)
Re: (Score:2)
27W. Wow. That's only about four times as much as my whole laptop (with Intel graphics). Definitely very low-power! :-P
I sure hope they have a mobile version of that chip...
Re: (Score:1)
this is good in principle (Score:1)