Graphics-Enabled CPUs To Take Off In 2011 172
angry tapir writes "Half the notebook computers and a growing number of desktops shipped in 2011 will run on graphics-enabled microprocessors as designers Intel and Advanced Micro Devices (AMD) increase competition for the units that raise multimedia speeds without add-ons. The processors with built-in graphics capabilities will be installed this year on 115 million notebooks, half of total shipments, and 63 million desktop PCs, or 45 percent of the total, according to analysts."
Supercomputing (Score:4, Interesting)
Re: (Score:2)
The point of GPU super computer is to have a lot of cores working at a slow speed most GPUs in the hybrids only have a small am amount of cores mine has 80. The point the hybrids is to be able to include low power graphics without the need for extra hardware thus reducing cost.
GPU clusters or just stand alone GPUs would like to have as many cores as possible compared to the rest of the machine. To achieve this effectively you want to buy a somewhat bear bones system and stick some cost effective high end GP
Re: (Score:2)
Re: (Score:2)
The biggest problem is memory bandwidth. GPU's are fast because of their high throughput, the problem is CPU's won't ever have enough memory on die to keep up despite the communications. It's a trade off. I remember Mark rein of epic games saying on-die CPU's would kill video cards but they never did, because most people don't understand that performance is about trade offs.
Re: (Score:2)
The GPU in Intel's i5 sucks. As in, even with the reduced CPU/GPU latency, it provides performance equivalent to a mid-range, 3 year old graphics card. Fine for desktop applications, web browsing, and email. Not useful for video processing and definitely none of the computational advantages of putting a high-speed, massively parallel GPU on a PCI-express bus.
Not to mention that the current on-die GPUs offered by Intel don't have access to either DX11, PhysX, nor OpenGL 4.0, so they can't be used as computational hardware anyway (AMD's existing and upcoming Fusion products can all use DX 11 and OpenGL 4, though PhysX is still kept tightly under lock by Nvidia). It does, however, have specialized video transcoding hardware, which I guess is more important right now as few things use DX 11/OGL 4's compute shaders, but may be a liability going forward depending on
A GPU by any other name would render as slowly (Score:1)
Re: (Score:3)
Other than a few hard core gamers and graphic artists, discrete graphic cards are a total waste of money for most people.
Re: (Score:3)
Re: (Score:1)
Personally I despise GPU accelerated desktops, all I want is a link to an app to click that results after clicking in lauching an app that performs the functions I require. Anyone who needs more should pay a premium for the added eye candy.
Re: (Score:3)
Not all GPU accelerated desktop is 'fluff', such as expose/compiz scale/kde present windows (particularly the latter with window title search). When I have many windows open, it's a vastly superior way to find what I need than anything else. It could have been done without craphics acceleration, but it's easiest to get large as possible previews of the results of your search this way.
Re: (Score:2)
GPGPU no, but then most users don't do anything computing intense, CPU or GPU. Integrated chipsets handle simple desktop effects quite fine, your FUD is out of date. Problems with higher resolutions? What is this, the 90s?
Re:A GPU by any other name would render as slowly (Score:4, Insightful)
Also, this bullshit that users don't do computing intense stuff is, well, bullshit. Full HD video, 3D movies, photo processing are computationally intensive even if they are not particularly serious usage of computing power. Don't confuse "important work" with "computationally intensive work".
Re: (Score:2)
Re: (Score:2)
Full HD video --> Works just fine on my 2008 era Intel laptop with integrated video... and I'm using it to drive a 1080p display to show ripped Blu-Rays at full definition on top of a composited KDE desktop.
Re: (Score:1)
You don't need a discrete GPU for any of that. They're not even computationally intensive.
Computationally intensive = 4 hours for a simulation.
Re: (Score:2)
Re: (Score:2)
You know damned well that it's usually I/O throughput, not processor power, that prevents things from loading or displaying instantaneously. "computationally intensive" means it'll peg one or more cores, not that you'll wait a couple of seconds while your operating system accesses the information it needs from a hard drive.
For casual use like surfing the web, writing up a document in a word processor, or playing some stupid flash game on Facebook, it's quite possible to get by with a 1.6GHz netbook without
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Just make sure you set the quality to low.
Unacceptable. If it cannot handle mid to high settings, do not want.
Re: (Score:2)
No, full HD video is not particularly computing intense with dedicated hardware. The Intel Core i5-2500K decodes 5 simultanious 1080p streams [anandtech.com] according to Anandtech. Hell it even has HDMI 1.4a and 3D support if you're into that, this is "integrated" performance in 2011. I don't know how intensive Photoshop with thousand layers can get, but simple touchups of photos certainly do fine without a discrete GPU.
Re: (Score:2)
I haven't used Photoshop since about 2005-2006, but I don't recall it ever using the GPU. Has this changed?
Re: (Score:2)
The FUD would be the anecdotal evidence that many have already provided that you're completely and utterly wrong.
And to add to the anecdotal evidence: I have a 2-year old netbook which is able to handle 1920x1080p through its VGA out port, with a composited desktop and full motion video at that resolution. While I don't use it for intensive gaming, I have played Civ4 and WoW on the netbook at that resolution. While it's not exactly an ideal configuration, both games are playable. When I'm typing up a docume
Re: (Score:2)
Full HD video isn't computationaly intensive if you have specialized hardware, and isn't even practical if you do on a general porpose GPU. The same applies to 3D movies (displaying them, not rendering).
Now, photo processing... That was quite heeavywork for the hardware available at the late 90's. Today we do it at portable devices.
Yet, lots of people do computationaly intensive things. Mainly gamming.
Re: (Score:2)
Try connecting a full HD monitor to an integrated Intel GPU and you'll see what I meant.
I have a machine set up that way right now, with a pathetic 945 chipset driving a 1920 x 1600 display at full resolution. It's amazing how much 3D framerate I can coax out of it, provided I stick to what works nicely in its fixed function pipeline. That was my 3D development box for years. We are talking Pentium M here. A Zacate chip consuming significantly less power will blow that right out of the water.
Re: (Score:2)
Apparently you have not used any recent integrated GPU's from AMD. My integrated Radeon 4250 can play Left4Dead 2 at a fine framerate with the settings adjusted appropriately. Accelerated desktops are light duty work. I don't think it'll do OpenCL right now, but AMD is serious about making their integrated graphics better than the barely usable stuff they've been pushing out before; I'm sure future iterations will do all of these things even better than current ones do.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I realize that Integrated Graphics are sub-par, but to say they're useless for games unless you're a masochist (the "self" is redundant...), is a bit overstating it. Many of us non-gamers do like to play a game from time to time, but we don't want to spend ourself into bankruptcy. Guess, what? This means we buy older games (cheaper!), and from my experience today's integrated graphics (also cheaper!) handle older games perfectly fine.
Re:A GPU by any other name would render as slowly (Score:5, Insightful)
I actually can chip on this on a "this is not true" side. My father isn't a gamer by any stretch - the only games he likes to play are various arcanoid derivatives. Which meant that his work laptop served him just fine.
Then came shatter, and he all but killed me with his "why won't my laptop run this?" questions. Try to explain to someone running the crappy intel 945GM that always ran the old 2d arcanoids that shatter just won't work on it.
So now, I'm probably giving them my current gaming computer as I upgrade, and I'm pretty sure he'll be telling tech support at work that his next laptop has better include 3d acceleration or else (he's in position to be able to tell them that). So the old saying applies here - you'll be satisfied with integrated, until in comes one killer application that it won't run, and then you aren't. Problem is, with so much software requiring decent 3d graphics on board (even aero does!) you're still best served by a half decent dedicated graphics card that powers itself down when 3d features aren't used or used sparingly.
Finally there's an issue of quality, and that goes beyond 3d. Most integrated chipsets have clear problems displaying higher resolutions, which is why high resolution laptops generally have a dedicated chipset rather then integrated solution.
Re: (Score:2)
Ehm...
I just checked. Shatter was released in 2010(!) for Windows. The Integrated graphics you mention were released in January 2006. Go and read my comment again: I said, older games on modern-day integrated graphics. I'm pretty sure Shatter will work perfectly fine on my wifes ATI Radeon HD 5750 (iMac bought in fall 2010)... which are the integrated graphics sold these days. Will Shatter work on my 2007 laptop? Can't say, because I can't find system requirements of Shatter. However, the ATI Radeon
Re: (Score:2)
Shatter's system requirements were on a level of a 2005 computer at best, which is my point. Specifically my older 2005 bought computer ran it fine on max settings with a barely passable graphics card.
As for high resolution, 1280x1024 hasn't been "high" for a decade at least. The high resolutions nowadays start at around 1900x1200 and go up from there. My 1680x1050 is average at best nowadays, and in games tends to be the lowest benchmarked resolution.
Re: (Score:2)
My father had a pre-2000 (year) laptop with native resolution of 1600x1200. From work. with a (iirc) 14" or smaller screen. From business line of DELL's of all places. You must be looking at very cheap low end consumer crap which goes for lowest possible denominator.
And on the topic of shatter, no. I spent several hours trying to make the damn thing work, down to trying a couple of hacks. Nothing worked. 945GM's implementation of shaders is simply so horrendously bad, it doesn't work. Google was filled with
Re: (Score:2)
I think the argument has gone a bit off track here. I was talking about laptop IGPs being bad for even basic modern programs that use shaders, and low quality when you hook up to a desktop "high resolution" monitor and barely tolerable on modern LCD laptop's own screen (typically RAMDAC problems on VGA output handling high resolutions awfully). I concede that you're most likely correct with reasonable resolutions you've mentioned.
Notably the main reason why laptop's own native resolutions haven't really gon
Re: (Score:2)
You're wrong.
Speaking as someone who was checking the specs on Apple products last year, the only Apple products that used integrated graphics then were the Mac Mini, Macbook, and Macbook Pro.
The iMac and Mac Pro all use discrete AMD/nVidia cards/processors.
The current iMac models [apple.com] have a choice of ATI Radeon HD 4670 256MB, ATI Radeon HD 5670 512MB, or ATI Radeon HD 5750 1GB.
Re: (Score:2)
Sadly, no. The graphics cards are a separate card, but not a st
Re: (Score:2)
I didn't see it before writing my reply. Slashdot unhelpfully doesn't show posts at the same level as the one you're currently viewing, only its children.
Re: (Score:3)
Re: (Score:2)
All these new HTML5 demos run at above 50fps. More than enough.
The HDMI port outputs full 1080p to my plasma TV.
A sibling poster is correct - no one really cares about how many FLOPS a GFX card can handle.
Re: (Score:2)
What the hell are you talking about? The Xbox 360 uses DirectX just the same as Windows.
If you could change graphics settings on consoles the same as PCs, you'd probably notice the difference, but I'd assume that playing games on a console is often the equivalent of using "medium" settings on a PC. I say this as someone with both a PS3 and 360, not trying to say that the consoles are inferior in terms of gameplay, just that obviously a modern day PC is going to kick their ass. That's how things work. Consol
Re: (Score:2)
What the hell are you talking about? The Xbox 360 uses DirectX just the same as Windows.
It doesn't use Windows the same as Windows, though; the Xbox 360 OS is based on the Xbox OS which is based on Windows 2000. But it has almost none of the OS present... Which is why you need a quad-core to play Grand Theft Auto when the Xbox 360 has only three. OS overhead.
Re: (Score:2)
Still, it's a far cry from banging directly on the hardware.. they could be doing even more on the Xbox if they were allowed to do that.
Re: (Score:2)
Still, it's a far cry from banging directly on the hardware.. they could be doing even more on the Xbox if they were allowed to do that.
DirectX is frankly close enough. The version of DirectX was bumped for both systems to permit DirectX developers to take better advantage of the hardware.
Re: (Score:1)
Re: (Score:1)
I still prefer them to e.g. the latest Need for Speed, where content and playability has been sacrifized on the altar of cartoon realism and HD/HDR graphics
Let me tell you, there are way more people addicted to WoW than to nethack and dwarf fortress. I'm not saying one is better than the other for gameplay... but I am saying that gameplay is not the only reason to play. You need to reach a minimum threshold of performance. This APU just upped that threshold.
Re: (Score:2)
The target for on die GPUs aren't crysis and call of duty, it's aero and quartz. Which sandy bridge handles very well.
IGP's are sufficient for most games (Score:2)
IGP's are sufficient for most games. Yes, you read that right. IGP's with good drivers are sufficient for playing the games that most people play. These include Flash games (Farmville) and the "demo" games that come with a typical OS installation (Solitaire).
I hate how supposed "gamers" dominate any discussion that remotely has anything to do with computer graphics. Not everybody wants to play Crysis (and I don't even know what that is, without a quick peek at Wikipedia).
Re: (Score:2)
IIRC, AMD's Zacate fusion chips have 80 stream processors, same as the 4350 I normally develop for (because it is fanless and quiet). The GPU is connected to the CPU by a bus with (I think this is right) 2 ns latency, put that in your pipe at smoke it.
The writing is very clearly on the wall for integrated vs discrete.
Re: (Score:3, Insightful)
My back-of-the envelope calculations tell me that the 9W version is at least as powerful as a low-end Nvidia 400-series or ATI 5000-series
My back of the envelope memory tells me that all low end 400 series and low end 5000 series "graphics" are actually IGPs as well...
Round and round it goes. (Score:3)
Reduces the load on the motherboard (Score:2)
One of the good things about having the GPU integrated in the processor chip itself is you don't have to go through the bus, so this reduces latency and leaves more bandwidth for everything else.
Re: (Score:2)
And this effects grandma checking her email how? Now that computers are main stream and smart phones nearly so, what difference does it make to the average user if his email loads in 2 seconds or 3?
Overheating already... (Score:3, Interesting)
...CPU handling the graphics in laptops is already causing overheating issues.
Two cases in point, a Toshiba laptop with AMD and a 13" MacBook Pro with Intel, the fans run annoyingly at high speed, the bottoms are hot enough to fry eggs on. That's just sitting with one web page open. How long can one expect machine like that to last? A year? two maybe?
Are web pages going to suddenly tone down their act, quit using video, animation, Flash? Text and pictures only? If they do that, then what? Hardware makers only start making laptops that can handle web text?
Dedicated graphics is the way to go, CPU and graphics on separate dies away from each other, separate the heat sources.
I can just imagine the scene where a bunch of power hungry types just made the decision to move towards integrated graphics, and a highly intelligent engineer just stomping out of the boardroom in protest.
Re: (Score:1)
Heh.... Don't you think you just put your finger on the whole point? Computers are strong enough (most people really overestimate their real needs, and think they really do need that i7, if the Core2Duo would have overkill already.) and have been, for I dare to say, the last 6 years. I use today a machine I bought in january 2007 and it was one on sale, to get rid of it before the Vista release. So it was already bottom-line back then.
Re: (Score:2)
I admire that you continue to use older hardware... If it works then so much better for the environment...
But I've noticed that I really appreciate the extra horsepower. Between work, non-work and play, having a fast processor, gobs of memory and great graphics is really nice. For work, I run a VMWare virtual machine so when I connect to the VPN I don't lost my local connectivity. Within that VM need to run compiles, Lotus Bloats, Word, etc.. For my non-work, I run some pretty hefty graphics and finance
Re: (Score:2)
Two cases in point, a Toshiba laptop with AMD and a 13" MacBook Pro with Intel, the fans run annoyingly at high speed, the bottoms are hot enough to fry eggs on. That's just sitting with one web page open.
This tells me a few things:
I don't think discrete graphics chips are going to solve any of these problems.
Re: (Score:2)
"Two cases in point, a Toshiba laptop with AMD and a 13" MacBook Pro with Intel, the fans run annoyingly at high speed, the bottoms are hot enough to fry eggs on. That's just sitting with one web page open. How long can one expect machine like that to last? A year? two maybe?"
This is an exaggeration...my 13" MacBook Pro doesn't get hot or have the fans turn on with a single web page, nor does this happen while browsing the web/watching youtube.
Re: (Score:2)
Re: (Score:2)
Toshiba AC100 Nvidia Tegra smartbook (Score:2)
This is ontopicish, isn't it? And it's all ready out there. (Hasn't probably "taken off" (depending on the definition of that), though...)
I'd love to have a Toshiba AC100 smartbook with an Nvidia Tegra ARM cpu. Capable of HD output, 9 h battery (on lighter usage IIRC). About 800 grams. Runs Android, but an Ubuntu port is progressing, from what I can tell.
Re: (Score:3)
I have one. The main problems with it are that in Android, there is a certain lack of applications I need (can't seem to find a decent text editor / wordprocessor, for one).
Under Linux, you get all the software (Pidgin, proper text editors with undo and stuff, GIMP and so on) and it's handy for playing with ARM ports of software, but the battery life is only about 3.5 hours. If you want to keep Android there it has to run off the SD card, which is very slow, even with a class 10 card. I might try install
Re: (Score:1)
The main problems with it are that in Android, there is a certain lack of applications I need (can't seem to find a decent text editor / wordprocessor, for one).
Good thing you can compile one yourself!
Re: (Score:2)
Good thing you can compile one yourself!
The NDK doesn't really allow access to the UI and writing one from scratch in Java was not a prospect I really fancied. It was easier to just stick Linux on it.
I hadn't considered the possibility of QT on Android, though - an android build of Kate would be ideal. Wouldn't help with GIMP or Pidgin, mind.
Toshiba AC100 review in theregister.co.uk: 1/10 (Score:2)
http://www.reghardware.com/2010/11/03/review_netbook_toshiba_ac100/ [reghardware.com]
Verdict
The beautifully designed and executed hardware is very close to my ideal netbook, and it's hardly an exaggeration to say that I'm heart-broken by Toshiba's cocked-up Android implementation. The best one can hope for is a firmware rescue from the open source community, although I wonder if the product will stay around long enough in these tablet-obsessed times for that to happen.
The more things change... (Score:5, Insightful)
Way back near the dawn of time, Intel created the 8086, and its slightly less capable little brother, the 8088. And they were reasonable processors ... but although they were good at arithmetic, it was within tight constraints. Fractions were just too hard. Trigonometry sent the poor little souls into a spin. And so on.
And thus, the 8087 was born. It was able to carry the burden of floating point mathematical functions, thereby making things nice and fast for those few who were willing to pony up the cash for the chip.
Then out came the 80286 (let's forget about the 80186, it's not really all that relevant here). It was better at arithmetic than the 8086, but still couldn't handle floating point - so it had a friend, the 80287, that filled the same purpose for the 80286 as the 8087 did for the 8086 and 8088. (We'll blithely ignore Weitek's offerings here. They existed. They're not really germane to the discussion.)
Then the 80386. Much, much better at arithmetic than the 80286, but floating point was still an Achilles heel - so the 80387 came along for the ride.
And finally, the i486. By this stage, transistors had become small enough that Intel could integrate the FPU on die - so there was no i487. At least, not until they came out with the i486SX, which I'll blithely ignore. And so, an accelerator chip that was once hideously expensive and used only by a few who really needed it was integrated onto chips that everybody would buy.
Funnily enough, it was around the time that the i486 appeared that graphics accelerators came onto the scene - first for 2D (who remembers the Tseng Labs W32p?), and then for 3D. Expensive, used only by a few who could justify the cost ... is this starting to sound familiar to you?
So another cycle is beginning to complete, and more functionality that used to be discrete is now to be folded onto the CPU. I can't help but wonder ... what will be next?
Re: (Score:2)
Whatever needs to be (Score:2)
The whole reason graphics was separated out is it is so damn math intensive. In particular, it calls for a certain kind of parallel math that CPUs aren't, or perhaps more accurately weren't, very good at. Building a real general purpose CPU which could do good 3D was just not possible. Slowly it is becoming more possible. We are still a long way off, but approaching it.
Ultimately everything on one CPU is what we want. However it isn't possible for high end 3D, hence it gets put off to a separate combination
Re: (Score:2)
So another cycle is beginning to complete, and more functionality that used to be discrete is now to be folded onto the CPU. I can't help but wonder ... what will be next?
Next is the JSPU, a dedicated chip to speed up javascript performance. It's the future.
Re: (Score:2)
Whereas your post was the height of eloquence and supremely succinct. I actually prefer the OP. At least it's *worth* reading.
Re: (Score:2)
Re: (Score:2)
More importantly - who cares?
If a PC already has integrated motherboard sound, integrated motherboard Ethernet, integrated motherboard USB, integrated motherboard RAID, etc. then for the most part you won't *care* how the layout is arranged.
When you buy a laptop, you care about battery life, right? And you care about performance, right?
Double Obsolescence (Score:1)
As long as they cost less than half of what I'll spend to replace them
OR
Re: (Score:2)
To be fair, if you don't upgrade one or the other on a regular basis you'll likely end up getting both at the same time regardless. I'm not about to use a 2011 cpu with a 2005 gpu or vice versa.
Ivy Bridge will also have 1gb of graphics memory (Score:2)
It is very nice to see that competition is pushing the market to get better and better :)
Dedicated Sound Card (Score:2)
At one point most gamers had dedicated sound cards. Eventually the technology caught up and almost every gamer now uses integrated sound.
Graphics will eventually get that way. It won't be this year but that is the trend.
Re: (Score:2)
I'm excluding audiophiles. They'll pay extra because they think digital bits going over gold sound better than digital bits going over copper.
Re: (Score:1)
Nope, that was 1911.
Re: (Score:3)
It may not be the year of Linux on the Desktop but it WILL be the Year of the Linux on your Mobile.
Re: (Score:2)
They also said 2011 would be the year of Linux on the desktop!
This is the year of Linux on my desktop, as was 2010 and every other year back to 1998. If it is not the year of Linux on the desktop for you then I am sorry for you, I really am.
Re: (Score:2)
Well, on the bright side, 2011 is going to be the year of Duke Nukem Forever, so there's hope for Linux on the Desktop! ;-)
Indeed, though many people may know it first as "Android on the Desktop".
Re: (Score:2)
They also said 2011 would be the year of Linux on the desktop!
They also said that Linux is still for faggots.
While your statement is true, it is a very narrow subset of who Linux is for. Linux is also for midgets, Asians, Christians, elves, drug traffickers, airline pilots, geeks, dentists, evil geniuses, Pixair, etc... I think you are trying to be inflammatory...
Re: (Score:2)
Optimus, or some other form of switchable graphics.
Re: (Score:2)
It's a problem, yes. Switchable graphics are just a workaround...
Re: (Score:2)
Yep. I often need a machine with powerful CPU but don't care about the graphics. It seems like NOBODY makes one. Laptops are a bummer because you can't build your own.
Re: (Score:2)
Yep. I often need a machine with powerful CPU but don't care about the graphics. It seems like NOBODY makes one. Laptops are a bummer because you can't build your own.
Really? Because you just described the entire apple PC lineup.
Also, if you can't find it, it sounds like you're not looking. Or, only looking at bestbuy and futureshop. MSI has many models, sony and HP do custom to order (CTO) and asus has so many options I'm surprised they don't confuse customers out of a sale.
It really sounds like you are surprised to find that budget laptops don't come with premium features, like full HD, 1080p screens. So hit up dell and configure one.
Re: (Score:2)
The point being ... we want to pay less. Also, some of us want to run Windows.
"Custom to order" usually isn't as flexible as you might imagine. Go to those websites and try to get a machine with good CPU and 'bad' graphics. ...or a machine with 8Gb RAM with 'bad' graphics (which I tried to do a couple of months ago). I don't need graphics, I've got a pile of graphics cards here and don't need to pay for another one). None of the sites I tried could do that, despite offering "configure it any way you want!"
Macs using integrated and discrete GPUs ... (Score:2)
"Energy-efficient graphics.
Thanks to the new microarchitecture, the graphics processor is on the same chip as the central processor and has direct access to L3 cache. That proximity translates into performance. The graphics processor also automatically increases clock speeds for higher workloads. An integrated vi
Re: (Score:2)
Something like the MacBook Pro where there is basic graphics integrated into the CPU
The parent said "would make it cheaper"
Re: (Score:2)
The Dell XPS 15" has a 1920x1080 option and decent graphics capabilities, with nVidia Optimus which apparently switches between low power and full graphics mode depending on your usage.
Not quite 1200, but for a 15" widescreen, I think any res over 1680x1050 is going to be equivalent, since you'll have to increase font sizes anyway (unless you have some very good eyesight or hunch really close to the screen).
Re: (Score:2)
nVidia Optimus
Doesn't work with Linux...
Re: (Score:2)
It is that OS that people use when they want something more than a toy or text editing.
Re: (Score:2)
Re: (Score:3)
So there will be more computers with crappy integrated graphics. Hopefully, it will still be possible to upgrade them with a decent graphics card.
Yes, and yes.
Oh, and btw, wasn't the plan until recently to basically replace the CPU with the GPU? I'm confused...
No. Graphics Processor Units make very poor Central Processing Units. GPUss work nicely to augment CPUs when doing specialised calculations (encryption, video encoding, physics, etc) that would take the CPU a long time to do on its own, but there are no plans to replace CPUs with GPUs.
Re: (Score:2)
Better yet, if they followed their original plan, you'll be able to use both your crappy integrated GPU and you good plugged GPU at the same time, for solving the same problem.
Re: (Score:3, Informative)
And the advantage is...?
The advantage of shared memory graphics is reduced cost and power consumption.
The advantage of integrating the memory controller in the CPU is it allows the CPU faster access to memory.
The advantage of reducing the number of high speed chips is reduced cost and power consumption.
So with that in mind lets consider the options for a CPU with an integrated memory controller.
Putting the shared memory graphics on a seperate chip would require a link to the CPU that offered high speed high priority ram access by
Re: (Score:2)
It's just those integrated graphics are in the CPU rather than the northbridge.
You say this like it's such a tiny thing. Being that 'close' to the MMU and RAM, and CPU, has got to help things.
Even if the output still looks like crap, at least it will output said crap more efficiently.
Re: (Score:2)
Having a low power graphics chip generates more heat??
Re: (Score:1)
Having a low power graphics chip generates more heat??
It's only low powered when compared to other discrete graphics solutions, not when compared to a bare CPU lacking any integrated GPU function. Yes, they are low power. But only in relative terms, not in absolute ones.
Re: (Score:2)
And I have an Atom N270-based netbook that, probably 90% of the time, is on passive cooling. It *has* a fan, but that fan is almost never on when all I'm doing is surfing the web, writing a document, or working in a spreadsheet.
Yes, they're going to increase power consumption a little, but you're forgetting how low the power and heat requirements are for these devices in the first place. If the fan has to run 20% of the time instead of 10%, it may shorten the battery life by 30 minutes total, over the cours