Forgot your password?
typodupeerror
Graphics Hardware

Graphics-Enabled CPUs To Take Off In 2011 172

Posted by timothy
from the in-my-day-we-had-integrated-graphics dept.
angry tapir writes "Half the notebook computers and a growing number of desktops shipped in 2011 will run on graphics-enabled microprocessors as designers Intel and Advanced Micro Devices (AMD) increase competition for the units that raise multimedia speeds without add-ons. The processors with built-in graphics capabilities will be installed this year on 115 million notebooks, half of total shipments, and 63 million desktop PCs, or 45 percent of the total, according to analysts."
This discussion has been archived. No new comments can be posted.

Graphics-Enabled CPUs To Take Off In 2011

Comments Filter:
  • by Anonymous Coward on Friday March 18, 2011 @06:21AM (#35527688)

    Way back near the dawn of time, Intel created the 8086, and its slightly less capable little brother, the 8088. And they were reasonable processors ... but although they were good at arithmetic, it was within tight constraints. Fractions were just too hard. Trigonometry sent the poor little souls into a spin. And so on.

    And thus, the 8087 was born. It was able to carry the burden of floating point mathematical functions, thereby making things nice and fast for those few who were willing to pony up the cash for the chip.

    Then out came the 80286 (let's forget about the 80186, it's not really all that relevant here). It was better at arithmetic than the 8086, but still couldn't handle floating point - so it had a friend, the 80287, that filled the same purpose for the 80286 as the 8087 did for the 8086 and 8088. (We'll blithely ignore Weitek's offerings here. They existed. They're not really germane to the discussion.)

    Then the 80386. Much, much better at arithmetic than the 80286, but floating point was still an Achilles heel - so the 80387 came along for the ride.

    And finally, the i486. By this stage, transistors had become small enough that Intel could integrate the FPU on die - so there was no i487. At least, not until they came out with the i486SX, which I'll blithely ignore. And so, an accelerator chip that was once hideously expensive and used only by a few who really needed it was integrated onto chips that everybody would buy.

    Funnily enough, it was around the time that the i486 appeared that graphics accelerators came onto the scene - first for 2D (who remembers the Tseng Labs W32p?), and then for 3D. Expensive, used only by a few who could justify the cost ... is this starting to sound familiar to you?

    So another cycle is beginning to complete, and more functionality that used to be discrete is now to be folded onto the CPU. I can't help but wonder ... what will be next?

  • by Anonymous Coward on Friday March 18, 2011 @06:59AM (#35527866)

    When I read your post, I used the voice of Zapp Brannigan inside my head. You are a tedious blowhard who likes the sound of his own voice.

    GPUs being integrated into CPUs mirror the situation of FPUs being integrated into CPUs in the 90s.

    There you go, 17 words, point made.

    ABLAH BLAH BLAH what I assume is esoteric knowledge that makes me look clever ABLAH BLAH BLAH things that everyone who reads slashdot already knows ABLAH BLAH BLAH overuse of the phrase "blithely ignore" just to make sure everyone sees that I know some other irrelevant facts ABLAH BLAH BLAH trite observation that is the first thing to occur to anyone when hearing that GPUs are being integrated into CPUs BLAH BLAH FUCKING BLAH.

  • by Luckyo (1726890) on Friday March 18, 2011 @07:18AM (#35527940)

    I actually can chip on this on a "this is not true" side. My father isn't a gamer by any stretch - the only games he likes to play are various arcanoid derivatives. Which meant that his work laptop served him just fine.
    Then came shatter, and he all but killed me with his "why won't my laptop run this?" questions. Try to explain to someone running the crappy intel 945GM that always ran the old 2d arcanoids that shatter just won't work on it.

    So now, I'm probably giving them my current gaming computer as I upgrade, and I'm pretty sure he'll be telling tech support at work that his next laptop has better include 3d acceleration or else (he's in position to be able to tell them that). So the old saying applies here - you'll be satisfied with integrated, until in comes one killer application that it won't run, and then you aren't. Problem is, with so much software requiring decent 3d graphics on board (even aero does!) you're still best served by a half decent dedicated graphics card that powers itself down when 3d features aren't used or used sparingly.

    Finally there's an issue of quality, and that goes beyond 3d. Most integrated chipsets have clear problems displaying higher resolutions, which is why high resolution laptops generally have a dedicated chipset rather then integrated solution.

  • First of all where would this be FUD? Try connecting a full HD monitor to an integrated Intel GPU and you'll see what I meant.

    Also, this bullshit that users don't do computing intense stuff is, well, bullshit. Full HD video, 3D movies, photo processing are computationally intensive even if they are not particularly serious usage of computing power. Don't confuse "important work" with "computationally intensive work".
  • by Auroch (1403671) on Friday March 18, 2011 @08:14AM (#35528230)

    My back-of-the envelope calculations tell me that the 9W version is at least as powerful as a low-end Nvidia 400-series or ATI 5000-series

    My back of the envelope memory tells me that all low end 400 series and low end 5000 series "graphics" are actually IGPs as well...

"More software projects have gone awry for lack of calendar time than for all other causes combined." -- Fred Brooks, Jr., _The Mythical Man Month_

Working...