Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Graphics Power Upgrades Hardware Technology

AMD's Fusion Processor Combines CPU and GPU 240

ElectricSteve writes "At Computex 2010 AMD gave the first public demonstration of its Fusion processor, which combines the Central Processing Unit (CPU) and Graphics Processing Unit (GPU) on a single chip. The AMD Fusion family of Accelerated Processing Units not only adds another acronym to the computer lexicon, but ushers is what AMD says is a significant shift in processor architecture and capabilities. Many of the improvements stem from eliminating the chip-to-chip linkage that adds latency to memory operations and consumes power — moving electrons across a chip takes less energy than moving these same electrons between two chips. The co-location of all key elements on one chip also allows a holistic approach to power management of the APU. Various parts of the chip can be powered up or down depending on workloads."
This discussion has been archived. No new comments can be posted.

AMD's Fusion Processor Combines CPU and GPU

Comments Filter:
  • Moving electrons (Score:4, Informative)

    by jibjibjib ( 889679 ) on Friday June 04, 2010 @05:26AM (#32455992) Journal
    "Moving electrons between two chips" isn't entirely accurate. What moves is a wave of electric potential; the electrons themselves don't actually move very far.
  • by odie_q ( 130040 ) on Friday June 04, 2010 @05:54AM (#32456140)

    The technical difference is that while your Core i3 has its GPU as a separate die in the same packaging, AMD Fusion has the GPU(s) on the same die as the CPU(s). The Intel approach makes for shorter and faster interconnects, the AMD approach completely removes the interconnects. The main advantage is probably (as is alluded to in the summary) related to power consumption.

  • Re:vs Larrabee (Score:4, Informative)

    by Cyberax ( 705495 ) on Friday June 04, 2010 @06:48AM (#32456368)

    How so?

    AMD's offer is real, it uses a real performant GPU, not a GMA joke. Larrabee is stil vapourware, and it will be for a long time.

  • by TheThiefMaster ( 992038 ) on Friday June 04, 2010 @07:17AM (#32456476)

    Well sure YOU DO, but your Gran still has a 5200 with "Turbo memory" (actually that's only 3 years old, she probably has worse).

    What year are you living in?
    1: Turbocache didn't exist until the 6100.
    2: The 5200 is seven years old
    3: You can apparently still buy them: eBuyer Link [ebuyer.com]

  • Re:vs Larrabee (Score:5, Informative)

    by purpledinoz ( 573045 ) on Friday June 04, 2010 @07:33AM (#32456538)
    Also, think of what this means for laptops. First, you save a huge amount of space by not having to have a separate GPU chip on the board. Have you seen how crammed the mainboard is on the macbook? And with the significant improvements in power consumption, it's a win-win for the laptop market.
  • Re:vs Larrabee (Score:2, Informative)

    by hattig ( 47930 ) on Friday June 04, 2010 @07:52AM (#32456630) Journal

    Intel's on chip GPU is just that - a GPU, and a primitive one at that. It can't even do OpenCL. It's certainly not a competitor to anything that AMD will release. Never mind Intel's appalling graphics drivers (and consistent history of poor driver releases), and benchmark cheating (so that they look competitive in reviews).

  • Re:Relevant? (Score:5, Informative)

    by Joce640k ( 829181 ) on Friday June 04, 2010 @07:53AM (#32456644) Homepage

    AMD designed/implemented the 64bit instruction that will be running our desktop PCs for decades to come.

    Intel was the one scrambling to catch up [wikipedia.org] on that.

  • Re:heat (Score:2, Informative)

    by hattig ( 47930 ) on Friday June 04, 2010 @07:55AM (#32456658) Journal

    This demo was of Ontario - AMD's low power solution for netbooks and low-end notebooks. This will be using the low power Bobcat cores and probably something similar to an HD5450 graphics-wise.

    I seriously doubt heat is going to be an issue.

  • Re:vs Larrabee (Score:3, Informative)

    by Soul-Burn666 ( 574119 ) on Friday June 04, 2010 @08:30AM (#32456898) Journal

    It should be plenty good for space and power consumption. Just look at Intel's US15W chipset which includes the GMA500 IGP.
    It's tiny and consumes 2W compared to previous gen chipset + GPU setups (GMA950) that consume 15W, lengthening the battery life by a huge margin.
    The chip itself has good performance, hindered only by terrible outsourced drivers (Tungsten, I'm looking at you), currently only optimized for video decoding (who said two smooth 1080p streams at less than 100% CPU usage using EVR in MPC?)

    Combining the CPU and GPU can probably give a comparable reduction of power consumption and size, with the support of AMD/ATi graphics core instead of PowerVR core + terrible Tungsten drivers.

  • by Skowronek ( 795408 ) <skylarkNO@SPAMunaligned.org> on Friday June 04, 2010 @08:36AM (#32456944) Homepage

    The documentation needed to write 3D graphics drivers has been consistently released by ATI/AMD since R5xx. In fact, yesterday I was setting up a new system with a RV730 graphics card which was both correctly detected and correctly used by the open source drivers. Ever since AMD started supporting the open source DRI project with both money, specifications and access to hardware developers things have improved vastly. I know some of the developers personally; they are smart and I believe that given this support, they will produce an excellent driver.

    It's sad to see that with Poulsbo Intel did quite an about-face, and stopped supporting open source drivers altogether. The less said about nVidia the better.

    In conclusion, seeing who is making this Fusion chip, I would have high hopes for open source on it.

  • Re:Meh. (Score:5, Informative)

    by CAIMLAS ( 41445 ) on Friday June 04, 2010 @10:33AM (#32458064)

    People who don't know better seem to skimp on the power supplies more than anything else.

    I can understand cheap boards; they'll (usually) last the useful life of the system provided they're not really crappy. But the power supply is existential: it's the heart of the system.

    If it doesn't pump your electricity properly (at the correct rates and the like), your brain and various peripherals will die a slow death. Sometimes it is not so slow.

    Invest in a decent power supply: it's worth it. It's probably the only part of a typical user computer I'd consider an investment, too, because it is an insurance policy (of sorts) on the parts. Buying a cheap power supply so you can get a UPS is backwards. Your components are still going to be getting crap power if the PSU is crap.

    I've had a total of one power supply failure, 2 disk failures, and 0 peripheral/RAM/CPU/motherboard failures in the 12 years I've been buying my own parts to build systems.

    The current PSU I've got in my main home computer is a Seasonic something or other (they, and Antec, I've found are very good). I'm amazed at how good this converter is: yes, it's got PFC and all those bells, which certainly help, but it delivers amazingly consistent power, evening out the voltage nicely. Hell, we had the power go out for long enough to stop the motor in the washing machine, make my wife's laptop go to battery, and kill the lights, and make my LCD lose power: the computer didn't turn off (and no, I'm not currently using a UPS). This little power supply caches enough power for a full second or so of operation while playing a CPU and graphics intensive game.

    So yeah, paying $70 or more for a PSU does not seem unreasonable in the least. With PSUs, you're paying more so for quality than you are for advertised performance or anything like that, so throw down the cash.

  • Re:Moving electrons (Score:3, Informative)

    by stewbee ( 1019450 ) on Friday June 04, 2010 @10:56AM (#32458432)
    Correct, I was referring to power consumption. At the die level, you can make a buffer a simple transistor/FET, so the time delay added would be pretty small. The noise that I was referring to in getting from A to B is mostly an EMI sort of issues. In a trace that runs from chip to chip, depending on how long it is, is susceptible to picking up EM radiation from other sources to the point where the EMI would corrupt the received signal to possible give the wrong value (p(0|1) or p(1|0) condition). There are other various mechanisms that cause this error as well such as cross talk, ground bounce, mismatched line impedance. To reduce some of these effects, increasing the line voltage will increase the noise margin at the receiver. As an unwanted side effect, causing this line to have a larger voltage swing (and usually current swing) will now make this a radiator and contribute to the noise environment.

    I hope I understood what you were asking.
  • Re:Meh. (Score:2, Informative)

    by tippe ( 1136385 ) on Friday June 04, 2010 @11:06AM (#32458602)

    Taken as a whole, GPU+CPU is simpler and more robust than two separate components connected via an external bus. It does away with connectors, bus drivers (need something to drive those signals across connectors and inches of trace) , level shifters (external busses don't operate at the same voltage as core silicon), bridges (external busses are often shared by multiple devices) and all of the complexity, signal integrity issues and points of failure that these things introduce. GPU+CPU on one die means that P&R, timing closure, functional sims, gate sims, power sims, validation, QA, etc were all performed on them together, making the overall system much more robust. In the separate GPU/CPU case, each device is designed, built and validated separately (possibly by different companies) and then "wired together" (motherboard PCB + graphics card PCB) by one or more different companies (that always seem to be trying to find ways to undercut each other and to make their products cheaper and more fragile). As I see it, GPU+CPU on a single monolithic die, operating in a single voltage domain, will be a lot more robust than separate components could ever be (after any initial "kinks" in design & manufacturing are sorted out).

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...