Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Graphics Hardware

AMD's Fusion CPU + GPU Will Ship This Year 138

mr_sifter writes "Intel might have beaten AMD to the punch with a CPU featuring a built-in GPU, but it relied on a relatively crude process of simply packaging two separate dies together. AMD's long-discussed Fusion product integrates the two key components into one die, and the company is confident it will be out this year — earlier than had been expected."
This discussion has been archived. No new comments can be posted.

AMD's Fusion CPU + GPU Will Ship This Year

Comments Filter:
  • by Dragoniz3r ( 992309 ) on Saturday May 15, 2010 @11:48PM (#32224476)
    It doesn't really matter, any more than AMD's "proper" quad core mattered more than Intel pasting two dual-core dies together. This is really just AMD getting beaten to the punch again, and having to try to spin it in some positive way. It's great news that it will be out earlier than expected, but I think they would have been better off taking the less "beautiful" and just throwing discrete dies into a single package. Particularly as it has yet to be seen how big the market for this sort of thing is. More exciting to me is that AMD is ahead of schedule with this, so hopefully they'll be similarly ahead with their next architecture. I'm yearning for the day when AMD is back to being competitive on a clock-for-clock basis with Intel.
  • by Anonymous Coward on Saturday May 15, 2010 @11:56PM (#32224534)

    Sure Intel got there first and sure Intel has been beating AMD on the CPU side, but...

    Intel graphics are shit. Absolute shit. AMD graphics are top notch on a discrete card and still much better than Intel on the low end.

    Maybe you should compare the component being integrated instead of the one that already gives most users more than they need.

  • I would say it will matter, at least it might, Can't really write it off until you've seen it in the wild. AMD's more elegant initial dual core solution was infinitely better than Intels "lets slap 2 space heaters together and hope for the best"
  • by WrongSizeGlass ( 838941 ) on Sunday May 16, 2010 @12:03AM (#32224590)

    Even faster than current generation discrete GPUs? I think not.

    They'll move data inside the chip instead of having to send it off to the internal bus, they'll have access to L2 cache (and maybe even L1 cache), they'll be running in lock-step with the CPU, etc, etc. These have distinct advantages over video cards.

  • by rastoboy29 ( 807168 ) on Sunday May 16, 2010 @12:28AM (#32224716) Homepage
    I hope so, Intel is far too dominant right now.
  • by cyssero ( 1554429 ) on Sunday May 16, 2010 @12:39AM (#32224776)
    Should it be any accomplishment that a game released in November 2004 works on a latest-gen system? For that matter, my Radeon 9100 IGP (integrated) ran HL-2 'fine' back in 2004.
  • by TubeSteak ( 669689 ) on Sunday May 16, 2010 @01:28AM (#32225064) Journal

    Intel graphics are only shit for gamers who want maximum settings for recent games.

    Having the "best" integrated graphics is like having the "best" lame horse.
    Yea, it's an achievement, but you still have a lame horse and everyone else has a car.

  • by evilviper ( 135110 ) on Sunday May 16, 2010 @01:31AM (#32225072) Journal

    This is really just AMD getting beaten to the punch again, and having to try to spin it in some positive way.

    I'll have to call you an idiot for falling for Intel's marketing, and believe that, just because they can legally call it by the same name, it remotely resembles what AMD is doing.

  • by BiggerIsBetter ( 682164 ) on Sunday May 16, 2010 @02:02AM (#32225208)

    Call me when they can fit 9 inches of graphics card into one of these cpu.

    Size isn't everything!

  • by sznupi ( 719324 ) on Sunday May 16, 2010 @02:43AM (#32225418) Homepage

    What are you talking about? On good current integrated graphics many recent games work quite well; mostly "flagship", bling-oriented titles have issues.

    "Lean car -> SUV" probably rings closer to home...

  • by FishTankX ( 1539069 ) on Sunday May 16, 2010 @02:56AM (#32225462)

    The grpahics core will likely be small, add an inconsequential amount of transistors, be disable-able, and or crossfire able with the main crossfire card.

    However, the place I see this getting HUGE gains, is if the on board GPU is capable of doing physics calculations. Having a basic physics co processor on every AMD CPU flooding out the gates will do massive good for the implementation of physics in games, and can probably offload alot of other calculations in the OS. On board video encode acceleration anyone?

    Just having a dedicated super wide parallel optimized floating point monster on the die for relatively little price penalty seems like an excellent idea to me.

  • by BikeHelmet ( 1437881 ) on Sunday May 16, 2010 @03:14AM (#32225522) Journal

    Lower power consumption, making AMD chips more competitive in notebooks - perhaps even netbooks.

  • by bemymonkey ( 1244086 ) on Sunday May 16, 2010 @05:36AM (#32226136)

    Why are Intel graphics shit? They run cool, use very little power and have sufficient grunt for anything a typical non-gamer (maybe CAD and GPU-accelerated Photoshop aside) will throw at them...

    Not being able to run games does not make an integrated GPU shit...

  • by MemoryDragon ( 544441 ) on Sunday May 16, 2010 @05:49AM (#32226178)

    Except that Intel yet has to deliver an integrated graphics solution which deserves the name. AMD has the advantage that they can bundle an ATI core into their CPUs which means a decent graphics card finally.

  • by Joce640k ( 829181 ) on Sunday May 16, 2010 @08:23AM (#32226714) Homepage

    How come Intel sells more GPUs than ATI and NVIDIA combined?

    Because they sell them to people who've moved out of their parent's basement...

  • I’m sorry, but I still support everyone who does things properly instead of “quick and dirty”.
    What Intel did, is the hardware equivalent of spaghetti coding.
    They might be “first”, but it will bite them in the ass later.
    Reminds one of those “FIRST” trolls, doesn’t it?

  • The Diff (Score:4, Insightful)

    by fast turtle ( 1118037 ) on Sunday May 16, 2010 @09:27AM (#32227008) Journal

    There's two sides to this coin and Intel's is pretty neat. By not having the GPU integrated into the CPU die, Intel can improve the CPU/GPU without having to redesign the entire chip. For example, any Power management improvements can be moved into the design as soon as it's ready. Another advantage for them is the fact that each die CPU and GPU are actually indepenent and can be manufactured using what ever process makes the most sense to them.

    AMD's design offers a major boost to overall CPU performance simply through the fact that the integration is far deeper then Intel's. From what I've read, the Fusion ties the Stream Processors (FPU) directly to a CPU and should offer a major boost in all Math ops of the CPU and I expect that it will finally compete with Intel's latest CPU's in regards to FPU operations.

  • by alvinrod ( 889928 ) on Sunday May 16, 2010 @09:40AM (#32227090)
    And if Moore's law continues to hold, within the next four years it won't be an issue to put both of those chips on the same die. Hell, that may even be the budget option.
  • by haruchai ( 17472 ) on Sunday May 16, 2010 @10:01AM (#32227182)

    and, therefore, can't afford anything better than the graphics equivalent of Mac'n'Cheese now
    that Mom and Dad are no longer paying the bills.

  • by Skaven04 ( 449705 ) on Sunday May 16, 2010 @10:48AM (#32227446) Homepage

    You've got to stop thinking of it as a GPU and think of it more like a co-processor.

    First of all, AMD isn't going to force you to buy a built-in GPU on all of their processors. Obviously the enthusiast market is going to want huge 300W discrete graphics rather than the 10-15W integrated ones. There will continue to be discrete CPUs, just like there will always continue to be discrete GPUs.

    But this is a brilliant move on AMD's part. They start with a chunk of the market that is already willing to accept this: system builders, motherboard makers and OEMs will be thrilled to be able to build even smaller, simpler, more power efficient systems for the low end. This technology will make laptops and netbooks more powerful and have better battery life by using less energy for the graphics component.

    Now look further ahead, when AMD begins removing some of the barriers that currently make programming the GPU for general-purpose operations (GPGPU) such a pain. For example, right now you have to go through a driver in the OS and copy input data over the PCI bus into the frame buffer, do the processing on the GPU, then copy the results back over the PCI bus into RAM. For a lot of things, this is simply too much overhead for the GPU to be much help.

    But AMD can change that by establishing a standard for incorporating a GPU into the CPU. Eventually, imagine an AMD CPU that has the GPU integrated so tightly with the CPU that the CPU and GPU share a cache-coherent view of the main system memory, and even share a massive L3 cache. What if the GPU can use the same x86 virtual addresses that the CPU does? Then...all we have to have is a compiler option that enables the use of the GPU, and even tiny operations can be accelerated by the built-in GPU.

    In this future world, there's still a place for discrete graphics -- that's not going away for your gaming rig. But imagine the potential of having a TFLOP-scale coprocessor as a fundamental part of future sub-50W CPU. Your laptop would be able to do things like real-time video stabilization, transcoding, physics modeling, and image processing, all without breaking the bank (or the power budget).

    But before we can get to this place, AMD has to start somewhere. The first step is proving that a GPU can coexist with a CPU on the same silicon, and that such an arrangement can be built and sold at a profit. The rest is just evolution.

  • by Thagg ( 9904 ) <thadbeier@gmail.com> on Sunday May 16, 2010 @11:47AM (#32227786) Journal

    If AMD puts a competetive GPU onto the CPU die, comparable to their current high-end graphics boards) then this is a really big deal. Perhaps the biggest issue with GPGPU programming is the fact that the graphics unit is at the end of a fairly narrow pipe with limited memory, and getting data to the board and back is a performance bottleneck and a pain in the butt for a programmer.

    Putting the GPU on the die could mean massive bandwidth from the CPU to the hundreds of streaming processors on the GPU. It also strongly implies that the GPU will have access directly to the same memory as the CPU. Finally, it would mean that if you have a Fusion-based renderfarm then you have GPUs on the renderfarm.

    This is exciting!

  • Re:Sup dawg (Score:1, Insightful)

    by Anonymous Coward on Sunday May 16, 2010 @01:35PM (#32228464)

    Don't worry, external gpus will be all the rage in 2020. A decade from now, we'll be drooling over how much better a dedicated graphics chip could perform, having completely forgotten that we've played this hokey-pokey about a half-dozen times before.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...