Forgot your password?
typodupeerror
AMD Graphics Hardware

AMD Demos Llano Fusion APU, Radeon 6800 Series 116

Posted by timothy
from the onward-ever-onward dept.
MojoKid writes "At a press event for the impending launch of AMD's new Radeon HD 6870 and HD 6850 series graphics cards, the company took the opportunity to provide an early look at the first, fully functional samples of their upcoming 'Llano' processor, or APU (Applications Processer Unit). For those unfamiliar with Llano, it's 32nm 'Fusion' product that integrates CPU, GPU, and Northbridge functions on a single die. The chip is a low-power derivative of the company's current Phenom II architecture fused with a GPU that will target a wide range of operating environments at speeds of 3GHz or higher. Test systems showed the integrated GPU had no trouble running Alien vs. Predator at a moderate resolution with DirectX 11 features enabled. In terms of the Radeon 6800 series, board shots have been unveiled today, as well as scenes from AMD's upcoming tech demo, Mecha Warrior, showcasing the new graphics technology and advanced effects from the open source Bullet Physics library."
This discussion has been archived. No new comments can be posted.

AMD Demos Llano Fusion APU, Radeon 6800 Series

Comments Filter:
  • by hairyfeet (841228) <bassbeast1968&gmail,com> on Tuesday October 19, 2010 @08:53PM (#33955614) Journal

    I have to agree, I frankly loved how easy it was to tell what was what by the AMD naming conventions. with Intel it is hard to tell what had virtual support and what hadn't, Nvidia had so many cards overlapping with numbers all over the place I frankly can't tell you if a 9600Gt beats a GT210 or the other way around, but with AMD it was easy. One the CPU side it was Sempron (almost pointless now), followed by Athlon, Phenom, and Phenom II. Those were followed by an x and number of cores, and of course faster is better, easy peasy. On the GPU side you had the xx3x, for bargain basement and integrated, the x5xx for those that only cared about HTPC or video acceleration, low mid, the x6xx the mid to high mid, and the x7xx and x8xx for the low high end to high. Everyone had a niche, everyone had a price point, easy peasy.

    Hopefully by the time February rolls around they will have this straightened out, as I'll be replacing my HD4650 when I add my liquid cooler for my CPU and I'd really hate to play "guess which card is right" again. Meh, I figure I'll get one in the $100 price point anyway, but it would be nice to tell whether the 5xxx series or 6xxx series would be better at that price point. While I like to play FPS my screen is only 1600x900 and I'm not into the Crysis ePeen graphics, as long as I have good framerate I'm happy. Any suggestions? Oh and please don't say Nvidia as I don't buy their stuff after bumpgate and the way they turn my apt into a space heater, I'm also not happy with their disabling PhysX on machines with any AMD GPUs, since mine is integrated I sure as hell ain't paying for crippleware. So which would be the better buy in the $110 price range? The 5xxx or 6xxx?

  • by PopeRatzo (965947) * on Tuesday October 19, 2010 @09:24PM (#33955854) Homepage Journal

    Hopefully by the time February rolls around they will have this straightened out, as I'll be replacing my HD4650

    I'm about ready to replace my 4650, too. I've got a new HD monitor coming and figure that's as good a time as any to up the graphic power, though I won't be going water-cooled.

    My problem with the numbering system is always the second digit. For example, is a 5830 better than a 5770 or 4870? Do I add up the 4 digits and compare the sums? Is the first digit the most important, or the second, or the third?

    The way I usually end up deciding is by sorting all the cards at Newegg by price and seeing what cart in the second-to last series is best in the $100-130 range. Then, I go to the recommended requirements for the game I want to play (again, I wait until the prices drop on Steam, so I'm just now buying games that came out last Christmas) and see if the new video card meets the requirements.

    I might consider buying an nVidia card but the business with PhysX and their even more confusing model numbers puts me off.

  • by Ephemeriis (315124) on Tuesday October 19, 2010 @09:37PM (#33955930)

    I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.

    AvP is a relatively modern game. Came out in the last year or so. It isn't mind-shatteringly amazing, but it looks pretty decent.

    Traditionally, integrated graphics have done a lousy job with serious gaming on PCs. Basically any FPS has required a discrete 3D card.

    If Joe Sixpack can go out and buy an off-the-shelf machine at Best Buy and play a game on it without having to upgrade the hardware, it'll be a huge step in the right direction.

    But this chip doesn't look like it'll be replacing 3D cards for serious gamers anytime soon.

    Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

    It's a desktop chip, so I can't imagine it'll do anything without a fan. Although the integrated graphics means that you wouldn't need a separate graphics card with its own fan. So it should be at least a little quieter.

  • by fast turtle (1118037) on Tuesday October 19, 2010 @09:55PM (#33956064) Journal

    What I suspect AMD has done was add tesselation units to the chip. This will be evident when running the Heaven Benchmark with Tesselation enabled. Keep in mind that Tesselation is one of the key changes between DX10.1 and DX11 and as you stated, this is future looking. Sure the chip may be a bit slower currently but I suspect that when running something that depends heavily on tesselation, there wont be any slowdowns.

    The reason I'm aware of this is my Radeon 5650. It's a DX11 card with 512 onboard and when running the Heaven Test, there's lots of improvement when tesselation is on even though the card struggles and drops to between 4-12 frames. With tesselation off, the card easily handles the test at a playable rate of 45-60 frames.

  • by tyrione (134248) on Tuesday October 19, 2010 @11:24PM (#33956630) Homepage

    I doubt it. Switching to AMD (especially for only part of their line) seems like it would have a lot of ancillary costs such as the R&D help I know Intel has given Apple. Apple stuck by Intel for years through their abysmal "GPUs" (I've got one, along with an nVidia, in my MacBook Pro). Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.

    Also, don't forget the right now AMD has the Phenom, which is a good chip, and Intel has their current Core line, which is an amazing line of chips. To go to AMD means sacrificing performance/watt on the CPU side.

    Two years ago maybe it would have mattered. Today? Too little too late.

    Being a former NeXT and Apple Engineer I can tell you unequivocally your thought is Bull Shit. Intel gave NeXT practically zero information for the NeXTStep Port to Intel. Apple designs around Intel Specs and Intel helps as another OEM. No special treatment.

  • by Klinky (636952) on Wednesday October 20, 2010 @02:58AM (#33957806)

    Firstly this can save money. Integrating the GPU into the CPU can create a lower cost part for an OEM than having to using two chips in separate packages. Second this is a fusion between x86 & GPGPU/OpenCL. Once a critical mass of CPUs have an integrated GPU then you will probably see GPGPU tech really start to become integrated into programs that can take advantage of it. Suddenly your low-end budget box CPU can encode & decode multiple HD streams from your camera or apply special effects in realtime. Your games can take advantage of the integrated GPU for physics or possibly the framerate will be playable compared to some of the other crap IGPs. Things like image/video/audio encoding/decoding/editing, gaming, compression & encryption can all benefit from GPGPU. This is basically the start of setting a GPU specifications floor. The question is will Intel/nVidia play along and implement quality OpenCL on their GPUs? I think nVidia will probably have to at some point, but Intel might be a stalwart as OpenCL, anything not x86 that performs general purpose instructions probably looks like a threat to them.

  • by tibman (623933) on Wednesday October 20, 2010 @10:16AM (#33960652) Homepage

    I bought and use that exact water cooler on an AMD965 (Phenom IIx4 3.4Ghz Black-Edition). It works great and i highly recommend it. My only advice for anyone is make sure your side panel doesn't have fans or protrusions in the back near your 120mm exhaust port. My case has a 180mm side fan that prevented the radiator (sandwiched between two 120mm fans) from being mounted inside the case. I dremeled out a slot so the coolant tubes could pass through the back (it's a closed coolant system, so you can't just dremel holes). Right now there is a 120mm fan inside, the case wall, radiator outside, then another 120mm fan. It's extremely quiet and i really enjoy it.

    My case, if anyone is interested: http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4034179&CatId=32 [tigerdirect.com]

  • by PopeRatzo (965947) * on Wednesday October 20, 2010 @05:38PM (#33966836) Homepage Journal

    I have a stupid problem with that very case. I used it with a GA55-UD3P motherboard and the connector to the audio jacks was on a wire that was about 1.5 inches too short to connect to the onboard audio.

    Do you know if you can buy extension cords for those little connectors? I'd hate to not be able to use the headphone jack because the wire inside the case is too short. (Note: I am not competent with a soldering iron)

Facts are stubborn, but statistics are more pliable.

Working...