Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Graphics Hardware

AMD Demos Llano Fusion APU, Radeon 6800 Series 116

MojoKid writes "At a press event for the impending launch of AMD's new Radeon HD 6870 and HD 6850 series graphics cards, the company took the opportunity to provide an early look at the first, fully functional samples of their upcoming 'Llano' processor, or APU (Applications Processer Unit). For those unfamiliar with Llano, it's 32nm 'Fusion' product that integrates CPU, GPU, and Northbridge functions on a single die. The chip is a low-power derivative of the company's current Phenom II architecture fused with a GPU that will target a wide range of operating environments at speeds of 3GHz or higher. Test systems showed the integrated GPU had no trouble running Alien vs. Predator at a moderate resolution with DirectX 11 features enabled. In terms of the Radeon 6800 series, board shots have been unveiled today, as well as scenes from AMD's upcoming tech demo, Mecha Warrior, showcasing the new graphics technology and advanced effects from the open source Bullet Physics library."
This discussion has been archived. No new comments can be posted.

AMD Demos Llano Fusion APU, Radeon 6800 Series

Comments Filter:
  • Deceiving naming... (Score:5, Informative)

    by TheKidWho ( 705796 ) on Tuesday October 19, 2010 @07:55PM (#33955106)

    The 6870 actually has less performance than the 5870... Same goes for the 6850/5850... I don't really understand why they named them the way they did... Either way, a 6970 is supposed to be released in the near future to surpass the GTX480/5870.

    • by AHuxley ( 892839 )
      New 6xxx price points. They make the numbers flow from within the 6xxx range. The 5870 is now history. Its all about the 6xxx and how its fps graphs and costs.
      If the 5870 is still better in price, frame rate or power use, I am sure it will be noted.
      The main thrust seem to be a new mid range (in price) 6xxx should be a bump towards the 5870 stats.
      As for the top end, will be fun :) http://vr-zone.com/articles/-rumour-ati-radeon-hd-6000-series-release-schedule-first-iteration-in-october/9688.html [vr-zone.com] has so
      • I didn't understand any of what you said. This new scheme doesn't make much sense. Why didn't they just name these new cards 6770 and 6750 if that's the price range they're targeting? This will just confuse consumers and is something I would expect from nVidia or Intel, AMD are usually sensible in their naming conventions.

        • by AK Marc ( 707885 )
          They have the first digit as the generation and the next three as the location in that generation. Evidently, they decided that they needed to trim costs more than increase FPS, and adjusted the numbers accordingly for the last three digits. It might not be as good of a plan when comparing it to the previous generation, but a more accurate numbering looking forward. Not that I know what they have planned or why they really did it, but since you know they did it and assume they had some reason, then you h
          • Re: (Score:3, Insightful)

            by PopeRatzo ( 965947 ) *

            Evidently, they decided that they needed to trim costs more than increase FPS

            That's a nice way of saying "give the consumer less".

            • by gman003 ( 1693318 ) on Tuesday October 19, 2010 @10:37PM (#33956302)
              Giving the consumer less, but also charging them less. Since very, very few people actually needed the top-of-the-line model of recent cards, it makes some amount of sense.
          • by Rockoon ( 1252108 ) on Tuesday October 19, 2010 @09:23PM (#33955846)
            There is also the problem that the poster is measuring performance based on some single metric (presumably FPS in some game) which doesnt necessarily mean much.

            Many years ago I upgraded from a Voodoo 3 to GeForce 4 Ti 4600, and for more than a few games that GF4 was slower in FPS than the Voodoo at first (but still more than fast enough for gaming.)

            This was at a time when games were almost strictly simple textured polygon throwers, which was the Voodoo 3's only strength. As the use of multi-texturing became more prevalent (heavily used in terrain splatting.. [google.com]), the advantages of the GF4 over the voodoo became apparent as more scene detail became essentially free, whereas that voodoo required many rendering passes to accomplish the same thing.

            Now I'm not saying I know that this generation of AMD gpu's will experience the same sort of future-benefits as that GeForce 4 did, especially since DX10/DX11 really isnt having a rapid uptake, but there could easily be design choices here that favor DX11 features that just arent being heavily used yet.

            The question is not 'is the 6870 slower than that 5870?' in some specific benchmark. The question is, which is these cards will provide a solid gaming platform for the most games. As with my experience, that voodoo performed better than the GF4 for while.. but for the newest games the GF4 kept providing a good experience whereas that voodoo became completely unacceptable.
            • by fast turtle ( 1118037 ) on Tuesday October 19, 2010 @09:55PM (#33956064) Journal

              What I suspect AMD has done was add tesselation units to the chip. This will be evident when running the Heaven Benchmark with Tesselation enabled. Keep in mind that Tesselation is one of the key changes between DX10.1 and DX11 and as you stated, this is future looking. Sure the chip may be a bit slower currently but I suspect that when running something that depends heavily on tesselation, there wont be any slowdowns.

              The reason I'm aware of this is my Radeon 5650. It's a DX11 card with 512 onboard and when running the Heaven Test, there's lots of improvement when tesselation is on even though the card struggles and drops to between 4-12 frames. With tesselation off, the card easily handles the test at a playable rate of 45-60 frames.

            • I am wondering how your Geforce4 Ti4600 got outclassed by a Voodoo3. The Voodoo3 was equaled or outclassed by the original GeForce256. Maybe your memory is fuzzy, but there would be some major issues if your Voodoo3 was faster than the Geforce4. Also multi-texturing was a big deal around the TnT2 & Voodoo2/3 days, the Geforce3/4 were way past that stage with the introduction of shaders. By the time the GeForce4 came around 3Dfx was already dead for two years after their Voodoo5 failed miserably against

              • flagship = nascar what wins on sunday sells on monday. the flagship is benchmarked and touted on all the tech sites as a ferrari. then they sell the average consumer kia's that are slower then last years model. sure it's cheaper, but it's a hunk of shit, and going in the wrong direction, as everything else incrementally gets more computing intensive... no apologies for the bad car analogy.
            • yeah good points. Issue is that gaming graphics are now undeniably console driven, and consoles don't do DX11/tesselation yet, so that featureset I suspect will be a bit niche. For every stalker/crysis type of envelope pushing gfx-whore-heaven fps (which I do oh so love) there will be 4-5 multiplatform releases that don't even push a 5770 let alone higher.

              I'm considering an upgrade from my 5850 purely from a noise POV, I have the HIS icoolerV variant (stock not OCed) and reviews say its quiet but under load

              • by Creepy ( 93888 )

                That and only XBox360 does DX9 (specificially 9.0C). PS3 uses OpenGL ES 2.0 library for the most part, with lots of hacks. If they drive the market, don't expect DX11 or later to be adopted until the next gen of consoles come out.

                I'm really excited about physics integration, as most of the stuff I've worked on recently have required passing physics information between hardware and software (in textures). For instance, cloth and hair both need physics to behave realistically.

        • Re: (Score:3, Interesting)

          Comment removed based on user account deletion
          • Re: (Score:3, Interesting)

            by PopeRatzo ( 965947 ) *

            Hopefully by the time February rolls around they will have this straightened out, as I'll be replacing my HD4650

            I'm about ready to replace my 4650, too. I've got a new HD monitor coming and figure that's as good a time as any to up the graphic power, though I won't be going water-cooled.

            My problem with the numbering system is always the second digit. For example, is a 5830 better than a 5770 or 4870? Do I add up the 4 digits and compare the sums? Is the first digit the most important, or the second, or

            • Re: (Score:3, Informative)

              Comment removed based on user account deletion
              • Re: (Score:3, Interesting)

                by tibman ( 623933 )

                I bought and use that exact water cooler on an AMD965 (Phenom IIx4 3.4Ghz Black-Edition). It works great and i highly recommend it. My only advice for anyone is make sure your side panel doesn't have fans or protrusions in the back near your 120mm exhaust port. My case has a 180mm side fan that prevented the radiator (sandwiched between two 120mm fans) from being mounted inside the case. I dremeled out a slot so the coolant tubes could pass through the back (it's a closed coolant system, so you can't ju

                • Re: (Score:3, Interesting)

                  by PopeRatzo ( 965947 ) *

                  I have a stupid problem with that very case. I used it with a GA55-UD3P motherboard and the connector to the audio jacks was on a wire that was about 1.5 inches too short to connect to the onboard audio.

                  Do you know if you can buy extension cords for those little connectors? I'd hate to not be able to use the headphone jack because the wire inside the case is too short. (Note: I am not competent with a soldering iron)

                  • Re: (Score:3, Informative)

                    by tibman ( 623933 )

                    Is this your mobo and that green spot (F_AUDIO) on the left by the audio jacks is where you need to plug in? http://www.orangeit.com.au/catalog/images/prodimg/img1338.jpg [orangeit.com.au]

                    I don't remember where mine went and can't check until later tonight.. but i think mine was bottom left. Is it possible the cable is wrapped around something behind the other sidepanel? I don't know if anyone sells an extension cord for those, but i found some stuff that may work for you.

                    http://www.sparkfun.com/commerce/product_info.php?p [sparkfun.com]

                    • Brother, that's really nice of you to offer, but I'm sure you have better things to do with your time.

                      I'll have to pop the cover on my case again to look, but I seem to recall that the connector on the mobo was one of those 2 or 3 bare pins sticking up and the wire from the case was one of those thin wires with the tiny black connector on the end that you slip over the pins. I may well be wrong. When I built the system, I was in a panic to get it up and running. All I remember is that I had trouble makin

                • Comment removed based on user account deletion
                  • by tibman ( 623933 )

                    105-110F usually and rarely does it go over 120F. I don't have a before and after comparison though, sorry. The only bad part about the thing is you have to remove the mobo from the case to install a custom bracket. That won't be a problem for you though if your friend is going to dremel the case anyways.

              • Any ideas on passive systems for low power iMac type computers?
                On an unrelated note whats the best distro choice for a 600MHz Transmeta cpu laptop?
            • by aliquis ( 678370 )

              For example, is a 5830 better than a 5770 or 4870?

              Probably.

              Stupid guesses:
              58xx > 48xx from generation alone.
              57xx probably more limited chip or something else (different memory?) than 58xx.
              xx30 lower end than xx70 of the same chip.

              Or something, wikipedia most likely tell?

              Facts:
              HD4870: 750/900 clock, 800:40:16 unified shaders, texture mapping units, render output units, 256 bit GDDR5.
              HD5770: 850/1200 clock, 800:40:16, 128 bit GDDR5.
              HD5830: 800/1000 clock, 1120:56:16, 256 bit GDDR5.

              X7XX on both seem to be 128 bit memory.
              X8XX 256 bit
              48XX X2 2x256
              X9XX 2x256

              • I think the 5830 is actually just a little better than the 5770 in real world performance but uses a significantly more power. I think the 4870 is actually also pretty close to the 5770 in performance, might actually be faster in fact. But the 5770 is based on the 5800 series core I think, just a little cut down.
                • To add to what you said, the 5800 series used more power since it was a much larger GPU. The 5830, 5850, and 5870 shared the same chip (Cypress). For the lower-end parts, the shading units, ROPs, texture units were fused off to improve yields and fill in the large performance gap between the 5780 and 5770. Likewise, the 5770 GPU (Juniper) actually started with a smaller core (with less than half as many transistors as Cypress), and subsequently disabled sections for the 5750.
          • by aliquis ( 678370 )

            I frankly can't tell you if a 9600Gt beats a GT210 or the other way around

            I assume it does. What about a HD4250 vs HD 3650?

            Hard to see the difference =P, though yeah, LE, GT, GTX, GTS, ... may seem weird, 30, 50 and 70 is rather obvious.

    • by Renraku ( 518261 )

      Because shit dude the 6870 is 1000 better than the 5870! And it's going for a lot less! It's a GREAT deal.

    • I'd love to see this thing in an Android tablet, or a next-gen game console. I'm sure the pixels-per-watt are fabulous. I hope it shows up in the Wii-2, or in some super-powered tablet that will make PCs obsolete.
      • PCs will be obsolete when I can have a chip implanted in my brain.
        • Which one? (Score:1, Offtopic)

          PCs will be obsolete when I can have a chip implanted in my brain.

          Will you go for Ponch, or Jon?

          Cheers,

        • by aliquis ( 678370 )

          My brain will be obsolete when one can implant a chip in it.

          Who's right? Just poking in a M68k or "implanting" a 5.56mm NATO in there with no connections doesn't count :D. Though the later is sure to make my brain obsolete :D

    • Deceiving? (Score:4, Insightful)

      by mykos ( 1627575 ) on Tuesday October 19, 2010 @08:12PM (#33955272)
      I have a feeling that people who buy expensive pieces of hardware have tendency to do at least one web search or pop at least one question off at an internet forum about products before they buy. It's not like AMD is putting the or anything... [overclock.net]
      • Re:Deceiving? (Score:4, Insightful)

        by mykos ( 1627575 ) on Tuesday October 19, 2010 @08:13PM (#33955288)
        Wow HTML fail on my part...what I mean to say is "It's not like AMD is putting the same chip with the same everything into four generations of parts or anything"
      • by aliquis ( 678370 )

        From the comments it looks like they said 5 cards, not 5 generations? 2 generations?

        And they did the same (or for three?) with the low-end GF4s didn't they?

        That didn't mean ATI got better cards than the TIs though.

        • by AvitarX ( 172628 )

          I think in the GF 3-4 range there were a lot of GF4-mx cards.

          These were essentially low-middle end GF3 cards, and for some things my GF2 TI outperformed it (with less CPU to boot).

          • by 0123456 ( 636235 )

            I think in the GF 3-4 range there were a lot of GF4-mx cards.

            These were essentially low-middle end GF3 cards, and for some things my GF2 TI outperformed it (with less CPU to boot).

            The 'Geforce 4 MX' was a Geforce 2 in drag, wasn't it?

    • I think AMD is going to have the 5770 and 5750 live on as their lower-midrange cards while these will become the upper-midrange cards. The top of the line (single die) cards will then be named 69XX. Not sure what they are going to do with the dual die cards (perhaps bring back the X2 naming convention).

      It makes sense for AMD to hold off on updating their lower end offerings since consumers are less demanding at this price point.
    • This is because in the current generation, the top end ATI GPUs are the 58xx. With this coming generation they wanted the top end to be x9xx, so they have just incremented the second digit. 6870 is intended as a successor to the 5770.
    • low power but better manufactured leaves to see if they can be overclocked better than the 5x series. TBD until people get some hands on time, of course.

    • by Ecuador ( 740021 )

      The general idea is that the 5850/5870 will be phased out as these cheaper new cards will be introduced, so the AMD mid to up-range lineup will be 5750/5770, 6850/6870, 6950/6970 (next month, + 6990 later). So, that does make sense since the 6850 and 5750 will be out at the same time and the former is much faster (and that is indicated by its x8xx vs x7xx), and also has the new features (like HDMI 1.4, 3D and those are indicated by its 6xxx vs 5xxx). But the 5850/5870 will not disappear overnight so for a w

  • I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.

    Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

    • I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.

      Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

      By current standards, Low-power would be just under the amount of energy required to power the sun to run a dual-card setup.

    • by Sycraft-fu ( 314770 ) on Tuesday October 19, 2010 @08:57PM (#33955640)

      Without any fan? No probably not. It is a desktop processor. This isn't an ultra low power component, it isn't an Atom. The idea AMD is going for here, and I think there's some merit to it, is a low range desktop type of system. People who want something cheap, but still want to play games. Intel's integrated chips don't work that well for that (though they've improved) so this is to try and fill the market.

      If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones). But don't bitch that some people might want a computer that can play a game a little better than Nethack.

      • by Chris Burke ( 6130 ) on Wednesday October 20, 2010 @12:01AM (#33956894) Homepage

        The idea AMD is going for here, and I think there's some merit to it, is a low range desktop type of system. People who want something cheap, but still want to play games. Intel's integrated chips don't work that well for that (though they've improved) so this is to try and fill the market.

        Think more mid-to-high-end laptops.

        As mentioned in the summary, this is a low-power version of the Phenom II. Not an ultra-low power for consumer electronics or netbooks like Atom or AMD's Bobcat, but still solidly aimed at the mobile market. It provides all the power and cost advantages of a UMA solution plus gets rid of one of the system buses for more savings, while providing good-for-a-laptop graphics without having to significantly re-engineer the motherboard or cooling solution. This is still in theory; demonstrations of engineering samples are nice, but it'll be interesting once the reviewers get their hands on some.

        Of course you're also right, since cost and power usage are relevant for desktop. Just not as much, since you're not dealing with battery life, or the form factor that make it difficult to work with discreet graphics. A single line of UMA-based motherboards with optional pci-e graphics card can serve multiple markets with one design and acceptable margins.

        • by Khyber ( 864651 )

          "Think more mid-to-high-end laptops."

          No, those will come with their own MXM expansion slot for dedicated GPU.

          • Most of the mid-range laptops don't have MXM because it's a waste. The price differential of adding the slot is almost certainly more of a burden to the user than a replacement, and many MXM cards have heat sinks in nonstandard locations, or come in nonstandard sizes that only fit one range of laptops. This is supposedly less common in MXM3 but people were still using MXM1 when MXM3 had come out.

            Low- to Mid-range laptops will get these chips, netbooks will continue struggling along with the slowest CPUs, an

            • by Khyber ( 864651 )

              "Which suggests that MXM is a big fucking waste of time and money in all cases."

              Yea, you go ahead and say that to my nx9240, which has had four video upgrades and still runs like a champion.

              • The plural of anecdote is not data, and a single anecdote is not even that. I was talking about a waste to the manufacturer, who is more important than you, especially when deciding whether the system shall have an MXM slot. Statistically nobody ever bought a laptop because they could upgrade it. You are noise.

          • The very high-end, sure. For the rest, the cost and battery life advantages will steer it towards Fusion. People these days want a high-end laptop that can play games, but that can also function as a useful portable computer in the absence of a power outlet.

      • by 0123456 ( 636235 )

        If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones).

        Ha... my LG Blu-Ray player has been in for three times for warranty repair and now we're waiting for a replacement unit because they can't fix it, while my $90 unknown-Chinese-brand Blu-Ray player from Walmart works perfectly and is multi-region out of the box.

        And our Ion MythTV frontend does have a fan to keep it cool when playing 1080p, but it's inaudible from the sofa.

      • by qmaqdk ( 522323 )

        If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones). But don't bitch that some people might want a computer that can play a game a little better than Nethack.

        And if you wan't a car that doesn't use gas, get a bicycle.

    • by Ephemeriis ( 315124 ) on Tuesday October 19, 2010 @09:37PM (#33955930)

      I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.

      AvP is a relatively modern game. Came out in the last year or so. It isn't mind-shatteringly amazing, but it looks pretty decent.

      Traditionally, integrated graphics have done a lousy job with serious gaming on PCs. Basically any FPS has required a discrete 3D card.

      If Joe Sixpack can go out and buy an off-the-shelf machine at Best Buy and play a game on it without having to upgrade the hardware, it'll be a huge step in the right direction.

      But this chip doesn't look like it'll be replacing 3D cards for serious gamers anytime soon.

      Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

      It's a desktop chip, so I can't imagine it'll do anything without a fan. Although the integrated graphics means that you wouldn't need a separate graphics card with its own fan. So it should be at least a little quieter.

      • by Osgeld ( 1900440 )

        If Joe Sixpack can go out and buy an off-the-shelf machine at Best Buy and play a game on it without having to upgrade the hardware, it'll be a huge step in the right direction.

        well they always could, but they are too ill informed to make a good choice for their needs, they see a half dozen intel stickers on the front of some demo box and instantly start rationalizing that intel makes the best computer, just like they play their Nintendo tapes and wear their Nike air's, while blowing their nose in a Kleenex and using Clorox

        personally there ought to be a survey for the people who just dont know, cant understand or are overwhelmed by a metric fucton of TOTALLY meaningless number thi

        • well they always could, but they are too ill informed to make a good choice for their needs

          The problem hasn't really been one of information.

          Until fairly recently, your average off-the-shelf computer shipped with very crappy graphics. If you just ordered whatever was on Dell's website or grabbed something from Best Buy it would have enough integrated graphics to run Windows and not much else.

          Sure, you can generally customize them with a video card of your choice... At least if you're ordering on-line... But even then the offerings weren't terribly impressive.

          And there really is a limit to how

          • by Osgeld ( 1900440 )

            And there really is a limit to how much self-education you can expect your average consumer to do. Do you go out and research what kind of spark plugs are factory installed in your new car? I sure as hell don't.

            no just as I said earlier about the little survey, I look in the book hanging off the shelf to quickly decide what I need for a sparkplug

            Its not about joe sixpack educating themselves, its about providing reasonable information in a easy to digest format

            why do most computers come with a crap video card? cause most of them say "great for gaming" on the front and is pushed out en mass as cake and people eat it up not having a clue either way

            (and yes the cake is a lie)

            • And there really is a limit to how much self-education you can expect your average consumer to do. Do you go out and research what kind of spark plugs are factory installed in your new car? I sure as hell don't.

              no just as I said earlier about the little survey, I look in the book hanging off the shelf to quickly decide what I need for a sparkplug

              The book hanging off the shelf... At the car dealership?

              I'm not talking about replacing the spark plugs in a vehicle you already own. I'm talking about going out to the dealership and purchasing a car. I don't think most people check to see what type of spark plugs are factory installed when they go looking for a new car. Either they know enough to care, and they'll just replace them with their preferred type when they get the vehicle home... Or they don't know enough to care, and they'll just use what

              • by Osgeld ( 1900440 )

                yea ok I misread the example, but If my Kia Rio had a big red sticker on it that says "great for off-roading" when in fact its not there would be some heck to pay

                why its okay to falsely advertise or exaggerate features in a computer, and its not in any other industry is what I am trying to get at here

                Its become so confusing I cant even keep up with it, there is more meaningless random tell you nothing numbers on computers and that needs to stop for everyone's mental stability

              • Those who are interested in fast cars will usually make sure to buy the biggest engine (GPU/CPU) they can afford.
                Your average ricer kids (gaming nerds) are also likely to obsess about technical details and be at least somewhat well informed.
                They might also decorate the car (PC) with lots of spoilers (LED-illuminated fans).
                And then they go drag racing (comparing benchmarks ;-)

      • Re: (Score:3, Funny)

        by mjwx ( 966435 )

        AvP is a relatively modern game. Came out in the last year or so.

        It worked, I have travelled back to the year 2000.

        • AvP is a relatively modern game. Came out in the last year or so.

          It worked, I have travelled back to the year 2000.

          Hmmm...

          Well, I assumed they were talking about the 2010 AvP game [wikipedia.org]. As that would make more sense (seeing as DX11 didn't even exist in 2000).

          But I suppose it could be the 2000 AvP game [wikipedia.org]. In which case I'm less impressed.

      • by eknagy ( 1056622 )

        AvP is a relatively modern game. Came out in the last year or so.

        Actually, there are more than a dozen AvP games according to the Wikipedia:
        http://en.wikipedia.org/wiki/List_of_Alien_and_Predator_games [wikipedia.org]

        I liked the AvP 1999 a lot and at first glance I did not understand why they showcase a 11-year old game.

        Now I know I will have to buy an AvP 2010 with my next AMD laptop ;)

        • Yeah. I just assumed they were referring to the 2010 version, as the earlier ones probably didn't feature DX11 graphics. But just saying "AvP" doesn't really clarify things much at this point.

          The 2010 game is a mixed bag.

          The marine campaign is a ton of fun. The predator campaign is fun, but doesn't make a whole lot of sense. The alien campaign was a disappointment.

          The multiplayer can be fun, or frustrating, depending on the map and who you're playing as/against. Some of the maps seriously favor one rac

    • by AvitarX ( 172628 )

      You want Ontario or Zecate (Bobcat based APUs).

      Both offer h.264 accelerated playback and are 9W or 18W.

      I am seeing mixed info actually on what is available dual cs single core at what wattage.

      The 9W is definitely single core, faster than Atom, with decentish GPU (accelerated video), the Zecate I think is 18W with a single core.

      I imagine they will not quite double when going dual core (as the graphics part will not increase).

      And they are supposed to have tech to completely shut down parts of the chip that ar

    • Hell, the crappy Intel integrated GPUs can handle video pretty well. Even a low-end card can do 1080p.

      As for low-power, if recent experience is any judge, the power usage will be low only in comparison to quadruple Pentium IVs. Some cards last gen were 300+ watts TDP.
    • by aliquis ( 678370 )

      Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

      Shouldn't anything new?

    • by stms ( 1132653 )

      I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be

      Agreed how many IOCs (Instances of Crisis) is that? like .25?

    • by h4rm0ny ( 722443 )

      Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

      1080p blu-ray isn't that hard. If that's your primary interest you don't need to worry about getting a fancy card that scores well in gaming tests. The two things don't (mostly) correlate. With AMD cards, you're looking for something that has UVD2 support. That's the clue that indicates a card can give you proper blu-ray playback. And yes, that's plenty possible without a fan. For example, this oldish 4350 [awd-it.co.uk] should be fine for blu-ray (has UVD2.2, the latest) and is fanless, But I'm certain you could fine thi

  • "...showcasing the new graphics technology and advanced effects from the open source Bullet Physics library"

    Nvidia has their PhysX engine, and Intel advised they were to acquire Havok. Bullet is exciting for me. It was used in Grand Theft Auto 4, and in the movie Hancock.

    So for me, reading AMD, ATI, Bullet in the same sentence is the interesting part.
  • Will apple use this new cpu with gpu build in?

    as they like thin and small and intel new chip with gpu build in does not fit with apples techs and apple does not like putting in add on video chips in there low end systems.

    • by MBCook ( 132727 )

      I doubt it. Switching to AMD (especially for only part of their line) seems like it would have a lot of ancillary costs such as the R&D help I know Intel has given Apple. Apple stuck by Intel for years through their abysmal "GPUs" (I've got one, along with an nVidia, in my MacBook Pro). Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.

      Also, don't forget the right

      • but the lack of OPEN CL in Intel's chip is bad and do you want a $700 desktop with intel video?

        a $1200 laptop? $1500 laptop?

        also the 320M they have now is better then what Intel new chip can do.

      • by tyrione ( 134248 ) on Tuesday October 19, 2010 @11:24PM (#33956630) Homepage

        I doubt it. Switching to AMD (especially for only part of their line) seems like it would have a lot of ancillary costs such as the R&D help I know Intel has given Apple. Apple stuck by Intel for years through their abysmal "GPUs" (I've got one, along with an nVidia, in my MacBook Pro). Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.

        Also, don't forget the right now AMD has the Phenom, which is a good chip, and Intel has their current Core line, which is an amazing line of chips. To go to AMD means sacrificing performance/watt on the CPU side.

        Two years ago maybe it would have mattered. Today? Too little too late.

        Being a former NeXT and Apple Engineer I can tell you unequivocally your thought is Bull Shit. Intel gave NeXT practically zero information for the NeXTStep Port to Intel. Apple designs around Intel Specs and Intel helps as another OEM. No special treatment.

      • Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.

        I've been hearing that Intel's latest graphics are finally pretty good for over a decade. At this point, they could release a graphics chip so amazing that each polygon gives me twenty dollars and a blowjob, and I'd still make a careful point of never using Intel graphics, no matter the cost. It's like the boy who cr

  • by Suiggy ( 1544213 ) on Tuesday October 19, 2010 @10:28PM (#33956260)

    APU doesn't standard for Applications Processing Unit, it's an acronym for Accelerated Processing Unit.

    http://sites.amd.com/us/fusion/apu/Pages/apu.aspx [amd.com]

    "The GPU, with its massively parallel computing architecture, is increasingly being leveraged by applications to assist in these tasks. AMD software partners today are taking advantage of the GPU to deliver better experiences to across an ever-wider set of content, paving the way to breakthrough experiences with the upcoming AMD Fusion Family of Accelerated Processing Units (APU)."

  • . . . maybe it's because this is AMD's answer to ION? Or maybe not, I dunno, just a thought.

What is research but a blind date with knowledge? -- Will Harvey

Working...