Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Portables Hardware Technology

AMD Reveals New Mobile Technologies 78

MojoKid writes "AMD disclosed a few details today regarding their upcoming mobile platform technologies, codenamed 'Griffin' and 'Puma'. According to AMD, Griffin will be manufactured at 65nm and it will feature a new mobile optimized on-die Northbridge with a power optimized DDR2 memory controller, HyperTransport 3 connectivity, and larger L2 caches than current designs. The new memory controller should also extend battery life thanks to new power saving features, that allow the controller to operate on a separate power plane and at a lower voltage than the execution cores."
This discussion has been archived. No new comments can be posted.

AMD Reveals New Mobile Technologies

Comments Filter:
  • hmm (Score:3, Informative)

    by Anonymous Coward on Friday May 18, 2007 @05:14AM (#19176177)
    larger L2 caches that current designs? Syntax error, line 4.
    • Re:hmm (Score:5, Funny)

      by Briareos ( 21163 ) on Friday May 18, 2007 @05:36AM (#19176271)

      larger L2 caches that current designs? Syntax error, line 4.

      "Design by current":

      1. Put a block of silicon somewhere
      2. Apply enough current to make it melt *in an interesting way* (sparks flying etc)
      3. Test it's input/output behaviour.
      4. ???
      5. New x86-compatible CPU

      Lather, rinse, repeat steps 1-4 until you get to step 5.
      • by asliarun ( 636603 ) on Friday May 18, 2007 @06:32AM (#19176499)
        AMD badly needed to compete with Intel in the mobile computing market, and the Puma platform should get them some design wins, which would hopefully fuel a price war between the Puma and Santa Rosa platforms. At the risk of sounding cliched, the ultimate winner would be the customer. Unfortunately, despite AMD's efforts, I still think that AMD would be a marginal player in the mobile segment for some time to come, and would mainly competing with Intel C2Ds on price. I have yet to see AMD make compelling chips and platform designs for low power and ultra low-power laptops, for example. I would love to see it happen though, as this is one area where Intel is plainly getting away with overpricing their chips and platform solutions.

        What surprises me is why AMD is not putting in more efforts in making better mobile chips and platforms, when this is the one segment that is truly growing at a compounded rate. Heck, Centrino (and P-M) was the one and only reason that Intel managed to make a profit in the inglorious P4 days. One clear use case that I see is in corporations transitioning from desktops to laptops is simply "work-life balance". With the crazy hours that people are working nowadays, and the fact that broadband has become affordable, this will be the one carrot that more and more companies will dangle to keep their employees reasonably happy. Furthermore, as computers become commodities, people will increasingly look at differentiators such as mobility, ease of use, and connectivity instead of flexibility which was the desktop's forte. The only way in which this can truly happen though is if laptop prices start matching desktops in price and to some extent, in performance. In fact, performance is increasingly becoming irrelevant as most dual-cores and quad-cores are overkill for most users, even your so-called "power" users. Except for some niche areas like CAD or image processing, I have yet to see users complain because they are bottlenecked by their processor. Most users do get bottlenecked by their RAM or battery life (in case of laptops) though.
        • by PopeRatzo ( 965947 ) * on Friday May 18, 2007 @06:55AM (#19176597) Journal

          the ultimate winner would be the customer.

          You must be new to this planet.
          • I see your point, but the customer DOES benefit sometimes. Take the current ongoing price war between AMD and Intel on desktop CPUs. Man, you wouldn't have dreamt of getting a good quadcore for under $300 a few months ago. My argument is that Intel has not yet dropped its Centrino platform prices because it does not see a credible challenge from AMD. The day it gets a whiff that AMD has a better or even somewhat equal solution, it will start getting paranoid and will try to maintain marketshare at any cost,
            • Of course, prices often drop, and to a certain extent, that helps consumers. But it's interesting how the average price of the "computer I want" seems to stay about the same as a decade or more ago.

              I have all this new processing power and memory, and it comes on a computer with Vista, that uses more processing power and memory to do about the same thing. What have I gained?

              However, when I removed Vista and put my old trusty WinXP Pro SP2 on that same computer, I saw a real gain in performance. I would lo
              • by AvitarX ( 172628 )
                I remember computer I want being a lot more than $1200.00 a decade ago. Especially laptops.
              • Part of the objective in a new OS release is to take advantage of new technologies and capabilities of modern hardware. How efficiently this is done by any particular vendor is a whole other subject. Apple has long had the inherent advantage of lording over the hardware, thus assuring that, while you might need a new system for the next major version change, EVERYONE'S new system would be optimized for the software. If you're looking for feature-rich operating system power with a decreased draw on system
                • Part of the objective in a new OS release is to take advantage of new technologies

                  And what exactly are the "new technologies" that Vista is taking advantage of that can't be used in WinXP? And don't give me "DirectX10" because I don't believe for a second that the "new technologies" in DirectX10 couldn't be used somehow in WinXP.

                  And does it really count when the "new technology" that your latest, greatest operating system is taking advantage of is a "new technology" that the developer of the OS has release

                  • I completely agree when it comes to Vista. I have been challenging every Vista fan I come across to convince me that there's something really new, useful and innovative in Vista that can't be easily done in Windows XP, or more importantly, hasn't already been done by Linux and Mac OS X. The bloat from the DRM, RIAA-pleasing crap alone makes me wretch.
                    • How much space&CPU does a simple decryption algorithm really take anyhow? Especially during the brief periods when it's not being used. If there's bloat, there's gotta be a reason other than DRM. Unless it's exceedingly poorly implemented DRM, in which case, there are probably some advantages there for the user...
                    • Re: (Score:2, Interesting)

                      by ronadams ( 987516 )
                      Keep in mind that this DRM includes not only the encryption algos but also software to monitor files and attempt to determine if they require licensing. Also, there's a fair bit of behind-the-scenes crap in Vista, spying on media files. Windows = Bloat. Sorry MSFT fans; it's just true.
        • It seems the obvious place to go for consumer sales. But the sensible analysis at ars technica on Barcelona [arstechnica.com] architecture is that AMD are gunning for the server ecosystem, with massively-parallel processing of data loads and high-bandwidth system connections. If they can tease a highly-efficient notebook system out of that, I'll be impressed.
        • The reason AMD has had a hard time competing is because they're a bit late in fabrication processes. Note that the above article mentioned the new chipsets will be made with 65nm fabrication. The Core 2 Duo is also 65nm, whereas older AMD processors were (I believe) 80nm.

          At 65nm, transistors can be placed much closer together, allowing for less signal degradation through the chip, so it can run reliably at lower power.

          The reason AMD's behind is because each fabrication plant costs a ton to make.
        • I have yet to see AMD make compelling chips and platform designs for low power and ultra low-power laptops, for example.

          I don't have time to replicate the research (there is a quite-antiquated comment here on slashdot that I will probably never see again in which I did this with sources) but at one point the non-mobile Opteron plus the chipset had a lower idle power consumption and about the same peak power consumption as a top-speed Pentium M with the chipset.

          AMD figured out the low-power thing before in

        • by tacocat ( 527354 )

          Yay!!! AMD has a new product line for the mobile PC market.

          Only problem is, they have so frickin many platforms that you spend forever trying to figure out WTF you want in a CPU. Sorry AMD, but I think you jumped the shark tank on this one and long before this too.

          Would have been a lot easier if you had stuck to three basic cores:

          1. CPU Core for compuational freaks who need it all.
          2. CPU Core for grandma's CP that doesn't need the high performance. (Even this is a stretch)
          3. CPU Core for mobile PC's

          It coul

        • by tknd ( 979052 )

          What surprises me is why AMD is not putting in more efforts in making better mobile chips and platforms, when this is the one segment that is truly growing at a compounded rate.

          I'm not sure that it's because they don't want to compete in the mobile platform, but that in the past they simply didn't have the resources and enough industry influence to put out something like Centrino. Remember, the CPU is only one factor in the entire power scheme for a computer. Intel had/has the resources to build its own

  • SFF PCs? (Score:3, Interesting)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Friday May 18, 2007 @05:55AM (#19176359) Homepage
    How viable are these for sticking into a SFF PC, to be used as a small media center capable of playing h264. That's what I want to know :)
    • Re: (Score:2, Informative)

      by norkakn ( 102380 )
      At what resolution? 1080 might still be a problem. Last time I checked, non apple h264 blew pretty hard (I'm looking at you, VLC (and others)), so why not get get a mini or an apple TV?
      • Re: (Score:3, Informative)

        by TheRaven64 ( 641858 )

        Last time I checked, non apple h264 blew pretty hard (I'm looking at you, VLC (and others)),

        Really? On my old G4 Powerbook (1.5GHz), VLC could play back 1080p H.264 from the Apple trailers web site (with CPU usage at over 90%), but Quicktime dropped frames. On my new Core 2 machine both can play 1080p back just fine with under 50% CPU utilisation. It's the main thing that makes me realise just how much faster the new machine is than the old; pretty much everything else is disk speed limited these days.

      • Because the Apple TV doesn't have any TV Tuner and therefore can't be used to record TV shows. I don't see what use a Media center PC is without a TV tuner. Sure if you want to download all your content over bitttorrent or iTunes, then it might work ok, but I think that TV Tuners are the way to go.
        • by norkakn ( 102380 )
          Makes sense. I don't have a TV, but I'm thinking about getting a media server, so I forget about some of the "normal" features. This still doesn't discount the mini or a hacked aTV with an external converter, but that is less than elegant.

          I concluded that there aren't any products out there that do what I want, so I'm not going to buy any, which is fine by me.

          Another route that you could take would be to get either a mini, or a slim linux box (if VLC's quality doesn't bother you) as your set top box, and
    • by Targon ( 17348 )
      Since these are intended for laptops, availability may be limited in the channel. There are a number of GPUs on the way that may do the job for SFF PCs, and the new processors should also give a nice bump in performance over current generation processors.
    • Assuming that the new "Griffin" processor is anything like the old Turions then there should be nothing stopping you building a media PC around the new technology. I'm currently in the process of building a media PC myself around a Turion MT37 (2GHz Athlon 64 with 1MB cache) that used to live in an old laptop.

  • by Raven737 ( 1084619 ) on Friday May 18, 2007 @05:58AM (#19176365)
    I wonder how long before AMD makes a PC-on-a-chip, like VIA did. [linuxdevices.com]
    Now with ATI they should have all the required components for that (good graphics controller etc).

    I am thinking ultra ultra portable =)
    • by Eukariote ( 881204 ) on Friday May 18, 2007 @07:27AM (#19176777)
      Having merged with ATI, AMD now has all the IP it needs to do such a device. AMD in fact has a so-called "Fusion" development program http://en.wikipedia.org/wiki/AMD_Fusion [wikipedia.org] to do just that. The Silicon on Insulator (SOI) process technology that AMD (unlike Intel) uses for CPUs has low leakage and is well suited for low-power devices. So in the short term, while the GPU and CPU are still separate, good reductions in power consumption can already be had by switching the production of the GPU/chipset of mobile devices to SOI.
      • AMD would need to expand their cleanroom space a lot to be able to fab GPUs on SOI as well as CPUs. If they had the space, this would be a good investment as it would let them make chipsets and GPUs using the older 90 nm 200 mm wafer equipment that no longer makes CPUs. Intel currently does this and it saves them money and gives them more control than using a third-party foundry like TSMC.

        However, I doubt that AMD would make SOI chipsets and GPUs in-house for some time because they can't swing the investmen
    • by Targon ( 17348 )
      Fusion will be the FIRST generation of CPU and GPU on the same chip, due out in 2008 I believe. PC on a chip implies memory, sound, and the rest though, so I wouldn't expect a "True PC-on-a-chip" until 2010 at the earliest. One chip with one or two gigs of memory, video, and sound on it would get pretty hot, so cooling it would be a bit of a problem.

  • by Morgaine ( 4316 ) on Friday May 18, 2007 @06:19AM (#19176445)
    The CPU manufacturers seem to be focussed quite well on keeping their CPUs and motherboard chipsets within reasonable power limits, and this latest announcement by AMD is very promising. The situation is not quite so rosy in the 3D graphics chipset arena, as the review of the Radeon HD 2900 XT a few days ago highlighted ... The Tech Report had to upgrade their PC's PSU to 750W to achieve stability.

    That's "not good", to put it mildly. If you extrapolate the power consumptions of graphics cards over the last decade or more into the future, it rapidly takes us into the realms of impossibility, except for those interested in Freon cooling and running their own power station in the back garden.

    Something's got to change, and it has to be rather fundamental. Just decreasing die feature sizes has held back the rate of power consumption increases considerably, but that regular improvement is already factored in to this very bad upward curve we're on. We need something as dramatic as the change from MOS to CMOS was back in the day, which dropped consumption by orders of magnitude. If something like that doesn't happen, we're in big trouble.

    AMD's work on decreasing power consumption is great (and so is Intel's), but please focus your ex-ATI team's efforts on reducing the power guzzling on *graphics*. That's where the major problem for the future lies at the moment.
    • by clickclickdrone ( 964164 ) on Friday May 18, 2007 @06:21AM (#19176465)
      >Tech Report had to upgrade their PC's PSU to 750W
      Daddy? Why did all the energy get used up and the planet die?
      Well son, we umm, liked to play a lot of games.
      • 400W (Score:1, Offtopic)

        by thegnu ( 557446 )
        I've currently set my threshold for a PC power supply at 400W (and one of those 80% efficiency rated ones, at that), which is still ridiculously high, IMO. I love quiet computers, and I love low-power stuff. I waited to upgrade from an underclocked Barton to a Venice-core AMD because I wanted a cool, quiet PC.

        Of course, I play guitar through a MM 210 Sixty Five, and my buddy Jim plugs his bass into a big fat half-stack. And let's not forget A/C so my equipment doesn't get damaged. And my car isn't runni
    • When demand for power drives it price up, smart investors will look into cheaper alternatives to power-hungry CPUs. Of course that will never happen until full deregulation is allowed to bring a free market to life.
      • Re: (Score:3, Interesting)

        by TheRaven64 ( 641858 )
        Power has been a driving force for a lot of big consumers for a while now. A former Intel chief architect told me that their first wake-up call came when a company in NYC told them that they couldn't upgrade their machines from i486 to Pentium because their building's power distribution system couldn't handle the extra load. It took them a few years for this to filter all the way to the chip design teams (the P6 was already under development then, and NetBurst was in the very early design phase), but this
        • The current Core line of chips are rather efficient per core, but the trend of sticking tons of cores in a package makes for some pretty high TDPs. The Core 2 Quads are pretty warm, with the Q6600 being 105 watts, the QX6700 and 6800 are 130 watts. This is as bad as the hottest of the Pentium Ds and hotter than the Pentium 4s ever were. The average C2Ds are much cooler than the average P4s and PDs, but I'd definitely say that "fast" is not out of style at either company.
          • Making chips fast will always sell, but it's no longer the main driving force. The two biggest growth areas are mobile, and high-density server chips. Both of these want fast, but they want low power as well. Speed-per-watt is the most important thing, with raw speed being far less important. Multi-core in mobile environments gives the potential for shutting down all except one core while mobile, and running all of them when on mains power, which gives less of a performance hit than throttling back the
    • by Targon ( 17348 )
      I suspect that the tested R600 cards that have been reviewed are based on the 80nm fab process versions. There will be a 65nm version that will help in this regard to reduce the power demand. AMD is already working on 45nm, so once AMD is ready on that front, we should see BOTH CPU and GPUs using a 45nm process technology.

      A big question is how much power the GPU takes, but also how much power it takes to drive 512 megs and 1 gig of memory on the video cards. Would it save power if we plugged the GPU i
      • AMD doesn't fab their GPUs. They use TMSC for that. AMD has enough problem keeping up with demand on the CPUs to be busy with GPUs.

        Tom
      • If it had its own dedicated memory bank, wouldn't it need to power that somehow? I'm not a chipset designer (So there may be a valid way to do this) but I thought that no matter where you put the memory it'd need roughly the same amount of power, and you'd also need to come up with a reliable and fast way of managing memory through the interface.

        All that said, an external memory bank to the card would make upgrades a lot nicer. Need another 256mb of graphics memory? Just slot it in!
        • >> an external memory bank to the card would make upgrades a lot nicer.
          It's funny, I've said that exact same thing. What's more interesting is that it turns out that they did actually used to make graphics cards with their own external memory banks, but they discontinued it because it was much cheaper and faster to stick with onboard memory. Plus, the cynic in me would be willing to bet that the graphics card providers prefer charging much more money for the few extra dollars worth of memory.
      • I want to see video cards on the Hyper Transport bus and with that you can cut down on the pci-e lanes needs amd maybe even have a low end chip set with only a few pci-e lanes and sata / ide ports.
    • The situation is not quite so rosy in the 3D graphics chipset arena, as the review of the Radeon HD 2900 XT a few days ago highlighted ... The Tech Report had to upgrade their PC's PSU to 750W to achieve stability.

      High end graphics cards have never been terribly friendly to power supplies. I see no reason to think the 2600 or 2400 cards - which have significantly fewer stream processors and are fabbed at 65nm instead of 80nm - will have unreasonable power draw.

      The much more important issue for DAAMIT at t

      • That's the beautiful thing, they don't need an excuse. They just have to not do it. Maybe if they believed it were sensible for them to give stuff away for free, they would. Why not present them with a business case?
        • Maybe if they believed it were sensible for them to give stuff away for free, they would.

          Documenting their hardware has nothing to do with "giving stuff away for free". Their video cards cost quite a bit of money, and AMD is losing sales every day to Intel because they won't release the docs to let people use their hardware. That's both video card *and* processor sales, because Intel graphics only come with Intel processors.

          Releasing programming documentation for hardware is pretty normal. In fact, high e

          • Sure, documenting their hardware means they give operational details away for free. That's not something they do now, so apparently it means something to them. You can argue that they are wrongheaded all you want, but realistically, they don't see that opening things up would make them more money. Your best bet is to attempt to convince them otherwise. Assuming that your principles are so clearly correct when they obviously don't agree isn't really getting you what you want, is it?
            • Sure, documenting their hardware means they give operational details away for free. That's not something they do now, so apparently it means something to them. You can argue that they are wrongheaded all you want, but realistically, they don't see that opening things up would make them more money.

              AMD doesn't provide clean avenues for contact on this sort of thing. There are only three types of action I could take to get them to listen:

              • I could try to get in a position where I was a direct customer, and th
    • by *weasel ( 174362 )
      CPUs and motherboards were doing the same thing. But since volume customers actually buy CPUs and motherboards, there wasn't as much incentive to develop power-hungry chips just for the sake of having the 'fastest'.

      GPUs are still going insane because volume customers don't buy them. And the people who do, frequently go for 'fastest' above all other considerations.

      You'll start to see reasonable GPUs when the GPU/CPU combined cores start shipping, the installed base for GPUs starts to grow past enthusiasts,
    • by Ruie ( 30480 )
      Two words: Anaconda cables [generalcable.com]

      From a practical point of view 1kw of power consumption uses up about $70/month if turned on 24x7 (assuming a high rate of $0.10/kWh). This is about the price of DSL. In energy units, this is the same as 20 gallons of gasoline (if you decide to provide it with a portable generator) or 100kg of coal. Though, of course, the environmentally conscious way is to use nuclear - in which case this is about 100g of uranium (assuming 10% efficiency).

    • by ponos ( 122721 )

      The situation is not quite so rosy in the 3D graphics chipset arena, as the review of the Radeon HD 2900 XT a few days ago highlighted ... The Tech Report had to upgrade their PC's PSU to 750W to achieve stability.

      High end cards will always draw much power because pro gamerz will tolerate this sort of thing (remember Voodoo, with external power supplies?). Most consumers will want to stay in the mid to high-end and upgrade frequently. Today this suffices for decent game performance (OK, you don't really

  • by Mish ( 50810 ) on Friday May 18, 2007 @07:04AM (#19176645)
    "Griffin"? "Puma"? Sounds like someone has a taste for mythological animals.

    Sarge: May I introduce, our new light reconnaissance vehicle. It has four inch armor plating, maaag buffer suspension, a mounted machine gunner position, and total seating for three. Gentlemen, this is the M12-LRV! I like to call it the Warthog.
    Simmons: Why 'Warthog' sir?
    Sarge: Because M12-LRV is too hard to say in conversation, son.
    Grif: No, but... why 'Warthog'? I mean, it doesn't really look like a pig...
    Sarge: Say that again?
    Grif: I think it looks more like a puma.
    Sarge: What in sam hell is a puma?
    Simmons: Uh... you mean like the shoe company?
    Grif: No, like a puma. It's a big cat. Like a lion.
    Sarge: You're making that up.
    Grif: I'm telling you, it's a real animal!
    Sarge: Simmons, I want you to poison Grif's next meal.
    Simmons: Yes sir!
    Sarge: Look, see these two tow hooks? They look like tusks. And what kind of animal has tusks?
    Grif: A walrus.
    Sarge: Didn't I just tell you to stop making up animals?
    Sarge: So unless anybody else has any more mythical creatures to suggest as a name for the new vehicle, we're gonna stick with 'the Warthog'. How about it Grif?
    Grif: No sir, no more suggestions.
    Sarge: Are you sure? How 'bout Bigfoot?
    Grif: That's okay.
    Sarge: Unicorn?
    Grif: No really, I'm... I'm cool.
    Sarge: Sasquatch?
    Simmons: Leprechaun?
    Grif: Hey, he doesn't need any help man...
    Sarge: Phoenix!
    Grif: Huh... Christ.
    Sarge: Hey Simmons, what's the name of that Mexican lizard, eats all the goats?
    Simmons: Uh, that would be the Chupacabra, sir!
    Sarge: Hey Grif! Chupathingie, how 'bout that? I like it! Got a ring to it...
    • "Griffin"? "Puma"? Sounds like someone has a taste for mythological animals.

      Uhh ... a puma [wikipedia.org] is a real animal.
      • Re: (Score:3, Funny)

        by Anonymous Coward

        --------> joke

        .. o
        . /|\ <---- you.
        . / \
        Did you even read the rest of his post? Obviously not.
      • hmm, rather tempted to edit the wikipedia entry to talk about the origins of the puma myth... Probably work the Incans into it somewhere. Now shall I make them protectors, or antagonistic creatures? *think think think*
    • by Targon ( 17348 )
      Well, we are talking about a platform, which is very much like a combination of different parts put together by magic. Chimera, Sphinx, Manticore, the list goes on for good code names for a project that involves multiple elements.

      I would like to see AMD work on audio technology as something to add. I still miss the old days of NVIDIA's Soundstorm.
    • "Griffin"? "Puma"? Sounds like someone has a taste for mythological animals.
      Or someone watched too much Family Guy.
      Buy AMD laptops! How dare you disobey me? You will bow to me!
  • by owlstead ( 636356 ) on Friday May 18, 2007 @07:42AM (#19176849)
    As long as the advertisements don't include weight (and adapter weight if possible) and battery life in their commercials, it might not be worth it to go for the ultra-compatible arena. Here in the Netherlands at least they even put in commercials for ultra-thins without noting the weight and the battery life, even if these figures are more than decent. The difference seems to be made by a 80 or 100 GB HDD, which I don't care about for a bit.
  • There are a couple of additional technologies in development at this time as well. Code-named "Pea" and "Tear", they are set to be released at the same time as "Griffin".
  • ...as long as Puma Man [wikipedia.org] doesn't fall into the picture in any way, I'm happy.
  • and yet Intel is going to put their 65nm chips on the market in a few months. Does anybody else see the problem here?
    • by trimbo ( 127919 )
      I think you mean 45nm for Intel.

      Yes, AMD is far behind Intel in manufacturing prowess... as is just about everyone.
    • This is a product announcement for something that will be launched at 65nm, it is not the first 65nm product. You're probably thinking of 45nm, which yes Intel is well ahead of AMD on. It's always been this way, with AMD lagging 6-12 months behind in process technology. So no, no big problem that I see.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...