Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Hardware

Inside AMD's Phenom Architecture 191

An anonymous reader writes "InformationWeek has uncovered some documentation which provides some details amid today's hype for AMD's announcement of its upcoming Phenom quad-core (previously code-named Agena). AMD's 10h architecture will be used in both the desktop Phenom and the Barcelona (Opteron) quads. The architecture supports wider floating-point units, can fully retire three long instructions per cycle, and has virtual machine optimizations. While the design is solid, Intel will still be first to market with 45nm quads (the first AMD's will be 65nm). Do you think this architecture will help AMD regain the lead in its multicore battle with Intel?"
This discussion has been archived. No new comments can be posted.

Inside AMD's Phenom Architecture

Comments Filter:
  • Sorry what? (Score:5, Insightful)

    by tomstdenis ( 446163 ) <tomstdenis@gma[ ]com ['il.' in gap]> on Monday May 14, 2007 @11:45AM (#19114959) Homepage
    I had a 2P dual-core opteron 2.6GHz box as my workstation for several months. To be honest I couldn't really find a legitimate use for it. And I was running gentoo and doing a lot of my own OSS development [re: builds].

    While I think quad-cores are important for the server rooms, I just don't see the business case for personal use. It'll just be more wasted energy. Now if you could fully shut off cores [not just gate off] when it's idle, then yeah, hey bring it on. But so long as they sit there wasting 20W per core or whatever at idle, it's just wasted power.

    To get an idea of it, imagine turning on a CF lamp [in addition to the lighting you already have] and leave it on 24/7. Doesn't that seem just silly? Well that's what an idling core will look like. It's in addition to the existing processing power and just sits there wasting Watts.

    Tom
  • by Smidge204 ( 605297 ) on Monday May 14, 2007 @11:46AM (#19114991) Journal
    Ultimately, it's performance that makes a successful product, not gigahertz or nanometers.

    Sure, the 45nm process has great potential for better performance and higher efficiency, just like faster clock speeds had great potential - until AMD made a better architecture and achieved better performance at a lower clock speed than Intel's offerings at the time.

    Let's wait and see how it really performs before passing judgement.
    =Smidge=
  • Re:Sorry what? (Score:5, Insightful)

    by LurkerXXX ( 667952 ) on Monday May 14, 2007 @11:50AM (#19115043)
    Certain apps get a big boost from quad cores, lots of others don't. Some of those apps aren't for servers. For example, if you happen to do a ton of video editing, a quad core might be a good choice. I'll agree with you for most of us it's silly on the desktop right now. That won't necessarily be true in a few years when they write a lot more apps that need and take advantage of multithreading.
  • Re:Support? (Score:2, Insightful)

    by EvanED ( 569694 ) <{evaned} {at} {gmail.com}> on Monday May 14, 2007 @11:56AM (#19115173)
    MAKE -j6.

    Mmmmmmmm....

    (-j6 instead of -j4 in an effort to counter I/O latencies... Actually that'd be an interesting benchmark; figure out what the optimum level of parallelism is. Too little and processors will be idle, too much and context switches would become an issue.)
  • Re:Support? (Score:1, Insightful)

    by Anonymous Coward on Monday May 14, 2007 @11:58AM (#19115193)
    Don't you get it? Let's say you have four processor hungry applications that aren't multi-threaded. Cool! One runs on each core...

    We've had this for YEARS. Literally, 20 years.
  • less power (Score:5, Insightful)

    by twistedcubic ( 577194 ) on Monday May 14, 2007 @12:16PM (#19115487)
    Actually, I just got a 65W Athlon X2 4600+ from Newegg which uses less power than my current 6 year old Athlon XP 1800+. The motherboard (ECS w/ ATI 690G) I ordered supposedly is also energy efficient. I guess I could save $60 by getting a single core, but almost all single core Athlons are rated at more than 65W. Why buy a single core when it costs more long term and is slower when multi-tasking?
  • Re:Sorry what? (Score:4, Insightful)

    by LurkerXXX ( 667952 ) on Monday May 14, 2007 @12:31PM (#19115745)
    That being said, it's a lot easier to get a 10 GHz computer with 4x2.5GHz CPUs, than it is to make a single 10 GHz CPU.

    That's the entire answer right there.
  • Re:Support? (Score:3, Insightful)

    by QuasiEvil ( 74356 ) on Monday May 14, 2007 @12:56PM (#19116257)
    So I suppose whatever OS you're using only has one thread/process running at a time? I've never understood the argument that multi-core doesn't benefit the desktop user. As I look at my machine right now, I have two development environments going (one actually in debug), four browser windows, an email client, an IM client, various background junk (virus scanner, 802.1x client for the wireless), and of course the OS itself - XP. None of those needs a more powerful proc, but it's nice when they're all grabbing for CPU time that I have two cores for them to run on.

    At home I'm often converting images from RAW in the background and doing postprocessing on them in the foreground. RAW->JPEG conversion is CPU intensive, and it's nice that it doesn't bring my system to its knees while doing it. I can continue about my work, while the converter is maxing out one of the cores in the background.

    I've had dual procs since 1996, and would never, ever, ever go back. It's so nice to not have everything stall when a background job starts hogging the CPU.
  • Re:Sorry what? (Score:2, Insightful)

    by Short Circuit ( 52384 ) * <mikemol@gmail.com> on Monday May 14, 2007 @01:01PM (#19116361) Homepage Journal
    Well, there's the copious amounts of per-core cache. That helps. Then there's the fact that it's a hell of a lot cheaper to make a four parts that run at 2 GHz than one part that runs at 8GHz. (Like, it can't be done right now.)
  • Re:Sorry what? (Score:4, Insightful)

    by somersault ( 912633 ) on Monday May 14, 2007 @01:09PM (#19116551) Homepage Journal
    Also your computer tends to be doing quite a lot in the background (especially with lots of 3rd party crapware/virus scanners/firewalls loaded onto it) rather than just running whatever app you currently want to be using. It's nice to be able to experience the full potential of one core in the app that you do want to use while leaving another core to handle background services, though I don't know if Windows automatically organises processor time to do that kind of thing, and I've never tried splitting my tasks over my 2 cores manually. I guess my system is nippier than my old single core one, though the thing is that you tend not to notice stuff that *isn't* there (ever got a shiny new graphics card and just been like "oh.. everything's the same but without the slowdowns!" .. can be kinda anticlimactic!)
  • by agg-1 ( 916902 ) on Monday May 14, 2007 @05:36PM (#19121661)
    I hate to break the news to you, but your proposed "silver bullet" is hardly something new. Synchronous dataflow has been with us at least since the 1970s. It's great for designing hardware, DSP software and other simple kinds of algorithms, but as a panacea for all the diseases of the software world? I wish I had some of the stuff that you're smoking. :)

    Now, asynchronous dataflow (with the appropriate support for dealing with complex data structures) might actually be helpful to slash some of the complexities of developing efficient software for massively parallel computers, and in fact there has been renewed interest in such techniques since the 1990s, especially in functional programming circles (see, e.g., the work on "functional reactive programming" by Yale's Haskell group and others, http://www.haskell.org/frp/ [haskell.org]).

    But, as others have already pointed out, there's still billions of lines of "legacy" code to support if you want to go to market with such a system. So even if modern dataflow models prove useful for multicore architectures, they will still be confined to special niches for quite some time, IMHO.
  • Re:Sorry what? (Score:3, Insightful)

    by MrNemesis ( 587188 ) on Tuesday May 15, 2007 @12:26PM (#19131973) Homepage Journal
    I'd be nice if things worked like that, but 90% of the time you're bottlenecked on I/O anyway (usually swapping due to insufficient memory to run all those craplets) and you're hard pressed to take advantage of one core, let alone four of the things. Of course, once everyone has their 1GB+ of RAM then SMP might get a better chance to shine...

    I've been telling people not to bother buying fast processors for years now, unless I know they're heavily into their gaming or media editing. Every pound they don't spend on that I get them to sink into aftermarket RAM, and soon find their 1GHz/2GB machine at home is "faster" than the 3GHz/512MB machine at work. Only then do they understand that I wasn't mad... :D

Work is the crab grass in the lawn of life. -- Schulz

Working...