Inside AMD's Phenom Architecture 191
An anonymous reader writes "InformationWeek has uncovered some documentation which provides some details amid today's hype for AMD's announcement of its upcoming Phenom quad-core (previously code-named Agena). AMD's 10h architecture will be used in both the desktop Phenom and the Barcelona (Opteron) quads. The architecture supports wider floating-point units, can fully retire three long instructions per cycle, and has virtual machine optimizations. While the design is solid, Intel will still be first to market with 45nm quads (the first AMD's will be 65nm). Do you think this architecture will help AMD regain the lead in its multicore battle with Intel?"
Sorry what? (Score:5, Insightful)
While I think quad-cores are important for the server rooms, I just don't see the business case for personal use. It'll just be more wasted energy. Now if you could fully shut off cores [not just gate off] when it's idle, then yeah, hey bring it on. But so long as they sit there wasting 20W per core or whatever at idle, it's just wasted power.
To get an idea of it, imagine turning on a CF lamp [in addition to the lighting you already have] and leave it on 24/7. Doesn't that seem just silly? Well that's what an idling core will look like. It's in addition to the existing processing power and just sits there wasting Watts.
Tom
Why the fuss over 45nm? (Score:5, Insightful)
Sure, the 45nm process has great potential for better performance and higher efficiency, just like faster clock speeds had great potential - until AMD made a better architecture and achieved better performance at a lower clock speed than Intel's offerings at the time.
Let's wait and see how it really performs before passing judgement.
=Smidge=
Re:Sorry what? (Score:5, Insightful)
Re:Support? (Score:2, Insightful)
Mmmmmmmm....
(-j6 instead of -j4 in an effort to counter I/O latencies... Actually that'd be an interesting benchmark; figure out what the optimum level of parallelism is. Too little and processors will be idle, too much and context switches would become an issue.)
Re:Support? (Score:1, Insightful)
We've had this for YEARS. Literally, 20 years.
less power (Score:5, Insightful)
Re:Sorry what? (Score:4, Insightful)
That's the entire answer right there.
Re:Support? (Score:3, Insightful)
At home I'm often converting images from RAW in the background and doing postprocessing on them in the foreground. RAW->JPEG conversion is CPU intensive, and it's nice that it doesn't bring my system to its knees while doing it. I can continue about my work, while the converter is maxing out one of the cores in the background.
I've had dual procs since 1996, and would never, ever, ever go back. It's so nice to not have everything stall when a background job starts hogging the CPU.
Re:Sorry what? (Score:2, Insightful)
Re:Sorry what? (Score:4, Insightful)
Re:AMD IS Doomed to Always Be a Follower Unless... (Score:2, Insightful)
Now, asynchronous dataflow (with the appropriate support for dealing with complex data structures) might actually be helpful to slash some of the complexities of developing efficient software for massively parallel computers, and in fact there has been renewed interest in such techniques since the 1990s, especially in functional programming circles (see, e.g., the work on "functional reactive programming" by Yale's Haskell group and others, http://www.haskell.org/frp/ [haskell.org]).
But, as others have already pointed out, there's still billions of lines of "legacy" code to support if you want to go to market with such a system. So even if modern dataflow models prove useful for multicore architectures, they will still be confined to special niches for quite some time, IMHO.
Re:Sorry what? (Score:3, Insightful)
I've been telling people not to bother buying fast processors for years now, unless I know they're heavily into their gaming or media editing. Every pound they don't spend on that I get them to sink into aftermarket RAM, and soon find their 1GHz/2GB machine at home is "faster" than the 3GHz/512MB machine at work. Only then do they understand that I wasn't mad...