Inside AMD's Phenom Architecture 191
An anonymous reader writes "InformationWeek has uncovered some documentation which provides some details amid today's hype for AMD's announcement of its upcoming Phenom quad-core (previously code-named Agena). AMD's 10h architecture will be used in both the desktop Phenom and the Barcelona (Opteron) quads. The architecture supports wider floating-point units, can fully retire three long instructions per cycle, and has virtual machine optimizations. While the design is solid, Intel will still be first to market with 45nm quads (the first AMD's will be 65nm). Do you think this architecture will help AMD regain the lead in its multicore battle with Intel?"
What?! (Score:4, Funny)
Re:What?! (Score:5, Funny)
Re: (Score:3, Funny)
Re: (Score:2)
Hey Einstein (Score:3, Informative)
Re: (Score:2)
Re: (Score:3, Funny)
Re:Hey Einstein (Score:5, Funny)
Re: (Score:2)
That's because when you make a clever joke on Slashdot...
Re: (Score:2, Funny)
Sorry what? (Score:5, Insightful)
While I think quad-cores are important for the server rooms, I just don't see the business case for personal use. It'll just be more wasted energy. Now if you could fully shut off cores [not just gate off] when it's idle, then yeah, hey bring it on. But so long as they sit there wasting 20W per core or whatever at idle, it's just wasted power.
To get an idea of it, imagine turning on a CF lamp [in addition to the lighting you already have] and leave it on 24/7. Doesn't that seem just silly? Well that's what an idling core will look like. It's in addition to the existing processing power and just sits there wasting Watts.
Tom
Re:Sorry what? (Score:5, Insightful)
Re: (Score:3, Interesting)
Re:Sorry what? (Score:4, Insightful)
That's the entire answer right there.
Re: (Score:2, Insightful)
Re:Sorry what? (Score:4, Insightful)
Re: (Score:2)
Re: (Score:3, Insightful)
I've been telling people not to bother buying fast processors for years now, unless I know they're heavily into their gaming or media editing. Every pound they don't sp
Re: (Score:3, Interesting)
You can extract CPU info from the
Tom
Re: (Score:2)
Yes.
Usually multicore means faster processor so yes it would help, but do you actually get better performance on 4x1GHz than you would on 1x4GHz? If not, then what you're actually looking for is a faster processor, not necessarily dual core.
Nothing will ever run faster on 4x1GHz than on 1x4GHz. But it might be related to the fact that the former is:
1) Possible (show me a 10GHz processor, whereas 4x2.5GHz is possible. Must have same IPC)
2) Mo
Re: (Score:3, Interesting)
But the Core 2 Duo is easily 2 times as fast to render AND is far superior when previewing video with lots of color correction or lots of layers of generated media (movie credits or text overlays are particularly harsh because of all the alpha blending for each source). The P4 system struggles t
Re: (Score:2)
For example, placing multiple layers of overlay text onto a color corrected & contrasted source video footage will bring a system to a crawl, especially if 3rd party plugins are used and certainly if those plugins are not capable of using multiple cores (looking at magic bullet there). Just previewing that complicated footage in Vegas will peg a CPU because of all the blending taking plac
Re: (Score:2)
I've heard people saying that for years, explicit threading has been around for a long time now and even in the server space I still see pretty much noone doing it "well". If you think all the desktop people are going to magically "get it" in the next 5, then good luck ... personally my next desktop machine is going to have two cores (mainly so that when one task goes nuts, it onl
Re:Sorry what? (Score:4, Informative)
Re:Sorry what? (Score:5, Informative)
So there are plenty of workstation uses for a quad core, but I agree that at the moment it's overkill for a home desktop.
less power (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Mods, pay attention (Score:2)
Re:Sorry what? (Score:5, Informative)
AMD's cool & quiet tech will shut down individual cores when you are not using them. I believe this is all new for the Barcelona. It idles down cores when you are not using them fully. It shuts off parts of cores that you aren't using (eg the FPU if you are only using integer instructions).
Variable voltages, variable MHz's (Score:2)
AMD's cool & quiet tech will shut down individual cores when you are not using them. I believe this is all new for the Barcelona. It idles down cores when you are not using them fully. It shuts off parts of cores that you aren't using (eg the FPU if you are only using integer instructions).
According to the last picture [imageID=9] in the Image Gallery, different cores on the same chipset can run at different voltages and different MHz's:
http://www.informationweek.com/galleries/showImage
Re: (Score:2)
I should point out that the Intel Core 2 Duo's can do this already.
Tom
Re: (Score:2)
Uh... (Score:3, Informative)
Uh, doesn't "make -j 3" gives you a good speedup? I'd imagine multi-core being great for development, at least for compiled languages.
Re: (Score:2)
I had a 2P dual-core opteron 2.6GHz box as my workstation for several months. To be honest I couldn't really find a legitimate use for it. And I was running gentoo and doing a lot of my own OSS development [re: builds].
man make
Re: (Score:2)
My point though is that unless you're doing build 24/7 it's just not worth it. That opteron box can build LTC in 8 seconds. A decent dual core can do it in 14 seconds. A single core can do it in ~30 seconds. The single core box can also use a lot less power.
I think a reasonable tradeoff for developers is a dual-core box. But for mo
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:2)
Well, that's where AMD shines -- the current 65nm X2s idle at under 4W, and that's for 2 cores... So, each idles at 2W. Yeah, they are still wasting power, but not nearly as much as you make it sound. That's 17KWh per year if you run a core *all the time*, or about $1 a year in electricity.
Mind you, Intel, idling at 3-4 times that power is still "free" for even your high-end home user with a couple of computers ru
Re: (Score:2)
Re: (Score:2)
Tom
Re:Sorry what? (Score:5, Funny)
Re: (Score:2)
Multiply that by the 10s of millions of cores running around out there and each individual 10W here 10W there that is wasted adds up.
Also if you read the slides you'd see that each core can clock independently but not voltage. So the savings aren't as good as they could be anyways. And at anyrate, better would be cutting cores off co
Re: (Score:2)
By your logic we could excuse any and all wastes of natural resources as no one of them would amount to the entire picture. It's ok for me to speed in my 10MPG SUV, cuz I'm just one person. And I should own a 5000sqft house just for myself, cuz my one house is doesn't contribute that much, etc...
Point is the vast majority of people don't need anywhere as near as powerful computing as they have. And where w
Why the fuss over 45nm? (Score:5, Insightful)
Sure, the 45nm process has great potential for better performance and higher efficiency, just like faster clock speeds had great potential - until AMD made a better architecture and achieved better performance at a lower clock speed than Intel's offerings at the time.
Let's wait and see how it really performs before passing judgement.
=Smidge=
Re:Why the fuss over 45nm? (Score:5, Funny)
Re: (Score:2)
Re: (Score:3, Interesting)
Indeed, let's wait for the benchmarks. I would like some more real-world and 64-bit benchmarks: most recent reviews seems to have studiously avoided those in favor of synthetic 32-bit only benchmarks that are not very representative and are easily skewed with processor-specific optimizations.
And I'm not sure going to 45nm process will allow Intel to step back ahead. It seems process improvements have been yielding diminishing results in performance related areas. Transistor density will go up, though, so
Re: (Score:2)
Support? (Score:2, Interesting)
Re: (Score:2, Insightful)
Mmmmmmmm....
(-j6 instead of -j4 in an effort to counter I/O latencies... Actually that'd be an interesting benchmark; figure out what the optimum level of parallelism is. Too little and processors will be idle, too much and context switches would become an issue.)
Re:Support? (Score:5, Informative)
Prevailing wisdom and personal experience suggest using "-j N+1" for N CPUs. I have a 4 CPU setup at home (dual dual-core Opterons). Here's are approximate compile times for jzIntv + SDK-1600, [spatula-city.org] which altogether comprise about 80,000 lines of source:
Now keep in mind, everything was in cache, so disk activity didn't factor in much at all. But, for a typical disk, I imagine the difference between N+1 and N+2 to be largely a wash. N+1 seems to be the sweet spot if the build isn't competing with anything else. Larger increments might make sense if the build is competing with other tasks (large background batch jobs) or highly latent disks (NFS, etc). But for a local build on a personal workstation? N+1.
--JoeRe: (Score:2)
Re: (Score:2)
I would have expected N to be the right choice, not N+1..
Re: (Score:3, Informative)
Happy to. At various points, one or more of the processes will be blocked in I/O. With N+1 tasks running, there's a higher likelihood that all N CPUs will be busy, despite the occasional I/O waits in individual processes. With only N tasks running, an I/O wait directly translates into an idle CPU during that period.
--JoeRe: (Score:2)
Oh, and I should add, as you add more processes, you spend more time context switching and you pollute the caches more, so it's a tradeoff. That's why performance falls as you go to higher and higher parallelism. At very high parallelism, you can go off a cliff if you exceed the available system RAM. That's why kernel devs like to do "make -j" on the kernel as a VM stress test.
--JoeRe: (Score:3, Informative)
Re: (Score:2)
Re: (Score:3, Interesting)
Maya 3D
Or any other 3d rendering software where every CPU cycle is used to the last drop.
But other than that I can't think of anything off the top of my head, but multi-cores is very important to these types of apps. It is the different between 12 and 6 hours waiting for the project to render then people will go with the 6 hours.
Re: (Score:3, Insightful)
Re: (Score:2)
Quad core becomes a little tricky. When one spare cpu can run every background task wi
Re: (Score:2)
Re: (Score:2)
but are there really that many apps as of yet that can take advantage of it?
Desktop apps that can leverage quad-core... Hmm, let's see:
Intel released its first Quad almost 6 months ago and by all accounts there are plenty of customers. So, either you're correct and these buyers are morons making ~$300 [1] mistakes or you're wrong and people with the dough to pay for it actually need [2] it.
Which do you think it is
Re: (Score:2)
Scalability, 64-bit, and FPU (Score:4, Interesting)
Core 2 Duo? (Score:4, Funny)
Colour me disappointed...
AMD IS Doomed to Always Be a Follower Unless... (Score:2, Offtopic)
Re: (Score:2, Interesting)
A smaller firm operating on tighter margins like AMD could easily go belly-up trying to break out with a new CPU microarchitecture. At least Intel could afford all of Itanic's f
Re: (Score:2)
This probably has more to do with the fact that IA64 was garbage than any inherent attachment to x86. Microsoft even went to great lengths to support it, which is much more than you can say for SPARC or POWER. There's plenty of room, especially in the *nix server market, for processors unrelated to x86. With Linux or the BSDs, all you really need to do is sen
Re: (Score:2, Informative)
Re: (Score:2)
They are following because they are barely making a profit, the last I heard. Why? Because they have to compete by drastically cutting prices to compete head-on with Intel. With a new architecture and a new market niche (mostly embedded systems and mission-critical systems), they would leave Intel in the dirt. The desktop market would follow soon afterwards when the industry comes to its senses and realizes that it has been doing it wr
Re: (Score:2)
There's a semantic distinction between "following" and "trailing". "Following" is doing whatever your competitor is doing, after they have done it. "Trailing" is to merely be behind, as in a race or other competition. AMD may be trailing due to being unprofitable and losing some marketshare, however this does not indicate that they are following Intel.
It seems you want AMD to make a completely new architecture (microarch or ISA?
Re: (Score:2)
Itanium is not revolutionary. Not even IBM's Cell processor is revolutionary. All current processors are based on and optimized for the algorithm. That's a 150-year old technology! There is a better way to dothings. Sooner or later we will be forced to break with the past. The early bird gets the worm.
Re: (Score:2)
Yeah, but this ain't no ordinary worm. This one will make the first computer revolution pale in comparison.
Re: (Score:2)
Is the current state of COSA a web-page with some ideas, or is tools and a VM available so that people can actually play with it?
Re: (Score:2)
Unfortunately, nothing concrete is available yet. A few people are currently discussing the possibility of creating a free COSA VM or even a COSA OS on SourceForge.
Re: (Score:2)
The Actor model is what you ask for.
Re: (Score:2, Insightful)
Now, asynchronous dataflow (with the appropriate support for dealing with complex data structures) might actually be helpful to slash some of the comple
Re: (Score:2)
What's new? (Score:2)
Yeah, but when can I buy quadcores from AMD? (Score:2)
Re: (Score:2, Informative)
Re:Begging the question (Score:5, Informative)
Tom
Re: (Score:2)
Re: (Score:2)
Re:Begging the question (Score:5, Informative)
Re: (Score:2)
So you were much better off waiting until Core 2 regardless, if you wanted an Intel dual core anyway.
Re:Begging the question (Score:5, Funny)
Craptacular indeed (great new word) - the only thing craptacularer was the Celeron D they had out at the same time, which despite the name was not dual-core. Very amusing though, watching the 'tards with enough knowledge to be dangerous and who wanted a cheap PC,
"That one's a 'D', that's got 2 processors, that makes the internet faster"
Re: (Score:2)
Re:Begging the question (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The Athlon X2 was superior to the Pentium D. It wasn't until Core 2 Duo that Intel took the lead in desktop CPUs.
I think you mean "regained". Computers did exist before 2003.
Re: (Score:3, Informative)
They have a good chance. For one, their market share is rather higher than you make it out to be: about 20% of the 80x86 market vs Intel's 80%. Also, the computer manufacturers have an interest in keeping the competition between Intel and AMD alive. Unless they behave irrationally, they will help AMD to fully break the monopoly.
But the main thing that is pending for AMD is the antitrust lawsuit. Assuming there will be a just judgment, which is not a given with the US justice system led by the likes of Alb
Re: (Score:2)
That's odd. When I built my current PC in 2003 I looked seriously at using an AMD CPU rather than Intel, and discovered that the 'equivalent' AMD CPU was not only slower than the Intel CPU, but more expensive too. And, unlike someone who's never used an Intel CPU in their PC, I have no aversion to using which ever one is better.
The simple fact is that AMD had a brief period where they were technically better than
That's ONE theory, I guess.... (Score:2)
Intel should have realized, from how U.S. govt. treated Microsoft and others, that they weren't in NEED of someone like AMD to cut deeply into their sales for a while with truly competitive products.
I'd say your scenario would hold much more merit if govt. had already broken up Microsoft into separate divisions or something....
The fact is, AMD has occasionally built a very comparable, yet cheaper alternative to Intel's offerings. (Remember the success of AMD
Re: (Score:2)