Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Intel Hardware

Dual-Core Shoot Out - Intel vs. AMD 311

sebFlyte writes "The Intel vs AMD battle of the benchmarks continues. ZDNet is running its rather comprehensive-looking guide to a side-by-side test of Intel and AMD's dual-core desktop chips, the Athlon 64 X2 3200+ and the Pentium D 820. They look at pure performance, as well as the difference it makes to apps you might use on the desktop. In the end, AMD comes out as the winner. From the article: 'AMD currently offers the most attractive dual core option. The Athlon 64 X2 3800+ may cost $87 more than its Intel counterpart, the Pentium D 820, but the AMD chip is a much better performer. It also uses considerably less power.'"
This discussion has been archived. No new comments can be posted.

Dual-Core Shoot Out - Intel vs. AMD

Comments Filter:
  • Really? (Score:4, Insightful)

    by NVP_Radical_Dreamer ( 925080 ) on Friday November 04, 2005 @04:45PM (#13952971) Homepage
    It costs almost $100 more and is faster? What are the odds...
  • by Work Account ( 900793 ) on Friday November 04, 2005 @04:51PM (#13953047) Journal
    That is why I told everyone the best chip overall based on VALUE.

    You can get a brand new chip that is almost as fast as any other chip in the world, but at the PERFECT sweet spot in terms of price/performance.

    Information here in my first post above that ironically 1 person modded off-topic in a thread about the best consumer processors: http://hardware.slashdot.org/comments.pl?sid=16735 5&cid=13952965 [slashdot.org]
  • by panth0r ( 722550 ) <panth0r@gmail.com> on Friday November 04, 2005 @04:59PM (#13953129) Homepage
    Grant it, it'd be cool to have a greater amount of choice when choosing a platform, but the IA32 ISA is pretty well locked as the standard, and looking into the future with the "Apple Switch," the standard is going to stick, anyway, what's wrong with the IA32/x85 architecture. I can think of several reasons, but seeing as how embedded the architecture is in society and business, a change to a new architecture (preferably one that doesn't have it's roots in really old technology, like the x86 does) would take years, if not decades. Just a little warning, IA32 is here to stay, and if you're a developer or EE, then it seems preferable to stick with the current standard.
  • by Pharmboy ( 216950 ) on Friday November 04, 2005 @05:06PM (#13953189) Journal
    Personally, I don't use dual cpus for servers because they are faster. As you say, they are not, kernel wise anyway. Dual CPUs do offer a higher availability, and the ability to crank the crud out of one cpu (compiling, etc.) while you can still get stuff done with the other. I use dual cpu VERY limited on desktop, but even then, I notice a difference in my ability to switch back and forth and start new apps, etc. while I am doing very heavy tasks.

    On the server side, if a single threaded process goes haywire, instead of locking the box up, I can still log in and kill the process, no biggie. I have accidently "infinite looped" myself to death on single cpu boxes, and had to hard boot them, where on the dual, that wouldn't be an issue. That is just my experience, but I've been using dual cpus on several servers for over 6 years now. I would rather have dual 1gz than single 2.5ghz any day.
  • by Knight Thrasher ( 766792 ) * on Friday November 04, 2005 @05:06PM (#13953194) Journal
    HL2 ran fine at 800x600 a year ago on a $200 build I made with a AMD Athlon 2500+ Barton, and 512meg DDR333, with a Geforce5700LE.

    A benefit that's not been discussed so far is that all the Intel or AMD backers that run out, buying handfuls of whichever their preference, rapidly decrease the price of technology that's not absolutely brand new.

    AMD's M2 release in the spring will drop the AMD X2's in price, and the s939 single-core 64bit processors even lower. Wait until you can secure yourself true 64bit goodness for less than $200. Right now it's hovering around that mark, just above it. When it dips we all profit.

  • by Xonstein ( 927931 ) on Friday November 04, 2005 @05:10PM (#13953237)
    "The Intel vs AMD battle of the benchmarks continues."

    AMD has pretty much trounced Intel performance at every desktop and server pricepoint for the last 2 years at least, so who cares anymore? Even Dell has started carrying AMD CPU parts:

    http://tinyurl.com/c57po [tinyurl.com]

    Dell is pretty much singlehandedly holding up Intel on the desktop, as they can drive the overall system price down on volume despite the higher-priced parts.

    If their little Israel division hadnt come up with their M chips they'd even be worse off.
  • Re:Pick two (Score:3, Insightful)

    by Ignignot ( 782335 ) on Friday November 04, 2005 @05:11PM (#13953251) Journal
    Add to that "fails gracefully"

    If you have ever seen the videos of people taking the heat sinks off Intel chips while running quake 3, and the chips surviving then you would understand where the chips stand in this category.
  • Re:Backwards? (Score:5, Insightful)

    by Zathrus ( 232140 ) on Friday November 04, 2005 @05:18PM (#13953313) Homepage
    The AMD chip is more expensive and uses less power than the Intel chip? Isn't it usually the other way around?

    AMD chips have been the "low power" leaders for quite some time now -- at least 2 years. Pretty much since the introduction of the Athlon XP models.

    As for the price difference -- yes, the Athlon64 X2 chips are more expensive than their Intel "counterparts", but if you look at the benchmarks or the design you'll see why -- the Intel chips are a rush job and poorly (but cheaply) designed. You don't get anywhere near the performance of the AMD design though, and Intel's already stated that this won't change until mid 2006.

    Trust me, Dell is screaming bloody murder over this -- since the superiority of the Athlon64 X2 chips is completely undeniable, more and more of the server market is now shifting to AMD. And Dell is still purely Intel. Thing is, even if Dell was willing to break their allegience, it's doubtful that AMD could fulfill the quantities that Dell would want. They just don't have the fab capacity. And unless that changes, there's little reason for Dell to anger Intel (and lose some of the vast discounts that they get from Intel in the process).
  • by Waffle Iron ( 339739 ) on Friday November 04, 2005 @05:31PM (#13953414)
    I am _so_ sick of the x86 architecture

    Why? Unless you write your code in assembler (or you have some kind of irrational preference for a particular endianness), you'll never tell the difference between instruction set architectures. The only user-observable or programmer-observable difference between CPUs is speed, and x86 is faster.

  • Re:Really? (Score:1, Insightful)

    by Anonymous Coward on Friday November 04, 2005 @05:36PM (#13953467)
    Well that extra 50W the Intel pulls down, at 8 hours a day, for 3 years, and 10 cents a unit will cost you you an extra $43 or so over the years. Never mind that it performs a lot worse, so you'll be running at full load for longer on intensive tasks or getting less performance. Oh, and the Intel chip requires a top-of-the-line Intel chipset to run it, so factor in some extra costs for the platform. At least DDR2 memory is roughly price-equal to DDR now.
  • Re:Backwards? (Score:3, Insightful)

    by rm999 ( 775449 ) on Friday November 04, 2005 @05:41PM (#13953505)
    Oh well, Dell can easily ride it out - they probably make majority of their money off ordinary computer users who don't know any better and have no clue what this "AMD" thing is.
  • by hattig ( 47930 ) on Friday November 04, 2005 @06:09PM (#13953745) Journal
    The issue is that the second link is to the TDP of the X2 3800+, not the actual power consumption which will be lower than 89W, more likely 70W.

    What matters is system power consumption however. CPU + Chipset + Everything Else. Of course you can test with Everything Else being the same, so it comes down to the CPU + Chipset. AMD have an on-die memory controller, so that is a couple of Watts saved over the Intel chipset, however Intel's chipsets are traditionally quite efficient (although whether or not the chipset for dual-core processors is I don't know). Best bet is to measure at the socket.

    http://techreport.com/reviews/2005q2/athlon64-x2/i ndex.x?pg=15 [techreport.com]

    Under load:
    Pentium D 840 uses 292W at the socket.
    Athlon 64 X2 4200+ uses 178W at the socket.

    Difference is 114W. Plug that into your calculator!
  • by FishandChips ( 695645 ) on Friday November 04, 2005 @06:32PM (#13953950) Journal
    AMD comes out on top quite rightfully but actually neither of these processors offers good value for, perhaps, the majority of all computer buyers. A great deal of what folks do - word processing, surfing, email, etc - can be done very well on a p3, a Mac Mini or even a Via Epia combo. The trend to bigger is better has simply landed people with behemoth-sized machines that are expensive to buy and run and messy to maintain.

    It's also allowed free rein to OS bloat. And 1001 WinDel reviewers who'll gladly tell us that we really must have that 5-litre SUV to run the kids a couple of miles to school. That said, if you do need this kind of power then imho AMD's current chips offer a superb solution, but it's not for everyone.
  • by Nom du Keyboard ( 633989 ) on Friday November 04, 2005 @07:06PM (#13954222)
    Administering the threads carries an overhead, though, which means that dual core processors are never exactly twice as fast as their single core counterparts.

    Sometimes they're faster.

    How can this be?

    Context switching between threads expensive in terms of cycles on a microprocessor. A second processor can cut down immensely on context switching - or even virtually eliminate it when only two threads are active.

  • by kannibal_klown ( 531544 ) on Friday November 04, 2005 @07:43PM (#13954470)
    Waitaminute... The AMDs use less power and are faster than the Intels? And they're cheaper? Would somebody please explain to me again why Apple is moving to Intel?

    I'd guess quantity.

    Intel probably has the infrastructure to handle the increase of production created by Apple prodcuts, or at least appears to.

    AMD might be able to handle the load, but Apple probably don't want to risk it. They've been burned twice already in the last few years by companies that couldn't keep up with the needed quantity or speed increases. While AMD would definately be able to handle the speed increases Apple is probably worried they'll buy another dry well.

  • by SirSlud ( 67381 ) on Friday November 04, 2005 @08:25PM (#13954753) Homepage
    True true, but Apple is moving away from IBM and towards x86, not towards Intel, per se. While I agree that this takes some of the sheen off the claim that its the smartest move available, its still better than sticking with PowerPC. Intel, being the bigger player in DRM here, is going to give Apple the corperate confidence factor of locking the OS on approved hardware, so its a no brainer.

    So yeah, you have a point, but I think its largly moot. Apple wants to kick MS in the shins, not destroy it. Moving to Intel puts Apple in a position to put Intel in a bad spot - who do they treat preferentially? Apple isn't MS, but they still sell a shitload of machines. Hopefully, Intel has to become non-biased from an OS standpoint, and we all benifit. Meanwhile, AMD has the most to gain here; as the benifits of exclusive deals and advertising subsities for OEMS is reduced because the relationships are no longer exclusive (risk sharing, it makes any business person cream in their pants), AMD suddently has more leverage in OEM talks. Its a good time to be an OEM I think; everyone wants to be your suitor, and you're ultimately the gate keeper. AMD, on the outside looking in, gaining the critical praise, ramping up production, cheaper R&D labour, and all that, is suddenly about to be the belle of the ball.

The one day you'd sell your soul for something, souls are a glut.

Working...