Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Intel Hardware

Dual-Core Shoot Out - Intel vs. AMD 311

sebFlyte writes "The Intel vs AMD battle of the benchmarks continues. ZDNet is running its rather comprehensive-looking guide to a side-by-side test of Intel and AMD's dual-core desktop chips, the Athlon 64 X2 3200+ and the Pentium D 820. They look at pure performance, as well as the difference it makes to apps you might use on the desktop. In the end, AMD comes out as the winner. From the article: 'AMD currently offers the most attractive dual core option. The Athlon 64 X2 3800+ may cost $87 more than its Intel counterpart, the Pentium D 820, but the AMD chip is a much better performer. It also uses considerably less power.'"
This discussion has been archived. No new comments can be posted.

Dual-Core Shoot Out - Intel vs. AMD

Comments Filter:
  • by Work Account ( 900793 ) on Friday November 04, 2005 @03:45PM (#13952965) Journal
    The best price/performance deal is the $146 AMD 3000 chip.

    It is an amazing little bugger that can git er done with ease but does not cost and arm + leg.
    • And Orange juice contains significantly more vitamin C than apple juice, a comparison which has just as much to do with dual-core processors as your comment about the single core Athlon 64 3000+.
    • No, the best deal is to do absolutely buy nothing if you have a less that 3 year old system and do not play any modernish games.
    • Is that considering just the chip or considering the round of upgrades that most people would need to upgrade to it?
    • This is looking very good for Sun with their new hardware. Solaris 10, though a little more bloated than Linux, is rock stable. This is going to be the answer to their prayers.
  • Really? (Score:4, Insightful)

    by NVP_Radical_Dreamer ( 925080 ) on Friday November 04, 2005 @03:45PM (#13952971) Homepage
    It costs almost $100 more and is faster? What are the odds...
    • That is why I told everyone the best chip overall based on VALUE.

      You can get a brand new chip that is almost as fast as any other chip in the world, but at the PERFECT sweet spot in terms of price/performance.

      Information here in my first post above that ironically 1 person modded off-topic in a thread about the best consumer processors: http://hardware.slashdot.org/comments.pl?sid=16735 5&cid=13952965 [slashdot.org]
    • It is the performance/price ratio that they talk about. The price is known so they just have a performance benchmark. The CPU that can crank the most FPUs (or something like that) per unit of time per $ is the winner. You can also factor in power if you want. The most expensive != the fastest sometimes.
    • Well if you had bothered to read the whole article, then you'd have noticed that Intel's top of the line costs $196 more than its AMD counterpart and still manages to practically suck in every department.

  • by morcheeba ( 260908 ) * on Friday November 04, 2005 @03:49PM (#13953011) Journal
    Or, put another way, the bottom-of-the-line AMD 3800+ is less than 1/3rd the price of the top-of-the-line Pentium 840 EE ($328 vs. $999), yet it still beats it in most of the benchmarks.

    Too bad they didn't compare the Pentium D 830 in the benchmarks - this is closer in price to the AMD 3800+
    • Really, it's just mean to benchmark these AMD dual-core chips against those Intel abominations. Yes, we get it. Intel did not really expect you to buy those (except through Dell). More dignified benchmark sites let the naked emperors just sulk off the stage without all the pointing and giggling.

      Of course, this might be an AMD strategy: Announce to all hardware sites that they will loan anyone their fanciest processors if they benchmark them against Intel, and if they can get their review posted on slashdo

  • Coral Cache (Score:4, Informative)

    by bflong ( 107195 ) on Friday November 04, 2005 @03:49PM (#13953018)
  • by Dr. Zowie ( 109983 ) <slashdot@@@deforest...org> on Friday November 04, 2005 @03:49PM (#13953020)
    Isn't this like the fourth time we've seen a Xeon-vs-AMD benchmark on the front page? It's old news.

    The problem with the Xeons is they're totally throttled. The Xeon was like a V-6 engine under a VW carburetor; the dual-core Xeon is like a big-block V8 under the same carburetor.

    The AMDs have better access to RAM and better (independent) cross-CPU communication. The dual-core Xeons were clearly rushed to market to answer AMD's offering, before Intel could get their own memory-access ducks in a row.

    • This isn't a opteron-xeon review. Its a Athlon dual core vs. the new intel dual core (i.e. not the P4 line).

      I mean hell it says it in the summary.
      • Its a Athlon dual core vs. the new intel dual core (i.e. not the P4 line).

        The Pentium D 820 is P4 core based. Same line, just an evolution.

    • Totally OT, but this is a site for geeks, and geeks like facts, figures and statistics... so... VW were actually the first manufacturer to utilise EFI and have, in fact, had EFI in cars (thereby doing away with the carbie) for over 30 years now...
      A link to the Type 3 [wikipedia.org] tells us that "Originally a dual carburetted engine, the Type 3 engine was modified in 1968 to include fuel injection, reputedly the first mass production consumer car with such a feature."
      Anyway...
      • Totally OT, but this is a site for geeks ... VW were actually the first manufacturer to utilise EFI and have, in fact, had EFI in cars

        Cool. The PC BIOS was really starting to look old. I guess people with cars that use OpenFirmware won't be able to laugh so much anymore...

  • by Anonymous Coward on Friday November 04, 2005 @03:52PM (#13953052)
    I was doing ok with article until I got to the two points about how their disk defrag and antivirus/spyware apps were running, slowing the machine down, and how a dual core would make this so much better. A dual core will do NOTHING for this user!!! Those two examples highlight the perfect situation where the bottleneck isn't even close to being the CPU, the disks are simply working at 100% capacity, and you can add as many cores as you want, it doesn't change that fact.


    zdnet is usually fairly good, but not this time.

  • by Homology ( 639438 ) on Friday November 04, 2005 @03:52PM (#13953062)
    Most arguments for dual core reminds me very much for similar arguments for using dual CPU, apart from the price, that is.

    A kernel compiled for a single CPU is faster than a kernel compiled for multipe CPU's, even when you only have one CPU. This is why OpenBSD has two kernels: 1) one cpu and 2) multpiple CPU's. The main developer of DragonBSD said that his preference is single CPU, performance wise (I'll leave that as a Google exercise).

    • by Pharmboy ( 216950 ) on Friday November 04, 2005 @04:06PM (#13953189) Journal
      Personally, I don't use dual cpus for servers because they are faster. As you say, they are not, kernel wise anyway. Dual CPUs do offer a higher availability, and the ability to crank the crud out of one cpu (compiling, etc.) while you can still get stuff done with the other. I use dual cpu VERY limited on desktop, but even then, I notice a difference in my ability to switch back and forth and start new apps, etc. while I am doing very heavy tasks.

      On the server side, if a single threaded process goes haywire, instead of locking the box up, I can still log in and kill the process, no biggie. I have accidently "infinite looped" myself to death on single cpu boxes, and had to hard boot them, where on the dual, that wouldn't be an issue. That is just my experience, but I've been using dual cpus on several servers for over 6 years now. I would rather have dual 1gz than single 2.5ghz any day.
      • by hackstraw ( 262471 ) * on Friday November 04, 2005 @06:27PM (#13954367)
        Personally, I don't use dual cpus for servers because they are faster. As you say, they are not, kernel wise anyway. Dual CPUs do offer a higher availability, and the ability to crank the crud out of one cpu (compiling, etc.) while you can still get stuff done with the other.

        Thats called scaling.

        I would rather have dual 1gz than single 2.5ghz any day.

        Me too. Personally and professionally I am simply able to do more when I have more processors available. In fact, a researcher friend of mine has a single CPU Intel machine with hyperthreading. His other researching buddies like to run CPU intensive programs wherever they can find a spare processor. It was irritating him that people were running programs on his box and it wasn't very responsive. He would renice the process, and that helped some. He then enabled hyperthreading, and then he didn't notice when people were running on his machine anymore.

        I've done benchmarks with "normal" applications, and overall I get the best performance when doing X number of things in parallel where X == the number of processors, cores, or "hyper-whatevers". Its that simple. I'm saying "best performance" not a linear or superlinear performance, but oftentimes its at least I get 30% more out of enabling hyperthreading. It also just makes the machine more smooth and interactive.

        I welcome the day when every computer has 30 or so processors. The more the better. Just so long as they go completely to sleep when not needed or in use. Someday.
    • by ArbitraryConstant ( 763964 ) on Friday November 04, 2005 @04:07PM (#13953207) Homepage
      "A kernel compiled for a single CPU is faster than a kernel compiled for multipe CPU's, even when you only have one CPU. This is why OpenBSD has two kernels: 1) one cpu and 2) multpiple CPU's. The main developer of DragonBSD said that his preference is single CPU, performance wise (I'll leave that as a Google exercise)."

      Dillon said he felt FreeBSD's focus on many CPUs to the exclusion of single-CPU performance was a mistake, not that single CPUs are preferable.

      Also, most desktop workloads benefit from having two CPUs, it helps responsiveness quite a bit (even on OSes with good schedulers like Linux). There is overhead for the locking in the kernel, but the benefit almost always outweighs the cost.
      • Dillon said he felt FreeBSD's focus on many CPUs to the exclusion of single-CPU performance was a mistake, not that single CPUs are preferable.

        He said something to the effect that he replaced multi CPU with singel CPU if he could. He wrote that in in the last year or so.

        Also, most desktop workloads benefit from having two CPUs, it helps responsiveness quite a bit (even on OSes with good schedulers like Linux). There is overhead for the locking in the kernel, but the benefit almost always outweighs t

        • "He said something to the effect that he replaced multi CPU with singel CPU if he could. He wrote that in in the last year or so."

          I couldn't find the quote, but if he's making the statement in general then he's simply wrong (since there's a demonstrable benefit more often than not), and if he's talking about the few cases where there's not a benefit and the overhead is a problem then it's not news because that's been known for a while. But I'd have to see the quote...

          "It helps with responsiveness, not perfo
  • Pick two (Score:3, Funny)

    by Crouty ( 912387 ) on Friday November 04, 2005 @03:53PM (#13953067)
    So again it's
    • Cheap
    • Fast
    • Low power consumption
    Pick any two.
    • Re:Pick two (Score:3, Insightful)

      by Ignignot ( 782335 )
      Add to that "fails gracefully"

      If you have ever seen the videos of people taking the heat sinks off Intel chips while running quake 3, and the chips surviving then you would understand where the chips stand in this category.
      • I would hazard a guess that not more than 30% of slashdot readers are the kind of people who think it is fun to pop the heat sink off while playing Quake 3, but that is still good to know.
      • Way to cite a demo done with 3+ year old technology. The "burning AMD" syndrome was ages ago - the Athlon XPs did run hot relative to their Intel counterparts back in the day, and they definitely did not fail gracefully. However, I'm fairly sure that Athlon 64's don't suffer from the same problem - first, they *are* much lower power than comparable Intel processors these days, second, I believe (though not sure) that they have built-in thermal monitoring and autoshutdown capabilities.
      • Re:Pick two (Score:3, Informative)

        by nickysn ( 750668 )
        You're talking about the infamous THG video [tomshardware.com]. No, Athlon 64 doesn't suffer [tomshardware.com] from that. (downloading videos from THG may need registration) See also this [tomshardware.com].
    • 1) Cheap
      3) Low power consumption

      crap...I guess I can't pick too. That would require a low power Intel part based on this list of CPU's (Athlon 64 3000+ gets all three though).
    • Let's see, the AMD 3800+ has less than half the power consumption of the Intel EE, costs less than a third as much and beats it in practically every benchmark. That's the Cliff Notes, you illiterate moron.
  • by digitaldc ( 879047 ) on Friday November 04, 2005 @03:58PM (#13953111)
    Like most people, I will wait it out until the dual-core chips / products are stable and less expensive.
    Not everyone is playing Quake 4 and Half-Life 2 on a daily basis.
    • HL2 ran fine at 800x600 a year ago on a $200 build I made with a AMD Athlon 2500+ Barton, and 512meg DDR333, with a Geforce5700LE.

      A benefit that's not been discussed so far is that all the Intel or AMD backers that run out, buying handfuls of whichever their preference, rapidly decrease the price of technology that's not absolutely brand new.

      AMD's M2 release in the spring will drop the AMD X2's in price, and the s939 single-core 64bit processors even lower. Wait until you can secure yourself true 64bit

    • I too will wait. I'm holding out for the chips with the new virtualization instructions and Xen 3. Dual core is nice but better virtualization is more important to me at this point.
  • by Daveznet ( 789744 ) on Friday November 04, 2005 @04:00PM (#13953142)
    Ahh the AMD wins overall in performance but can it cook make me a sunny side up egg as fast as the intel :P
  • by MyOtherUIDis3digits ( 926429 ) on Friday November 04, 2005 @04:03PM (#13953169)
    I read somewhere recently that 'more watts used' = 'more powerful'
  • Plenty of people have bought the top of the line chip while not realizing that their motherboard's performance's limiting reagent could be a number of things like bus speed and salt water. Someone do the elitenessly-challenged a favor and please post the minimum board specs one needs to take full advantage of this chip's juice.

    Oh and no need to mention which kernels and OSs would be ideal; we already know about the answer to that. [debian.org]

  • "The Intel vs AMD battle of the benchmarks continues."

    AMD has pretty much trounced Intel performance at every desktop and server pricepoint for the last 2 years at least, so who cares anymore? Even Dell has started carrying AMD CPU parts:

    http://tinyurl.com/c57po [tinyurl.com]

    Dell is pretty much singlehandedly holding up Intel on the desktop, as they can drive the overall system price down on volume despite the higher-priced parts.

    If their little Israel division hadnt come up with their M chips they'd even be worse off.
  • Itanium (Score:3, Interesting)

    by msbsod ( 574856 ) on Friday November 04, 2005 @04:17PM (#13953311)
    The ultimate multi-core processor technology is VLIW (or EPIC as INTEL calls it). The cores are broken up into lots of tiny pieces, instructions are distributed through various pipes and run through whatever is available in parallel. The Itanium processor is Intel's EPIC problem child. Too complex, too much heat. Maybe it is just a bit too early for this technology. I think Intel could try to start a "mobile" Itanium project. They were quite successful with their Pentium M. Maybe that will give Intel an advantage.
    Or, Intel designs a dual-Alpha processor to beat AMD, but that sounds not like Intel, does it? Someone at AMD who might like the idea? ;-)


    Your PC may have Intel inside, but did you know that Intel's fabs have VMS and Alpha inside?
  • FEAR doesn't take advantage of it according to FiringSquad [firingsquad.com].
  • The same summary says both Athlon 3200 and 3800, which one is it?

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Friday November 04, 2005 @04:36PM (#13953460)
    Comment removed based on user account deletion
    • I switched to AMD from Intel on my last build, and I've been totally happy with it...quick, quiet, and reliable, and I've been using Intel as long as you. The last time I switched over (nvidia to ATI), I wasn't nearly so pleased.

      AMD makes good chips.
    • I needed a new game machine, so I went to Newegg and splurged on an Athlon64X2 4400, a couple of XFX 7800GTX boards on top of a ASUS A8N SLI motherboard, and 2Gs of Corsair RAM. I have to say, this is the fastest computer I have ever had, and it aint even the top of the AMD line.

      With all this new hardware, the case is 10deg cooler than when I had a P4 in there, off the same 500W power supply. I was still buying P4s when all my buddies had screaming AMD boxes, and I could not keep up in either Battlefield

  • By my calculations, the power difference between the Intel and AMD will make up the difference in the chip prices in about a month of continuous operation, at lease for Seattle electric rates (~$0.06/kWh)...
    • at lease for Seattle electric rates (~$0.06/kWh)...

      I'll agree that, without question the 90nm Athlon 64's completely crush Intel for power consumption... But even at the 840's TDP of 130W, that still only comes out to $5.62 total per month.

      Now, I'll even agree that AMD gives a VERY conservative TDP for their current chips, while Intel even disclaims that you can't count on theirs as an upper limit. But I don't think you can claim that the dual core P4s draw anywhere near the continual 1450W it would t
  • by zsazsa ( 141679 ) on Friday November 04, 2005 @04:44PM (#13953538) Homepage
    Okay. According to this page [xbitlabs.com], at full-tilt the Pentium D 820 consumes 130.6W, while this page [hothardware.com] says the Athlon 64 X2 3800+ consumes 89W. So, how long would the Opteron have to run at full blast to make up the difference in cost of $87? Last month I paid $0.078 per kilowatt-hour. This seems to be reasonably average for the United States. 130.6W - 89W = 41.6W difference between the two. Some back of the Google-calculator math reveals: (US$ 87) / (41.6 W * ((US$ 0.078) / (kW * Hr))) = 3.05871582 years. A not-insignifigant amount of time. If you're in an area where electricity is more expensive like New York or California, the amount of time is even less!

    Feel free to correct my math!
    • The issue is that the second link is to the TDP of the X2 3800+, not the actual power consumption which will be lower than 89W, more likely 70W.

      What matters is system power consumption however. CPU + Chipset + Everything Else. Of course you can test with Everything Else being the same, so it comes down to the CPU + Chipset. AMD have an on-die memory controller, so that is a couple of Watts saved over the Intel chipset, however Intel's chipsets are traditionally quite efficient (although whether or not the c
    • Or...
      If you save 5 minutes in a day due to the performance difference and you make 20 dollars an hour. It will take you 52 days to make the difference.

  • I have a question that has perplexed me for years now (since I built my AMD computer in 2001). If AMD consistently offers better CPUs for a lower price, why do they lag behind so much in sales? You would think at some point capitalism would kick in, people would leave Intel, and AMD would start getting a better market share. So, what's the deal?
    • Google for "AMD Intel Antitrust" and see what you find. Basically Intel has allegedly been maintaining its monopoly via unorthodox (and in some cases plain illegal) means.

      People like Dell are 100% Intel because if they sell even 1 AMD chip they will lose millions in back-hander "Advertising Funds" that Intel ply's them with.

      The other reason they haven't been so popular in the data centre is that there has been a dirth of quality enterprise-level chipsets. The 4 and 8 Ways that Sun and IBM currently sell

    • Re:Question (Score:3, Informative)

      AMD hasn't consistenly offered better CPUs at a better price.

      Back when they were lagging in the performance race, with the early XP line (Palomino) versus the P4 (Northwood), AMD was trying hard just to keep up. They priced their processors typically 20% below equivilant-performing Intel processors.

      AMD also had a pathetic platform for the server space, which consisted of (at most) a 2-way Athlon MP system utilizing a single 266MHz bus. The only chipset available, the AMD 761MP, wasn't exactly a top perfor
  • Yeah I just wanna give mad props to my registers for makin this thing HAPPEN

    XMM0/MM0/EAX/AX/AL and
    XMM1/MM1/ECX/CX/CL and
    XMM2/MM2/EDX/DX/DL... I LOVE YOU GIRL...

    and I can't forget

    XMM3/MM3/EBX/BX/BL...

    You are all wonderful, thank you again!

    Er... shoot out?

  • by FishandChips ( 695645 ) on Friday November 04, 2005 @05:32PM (#13953950) Journal
    AMD comes out on top quite rightfully but actually neither of these processors offers good value for, perhaps, the majority of all computer buyers. A great deal of what folks do - word processing, surfing, email, etc - can be done very well on a p3, a Mac Mini or even a Via Epia combo. The trend to bigger is better has simply landed people with behemoth-sized machines that are expensive to buy and run and messy to maintain.

    It's also allowed free rein to OS bloat. And 1001 WinDel reviewers who'll gladly tell us that we really must have that 5-litre SUV to run the kids a couple of miles to school. That said, if you do need this kind of power then imho AMD's current chips offer a superb solution, but it's not for everyone.
  • by Nom du Keyboard ( 633989 ) on Friday November 04, 2005 @06:06PM (#13954222)
    Administering the threads carries an overhead, though, which means that dual core processors are never exactly twice as fast as their single core counterparts.

    Sometimes they're faster.

    How can this be?

    Context switching between threads expensive in terms of cycles on a microprocessor. A second processor can cut down immensely on context switching - or even virtually eliminate it when only two threads are active.

  • by bofar ( 902274 ) on Friday November 04, 2005 @06:32PM (#13954395)
    The thing that bothers me about all these reviews is they fail to mention that the Intel processors need (more expensive) DDR2 memory versus DDR for AMD. If one is going to compare prices of the processors, the cost of the faster memory required by should be included in the price of their processor. Also note, that when AMD comes out with Socket M2 processors, which support DDR2, then they should benchmark even faster.

I cannot conceive that anybody will require multiplications at the rate of 40,000 or even 4,000 per hour ... -- F. H. Wales (1936)

Working...