Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Quad-Core Opteron (Barcelona) Tech Report 201

crazyeyes writes "AMD has been very tardy with Barcelona. Countless AMD fans have eagerly awaited a new processor. As the day draws closer, TechARP takes a look at the upcoming quad-core AMD Opteron. Is there more to it than just its four processing cores? Will it be the Intel-killer that AMD promised long ago? From the article: 'AMD is in the same boat as ATI. Delays after delays of their long-awaited Barcelona core not only ensured the dominance of their rival, Intel, in the desktop processor market, it also ensured that Intel would be the only choice for those who want a quad-core processor. Although that wait will end in August, 2007 when the Barcelona is finally launched, it remains to be seen if AMD's new processor will be able to inflict serious damage to Intel's dominance.'"
This discussion has been archived. No new comments can be posted.

AMD Quad-Core Opteron (Barcelona) Tech Report

Comments Filter:
  • Intel's sever / workstation chip sets suck next to the ones for Opterons / AMD FX cpus.

    FB-DIMMS cost a lot and need alot more power to run then DDR ECC ram and the Intel chipsets have very few pci-e lanes. The nforce pro chipsets have the lanes for 2 full x16 slots with 2 x4 slots and pci-e lanes for on board sata / sas raid with x4 lanes left over that are some times used for pci-x slots.

    Also the amd chips have better cpu to cpu link.
    • Re: (Score:2, Insightful)

      by PDXNerd ( 654900 )
      It depends on the solution you are looking for. Besides, when you say Intel's chipsets suck compared to AMD, you're comparing nforce pro chipsets, which is Nvidia, not AMD. Try looking at Nvidia's chipsets for Intel's CPUs and make the same comparison.

      On another note, this is considered news? There was, quite literally, nothing to see here. It was a couple of paragraphs with a flashy new slide in it. It lost all credibility when this line was written:

      Of course, Intel is also rushing out a similar solution, in the form of their V8 programme. So, it is a race to see which company will be the first to release an 8-core platform. AMD stands to take some wind out of Intel's sails if they are the first out with their 8-core platform.

      Intel has had an 8-core platform since last summer.

      • The Nvidia's chipsets for Intel's CPUs are only for the desktop cpus not for the workstation / sever ones.

        Intel V8 for gaming is a joke FB-DIMMS no cross fire or sli.
      • by Maniac-X ( 825402 ) on Sunday July 22, 2007 @06:09PM (#19948997) Homepage

        Intel has had an 8-core platform since last summer. Are they talking about "native" quad-core? Does a slight technical difference matter when one exists and one does not exist? How do we know "native" quad-core is better than dual-dual-core-on-a-single-chip?


        Actually it makes a lot more difference than you'd think. This is most evident in caches. Intel's quadcore has two shared L2 caches (one per two cores). AMD has a full L2 cache per core AND a shared 2mb L3 cache. Intel doesn't have an L3 cache on any of their stuff. Besides that, HTT is a lot faster than Intel's dated FSB. More bandwidth and faster aggregate links means that yes, the native quadcore will be a lot better.

        Aside from that, AMD also still has much better memory performance via the on-chip memory controller, and doubled-width op registers from the last gen AMD stuff.
        • '' Actually it makes a lot more difference than you'd think. This is most evident in caches. Intel's quadcore has two shared L2 caches (one per two cores). AMD has a full L2 cache per core AND a shared 2mb L3 cache. Intel doesn't have an L3 cache on any of their stuff. ''

          Very interesting comparison. The way I see it: AMD has 0.5 MB L2 per core, and 2MB L3 shared between four cores. Intel has 4MB L2 shared between two cores, and 4MB L2 shared between the other two cores. Anything where 0.5MB is enough, AMD
          • Re: (Score:3, Insightful)

            by PopeRatzo ( 965947 ) *
            It will be an easy decision for many of us. Whichever platform runs our applications the best will be the one we spend our hard-earned cash on. Personally, if the boost in productivity (music and video production) I got when I moved to a dual Xeon and a Core2Duo (2 boxes, of course) is any indication, I'm going to like this proliferation of cores.

            If I could only get my favorite applications (like Logic Pro or Sonar or Wavelab or Nexus or Kontakt or Premiere or After Effects or even Flash) was available in
          • Its not the size that counts, its how you use it. ;)

            Per core cache is faster than shared cache.
            L3 is better as well because it means it can be used to transfer data between cores instead of main memory.
      • Re: (Score:3, Interesting)

        Native quad-core is better than dual-dual-core because more cores can exchange cache snoop data over CPU-speed internal buses instead of low-speed external buses. Cache snooping quickly kills performance scaling on shared FSB architectures like the P3, P4 and Core 1&2. Since the same FSB is also used for memory IO, cache snooping robs some more of the FSB-limited memory performance on P3/P4/Core-1&2 FSB-based SMP architectures.

        Shared FSB systems do not scale... even Intel knows that. However, dual-d
        • by TheLink ( 130905 )
          "Native quad-core is better than dual-dual-core because more cores can exchange cache snoop data..."

          That's not really important at all. As long as something works better in practice, that's the one I'd buy/recommend.

          And from what I see, the Core 2 Duos are a LOT faster than the AMDs and in most cases the better choice.

          2 years ago I'd recommend AMDs over Intel's P4/Netburst crap. But now the Core 2's are stomping all over AMD, and with the recent Intel price cuts, AMD is in for a very bad time unless Barcelo
    • Re: (Score:3, Insightful)

      by itzdandy ( 183397 )
      No doubt that from a design perspective the barcelona is superior to an intel dual/dual core design. The problems is of course yields and bins. barcelona is a more expensive design because of lower yields from larger slabs of silicone. The only way to overcome that is to make much less complex, lower transistor count cores to up yields. otherwise, make a point-2-point bus working at much higher bandwidths and make seperate cores and glue them together(Intel). Intel will win this game because of better yi
      • by taniwha ( 70410 )
        "silicone"? what are you building exactly ....
      • Re: (Score:2, Funny)

        by ozmanjusri ( 601766 )
        The problems is of course yields and bins. barcelona is a more expensive design because of lower yields from larger slabs of silicone.

        So you're saying AMD is likely to go tits-up?

        • I'd rather say that AMD likely wont produce the premier chips like they did when the hammers first came out. AMD had such dominant chips then and it really woke Intel up. a japanese commander said something to the effect of "I fear we have woken a sleeping giant" when they attacked pearl harbor, and i think that is exactly what AMD did with intel when they lead the market..

          i certainly hope AMD stays in the game but i doubt they will lead it with barcelona.
    • Or you can't use all your memory sockets because the memory controller for half of them is in the second potentially non-extant CPU.

      Balancing the amount of memory on each side is a good idea too.

    • Than. than. than. not then. than.
  • by Fyre2012 ( 762907 ) on Sunday July 22, 2007 @04:33PM (#19948283) Homepage Journal
    I recall Intel was up in a fuss when AMD released the 64 bit chips. The market 'ooh'd and 'aahd' in delight of the new architecture, supposing that it would herald in a new era of computing in a similar way that the jump from 16 to 32 did.

    The reality of the situation became that the great majority of Athlon64 users were running 32 bit apps, and continue to do so.

    There has yet to be a dire 'need' for 64 bit processing, much to the similar way that there isn't a dire need for more than 4 GB of ram in a desktop machine.

    At work, I'm the Sysadmin for a dedicated hosting company (Linux, mostly Gentoo), and even in that market I don't know of any of my users running 64bit. any performance advantages are outweighed by incompatibilities and plain old PITA to get things working.

    That said, the delay in developing these quad core procs shouldn't put that big a dent in the pocket / market share of AMD simply because it's a niche market that has yet to be widely adopted.
    • by Inoshiro ( 71693 ) on Sunday July 22, 2007 @05:03PM (#19948507) Homepage
      "There has yet to be a dire 'need' for 64 bit processing, much to the similar way that there isn't a dire need for more than 4 GB of ram in a desktop machine. "

      1987 called, they want to use more than 64k of RAM. How can they do that without going to 32-bit?

      2007 called back, just to let you know that 4gb of RAM was $150. That's right, $150. At that point, a lot of people are starting to wake up to the unpleasant smell of Intel's PAE (that's right, segmenting, but with 32-bits!). We're also living with the limitations of the 32-bit tlb and the paging methods used. I have a machine here with 4gb of RAM, and it's not unusual because of how cheap RAM is. Linux can run it as 4gb of RAM in 64-bit mode no problem, or I can run in 32-bit with 3.6gb of RAM because the PCI bus and other devices all map to that high region (just like everything above 640k was mapped to devices back in the 20-bit addressing days). Windows 32-bit does the same thing.

      Now, while Linux 64-bit is stable and mature (having been something I've used for 3 years, after which most of the userspace apps have been cleaned up to work), Windows 64-bit is still not all there. Naturally, the proprietary apps will always live in the land of 32-bit. Supreme Commander, a recent DX10 game, has a lot of 32-bit troubles -- running out of RAM and crashing. One of the things you have to do to play it well is add /3GB to your boot.ini, and patch the EXE to enable larger address spaces for userland applications.

      Now, 10 years ago, or even 5 years ago, that would not have been even on the radar screen. Now that you can buy 4gb of RAM for less than $200 (CAD or USD), and now that we have games and applications that need it (beyond the VFS cache; go look at some series SQL applications or scalable web applications), I think you're way off base, and you sound like someone talking about how 64k of RAM (the 16-bit addressing limit) is more than enough for anyone.

      If all you're doing is sysadmining mom-and-pop's micro website that runs fine with 1 or 2gb of RAM, you'll never know this. If you're sysadmining a company that relies on this stuff, and has a cluster of machines that need to be up and running with gobs of RAM to buffer slower disks and backplanes, you'll know better. When normal users can get 4gb of RAM for next to nothing, the server machines better have at least 32gb of RAM.
      • Re: (Score:2, Interesting)

        by ACMENEWSLLC ( 940904 )
        The problem isn't that we don't need/want 64bit, it's that with Microsoft it is so damned hard to get to 64bit. My AS/400's been running 64 bit for something line 8 years now. The conversion was transparent. With Microsoft, I can no longer use my 32Bit antivirus. I can no longer use my 32bit device drivers and many don't offer 64 bit versions. WTF? Who thought that was a good idea? It doesn't have to be that way. But that's the hold up. I want 64bit so I can run more than 4GB of RAM. But I
        • Drivers have to be 64-bit for a 64-bit OS because they have to be able to talk to the devices through physical ram, without any translation. There's no easy way around this, especially for high bandwidth devices like graphics cards.

          32-bit software works because it was ALREADY USING virtual memory, so mapping the app's 32-bit virtual memory to 64-bit physical addresses isn't a whole lot different to mapping to 32-bit physical addresses.

          As for antivirus, I think that one was a difficult choice. Antivirus prog
      • Re: (Score:3, Informative)

        2007 called back, just to let you know that 4gb of RAM was $150

        If you'd watch the market more regularly, you'd know that RAM has priced out at anything between $30 per gigabyte and $125 per gigabyte in the past 12 months. Last summer it was around $60-$75 per GB, rising to the $125/GB figure in the fourth quarter of 2006. Right now it's bouncing around in the $30-$50/GB range.

        All depends on what week you buy it and what week your retailer bought their stock.

        I'm hoping that inexpensive ($30 or less)
    • Re: (Score:3, Interesting)

      by joib ( 70841 )

      That said, the delay in developing these quad core procs shouldn't put that big a dent in the pocket / market share of AMD simply because it's a niche market that has yet to be widely adopted.


      From what I've heard, the Intel quad cores are selling like hot cakes for running virtual machines.

      And it's not only quad core, Barcelona also brings a bunch of core improvements, sorely needed to keep AMD competetive with Core2.
    • There are advantages beyond >4GB of ram for the AMD 64-bit architecture that you apparently do not realize. 64-bit works smooth as silk on Linux, and you can have 32 and 64 mixed together on those systems and it still holds up. The incompatibility problems you see on Linux are with desktop configurations. Where you need certain closed source firefox plug-ins (and you foolishly installed the 64-bit firefox instead of the 32-bit firefox), or closed source 32-bit drivers for your video card, sound card, etc
    • by fm6 ( 162816 ) on Sunday July 22, 2007 @06:36PM (#19949255) Homepage Journal

      There has yet to be a dire 'need' for 64 bit processing...
      As usual, slashdotters are critiquing the computer marketplace as if it were all about them. It's not.

      Of course nobody's running 64-bit applications at home on at the office. Because the dominant player there is Microsoft — whose 64-bit support on the desktop is either lame (try to find even basic drivers for XP-64) or a nightmare (try to run Vista-64 at all!). Can't really run 64-bit apps without a 64-bit OS, can you?

      On the other hand, there's a huge demand for 64-bit apps that run on high end workstations and servers. How do think AMD managed to grab so much market share so quickly? By finding a way to meet that demand ahead of Intel, that's how.

      If it weren't for this demands I wouldn't have a job — documenting x64 servers for Sun. Yes, Sun. Its a big profit center for them these days.

      At work, I'm the Sysadmin for a dedicated hosting company (Linux, mostly Gentoo), and even in that market I don't know of any of my users running 64bit. any performance advantages are outweighed by incompatibilities and plain old PITA to get things working.

      All that tells us is that Gentoo 64-bit support sucks and that you're not supporting any high-end applications. What have you got, some low volume commerce and web presence sites? If you were doing millions of transactions a day, you'd be needing to squeeze all the performance out of your servers you could manage. Which is why the big boys run serious 64-bit OSs: RHEL, SLES, Solaris, Windows 2003.
      • Ahh more FUD (Score:4, Insightful)

        by Sycraft-fu ( 314770 ) on Sunday July 22, 2007 @10:09PM (#19951079)
        Vista 64-bit and Xp 64-bit work just fine, thanks. I'm running Vista-64 right now. We make a fair bit of use of both at work, for precisely the reason that we have apps that run out of 4GB of RAM.

        As for your drivers comment, well let's see here: Intel has 64-bit XP and Vista drivers for their motherboards (and by extension graphics) and NICs as far back as their 865 series (anything older doesn't support 64-bit CPUs). Vista-64 has native support for older nVidia chips (GeForce 2 is the oldest I've tried) and nVidia provides downloadable drivers for their 5 (FX) series and newer. ATi likewise has support in the OS for some older chips, and downloadable drivers for the 9500 and newer for XP-64 and Vista-64. Broadcom has XP-64/Vista-64 drivers out for all their NICs (both 44XX and 57XX series). LSI has 64-bit drivers for, well, all their products that I can see for XP and Vista (and Linux and Solaris). Colorvision has 64-bit drivers and is Vista compatible. Logitech, Microsoft, and Saitek all have 64-bit drivers and support apps out for their input devices.

        I could go on but basically any modern hardware seems to have no problems at all with 64-bit drivers. In fact, on all the 64-bit Windows systems I've set up, I've never encountered a component we didn't have a driver for. I'm not saying there aren't some oddballs out there, I'm saying that the vast majority of stuff DOES have a driver and thus it is a non-issue.

        When you are countering some FUD, please don't spread your own. You may to like MS OSes, that's fine, but it is a lie to say that finding drivers for 64-bit Windows systems is hard. The vast majority of devices, including specialty devices (I've got 64-bit Vista drivers for my colorimeter and StudioCanvas for example) have 64-bit drivers. It is just a non-issue. Far more rare is 64-bit software, but thankfully 32-bit software runs without problems on the 64-bit OS.
      • Re: (Score:3, Interesting)

        by langelgjm ( 860756 )

        Of course nobody's running 64-bit applications at home on at the office. Because the dominant player there is Microsoft -- whose 64-bit support on the desktop is either lame (try to find even basic drivers for XP-64) or a nightmare (try to run Vista-64 at all!). Can't really run 64-bit apps without a 64-bit OS, can you?

        Amen to that. I've run both XP 32- and 64-bit on this machine, and now I'm giving Vista x64 a go. XP 64-bit is a total joke - driver support is almost totally lacking, and now with Vista, I

      • Of course nobody's running 64-bit applications at home on at the office. Because the dominant player there is Microsoft -- whose 64-bit support on the desktop is either lame (try to find even basic drivers for XP-64) or a nightmare (try to run Vista-64 at all!). Can't really run 64-bit apps without a 64-bit OS, can you?
        Linux's 64-bit support isn't up to much either. Whatever OS you use, experimenting with 64-bit is a trail of tears, and of no use to most users.
        • Re: (Score:2, Informative)

          by matthewcraig ( 68187 )

          Linux's 64-bit support isn't up to much either. Whatever OS you use, experimenting with 64-bit is a trail of tears, and of no use to most users.

          This is very much untrue, as I can attest. I have been running a 64-bit OS since 2003, and it runs like a dream. I can't address all the technical reasons why, but I can say that I have no 32-bit libraries and I'm up and running. No tears here.
        • Linux's 64-bit support isn't up to much either. Whatever OS you use, experimenting with 64-bit is a trail of tears, and of no use to most users.

          If you say "end users", I'd tend to agree with you (although the situation is improving at a good pace).

          But for server users, Linux 64bit has been here for at least 2 years now. The early adopters probably started 3-4 years ago. Things were still slightly shaky 2 years ago, but are definitely pretty solid today.
      • I have an AMD X2 3800+ (s939), 2GB of ram and an nVidia GeForce 8800GTS all in an nForce 4 motherboard, and drivers for all of it are available for both Windows XP x64 and Ubuntu x86-64.

        My 250GB Windows disk is overflowing with major games, and all run wonderfully. Except for the occasional one I have to crack, either because it's too old and tries to load a 32-bit driver (understandable) or the company went for a shit copy-protection system that tries to load a 32-bit driver (eg Overlord). That's not such
      • Of course nobody's running 64-bit applications at home on at the office. Because the dominant player there is Microsoft


        Dominant? Yes. Total control? No. Plenty of us run 64bit apps, just not on Windows.
    • If you're running open-source software, then upgrading to 64-bit is trivial.
      • by DrSkwid ( 118965 )
        Have you done it ?

        Because I discovered that pleny of OS apps wouldn't compile. libdv is one iirc. Kino is another, not the most important apps in the universe but enough to make me waver.

        That said, some fine people have stepped up and done some of the dirty work and done a 64bit multi-media distro : http://64studio.com/ [64studio.com]

    • Re: (Score:3, Interesting)

      by SEE ( 7681 )
      The market 'ooh'd and 'aahd' in delight of the new architecture, supposing that it would herald in a new era of computing in a similar way that the jump from 16 to 32 did.

      The 80386 was introduced in 1985, but the transition to 32 bits in software was really only done in 1995. Windows 3.1, released seven years after the 386, still ran on the 286. Word 6.0 for DOS, released in 1993, still could run on an original 8086.

      The first 64-bit x86 processors were introduced in 2003. If they "herald in a new era of
    • Re: (Score:2, Interesting)

      by Thorrablot ( 590170 )
      Only a fraction of the 64-bit capable desktops and workstations are running 64-bit applications. What's more, there are very few mainstream 64-bit applications out there, despite the fact that for gaming, audio processing, image processing/photoshop apps, and video there would still be performance advantages (memory bandwidth and operation throughput), even if you don't yet have > 4GB of RAM.

      As a veteran of the 16->32 bit transition (for that matter, the 8->16 bit as well), I've been wondering
      • Re: (Score:2, Interesting)

        by Hal_Porter ( 817932 )
        Only a fraction of the 64-bit capable desktops and workstations are running 64-bit applications. What's more, there are very few mainstream 64-bit applications out there, despite the fact that for gaming, audio processing, image processing/photoshop apps, and video there would still be performance advantages (memory bandwidth and operation throughput), even if you don't yet have > 4GB of RAM.

        It's not really compelling - plus or minus a few percent. And you need to test two binaries which is expensive. So
    • Wait...You use Gentoo on 64bit boxes but compile as 32bit?

      No such thing as incompatibilities if your compiling it.

      (I run two 64bit Gentoo servers fyi)
  • ... it NEVER MADE _true_ QC CPU...

    All existing Q6xx0 solutions are dual-dual core ie two dualcores sharing same FSB - and that is _NOT_ the same as true QC as Barcelona is claimed to be.

    That difference is enough to make Barcelona the main choice for many core servers even if it were made with old K8 and not the new K10 cores.

    Intel should have true QC chips in a year or so...

    • Intel needs more then just true quad core. They also need memory controllers build in the cpus and a cpu to cpu link that does need to use the NB to talk the other cpu and also have so you can have more then one NB like chip like you can on a amd system. With a 2 way amd system you can have 2 chip set links and up to 2 HTX slots.
    • by ciroknight ( 601098 ) on Sunday July 22, 2007 @05:00PM (#19948495)
      Guess what? The market doesn't give a shit, they just want multiples of 4 in one socket, period. Even AMD admits it was a mistake not to go MCM; Intel got the drop on them, and has deepened their lead quite substantially, leaving AMD sitting on their hands with no competitor for far, far too long (and their upcoming competition will quite frankly devastate them in the short run, however in the long run...).

      Intel had the option to rest on its laurels; they don't like to work any harder than necessary to remain on top, and the Core marchitecture gave them a huge.. well I'll say it.. "Leap Ahead" of the competition. Unfortunately, Intel's more of a bunny; hop a few times then get tired and sit around, whereas AMD is more of the turtle (slow to market, but rather constant). Well all know who wins the race.
      • If amd where to go MCM there better cpu to cpu and cpu to chip link the hypertransport bus will make things a lot better they the way intel does it.
      • by Brane2 ( 608748 )
        Not true.

        If AMD wanted to, they could have hads Intel's style "quad core" long ago.

        Hell, even two "x2 4800" dies on one substrate, connected through HT link would be an equivalent, and they could do it in a few weeks even if they would decide to go for it _today_. There is not much to it.

        Opteron/AMD64 was _made_ so it could be connected it LEGO-like fashion...

      • Unfortunately, Intel's more of a bunny; hop a few times then get tired and sit around, whereas AMD is more of the turtle (slow to market, but rather constant). Well all know who wins the race.

        I think it's important to remember that Intel inadvertently delivered the high-end server market into AMD's lap.

        Intel had done so much heavy marketing, pushing claims that the Itanium was going to blow away all the proprietary CPU architectures, that damn near EVERYONE EOL'd their Unix servers... Alpha, MIPS, PA-RISC,

    • "True" QC, "fake" QC, what does it really matter? The only things that really matter in the end are performance and price (and possibly power dissipation). From the standpoint of a consumer, the internal technology has no importance at all.

      Now, if you said that "true" quad core was going to make the chips be twice as fast as Intel's, at half the price, then that would be interesting. Of course, you could say that the chips would twice as fast at half the price, and that would be just as interesting - the technology has nothing to do with it.
      • by Brane2 ( 608748 )
        1. it all depends on task at hand. With tasks with intensive intercore communications true Barcelona could be say 10x faster than anything Q6xx0 on Intel's side even without superior core.

        2. If your general understanding of the problem is poor than any explanation that could be "interesting" to you is likely to be marketing bull**it, optimized for technical morons.

        You can't make universally valid "X-times faster/slower than" comparisons between these kind of machines.

        Results tend to be program-and-load spec
        • "Barcelona could be say 10x faster than anything Q6xx0 on Intel's"

          BS: At what tasks?

          Because Intel has already demonstrated near 4x performance with it's "untrue" quad core. You are not going to get more than 4x single core performance. There is only a tiny margin of efficiency of multi-cores that AMD could improve upon.

          AMDs only real Barcelona hope is if it increases basic core performance significantly, there is no magical "real 4 core" performance leap to be had certainly not a 10X performance increases.
    • Doesn't matter (Score:4, Insightful)

      by Sycraft-fu ( 314770 ) on Sunday July 22, 2007 @05:03PM (#19948513)
      Geeks love to work themselves in to a lather over technical difference but to the end user, quad core is just 4 processors in a single socket, doesn't matter how it is delivered. Now if the Intel solutions performs poorly because of the 2x2 design then it could be a problem. However, thus far, it doesn't seem to. On the kind of apps that can use the power (like a 3D renderer for example) they just shine.

      In the end it doesn't matter how it is delivered, it matters who can deliver the good performance per $$$. Intel's quad core chips go a long way to doing that in the markets that can use them. The reason is it gets expensive to add physical processors to a board. A single socket board might be $100, but the same thing in a dual socket variety can be $400-600 and you don't even want to see the prices on quad sockets. Thus being able to drop 4 cores in to a standard desktop board, even if they aren't a monolithic 4 core package, is a good deal for many.

      Technical arguments and contrived benchmarks mean nothing. The only things that matters is how fast it runs the things you actually, really do, and how much it costs.
    • ... it NEVER MADE _true_ QC CPU...

      All existing Q6xx0 solutions are dual-dual core ie two dualcores sharing same FSB - and that is _NOT_ the same as true QC as Barcelona is claimed to be.

      That difference is enough to make Barcelona the main choice for many core servers even if it were made with old K8 and not the new K10 cores.

      Intel should have true QC chips in a year or so...


      You're very convinced the difference will be drastic, that's very funny thing to be when you never saw a single benchmark.

      According to
      • by Brane2 ( 608748 )
        I am very "convinced that performance difference will be drastic" since:

        1. This is mainly what currently holds Opterons over Xeons on servers, despite superior C2D core and heap of cache.

        2. I have exchanged dual Opteron boards for single socket DC 6000+.
        Despite HT link and dual RAM bank of existing Opterons being superior for most uses to Intel's shared FSB, there is tangible speedup just due to having really fast intercore communication path.

        3. AMD has onboard memory controller even now with K8 and K10 wil
    • AMD have indicated they may do MCM in the future (Multi-Chip Module), like Intel. But since they are releasing a true quad-core CPU before Intel, it is going to give them an advantage: to make an 8-core CPU, they will just need 2 quad-core chips, whereas Intel will need 4 dual-core chips.

      I wonder how significant this technical advantage really is on various levels: performance, power consumption, reliability, yield, simpler to manufacture, cost, etc. Could this also mean AMD will be first to market with

    • Oh god these AMD guys just won't drop this.

      WE KNOW ALREADY. the intel quad core still performs very well in benchmarks - faster than the dual core in multi-threaded apps, one can only conclude it damn well works.

      As for AMD's 'true quad' performing better, no one knows for sure, there's no real benchmarks yet at this point.
      AMD could release it and say 'wow, our system is 25% faster than the Q6600" and intel could say 'err so what' and release the Q6600 clocked 25% faster because there's so much damn headroom
      • by ErikZ ( 55491 ) *
        I've always been a fan of AMD, but I agree with your point.

        It doesn't matter if it's two dual cores in one package. It doesn't matter if it's 4 single cores with a highly trained monkey dividing up the instructions between the cores. It's the end results that matter.

        However, right now you can't get the 300$ quad core from Newegg, it's sold out. You *can* get a 61$ dual core from AMD though. And unless you require a lot of processing power, it's more than enough for most people. Especially college students o
    • ... but alas it's the Q6600 for me as a) Intel just drop kicked its price, and 2) AMD are still nowhere near releasing quad cores for the consumer market. "Later this year" just doesn't cut it, so here's yet another person switching from AMD (Newcastle 3200) in their desktop to Intel. :P
  • Someone want to buy me one? The only thing I have to trade for it is nothing... but I got a whole lot of it.
  • My gf is going to school in late September and will need a new machine. Shes currently looking at the Intel Core 2 Quad since it just had a great price drop (its $300 at newegg) and may be doing some vmware stuff. I'd love to tell her to wait for AMD bench marks but none are out, there is no release date and even if we had all of that she doesn't want to spend that much on her system so knowing the price would be nice(it would suck to wait till September only for Barcelona to be $800).
    • by Dogers ( 446369 )
      The chip being previewed here is the server edition. You'll be wanting to look at the Phenoms, which are apparently being previewed tomorrow on Tech ARP
      • Well I'll look for it tomorrow at work. I'm not to worried about the chip being a server editor or desktop edition. What ever is the cheapest/best performance is what we want to go for. Hell one of my "desktop" machines uses an IBM Cell Processor.
        • by Dogers ( 446369 )
          Well, the server version uses Socket F, the desktops will use AM2/AM2+

          But since you have a Cell system, I guess you're not too worried about upgrade paths ;)
    • She's unlikely to tax either one.

      She'll need a lot of RAM for VMWare to work well. That will have a huge impact on your cost calculations. Use RAID for performance, she'll need that too.

      Processors are important but they're not the whole answer in the performance equation. At this time the bottlenecks tend to be more in the RAM and HD I/O.

      • Well the system will be used for many different things(from gaming to vmware to multi threaded programming things I want to try out) and we're hoping to future proof it so we won't feel the need to upgrade for awhile. The big thing concerning me is out the Intel Core 2 chips have so many bugs in them.
        • They're called errata. The most recent bunch are more plentiful than usual but it's not unheard of. Get your microcode updates, whichever [microsoft.com] vendor [linuxbios.org] you get your chip from. AMD calls them BIOS updates which partly makes sense since you usually patch the BIOS at the same time. You get them from the OEM of your motherboard or system usually but as you see from those links operating system vendors can put them out too. The errata that have been in the press lately are unlikely to affect chips you buy right n

          • I know no system is future proof but if you buy right a system can last you years. I still use my IBM Thinkpad T40(1.5ghz pentium-m) as my laptop because it works so well for what I need. My currently desktop is an AMD X2 4400+ and I am very happy with it and usually don't use 100% CPU. Since we can wait till late September to get the system I'd rather be 100% sure on what we want then to buy now and wish we went with something else.
          • Re: (Score:3, Interesting)

            No computer is future proof. You can get some extra months on one by buying above average, but the best desktop you can get today will still look sad in three years. Pay extra for bleeding edge if you want to but the best value is middle of the road and frequent upgrades.

            If you had written that statement in the late 90s or even as late as 2002, I'd agree with you. But system performance stopped doubling every 18-24 months a long time ago. Now it's closer to 36-60 months (although dual-core and quad-cor
            • but think your information is a little dated. For many years what I said was true. Then for about 18 months it wasn't. I believe the race is back on now.

              Today's quad core, vm supporting 64bit machine will be quite useful in 8+ years, especially if you get the ULV processor. But compared to what's on the shelves at WalMart on that day it will still look dated.

              What I wonder is what sort of hideous application would require the type of computing horsepower that should be available 8 years hence. By then

              • I still doubt that we're going to see single-core performance increases like we did in the mid/late-90s. Current CPU cores, even with the smaller processes, seem to be limited in how much they can ramp up.

                While I'm a fan of multi-core (dual-core is a minimum recommendation from me for the past year, ever since the Athlon64 X2s broke the $150 barrier), I do question the idea of more then around 4-8 cores on a consumer / light-business desktop. For the power users, yeah, they'll be building systems with 4
                • I agree. Where we are right now is that the pace of improvement will not stop even though the baseline product is far more capable than 90% of the people need. I don't expect this to change any time soon. The person who can make efficient use of a top of the line pc will only become more rare.

                  The baseline will continue to improve at remarkable rates as each innovation on the bleeding edge drives prior art downmarket.

                  Peripherals are a different story. I expect more innovation in human-machine interface

    • My gf is going to school in late September and will need a new machine. Shes currently looking at the Intel Core 2 Quad since it just had a great price drop (its $300 at newegg) and may be doing some vmware stuff.

      I sincerely doubt that anyone still in school - any school - is going to overtax the current Core 2 Duo/AMD-64 X2 offerings available today. Short of running simulations of the Universe in real-time, or high resolution Maya renderings (remember when Photoshop was once the app that justified the

      • I'm currently doing research with MRI imaging on multi-core systems and she may be joining in. Plus we're Gentoo users ;)
      • $300 is not an unreasonable amount for someone to spend on the CPU in a system that they think will need horsepower. It's when you go beyond the $250-$320 price range that prices escalate rapidly compared to the performance increase. I would dissuade most users from spending $600 on a CPU, but wouldn't be as negative towards someone in the $250-$300 range.

        Of course, for the budget users, I'd be pointing them at the lowest cost dual-core CPU. Or possibly a few steps up from the bottom. There are some p
  • by Kjella ( 173770 ) on Sunday July 22, 2007 @04:46PM (#19948369) Homepage
    Intel basicly took a big hammer to any AMD claims of "more affordable quad-core" with their cut today from $530 to $266 for the cheapest quad core, which I doubt AMD can do much better than. I also don't expect them to top the QX6850 for performance right off the bat, since they clearly fail to do so in dual core. AMD is bleeding a lot of money right now and Intel knows to push when it hurts. Right now AMD is staying competitive but with the massive cuts to margins it can't be good for neither profits nor R&D. Intel is not going off on a huge strategic blunder like the PIV or Itanium again, this time they're on the ball and overclocking results suggest they have a lot of headroom.

    The latest batch of ATI cards have failed to compete with the 8800GTX and instead compete against lower clocked cards, presumably again with cut margins. Right now AMD and ATI to me look like two second place companies, and if they try to integrate closer they'll drag each other down. I'm certainly not inclined to buy those two as a package...
    • by bluSCALE4 ( 975190 ) on Sunday July 22, 2007 @04:58PM (#19948473)
      That's because you're not aware of the power of using the GPU in coordination with the CPU. Folding certainly shows GPU as a force not to neglect. You also fail to realize into your comment that quad cores are two dual cores. AMD and Motorola would do this sort of thing to claim next tier technology when in reality they were today's tech on steroids (They often fix GHz speeds with two CPU sets). Now, for some reason, AMD has opted not to do that and we'll see the true worthyness of this strategy with the release of the true quad core.
      • Last I heard, AMD is still a process generation behind. Intel can stuff two dual core dies in a package and still not exceed the TDP of AMD's FX series. I think it may be that AMD just couldn't stuff two dual cores in a package because of the power consumption.

        Motorola is no longer a player in the desktop CPU market, have not been for several years now, I'm curious why you bring them up. Their products were not put into a new notebook computer for over a year and a half now.
        • Re: (Score:2, Troll)

          by LuSiDe ( 755770 )

          Intel can stuff two dual core dies in a package and still not exceed the TDP of AMD's FX series.
          1) However, AMD's FX series have a far lower idle W (9 versus 35 on Intel, or something).
          2) If you use AMD's X2/EE versions your TDP gets far lower (whereas cost of CPU increases by merely a couple of bucks). If you use X2/EE/SFF you get a TDP of 35W.
          3) Price/performance wise, AMD is still a clear winner, hands down.
      • Re: (Score:3, Interesting)

        by Kjella ( 173770 )
        You also fail to realize into your comment that quad cores are two dual cores.

        I know, and I also don't give a shit. I got a single-socket mobo and four cores running, you don't. I don't need a special and expensive dual-socket mobo, eATX case or whatnot. That's 99% of the advantage there already. The notion that "real" quad-core makes a big difference is at best disputed, maybe if you have a lot of core-core communication but well... I don't see how that could be a very big bottleneck for normal quad-core u
    • Re: (Score:3, Informative)

      by dpilot ( 134227 )
      And if it continues to go this well, Intel will push AMD entirely out of the competitive CPU marketplace. Next they'll go after VIA in the low-end, low-power markets and drive them out, and they'll reinvigorate their efforts on IA64, attempting to go after the high-profit Sun and IBM sockets.

      In essence, the desktop will slow and rot, perhaps giving us another boneheaded move like NetBurst.

      You can take all of that with a grain of salt, but remember this... It's been hammered here many times before that a com
    • Re: (Score:3, Informative)

      by kestasjk ( 933987 )

      Intel is not going off on a huge strategic blunder like the PIV or Itanium again, this time they're on the ball and overclocking results suggest they have a lot of headroom.

      Really? I'm not so sure.

      Sooner or later they're going to have to go for something similar to an Itanium processor. Once pushing clock speed runs out, pushing cores runs out, pushing micro-op improvements runs out, they're going to start looking at the instruction set.
      You can bet that if they could change the instruction set at a whim they would have done a long time ago, and the processor would perform much better.

      I think it's inevitable that in the next 10 years things will start to look towards It

      • I think it's inevitable that in the next 10 years things will start to look towards Itanium (or an equivalent), because changing the instruction set will provide a lot of untapped processing power.

        We've been hearing things like this for a lot longer than ten years. I'm not an expert in this area (or any other, I'm more of a jackoff of all trades) but I do have a few observations to make.

        Itanium was supposed to be inherently faster than x86. It hasn't proven to be so. They did load it up with fancy FP hardware and, lo and behold, it was fast. It's still far cheaper to use a group of AMD processors if you want fast FP. And now they're going quad... That's mostly irrelevant though, the point is that

      • You can bet that if they could change the instruction set at a whim they would have done a long time ago, and the processor would perform much better.

        No. No it wouldn't. It might perform marginally better, as in maybe 1-2%, if you could get rid of all x86 cruft. That number is mostly a WAG, but I've done performance modelling studies to back it up (or more specifically, I've modelled x86 cores without x86 cruft because it's easier, and adding in the cruft for accuracy costs a percent or two on average).

        P
    • Re: (Score:3, Insightful)

      by tknd ( 979052 )

      Right now AMD and ATI to me look like two second place companies, and if they try to integrate closer they'll drag each other down.

      I look at it this way: there are only three players, AMD, Intel, and nVidia. Beyond that you're not going to find a chipset, cpu, or gpu worth anything. The only company that (now) has sufficient expertise in all three areas is AMD. Intel has done a good job with centrino, but clearly has no interests and lacks knowledge in the GPU arena (they've only done the bare minimum w

  • The Doctor: Rose Tyler, I was gonna take you to so many places. Barcelona. Not the city Barcelona, the planet Barcelona. You'll love it, fantastic place, they've got dogs with no noses!
    [laughs]
    The Doctor: Imagine how many times a day you end up telling that joke, and it's still funny!

    Well, that aside but because it had to be done. But to business. I remember when 64bit cpus came out Sun and a few others had them (12 years ago) the programming took forever to catch up. They are still selling 32bit processors

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...