Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Hardware

THG Labs In Depth With AMD Spider 103

The Last Gunslinger writes "Tom's Hardware Guide has published detailed results of their laboratory analysis of AMD's recently released Spider platform, including the Phenom 9500 and 9600 running on 790FX chipsets. Amongst other interesting details, the 2.4GHz Phenom 9700 has been pushed back to Q1 2008. The 2.3GHz Phenom 9600 benchmarks on average 13.5% lower than Intel's Q6600 quad-core CPU...and the MSRP for the Phenom is about 13.6% less as well. Much is made of the AMD OverDrive utility, by which the THG labs were able to OC the Spider platform by 25% (3.0GHz) using air cooling alone."
This discussion has been archived. No new comments can be posted.

THG Labs In Depth With AMD Spider

Comments Filter:
  • With the new 7-series chipset family, consisting of the 790FX, 790X and 770, AMD is simultaneously unveiling the Spider platform. Up to four graphics cards can be set up as a Crossfire X configuration using the new 790FX chipset.

    Four graphics cards! Now that sounds like a gamer's wet dream. These days, CPU performance is not nearly as important as GPU performance. Four GPUs, running in parallel, with the right level of support in DirectX and OpenGL and you can just imagine those FPS! That's the real news of interest in this article, IMHO.

    • by Ed Avis ( 5917 )
      Has gcc been ported to a GPU yet? Can you compile kernels (or Gentoo) on your video card?
      • Re: (Score:2, Informative)

        Has gcc been ported to a GPU yet? Can you compile kernels (or Gentoo) on your video card?
        It looks like there might be some work ongoing in this area [gpgpu.org], yes.
        • by Wavicle ( 181176 )
          Um, that is someone talking about writing a GCC backend to target GPUs. Completely dissimilar to the question being asked: can you run GCC on a video card.
      • I don't think you can parallelize compilation of a single program. While being able to use the GPU for a stage one Gentoo install would save a lot of time, it won't really help with compiling OpenOffice.org or any other similar enormous piece of software, which are the enormous timesinks that come with that distribution. If you planned it right and did all your compiling at once you might be able to save a little bit of time by compiling one of those big programs on one of the GPU processors while you com
        • I don't think you can parallelize compilation of a single program.

          Where do people come up with this stuff?

          It's currently annoying to parallelize the compilation of a single source file, but non-trivail applications have a whole bunch of source files - so parallelizing compilation of applications is really easy. In fact, compiling the Linux kernel was one of the benchmarks that I remember seeing used to demonstrate the advantages of multiprocessor machines back in the 90's.

    • Re: (Score:3, Insightful)

      by Dunbal ( 464142 )
      Four graphics cards! Now that sounds like a gamer's wet dream.

            Yep, only you'll need a 2kW power supply. Can't wait to see your electric bill :)

      • You won't be using them 24/7 so you'd hardly notice a difference between 1 of these videocards and 4 of them.

        And your 1KW powersupply is usually a 'peak' measurement, not a continuous output.

        And I have space inside my case for two power supplies, so I'm all good.
        • by Dunbal ( 464142 )
          You won't be using them 24/7

                And you call yourself a "gamer"? Heh :)
        • If ATI only came through with the goods that they hyped, such a accelerated video encoding using the cards, then there would be those of us with these on 24/7. I invested in 2xX1800 (while these things where expensive) and I'm still waiting to see a realistic return on that investment.

          DUE SOLELY TO THAT, when my video encoding system gets an upgrade its going Intel+NVidia+Linux....a complete swing from its current AMD+ATI+Windows deployment.
          • If ATI only came through with the goods that they hyped, such a accelerated video encoding using the cards, then there would be those of us with these on 24/7. I invested in 2xX1800 (while these things where expensive) and I'm still waiting to see a realistic return on that investment.

            My guess is that ATI said that "video cards can be used to accelerate video encoding", and you misunderstood it to mean "we are going to magically force the people who make your video encoding to optimize to use our current l

    • by bl8n8r ( 649187 )
      > Four graphics cards! Now that sounds like a gamer's wet dream.

      Yes, but unlike wet dreams, 4 gfx cards take a lot more power off the grid. At some point, people are going to have to get by without excessive dependency on energy. It's really just a matter of how bad things will get before people start thinking about conservative choices.
      • Re: (Score:3, Insightful)

        You seriously need to get a sense of proportion on A.) what energy supply problems our society has and B.) what stuff uses how much energy.

        I'll give you a hint - no-one's going to have to give up having a ridiculous gaming computer until long after everyone's replaced their electric ovens. Things are different for servers and workstations, but the only rational reason why power consumption matters in personal gaming machines is the fact that cooling is noisy.

    • by wwmedia ( 950346 )
      there there! do you need a tissue?
    • Re: (Score:3, Informative)

      by Hemogoblin ( 982564 )
      This seems an odd move by AMD. I thought that multi-gpu systems were a complete failure? Take a look at Valve's recent hardware survey:

      Multi-GPU Systems (1073 of 269297 Total Users (0.40% of Total) )
      NVIDIA SLI (2 GPUs) 880 82.01 %
      ATI Crossfire (2 GPUs) 193 17.99 %

      So only 0.4% of Steam's users had a multi-gpu system. Maybe this segment is actually profitable, but it's hard to imagine that with such low numbers.
      • Oh, here's the link to that survey [steampowered.com].
      • I would agree with you that it is a small segment (only 1000 Steam users), but just how much money is coming from just those 1000 users? A lot of these multi-card gamers are probably buying two relatively expensive, high margin cards. The R&D on NVidia and ATI's side is done, now that second card bought by gamers is just extra dollars in the bank for them.

        Now I have to ask you why it's a complete failure? Because they only managed to con 1000 Steam gamers into buying an extra card? Then maybe. Becau
        • Well that survey shows ATI sold about 200 extra cards. For the sake of the argument, lets say they get $100 in profit off each card. That's a profit of only $200,000. That barely pays for the yearly salaries of a couple of engineers.

          Of course, there could be any number of problems with this brief analysis. For example, the steam hardware survey may not be representative of all gamers. Even so, this is at least some evidence that multi-gpu's aren't the new sliced bread.
          • by Thing 1 ( 178996 )

            Of course, there could be any number of problems with this brief analysis.
            I'll say! 200 * $100 = $20,000. That's about one month's expenses for a couple engineers (not salary, total expenses).
        • by Wavicle ( 181176 )
          I would agree with you that it is a small segment (only 1000 Steam users), but just how much money is coming from just those 1000 users? A lot of these multi-card gamers are probably buying two relatively expensive, high margin cards. The R&D on NVidia and ATI's side is done, now that second card bought by gamers is just extra dollars in the bank for them.

          I'm not sure where your economic theory is coming from, I assume you don't actually work in the semiconductor industry. So let someone who does clue y
          • Your analysis is flawed. It assumes that every multi-GPU system owner owns a Steam game AND takes part in their survey. If I remember correctly, you could opt out of sharing system and usage data.

            I can only guess at the actual sales numbers but if all you would need is to triple that to make a profit, that's easy. I doubt Steam represents more then 10% of all gamers. Then we're assuming all the people running multi-GPU setups are using it for gaming too.

            Then, on top of that, this setup is barely o
            • by Wavicle ( 181176 )
              I was giving them the benefit of a doubt. Their multi-GPU design and validation costs for each generation likely stretch beyond $10 million dollars. nVidia doesn't have a fab of its own, so it must contract out samples for post-silicon testing. I think your "not everyone owns steam" is countered by my insanely low estimate of per-GPU validation cost. Remember, that's 1000 multi-GPU steam users across every GPU that has been made. If my $1 million figure is correct, they'd have to get 30x that many people bu
      • If you ask me, it's because up until now (with the release of the 3800 series), multi-GPU systems didn't give you that much more of an increase in performance when compared to the amount of money you were spending.

        If you bought two low-end cards and ran them together, it wasn't worth it (the performance increase was negligible), but if you bought two high-end cards (which sometimes showed quite a bit of improvement), you had to spend a fortune.

        The benchmarks I've seen from this new 3800 series, though
      • Think in terms of stream compute clusters. Four-way GPU setups make complete sense in this space and while
        the gamer crowd is a big spur for new tech, the high-performance cluster computing space is shaping up to
        be an easier target and just as much a spur for that same class of tech. Having said this, the fact that
        they're not upping the cache (which is where that discrepancy in performance in the benchmarks is REALLY
        coming from...) to match Intel's lead in this space (With an insane 8Mb of L2 on the top-en
        • they're not upping the cache (which is where that discrepancy in performance in the benchmarks is REALLY coming from...) to match Intel's lead in this space

          The difference isn't that big. (~13% is mentioned in the /. blurb).
          Now throw in the price difference, and in fine, it'll probably turn out, as usual, [tomshardware.com] that AMD remains an interesting solution that'll give the most for your buck.

          And, as though intel will be king for the enthusiast market, where gamers are willing to shell big wads of cash for some "Extreme

          • by Svartalf ( 2997 )
            Heh... There IS a reason that they bought ATI. It's the very thing you're telling me here.

            To be honest, the bang for buck comparison between the top-end is something I didn't compare. There's slots
            in the price points where it makes slightly more sense to buy Intel, there's quite a few slots where you're better served with an Athlon64 class CPU. I picked a slot where it made sense for a Core Duo for my last purchase- in 4-6 months, the next major upgrade will be for a quad-core rig lashup. I'll probably
  • How many pages? (Score:5, Insightful)

    by Barny ( 103770 ) on Monday November 19, 2007 @08:43AM (#21406449) Journal

    PAGE 1 of 42


    Ok, I can deal with it taking a few pages, and you wanting a few ad hits, but only taking up half of my screen width, and then only using 1/3 of the remainder for text broken by seemingly useless photos... not going to bother.

    Summery 4tw
  • Did they get this platform from a guy who lives in a Ford Taurus behind a Wal-Mart?
  • If it translates to 13.5% or more performance in the same test then their onto a winner with the overclocking audience (that love numbers and driving a hard bargain).
    • Re: (Score:2, Informative)

      by DrMrLordX ( 559371 )
      Unfortunately the stability of B2 chips past 2.3 ghz has been called into question thanks to problems [theinquirer.net] with the Transition Lookaside Buffer (TLB). Anandtech was unable to get their B2 chip stable past 2.6 ghz [anandtech.com] despite the fact that it would run at speeds as high as 3 ghz. It is telling that reviews on AMD's supplied system (like Tom's) did not include any real stability testing of the much-touted 3 ghz B2 stepping Phenom X4.
    • by fitten ( 521191 )
      Only if they just like to overclock for the sake of overclocking. Even when you overclock the part to its highest (those who went to Tahoe played with AMD hand-picked chips that could overclock to 3GHz, everyone else had problems with stability at even 2.8GHz), it's still outperformed by a non-overclocked mid-ranged Intel part. So, even if some fanboi starts crowing about overclocking their Phenom(inal failure), almost anybody can say "Yeah, but my stock mid-range Intel part is still faster than your OC'd
  • by JohnnyBigodes ( 609498 ) <morphine@NosPaM.digitalmente.net> on Monday November 19, 2007 @08:53AM (#21406517)
    For Gord's sake, not THG... They're well-known for accepting "tips" in the past, have a horribly laid-out site that favors 90% ads with 10% content, and their reviews are anything but "in-depth", catering for the lowest denominator. I also love it when they draw brilliant "conclusions" that contradict their own data.

    THG is a wart on the face of internet journalism, in fact, it can't even be called that. Unfortunately they still have too much weight for $ome rea$on.
    • by Carewolf ( 581105 ) on Monday November 19, 2007 @09:57AM (#21407041) Homepage
      They used to be good, in fact they probably used to be the best. That's why they still carry some weight this long after they sold out and lost the dominance on hardware news.
    • Re: (Score:3, Informative)

      by zerkon ( 838861 )
      The biggest reason I still go to their site is the cpu/gpu charts. google toms cpu charts if you haven't seen them. Whenever I'm building a computer for someone and I need to know at a glance which $SUPERAWESOMECPUNAME is better and if it's worth the price.
    • History repeats itself. But if that's the case then let's follow this line of logic:
      1) Intel will release a 128-bit N-core CPU that will be dirt cheap and universally accepted and praised for a decade, or more (re: 8080, 8086, 80186, 80286)
      2) Intel will rely on this single, gargantuan leap forward to build it's product line for the next decade. (re: 80386, 80486, Pentium etc.)
      3) Competitor will beat Intel at it's own game by releasing chips that are just slightly faster, and just slightly cheaper (re: AMD)
      4
  • by Peter Cooper ( 660482 ) * on Monday November 19, 2007 @08:55AM (#21406527) Homepage Journal
    Much is made of the AMD OverDrive utility, by which the THG labs were able to OC the Spider platform by 25% (3.0GHz) using air cooling alone.

    And almost everyone with a Q6600 can get it up to 3GHz on air too, even on the stock heatsink. With something a little more special, like a Thermalight air cooler, speeds of 3.4->3.6GHz are not uncommon. If we look at the benchmarks in the TomsHardware article, the Phenom gets its ass kicked nearly everywhere across the board. It can be argued that this is because most apps are not optimized to work on quad-core chips yet, but even in the benchmarks where quad-core is clearly a benefit, Intel still edges out a respectable lead with their reasonably older technology.

    The advantages of the Spider platform are that you won't need to buy a new board for future processors

    We've heard that before! Okay, AMD has done something pretty clever with making the chips compatible across the board.. but I'm willing to wager that the percentage of PC owners who actually upgrade their machines year by year is reasonably low. There are a lot of enthusiasts who do it, and this is likely AMD's market if their performance wasn't so poor compared to Intel nowadays, but computer parts are cheap enough to get a new machine every couple of years instead. Certainly this won't be of any interest to the main manufacturers.

    Still, I'm glad AMD's there. Their presence is helping to keep Intel honest and the prices generally low, but as an ex-AMD diehard, I'm not seeing any reason to go back to them yet.
    • We've heard that before! Okay, AMD has done something pretty clever with making the chips compatible across the board..

      If we leave gaming/enthusiast and CAD/3D markets out, Intel has been doing just that for years. Coupled with their love for open source drivers, Intel-only systems are a great way to ensure linux compatibility. Nice example: Lenovo's X61: lspci lists 23 devices, of which only the cardbus bridge, firewire and sd card reader (i.e. stuff not critical to system functionality) are not made by I

    • Disclaimer: This comment deals with my experiences and opinions. Please do not reply complaining or flaming because I do not share the same opinion as you.

      I've always enjoyed AMD's products... I don't know if it's some force from above, but whenever I deal with any Intel system, it "feels" slower. I'm not saying that it is, just doesn't feel as snappy.

      I have an (older) AMD system. It's an Athlon XP 2500+, 120GB SATA drive, 1GB generic DDR RAM. I run Gentoo linux.
      I'm not a gamer nor do I do anything
      • by turgid ( 580780 ) on Monday November 19, 2007 @03:04PM (#21411745) Journal

        I've always enjoyed AMD's products... I don't know if it's some force from above, but whenever I deal with any Intel system, it "feels" slower. I'm not saying that it is, just doesn't feel as snappy.

        That's because intel's front side bus architecture, off chip memory controller and inefficient caches hinder performance, especially under heavy multitasking. You'll also note that multiprocessor (i.e. multicore) intel systems scale very poorly as the number of cores (or processors) goes up compared with AMD processors which have a more sophisticated design.

        As code becomes more parallel as a matter of course, we'll see these effects becoming more important. Next year, intel is bringing out a more AMD-like NUMA architecture (new processors, chipsets and motherboards) to try to address these issues. AMD has a 5 year headstart.

      • Good post, but as you're a user whose requirements are a couple of years behind the curve it doesn't make sense for you to buy cutting edge products.

        Regarding your question, nearly all of the Core 2 Duo 6x and 4x series processors are 65W. Depending on your supplier, you should be able to get a 2.33GHz Core 2 Duo on your budget which would vaguely be able to keep up with an X2 6000+ but still have significant overclocking potential (even on stock cooler - these chips work so well that they almost seem desig
      • If you can find a suitable board, use a CPU intended for laptops; they are more power efficient than desktop CPUs.
    • not to mention you lose features when you use older boards with new processors. For example FTA says Hypertransports goes from 3.0 to 2.0 (whatever that means) and DDR3 won't be supported if your board only supports DDR2, obviously.

      I used to intern at AMD during the K8 days (Athlon 64). It was awesome that they came out with the chip and kicked Intel P4's ass. But now I guess the juggernaut took notice and focused its efforts on regaining the performance title and look at what they were able to do.

      I also
  • Canned benches (Score:5, Informative)

    by DrMrLordX ( 559371 ) on Monday November 19, 2007 @08:56AM (#21406539)
    Tom's Hardware agreed to the terms of AMD's carefully-managed benchmarking sessions. Way to drink the Kool-Aid, Tom's. Anand stuck up for his own integrity as a reviewer and produced a much better review [anandtech.com] of the chip. Moral of the story: If you want a Phenom X4, wait for the B3 stepping!
  • 42 Pages... (Score:5, Informative)

    by darthflo ( 1095225 ) on Monday November 19, 2007 @08:58AM (#21406553)
    Since THG managed to inflate this a wee bit too much, here's a quick summary of what's new:

    - Up to eight processing cores (one quadcore cpu, four single-core graphics cards)
    - Targeted, of course, at the enthusiast market.
    - Weird bug when running >2.3 GHz. Top-End model (Phenom 9700) not available until very later on. Disabling L3 Translation Lookaside Buffer fixes this and costs some 10% performance.
    - (According to THG) processors some 13% slower and cheaper than corresponding Intel models. Graphcis performance has more variations, nVidia stays undisputed performance king, with it's relatively new 8800 GT being arguably the best midrange choice.
    - Up to 42 PCIe 2.0 lanes total; Graphics via 2x16 or 4x8.
    - Power-efficient Northbridge (some 10 Watts of usage) and GPUs (especially in 2D mode which is, thanks to Aero, Aqua and Compiz, slowly disappearing)
    - Lots of critizism for stability problems in testing systems (not too troubling) four days before launch (troubling).

    Long story short: AMD, thank you very much for trying, I'll stay with, and continue recommending, Intel/nVidia.
    • Re: (Score:3, Interesting)

      by Hemogoblin ( 982564 )
      ... and their conclusion:

      In the end, if you're looking to make the most of a long-term investment, AMD is without a doubt the better platform choice.
    • Long story short: AMD, thank you very much for trying, I'll stay with, and continue recommending, Intel/nVidia.

      Gotta love competition.

      A few years back I was a big fan of AMD/ATI for gaming. You could get a blazing fast CPU/GPU for quite a bit less than Intel and nVidia were offering. Left you money to throw into fancy cases, ginormous power supplies, light-up fans, big-ass monitors...

      These days I'm not so impressed with AMD or ATI. AMD is still making decent processors, but they don't seem to be top of t

      • These days I'm recommending Intel/nVidia to anyone interested in gaming - due largely to the fact that AMD and ATI kicked Intel's and nVidia's asses for a while... And I'm sure that eventually AMD/ATI will make a comeback, and I'll wind up using their hardware again. Back and forth...and in the end the one who really wins is the customer...

        Given similar price/performance between products, it's always a better consumer choice to buy from the financial underdog. Otherwise you risk healthy competition degradi

        • AMD has best performance for the buck. 99% of people would be happy with AMD X2 AM2 socket processors. That's what I use. I also use nVidia graphics - on-board graphics chips are quite good these days.

          Intel CPU prices are generally higher than AMDs and ATI drivers suck. Hence my decision. Oh, and memory bandwidth on AMD processors at least used to be high than Intel's - not sure where that is right now.
        • Currently AMD is producing excellent products in the midrange market segment (where almost everyone actually buys stuff)
          For my customers, I price out whatever meets their needs. We sell plenty of AMD machines and usually the video card doesn't matter at all. For myself, I do a lot of gaming, so I am not generally looking at midrange stuff - hence my preference for Intel/nVidia at the moment.
    • The sole reason why Intel does have an edge is because of AMD. Without AMD you would still heat your home with a nice Pentium 4, single core, at about 3 Ghz this year. Competition is very important. And in the performance per dollar AMD is at least on par with Intel, so there is not even any harm done by suggesting it. But if everyone starts suggesting Intel AMD will go down the drain. They are pretty close anyways.

      Or do you want to start paying upwards of 200 dollar for a processor again? Just look at what
      • "You should buy a less-optimum system now, so I can have a better one in five years."

        Screw that. I didn't buy Intel a few years ago when P4 sucked, and I'm not going to buy AMD now when they suck.

        > And remember that Intel is hated even more than Microsoft by many in the industry.
        Ermm... and AMD is hated by many as well, I assume? What's that got to do with the price of tea in China, or what processor is in my PC?
        • by Britz ( 170620 )
          Actually, if you have the money for a Core2 Q6600 and a board to go with it go ahead. They are so cheap because of AMD. But AMD doesn't even make processors that fast. So they have no choice but to sell them cheaper. And for processors half as expensive AMD packs more punch per dollar.
  • Sort of like your wife's spending going down 13% along with her fidelity rating.

    Or, your heart surgeon costing you 13% less, but he went to medical school in Haiti.

    Or, you survived that Grizzly Bear attack, and lost only 13% of your face.

    Psst! AMD! That aint competition, baby!

  • The 2.3GHz Phenom 9600 benchmarks on average 13.5% lower than Intel's Q6600 quad-core CPU...and the MSRP for the Phenom is about 13.6% less as well.

    So..another second-fiddle AMD chip? Are they going to try to release something better than the competition at some point or stay the cheaper #2 for another few years? I don't really understand their marketing scheme here. Gamers will pay more for better performance. Nobody is buying quad-cores just on a whim. Intel could cut their quad-core prices at any time

    • Is not about 'not trying' to create performance leading parts, but rather the simple fact that AMD don't have the manufacturing clout that Intel does. Regardless of engineering talent between the two firms, Intel have always been one step ahead with regard to manufacturing processes and die shrinks, and it's telling that the only time AMD have ever really led the market performance-wise for any period of time was when Intel made a huge mis-step with their Netburst architecture. Short of another slip-up fr
    • People will buy it because it is cheap. AMD survives like they did in the past by being cheaper than Intel and Intel being greedy. Intel has tried the non-greed option for year or so now in the hopes of smothering AMD, but sooner or later the Intel stockholders is going to demand Intel raises the prices to what people(fanboys) are willing to pay for they privilege of rooting for number #1.
    • Gamers will pay more for better performance.

      This is largely false. The vast majority of real gamers want to pay a moderate amount for decent performance. Hardcore gamers who are willing to spend $250+ on just their CPU (or video card) are a tiny minority.

      With this new Phenom release (and the 38xx video cards last week), AMD has very competitive platform for mainstream gamers. The only thing that really sucks for them is the overhyping of the high end by benchmark sites - many people will buy Intel because

    • So..another second-fiddle AMD chip? Are they going to try to release something better than the competition at some point or stay the cheaper #2 for another few years? I don't really understand their marketing scheme here.

      Do you think AMD could have made a better part than Intel, but decided not to? What a bizarre idea. It's overwhelmingly more likely that this is the best AMD can do.

    • by Wavicle ( 181176 )
      The 2.3GHz Phenom 9600 benchmarks on average 13.5% lower than Intel's Q6600 quad-core CPU...and the MSRP for the Phenom is about 13.6% less as well.

      Yeah, this is just THG trying to please AMD. Newegg, which is not the cheapest around, sells the Retail boxed Q6600 for $279.99. The cheapest X4 9600 I can find is $291.97.

      MSRP is not street price. The Q6600 is 13.5% faster and also 4.1% cheaper.
  • Ok, those virtual presentations by Hector (CEO of AMD) fails. Can't he be a bit more enthusiastic about the product line!? Another reason to never let a ceo talk to the public.
  • "PAGE 9 of 42"....

    Sorry I can't be bothered to click through 42 pages and over 200 adverts. If there were a text mode version I might consider RTFA.
  • Interesting item in TFA about if one of the four cores is unstable or malfunctions, they disable it and sell the chip as a three-core chip. Now I'm no engineer, but I certainly hope that deactivating a core is done physically at the factory and not through software/firmware, otherwise I can imagine a generation of viruses that go around zapping people's cores...!
    • That is exactly where the defective core is deactivated, where else could that occur and them market it as an X3? Think man!
  • Odd pricing (Score:5, Insightful)

    by SmallFurryCreature ( 593017 ) on Monday November 19, 2007 @09:08AM (#21406633) Journal

    The AMD's are less powerfull then the Intel in this race. Okay, no harm done, but why on earth does AMD then price them at the same dollar for performance ratio as intel? Lets say intel charges 100 bucks for 100 performance points, AMD now says, well we can't give you those same 100 performance points, instead we can only give 80, but aren't we nice, we only charge 80 bucks for it.

    Sounds nice in theory, but if I am buying a new cpu at the top of its range (and therefore paying a premium) I want to either have the highest speed OR a far better deal. Computer components often are priced on a curve, the slower, the cheaper, usually leading to a sweet spot where you get the best price for performance. Is it smart of AMD to make straighten this curve into a line? For 13% more power, intel just charges 13% more? No wonder they are losing once again, they used to be the company that was the best value for money. Perhaps they need a reality check AMD YOU ARE NO LONGER EQUAL TO INTEL, the days that your CPU's were better are over so you can't charge as much anymore.

    a performance of 80 for a price of 50, now that would be a sweet, I could then reason that, well I get less power, but I save a lot of money. At this rate, I might as well buy an older intel and get a far far better deal.

    It seems a pity AMD is once again second, the deals were so much better when intel and AMD where constantly at each others throat.

    • The AMD's are less powerfull then the Intel in this race. Okay, no harm done, but why on earth does AMD then price them at the same dollar for performance ratio as intel? Lets say intel charges 100 bucks for 100 performance points, AMD now says, well we can't give you those same 100 performance points, instead we can only give 80, but aren't we nice, we only charge 80 bucks for it. Because they need to make money too?
    • by Barny ( 103770 )
      Probably because these are not "top of its range" chips, they are bottom of the range ones, wait for jan/feb (well maybe April the way they are going) for the high end parts.
    • Lets say intel charges 100 bucks for 100 performance points, AMD now says, well we can't give you those same 100 performance points, instead we can only give 80, but aren't we nice, we only charge 80 bucks for it.

      Maybe I only have 80 bucks.

    • Sometimes I buy this. But I'm living in Europe, and each time I look at building systems, AMD is always cheaper *overall* compared to Intel systems with the same system configuration. CPU be buggered, motherboards and memory are the things that are needed and create the end price. Since AMD seems to run fine with slightly slower but much cheaper memory, AMD is the king for cheap self build systems. At our company we run Core 2 Duo's in our development machines, and I don't think they are much more expensive
    • How are AMD and Intel not at each other's throat RIGHT NOW? AMD is grasping for a breath of air with the Spider platform--their stock prices have been slammed. The only thing AMD is able to do right now is to position themselves in the "midrange market" until their investment in ATI begins to pay off by giving them another radical design edge (with Fusion, etc.). Let's face it--any buy in the CPU market at the moment is a good deal. Spider isn't all about the CPU anyhow: if you want high performance games,
    • by saldate ( 994781 )
      Did you read the same reviews I did? As I understand AMD's pricing structure, it's more like you get 80 performance points for 100 bucks, which is obviously even more confusing. Phenom generally performs worse than Intel's lowest-end Q6600, but costs more. Have a look at the less-biased version: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3153 [anandtech.com]
  • by MojoKid ( 1002251 ) * on Monday November 19, 2007 @09:32AM (#21406811)

    HotHardware has some pretty extensive coverage of the platform and new Phenoms [hothardware.com] as well. There's a lot fewer pages to sift through and more data on performance.

  • which would be AWESOME!!!
  • AMD and NVIDIA AMD MB still cost less and use less power as well And where are the NV 7XX a chipsets boards?
  • I didn't think THG did anything since the early 90s.
  • Having Intel and AMD with different SSE4 sets is a headache. Couldn't they just agree to a standard?

For God's sake, stop researching for a while and begin to think!

Working...