Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Hardware

AMD Introduces New Opterons 128

New submitter Lonewolf666 writes "According to SemiAccurate, AMD is introducing new Opterons that are essentially 'Opteron-ized versions of Vishera.' TDPs are lower than in their desktop products, and some of these chips may be interesting for people who want a cool and quiet PC." And on the desktop side, ZDNet reports that AMD will remain committed to the hobbyist market with socketed CPUs.
This discussion has been archived. No new comments can be posted.

AMD Introduces New Opterons

Comments Filter:
  • by erroneus ( 253617 ) on Wednesday December 05, 2012 @01:02PM (#42193079) Homepage

    I hope people are starting to sit up an take notice. The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself. Games are just about as good as they are going to get without new display technologies. The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

    So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that? How about "embedded systems"? Things that people don't want or need to reboot? The current versions of Windows are too bloaty, power and memory hungry to fit within that framework, so it'll have to be another OS. We know this because of the horrible failure "Netbook" computing has been. People wanted it, but expected it to run Windows. Windows couldn't really do it effectively. (I know... people are still doing it... I've still got two netbooks running XP and going strong... but anyone selling XP?) Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

    Think about what we are seeing.

    • by kwerle ( 39371 ) <kurt@CircleW.org> on Wednesday December 05, 2012 @01:16PM (#42193215) Homepage Journal

      Actually, all modern OSs do a fantastic job of taking advantage of multiple cores. It's the apps that fail to do so.

      As for OSs that take advantage of low power CPUs, you only mention MS - who (I suppose) has done a good job of this with Windows RT on the Surface. And maybe even a good job with whatever the hell Windows Phones run. It's just that consumers have not liked the apps. Of course Apple and Google both have solid contenders in the embedded space.

      So, as it always has been: "It's the applications, dummy."

      What are you trying to get at?

      • by UnknowingFool ( 672806 ) on Wednesday December 05, 2012 @01:45PM (#42193609)

        I think he's saying that CPUs bought several years ago are good enough for most people and the need to upgrade hardware every few years is not as pressing as it once was. One way to force this is to bloat software like OS so that you needed new processors.

        This leaves MS in a difficult place as most consumer tend to buy new machines to get new Windows versions instead of upgrading. There are rumors that MS is switching to a yearly release to entice consumers to upgrade. It is nearly the same model that Apple uses.

        A key difference is that while Apple might make some profit on OS upgrades, they make a lot more on hardware. Thus MS is trying to get into the hardware business as well.

        • by dbIII ( 701233 )

          I think he's saying that CPUs bought several years ago are good enough for most people

          Most people won't be buying Xeons or Opterons, but for those that do 64 cores and 128GB of memory for under $9k means a lot more hardware for much lower budgets than expected.

        • by tyrione ( 134248 )

          I think he's saying that CPUs bought several years ago are good enough for most people and the need to upgrade hardware every few years is not as pressing as it once was. One way to force this is to bloat software like OS so that you needed new processors.

          This leaves MS in a difficult place as most consumer tend to buy new machines to get new Windows versions instead of upgrading. There are rumors that MS is switching to a yearly release to entice consumers to upgrade. It is nearly the same model that Apple uses.

          A key difference is that while Apple might make some profit on OS upgrades, they make a lot more on hardware. Thus MS is trying to get into the hardware business as well.

          If only that were true. The merging CPU/GPGPU and then HSA approach to ubiquitous computing is going to need OS Level tools and frameworks to make that huge leap and make it uniformly on a platform so that Application layer development isn't spinning its wheels reinventing custom threading models and distributed design architectures because there doesn't exist a set of Core APIs to aide. Apple is way ahead in this regard. AMD is also way ahead in this regard. Ironic that both MSFT and Intel are behind when

    • Comment removed based on user account deletion
      • by Jeng ( 926980 )

        So you are saying that a company should be forced to sell software they no longer want to support?

        What about that companies support costs. If the company is still selling the software their customers are going to demand support and the main reason that the company stopped selling said product is that the products support costs were too damn high. So how does that get the company more money? All it gets the company is pissed off customers and frazzled tech support.

      • by Attila Dimedici ( 1036002 ) on Wednesday December 05, 2012 @01:45PM (#42193607)
        I would go a different route. When you stop selling the software, it goes public domain. Of course if copyright still only extended for a reasonable length of time, Windows XP would be public domain.
        • by erroneus ( 253617 ) on Wednesday December 05, 2012 @01:58PM (#42193775) Homepage

          Oh I completely dig that idea. If it is of no use to you (ie. you aren't selling it) then you have apparently exhausted its value to you as a business. It is now your responsibility under the contract of copyright, to release it to the public domain. But no. "The value" is maintained by keeping it away from the public in order to ensure that they keep buying the same things over and over and over again. This is a public abuse which could only be enabled by copyright law.

          So copyright went from the right to copy and distribute to the right to take it away from the public and to withhold information, arts and technology.

        • Comment removed based on user account deletion
          • I partly wrote my response the way I did and where I did because of a previous response to your post that complained that your proposal would force companies to continue to support software after they considered it obsolete. My response is the answer to that, somewhat legitimate, objection to your original proposal.
            Perhaps a good solution would be to implement yours, but give companies the option of just releasing old versions to public domain if they wish to avoid any support issues that they were afrai
      • by Bengie ( 1121981 )
        I want support for Linux 1.0. What do you mean I should just upgrade to 2.6+? I thought you said old versions should be supported.
    • No, I think we have only exhausted the demands of what you might call simplistic computing - fragile algorithms that efficiently follow usually a fixed number of steps to transform their input into some determined output, like drawing a rectangle. Given any imperfect input, they simply explode. Nothing in nature works this way. Living things are more messy, rooted in pattern matching and open-ended searching. We still can't even really simulate glass of water tipping over and spilling off the table. Co
    • by serviscope_minor ( 664417 ) on Wednesday December 05, 2012 @01:38PM (#42193519) Journal

      I hope people are starting to sit up an take notice.

      ???

      The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

      Not sure I follow. Transistor density has kept on increasing. It's been a little slower recently, I think, but several manufacturers are now sub 30nm for a variety of different process types.

      Games are just about as good as they are going to get without new display technologies.

      Really? Seems unlikely.

      The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

      Are you sure? Have you looked at the recent CPU benchmarks? More and more programs are taking advantages of multiple cores. All sorts of things that people actually do, like file compression, web browsing, media transcoding. Certainly the things I do benefit from multiple cores.

      We know this because of the horrible failure "Netbook" computing has been.

      Netbook computing was fine until microsoft moved quickly to kill it. Then the manufacturers seemed bent on suicide after that for inexplicable reasons. Oh, and intel came up with bizarro licensing for the Atom restricting manufacturers yet they haven't (with few exceptions) switched to the faster and cheaper Bobcat CPUs which lack such bizzare licensing restrictions.

      Why can't I buy a machine at the low price point and low eright of the EEE 900? That machine sold many millions. Netbooks used to be sub 1kG in the beginning. Now the lightweight ones are 1.5. What happened?

      Venduhs are strange. Why did they drop all the high res screens from laptops 10 years ago only to scrabble to play catch up after Apple decided to bring in high res displays? Makes no sense.

      That said, there's still a quite decent range of cheap netbook machines around, but they're just not as good as they were.

      • by cynyr ( 703126 )

        I agree with the games thing. There are many ways they could use more CPU/GPU and still be useful. For example when have you seen wind in trees in a game that actually looks like wind in real trees? Trees in games are always some sort of leaf pattern on a plane with holes in it. Any games with a good deform-able environment? How about reflection in water ripples? Bullets that are actually computed using wind and movement of the player? Actual gravity and friction?

    • Oh bull (Score:4, Interesting)

      by oGMo ( 379 ) on Wednesday December 05, 2012 @01:42PM (#42193559)

      The desire for fulfill the prophesies of Moore's law and to have ever faster and more powerful computing has already exhausted itself.

      While software has been hampered by web "technology" over the last decade, we are hardly at the pinnacle of software and computing... it's more like the Dark Ages, actually. Some stuff is being done elsewhere (GPUs, mobile), but we're still mired in fundamentally stagnant and backwards principles on the desktop (and server, really).

      Games are just about as good as they are going to get without new display technologies.

      Laughable. Let's assume anything video-related is "new display technology," and that we certainly have a long way to go to realtime radiosity and raytracing at extremely high resolution in a mobile device, then toss it 3D for good measure, so that's a given. But in terms of gameplay, all the computing and RAM you can get can be eaten up for a very long while. Simulation in games, today, isn't anything like what it could be. If I can't build a city at the SimCity level, zoom in and rampage through it at the GTA level, and walk up to each and every person on the street and learn their personal history and daily routines at an RPG level, then go into every structure and demolish it bit-by-bit with full soft-body dynamics, you've got quite a long way to go.

      The desktop PC has been maxed out and has been resorting to multi-processor and multi-core as the means to keep growing but meanwhile, the primary OS for most people running these systems is still not taking full advantage of even those advances.

      This is true to some extent, but "resorting to multi-processor and multi-core" means the desktop isn't maxed out. The primary OS (and software) may not be taking advantage of these things, but they are there and we're far from done yet.

      Microsoft shows no remorse over their architectural choices and show no signs of slimming down and getting lighter. So nothing points in Microsoft's direction... not even Microsoft. They are raising prices to make up for the lack of interest in what they are doing now.

      Microsoft is irrelevant. They have been for a long time. They may not be going away anytime soon, but they've been irrelevant since Google used the web to effectively route technology around them (due to earlier attempted lock-in). Of course, this has resulted in aforementioned Dark Age of Software, but at least we're not stuck on one platform. We're at the point where Valve is looking to seriously move gaming away from Windows, and there are alternatives for everything else, so what happened before doesn't really apply to what can happen in the future.

      Think about what we are seeing.

      What we are seeing is ripe potential for a Computing Renaissance.

    • So now things are going for lower power, lower operating temperature and all that. What sort of things benefit from that?

      Racks and racks of servers.

      Every saving of one watt in TDP of the processor can double (or more) in savings in less cooling of the building.

    • Comment removed (Score:5, Interesting)

      by account_deleted ( 4530225 ) on Wednesday December 05, 2012 @07:18PM (#42198035)
      Comment removed based on user account deletion
      • Hell one of the Apple fanbois tried giving up ALL X86 for a month, just one month, and using nothing but his iPad and his iPhone...what happened? he gave up after a week and a half because it was hobbling him too damned much.

        Do you have a link to that story? Sounds like an interesting read but my efforts at searching for it have failed.

    • Games are just about as good as they are going to get without new display technologies.

      Hah, you're totally making that up. 3D simulation has barely scratched the surface of what is possible, and AI of today is just a lame joke. How about audio synthesys that doesn't need vactors? How about fully interactive worlds? How about interacting with proper physics? How about hair that looks and acts like hair?

      Trust me, the games you are playing today will look just as dated in ten years as that games you played then years ago.

  • Keep 'em Coming (Score:5, Interesting)

    by corychristison ( 951993 ) on Wednesday December 05, 2012 @01:03PM (#42193085)

    AMD has huge advantages in the server market, I'm really surprised people are so stuck on XEON's.

    You can't cram 64 XEON cores into a 1U. Not to mention Intel is spotty on their hardware virtualization extensions.

    Intel has the lead in power consumption, sure. But if you're looking into running anything Xen, KVM or VMware in production, the cost savings AMD brings to the table makes them a competitive contender.

    I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models. Primary reason being 16 cores on one chip, at a lower power consumption than the 8-core Desktop model.

    • Until recently, I've been buying 100% AMD for 15 years... but AMD is so far behind that for the first time, I bought several Intel-based servers.

      Not sure what advantages you think AMD has over Intel... I would love to see a list. because frankly, it's sad to see AMD where it is.

      • by h4rr4r ( 612664 )

        AMDs advantage is lots of CPUs for cheap.
        For some workloads, that is worth it. I am using some for a VDI deployment. RAM is the limiting factor on how many desktops I can host not CPU and not disk, because I went all SSD.

        • RAM is the limiting factor on how many desktops I can host not CPU and not disk

          Surely in that case it's also worth going for AMD, since you also get excellent value in terms of DIMM sockets. If CPU is really not the limiting factor, you could get a 4 way 6212 (are those the cheapest?). The processors are about £200 a pop, and you get 32 DIMM sockets giving you up to 512G of RAM, using 16G DIMMs. 16G DIMMs are now at the point where they are sometimes less/GB than 8G ones.

          Between the cheaper moth

        • Re:Keep 'em Coming (Score:4, Insightful)

          by corychristison ( 951993 ) on Wednesday December 05, 2012 @02:50PM (#42194533)

          - Core density
          - Virtualization extension on all Opteron chips (and now most desktop chips, even the A6-4455M in my laptop)

          Not all XEONs have hardware virtualization. Only some of the most expensive chips have it and even then, it can be spotty.

          Bottom line, AMD wins in virtualization/"cloud" market (and supercomputing).

          • Augh. Damn me for using my phone... That message should have been for parent.

            Sorry about that.

          • - a core on an AMD system has about the same performance as a thread on a XEON. So Opteron and XEON are equivalent there.
            - The e3 - entry level XEON -- the cheapest class of XEON chips -- has hardware virtualization. I don't think it's on every chip, but it's really not hard to pull up Intels site and see if a specific chip supports it.

          • Not all XEONs have hardware virtualization. Only some of the most expensive chips have it and even then, it can be spotty.

            Not true. Every Xeon since 2006 has shipped with VT-x support. Look at the Xeon 5030 [intel.com] for example. Absolute bottom of the line ($150 at launch) from 2006, and it supports VT-x.

            You're probably thinking of Intel's desktop line where to do artificially hobble large swaths of their CPUs with respect to VT-x.

      • by Jawnn ( 445279 )

        Until recently, I've been buying 100% AMD for 15 years... but AMD is so far behind that for the first time, I bought several Intel-based servers.

        Not sure what advantages you think AMD has over Intel... I would love to see a list. because frankly, it's sad to see AMD where it is.

        That's easy. Core density per dollar in the same n rack units.
        For the same number of dollars I can buy more real estate on which to run my virtualization stacks with AMD processors than Intel processors. And the savings extend beyond the hardware too. With more cores per socket, my VMWare licensing costs (per core or per VM, however you want to break it out) can be much lower with AMD processors in those sockets. So cheaper CAPEX (hardware and license costs) and cheaper OPEX (support subscription costs) ma

      • Mostly, not having to worry about whether the HVM extensions are turned on in a particular motherboard / CPU combination. Because *all* of the AMDs (from the lowly Athlon64 X2 chips and up) have it turned on.

        And 45W parts at the desktop are *very* nice in terms of noise. Even with the stock CPU fan, it makes for a very quiet desktop.
      • by dbIII ( 701233 )
        I've got a mix, AMD for massively parallel stuff and Xeons for stuff with a low number of threads. It's commercial software from a large company so GPU stuff is going to have to wait until the hype has been around for a decade and they pay some teenagers in India to code it.
        The number one AMD advantage to me was 64 cores and 128GB on a SuperMicro board in a decent chassis for $9k. The equivalent Intel machine does have more cores but costs more than around five AMD machines.
    • I'm in the market for a new Workstation. I've been looking at an Opteron instead of the desktop models.

      Be careful. Strange workstation vendors have decided that if you're blowing $5k on a huge workstation 2 sockets, then naturally you don't mind if it sounds like it's being powered by a gas turbine located inside the case. Oh, and the exhaust of the gas turbine is then fed into a flugelhorn or vuvuzela just incase you are hard of hearing.

      I really don't know why. I have purchased a 3 GPU machine which dissi

      • I'd probably go for single socket, 16-core Opteron on a supermicro or Tyan standard ATX board I can plop in my existing chassis. Supermicro will need a breakout cable for front panel buttons, but no big deal. In this situation I can fit most sized heatsinks just need to be sure it will fit on the socket.

      • by lopgok ( 871111 )
        Do your 'workstations' have intel xeon cpus or amd cpus? Otherwise, it is quite unlikely that they have ECC memory, which is pretty much required of real workstations. Real workstations are reliable (and quiet). Just being quiet doesn't count.
        • Do your 'workstations' have intel xeon cpus or amd cpus?

          The loud workstation I'm referring to was an AMD one. The quiet GPU monster was an Intel Core i7, so it didn't have ECC, but neither did the 3 graphics cards which were being used for computation, so it was a bit of a wash, really. And it actually needed proper graphics cards, since it relied heavily on the texture sampling unit for the computations.

          That's the other nice thing about AMD is that you can get cheap machines with ECC: the standard single s

          • by lopgok ( 871111 )
            I heard the high end nvidia cards have ECC on their video memory. They found there were too many errors when doing computation...
            • I think they do now. Or at least the computation cards do. I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units. Since the workload was image processing, that was necessary to get good performance. The computation cards tend not to be so good if you're doing graphics or graphics-like workloads as they dedicate more resources to floating point and fewer to pixel specifi

              • I don't think they have them on the consumer level graphics cards. It turned out that the consumer graphics cards were better for our needs since they had more and faster texture sampling units.

                I recently bought a Radeon HD 7850 2GB card that I'm pretty sure has ECC. Cost was $200. It's pretty cool - if you overclock the memory too high, the card doesn't start crashing your games or anything, it just doesn't run any faster.

                To my knowledge, all graphics cards with GDDR5 memory have ECC.

    • by Bengie ( 1121981 )
      Can you image the amount of heat 64cores of Bulldozers in a 1U case would create?!

      I've been looking at custom building a server at home and when looking in the 64GB-256GB of ram with single-dual socket 8-32 threads, Intel wins in price, performance, and power consumption.

      Choosing between an Intel Xeon i5 3.3ghz quad+HT with 65watt TDP compared to an Opteron Bulldozer 4module-8core 2.8ghz with a 125watt TDP and lower IPC, for an almost identical price isn't even a choice.
      • by dbIII ( 701233 )

        Can you image the amount of heat 64cores of Bulldozers in a 1U case would create?!

        That's why you put the noisy little buggers in a server room and keep the door closed.
        As a relatively early adopter my 48 core AMD is in 5U, but you can get the newer 4 socket 16 core ones in 1U. I've got a pile of twin 8 core systems in 1U from a few years ago that put out about the same amount of heat as a recent 64 core system would, and they are like hairdryers.

    • It's a pity AMD's cores aren't complete cores. The cores come as a pair and they share some resources with each other. Some kind of bastardisation between Hyper-threading and true multi-core.
      • by dbIII ( 701233 )
        It entirely depends on what the code is doing whether it acts as a complete core or not. The part in question acts as two floating point operators for normal length floating point variables and as one for double length floating point variables. In a lot of situations it may act as a two floating point units for the entire lifetime of the server.
        The hype makes it look like a crippling stupid hack like the netburst stuff but reality is a fairly simple tradeoff that's not going to make any difference to what
        • What AMD calls two cores is actually a single core with two integer units with two ALU's each and two FPU's. What Intel calls a single core has two ALU's, 2 FPU's. and 3 SSE units. Their 'dual-core' modules are very similar to a single hyper-threading core.
  • AMD SUcks (Score:5, Funny)

    by Anonymous Coward on Wednesday December 05, 2012 @01:04PM (#42193107)

    Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

      Yea, because usually when a company has no competition they lower prices. Happens all the time.

      • Everyone knows that Intel is better, and competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

        Yea, because usually when a company has no competition they lower prices. Happens all the time.

        Woooosh!

    • If I read what you wrote as a sarcastic statement intended to be read as the opposite of what you wrote, you come off sounding a lot more intelligent. Intel does have a leg up at the moment, but everything after that is incorrect and misguided in what you said. Here's hoping competition stays alive, socketed CPUs stay around, and AMD has a long life ahead of them (one they earned, of course).

    • Re: (Score:3, Insightful)

      by Anaerin ( 905998 )

      ...competition in the CPU market is not a good thing. I hope AMD goes out of business soon, so that Intel can lower the price of their chips.

      What? Competition drives innovation and lowers prices. It happened with AMD's Athlon killing the old Netburst P4s. It happened with x64 killing IA-64. Why would AMD leaving the market "let" Intel lower CPU prices?

      Oh, I'm sorry, you're just a troll, without the possibility of reasonable discourse or fair and reasoned debate. Forgive my oversight.

    • I wish I had mod points. This is the funniest thing I've read on here in a while. It's goes above and beyond trolling.

    • by Jeng ( 926980 )

      You got some nice bites there, but I think you could have trolled harder.

      7/10

  • by Kjella ( 173770 ) on Wednesday December 05, 2012 @01:25PM (#42193349) Homepage

    Okay so they're the only x86 CPU offering 1.25v DDR3 support but the difference between a pair of 1.25v and 1.5v DIMMs is around 4 W [tomshardware.com] and you can save 3 of those 4 W moving to the commonly available 1.35v DDR3. Meanwhile AMD keeps putting out 125W processors like the FX-8350 to not really compete with a 77W processor like the i7-3770K, so this "major datacenter advantage" I think I'll file under "major wishful thinking". Not to mention you're investing into a platform with little future since AMD wants to push ARM servers now. But I guess Intel has let AMD put a positive spin on continuing to deliver on old sockets.

    • by serviscope_minor ( 664417 ) on Wednesday December 05, 2012 @01:52PM (#42193711) Journal

      Huh?

      The i7 3770K has a TDP of 95W. And the FX-8350 is a very good chip and much cheaper than the i7. The benchmarks relative to the i7 are all over the place. In most cases it sits somewhere between the i5 and i7. In some cases it is destroyed by the i7, in other cases, the reverse is true. The single threaded performane is quite weak and usually substantially less than the i5, but then the i5 to i7 difference isn't enormous. The difference from FX8350 to i7 seems to be around 20-50% in most cases.

      Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

      Anyway.

      This thread is about the Opteron processors, which are still (a) competing against SB, (b) benefit from substantially cheaper full system costs and (c) you aren't terribly sensitive to single thread performance if you're buying a 4 socket server.

      Not to mention you're investing into a platform with little future

      What does that even mean? It's all x86, so even if AMD vanishes tomorrow you can keep using the servers and then transition to intel when you need new ones. The whole point of having more than one vendor means that no matter what, you're not investing in a platform with no futuer.

      • The i7 3770K has a TDP of 95W

        Intel's website says 77 W [intel.com] while various [techpowerup.com] websites [flyingsuicide.net] say the retail packaging says 95 W. Weird.

      • by steveha ( 103154 ) on Wednesday December 05, 2012 @02:52PM (#42194555) Homepage

        The i7 3770K has a TDP of 95W.

        I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

        But according to NordicHardware [nordichardware.com], in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)

        Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

        This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.

        http://www.agner.org/optimize/blog/read.php?i=49 [agner.org]

        The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.

        But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.

        If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.

        • by Anonymous Coward

          The i7 3770K has a TDP of 95W.

          I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

          But according to NordicHardware [nordichardware.com], in this case Intel is under-promising and over-delivering, and the chips really do dissipate only 77W despite being rated for 95W. (But how did they measure that? Is this a "typical" 77W? I guess it's not that hard to run a benchmark test that should hammer the chip and get a worst-case number that way.)

          Curiously the AMD processors tend to stack up better on the Linux benchmark suites.

          This is probably because Linux benchmarks were compiled with GCC or Clang rather than the Intel compiler. The Intel compiler deliberately generates code that makes the compiled code run poorly on non-Intel processors. The code checks the CPU ID, and the code has two major branches: the good path, which Intel chips get to run, and the poor path, which other chips run.

          http://www.agner.org/optimize/blog/read.php?i=49 [agner.org]

          The irony is that Intel, by investing heavily in fab technology, is about two generations ahead of everyone else, so they can make faster and/or lower-power parts than everyone else. This means they could be competing fairly and win.

          But because Intel does evil things like making their compiler sabotage their competition, I refuse to buy Intel. They have lost my business. They don't care of course, because there aren't many like me who are paying attention and care enough to change their buying habits.

          If you want the fastest possible desktop computer, pay the big bucks for a top-of-the-line i7 system. But if you merely want a very fast desktop computer that can play all the games, an AMD will do quite well, and will cost a bit less. So giving up Intel isn't a hard thing to do, really.

          AMEN!

        • I know that, at least in the past, Intel used to issue TDP numbers that represented "typical" heat, while AMD used to issue TDP numbers that represented worst-case heat (which is what TDP ought to be IMHO). I have read here on Slashdot that more recently, AMD has started playing those games as well.

          At least in the past, Intel used to issue TDP numbers that ignored the chipset, while AMD would compare CPU+chipset, because intel's chipsets would suck down the power like nobody's business and AMD's wouldn't. In the early amd64 days, companies actually built (17") laptops around the desktop Athlon64 because CPU+Chipset had lower TDP than intel's mobile offering. When you're competing against Mobile P4, that's not a surprise. But the same situation persisted all the way through the Core 2 Duo days! Intel's

      • The Opterons in the new series have a TDP of 65W or 95W for the 8-core models. At the expense of being clocked lower than a FX-8350, but the performance per watt is still better than for the FX-8350.

        Looking at the 4 core models, the 3350HE may be a worthy replacement for the Athlon II X4 910e I have in my current desktop:
        Four cores like the Athlon, 2.8 GHz clock speed where the Athlon had 2.6 GHz and only 45W TDP versus the 65W of the Athlon. In terms of pricing, the 3350HE seems to be similar to where the

      • by Kjella ( 173770 )

        The i7 3770K has a TDP of 95W.

        No it doesn't, but I guess if you don't have facts use FUD. Intel has kept a "segment TDP" on the retail packaging because they want all Sandy/Ivy Bridge motherboards, coolers etc. to support 95W processors - the maximum in the Sandy Bridge line - but the actual processor will never use more than 77W. This was explained here [nordichardware.com] but Intel's site and 99,9% of all reviews and online sites will list it as a 77W processor. In fact the 95W figure is so rare that only reason to bring it up - particularly ignoring all

    • by dbIII ( 701233 )
      Did you really write about desktop CPUs and then use them to try to debunk the phrase "major datacenter advantage"? You did? Why? Do you really think the people reading this thread are that stupid and why do you want to manipulate them?
  • And on the desktop side, ZDNet reports that AMD will remain committed to the hobbyist market with socketed CPUs.

    On a related note, I find it quite weird that Intel willfully forfeits their own future over to motherboard makers. It makes little sense to me to depend on third parties you've absolutely no control on to fix the price of the final product that your current customers -- computer makers -- end up buying, irrespective of the fact that Intel itself makes motherboards. I must be missing something besides the obvious (aka it's thinner, which incidentally ensures that AMD has to do this too for laptops). Slashdo

    • by Eugene ( 6671 )

      Intel also makes motherboard in addition to chipsets and CPUs...

    • On a related note, I find it quite weird that Intel willfully forfeits their own future over to motherboard makers. It makes little sense to me to depend on third parties you've absolutely no control on to fix the price of the final product that your current customers -- computer makers -- end up buying

      Intel brings out a new CPU socket every week, so nobody upgrades their Intel CPU because they can't, the new CPU takes a new socket. AMD brings out new CPU sockets only rarely, so people do sometimes upgrade their CPU. Intel processors appeal to the PHB market, and AMD processors appeal to the nerd market. Intel is known for being expensive. Cost-cutting makes sense, and the socket is expensive.

  • I was hoping AMD could release a faster workstation level Opteron 4300 to match the FX-8350, the top end 4386 is still a 3.1Ghz (turbo to 3.8Ghz) but
    the fastest 8 core Opteron 6328 is 3.2Ghz and goes to 3.8Ghz turbo. (but Opteron 4386's TDP is 95w vs Opteron 6328's 115w) while FX-8350 is 4Ghz turbo to 4.2Ghz and consuming 125w

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...