Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Intel Hardware

CPU Competition Heating Up In 2012? 100

jd writes "2012 promises to be a fun year for hardware geeks, with three new 'Aptiv-class' MIPS64 cores being circulated in soft form, a quad-core ARM A15, a Samsung ARM A9 variant, a seriously beefed-up 8-core Intel Itanium and AMD's mobile processors. There's a mix here of chips actually out, ready to be put on silicon, and in last stages of development. Obviously these are for different users (mobile CPUs don't generally fight for marketshare with Itanium dragsters) but it is still fascinating to see the differences in approach and the different visions of what is important in a modern CPU. Combine this with the news reported earlier on DDR4, and this promises to be a fun year with many new machines likely to appear that are radically different from the last generation. Which leaves just one question — which Linux architecture will be fully updated first?"
This discussion has been archived. No new comments can be posted.

CPU Competition Heating Up In 2012?

Comments Filter:
  • Evolutionary! (Score:5, Insightful)

    by Sponge Bath ( 413667 ) on Wednesday May 16, 2012 @08:12AM (#40015743)
    Evolutionary upgrades to intel processors and memory standards, titanium is not dead yet, AMD still can't keep up and ARM rules low power applications. Yes, it will be a landmark year for processors.
    • by TheRaven64 ( 641858 ) on Wednesday May 16, 2012 @08:33AM (#40015973) Journal
      I think the news is that MIPS is not dead, it's just pining for the fjords.
    • Re:Evolutionary! (Score:5, Interesting)

      by QuantumRiff ( 120817 ) on Wednesday May 16, 2012 @08:36AM (#40016003)

      Why do people keep saying AMD can't keep up? because they don't compete in a market you care about?

      My wife's laptop has an AMD E-350.. its got an ATI video card built onto the cpu.. it sucks down a whopping 9 watts, making her super light 10.6" laptop last about 7 hours.. 4GB of ram, 500GB hard drive, can stream HD video without a hiccup, and it was $350.. about what you would pay for a nice video card.. I would say AMD is competing rather well..

      In the server space, were ditching Intel as fast as we can.. because for our loads, a 16 core Opteron runs oracle at the same speed as a 12 core Intel (CPU usage is not our limiting factor, disk IO is for our databases) and the difference in price last time we looked was about $7k for a Dell R815 spec'd the same as a Dell R810 with dual CPU's.. That difference is a Fusion IO card, or almost another tray of drives.. which would really help IO.

      • Re:Evolutionary! (Score:5, Interesting)

        by msobkow ( 48369 ) on Wednesday May 16, 2012 @08:49AM (#40016137) Homepage Journal

        Unfortunately most tests aren't covering anything business related like calculating join tables and processing large volumes of relational data. Instead, they report on things business could care less about, like the time it takes to transcode a video file or it's ability to render videogame graphics.

        The simple truth is that there are very few CPUs currently on the market which aren't perfectly capable of handling business application processing like document editing in a very acceptable fashion. In fact, the issue with even the "slow" CPUs is the time it takes to load and initialize an application, not in it's responsiveness once the application is loaded. That would seem to be more of a question of storage bandwidth than it would be of processor horsepower, but reviewers still blame the CPU for the performance.

        For that matter, even the video playback reviews are kind of pointless. Once you have enough snort to render video without dropping frames or tearing, any extra power is pretty much pointless for video processing. While you can start turning on options in the video pipeline, the truth is the effects of those options are virtually unnoticeable unless you use a super-high resolution screen to display expanded video.

        I think Windows RT is going to wake up a significant portion of the population to the benefits of low-power ARM processors in the real world.

        The business market requirements are not the same as the general gaming/video market's requirements.

        • In the CADCAM world we're still aching for faster single-thread execution. Adding cores is kind of a nice side, meaning I can surf with a bajillion tabs open to various forums without slowing down the toolpath verification running on the other screen, but there's isn't much I can do right now to make that verification process faster. Some functions simply need to run sequentially, which means I need a faster clock to make any significant improvements.
      • Re:Evolutionary! (Score:4, Interesting)

        by serviscope_minor ( 664417 ) on Wednesday May 16, 2012 @09:31AM (#40016651) Journal

        My wife's laptop has an AMD E-350.. its got an ATI video card built onto the cpu.. it sucks down a whopping 9 watts, making her super light 10.6" laptop last about 7 hours.. 4GB of ram, 500GB hard drive, can stream HD video without a hiccup, and it was $350.. about what you would pay for a nice video card.. I would say AMD is competing rather well..

        Which laptop is it? There seem to be a distinct lack of super light laptops of late. My eee 900 at 935g is currently lighter than any netbook on the market and the Asus UX-21 seems (at 1.1Kg) to be the joint lightest non netbook.

        What is it? It sounds pretty good and I want one...

        In the server space, were ditching Intel as fast as we can.. because for our loads, a 16 core Opteron runs oracle at the same speed as a 12 core Intel

        That's what I found a while back, when everyone said Intel was faster: the 12 core 6100s could cram more FLOPS into 1U than the best Intel 4 socket boxes, and at a considerably lower price. Intel have substantially improved their offerings since then (AMD has not by quite so much), but the price of RAM crashed, making the CPUs and system boards a much larger fraction of the cost, increasing the advantage of AMD further.

        I actually did the calculation, since I had to budget the 5 year cost, including electricity and cooling and rack space. The AMD systems chew more power, though not as much as the raw CPU differences, since with RAM maxed out, that is a significant fraction of the power draw.

        End result was that the AMD systems were substantially cheaper in compute peak performance per $ and in 5 year compute power per $.

        Actually, the performance is very application dependent. Some codes suck on AMD, others (rarer) had the PhenomII macthing an i7 for speed.

        Intel still have the single thread performance crown, which is really useful on the desktop and laptop. If you're buying a 4 socket machine, it's a fir bet that your task is parallelizable, which reduces the advantage of intel in the 4 socket market.

        For single socket stuff that isn't too performance sensitive, AMD has the additional advantage that cheap consumer level boards support the otherwise expensive enterprisey features of ECC memory. If you care about that sort of thing, then getting a Phenom II is much cheaper than a 1 socket Xeon.

        • Sorry, i lied a bit.. its an 11.6" screen and it has a C-60 CPU, not an e-350.. looking at the specs, it looks like its half the power, and a bit slower than the "E" class. but still perfect for what she does.

          there are several models of the Acer Aspire One that carry them, think the one I picked up was this: http://www.newegg.com/Product/Product.aspx?Item=N82E16834215172 [newegg.com]
          I picked hers up at costco for $350..

        • For single socket stuff that isn't too performance sensitive, AMD has the additional advantage that cheap consumer level boards support the otherwise expensive enterprisey features of ECC memory. If you care about that sort of thing, then getting a Phenom II is much cheaper than a 1 socket Xeon.

          Phenom II is almost completely gone from the channel now, have you run the numbers or done any performance testing with the AMD FX- line?

          Our JBoss/MySQL-based app server is one application where Phenom II inde
        • by Anonymous Coward

          End result was that the AMD systems were substantially cheaper in compute peak performance per $ and in 5 year compute power per $.

          Actually, the performance is very application dependent. Some codes suck on AMD, others (rarer) had the PhenomII macthing an i7 for speed.

          Intel still have the single thread performance crown, which is really useful on the desktop and laptop. If you're buying a 4 socket machine, it's a fir bet that your task is parallelizable, which reduces the advantage of intel in the 4 socket market.

          I bolded the important bits. Not all parallel applications are created equal. There are lots which scale well to a small number of cores, but see diminishing returns as the core count goes up. Algorithms which are "embarrassingly parallel" (that is, they scale effortlessly to very high core counts) are the exception rather than the rule. So, the advantage of the AMD approach depends a very great deal on the type of code you're running. As you found, that approach (lots of weak cores) kinda sucks for ma

      • I have the same chip in my laptop and I love it. I even get to play games, just finished DeathSpank.

      • I have head AMD is very competitive in the server market. I doubt anyone would make a argument otherwise.

        Typically when we talk about this stuff, we are all talking about "Consumer Products"...

        That does not include embedded. Many of these low power chips are one degree separation from embedded chips.

        I would not call some of these devices a "computer" strictly speaking insofar as modern computing devices are concerned.

        Is an iPhone a computer? How about a Tablet? How about an ultralight laptop with a low powe

      • One major difference is that the R815 comes with a crap service plan, vs the $1300 the R810 comes standard with. And if I/O is your bottleneck, shouldn't you be considering a R820 anyhow with it's PCIe Gen 3 interfaces, and double the drive bays? Of course this will add to the cost, but there is no AMD alternative.

    • by na1led ( 1030470 )
      It's going to equal more fragmented support. You buy the latest and greatest just to find that your hardware/software don't support it. Developers and Manufactures are already having a difficult time supporting so many different platforms.
  • by na1led ( 1030470 ) on Wednesday May 16, 2012 @08:13AM (#40015753)
    Fast CPU and Ram is great but we are still limited to slow crappy Hard Drives (SSD's too expensive) and OS's / Software that don't take advantage of current technology, let alone next generation.
    • by tepples ( 727027 ) <.tepples. .at. .gmail.com.> on Wednesday May 16, 2012 @08:22AM (#40015847) Homepage Journal

      slow crappy Hard Drives (SSD's too expensive)

      SSDs aren't too expensive if you don't need to keep your library of videos available at a moment's notice at all times. There exist affordable SSDs that are big enough to hold an operating system, applications, and whatever documents you happen to be working on at a given time.

      • SSD's currently can easily be had at $1 a gig, and even less in some cases. That being said a 120g drive is more than enough for your OS and applications/games you run. Edit the registry to load your profile off an secondary drive and viola. SSD hawtness... I saw a marginal increase when I moved to my SSD. I'm a gamer so thats my benchmark mostly; SWTOR for instance loaded a good bit faster.
        • by Anonymous Coward

          "Edit the registry to load your profile off an secondary drive and viola."

          I have a similar setup (128GB SSD). You make it sound easy. I tried several ways to move everything over to a hard drive so that nothing user-related was stored on the system/boot SSD. I tried hard links, fiddling with the registry, changing environment variables, but in the end I gave up and kept the stub of my user directory on C: where Windows seemed to want it, and moved all the individual directories (Documents, Music, Picture

          • by jjjhs ( 2009156 )
            I have a hard drive, it can last indefinitely without doing all this nonsense to reduce wear.
          • by wbo ( 1172247 )

            I have a similar setup (128GB SSD). You make it sound easy. I tried several ways to move everything over to a hard drive so that nothing user-related was stored on the system/boot SSD. I tried hard links, fiddling with the registry, changing environment variables, but in the end I gave up and kept the stub of my user directory on C: where Windows seemed to want it, and moved all the individual directories (Documents, Music, Pictures, etc.) using the "right-click and use Location tab" approach. Microsoft doe

            • by darrylo ( 97569 )

              I've moved user directories after installation using these basic instructions [lifehacker.com], without having to resort to installation foo. I've actually done this 3-4 times over the past year, due to stupidity on my part trashing my system drive (and not having any backups, which I now do have). I've never seen any junction issues, but that's probably because I have c:\users\spoo pointing to d:\users\spoo (c:\users still exists and is valid).

      • by Anonymous Coward

        Indeed, you can get a reasonably big SSD (Big enough that it's enough for a normal single-OS work laptop) for less than $200, and if you ship out $400, you'll be over 256 GB.

        When the cost of labor in western countries is what it is, an SSD as an investment is well worth the money, with payback time measured in months even if it only saves a mere 5 minutes per day. Oh, it also acts as a nice extra protection against shocks (I'm probably not the only one who's online with a toddler on the lap)

      • Exactly. Most people buying SSDs are using them to store OS and regularly opened programs, but have standard HDDs to store the stuff that doesn't "need" the added performance. Media is a perfect example of something that is silly to put on a SSD (unless you're actually editing said media, not merely consuming it).

        The bulk of the data being generated by most people today does not need to be stored on SSDs, really, nor should it be. It's the equivalent of buying a Ferrari to use as your daily driver.

        • by tlhIngan ( 30335 )

          by AngryDeuce (2205124) Alter Relationship on Wednesday May 16, @07:01AM (#40016265)

          Exactly. Most people buying SSDs are using them to store OS and regularly opened programs, but have standard HDDs to store the stuff that doesn't "need" the added performance. Media is a perfect example of something that is silly to put on a SSD (unless you're actually editing said media, not merely consuming it).

          Even editing media doesn't require SSDs - media is huge and even buffering a "small" amount like 1MB the seek tim

        • by Hadlock ( 143607 )

          I would really like to see smaller (32gb) ssd drives hard wired to the motherboard in laptops, with the option to add a second spinning drive in the open hd bay. Unfortunately two hard drives is only reserved for the upper most echelons of business laptops.

          • You could always buy one of the laptops that allows you to remove the optical drive and put a hard drive in its place.

      • Yep - I've got a 512G hybrid Seagate (32G of which is SSD) in one bay and a 1T Western Digital in what used to be my optical bay (macbook pro) giving me plenty of storage for not much money at all, relatively speaking, and the 32G does seem to help with performance especially for what I run often.

    • by AngryDeuce ( 2205124 ) on Wednesday May 16, 2012 @08:54AM (#40016199)

      SSD's too expensive

      Regular hard drives were just as expensive (if not more so) when they were at a comparable point in their development and life-cycle

      Here is an awful-colored chart [ns1758.ca] showing price per MB over the years. It's not so much that SSDs are really that expensive, it's that traditional HDDs have gotten ridiculously cheap, and capacities have grown beyond the storage needs of most average people. I remember actually filling up hard drives and having to buy larger and larger disks to hold my shit every couple years, but the 500 GB WD in my most current build is running at 40% capacity and I've got a lot of media on there.

      • by Kjella ( 173770 )

        but the 500 GB WD in my most current build is running at 40% capacity and I've got a lot of media on there

        No you don't, but I'm finally starting to figure out what the people that lived on campus and had 100 Mbit around y2k was talking about when they said streaming was the future while I was still fighting with 64 kbps ISDN hoping to get a 1 Mbps ADSL line. Or even download and delete, which is a lesser form. Right now there's ~20.000 BluRays on Amazon and there's no reason for me to have a petabyte array to store them on. That's 100 people with a 10 TB server or 1000 people with a 1 TB disk or 10000 people wi

    • by bky1701 ( 979071 )
      "OS's / Software that don't take advantage of current technology"

      And bloat. Let's not forget bloat. There is much to be said when a modern computer running modern programs and OSes is only slightly more responsive than the same in the 90s.
    • With the proper software (e.g. Flashcache) you can run a write back cache on your SSD so as long as your file working set is smaller than the SSD everything will be cached.

  • by Rosco P. Coltrane ( 209368 ) on Wednesday May 16, 2012 @08:15AM (#40015785)

    When is the battery problem going to be solved? Yes I know batteries have been getting better over the years, but devices these days have a hard time saying alive more than 24 hours doing anything useful these days.

    All these wonderful gadgets all end up sucking pond water from the bottom because you need to tether them to a mains socket every few hours...

    • >> When is the battery problem going to be solved?

      Never. How do you want to "solve" that "problem" ?
      System power is a design issue, but the current state of the art is not really problematic. Of course, if you want turbo-gaming for 12 hours, it's heavy. But else ....

    • by Idbar ( 1034346 )
      Batteries have a bit of a slower pace. But the more power budget you give to designers, the more features they add and more power they consume. If some people is fine with having a device recharged every 24h, many people (designers) will work with that budget in mind.
    • by msobkow ( 48369 )

      And how many people stay up 24 hours at a stretch using a battery powered device?

      Sure I can see the need for longer than a 12 hour lifetime in a few cases, like someone who's "off roading" and can't plug in while they're sleeping, but for the vast majority of the population they just need it to function while they're awake, charge while they're sleeping, and that's more than they need.

      As I've never seen a battery powered portable device that requires you to shut down in order to plug in the power suppl

      • My mom's a midwife. Her work replaced her blackberry with an iPhone and she went from multiple days without recharging to under a day. Given that she can be away for 30hrs straight, this meant getting multiple phone chargers (home, car, work, etc) and she basically needs to plug it in whenever she can.

    • Comment removed based on user account deletion
      • by serviscope_minor ( 664417 ) on Wednesday May 16, 2012 @09:50AM (#40016947) Journal

        Unless you want something with the energy density of thermite

        Thermite doesn't have an especially high energy density. See here: http://en.wikipedia.org/wiki/File:Energy_density.svg [wikipedia.org]

        Pure aluminium has a moderate energy density. Once you mix in the iron oxide in stochiometric quantities, the energy density goes down by quite a bit (factor of 4). That still puts it as better than any known battery technology, but only by a factor of 2 for zinc air and 5 dor li-poly. All the common fuels have a much higher energy density.

        The reason that thermite burns so hot is that the products of combustion have a fairly low specific heat capacity and there is no need to heat up a huge bunch of useless nitrogen (compared to burning fuel in air).

        Bottom line is that thermite beats existing battery tech by a wide margin, but falls very far short of common fuels.

        • That is a good start but there is a bit more to it than that.
          .

          Fe2O3 + 2Al --> 2Fe + Al2O3 is an "extremely intense exothermic reaction" [wikipedia.org]

          Also, aluminum has quite a number of useful properties that enhance the reaction -- "at least 25% oxygen, have high density, low heat of formation, and produce metal with low melting and high boiling point", etc.

          FWIW I am not sure how "fairly low specific heat capacity" helps (or even if Fe2O3 & Al have it).

          • FWIW I am not sure how "fairly low specific heat capacity" helps (or even if Fe2O3 & Al have it).

            The reaction inputs and outputs all have to be raised up to the final temperature. If the heat capacity is high then the temperature will be lower, since more energy will be required to raise the temperature.

  • Locked down (Score:5, Interesting)

    by tepples ( 727027 ) <.tepples. .at. .gmail.com.> on Wednesday May 16, 2012 @08:19AM (#40015813) Homepage Journal
    How many of these CPUs will appear only in devices with cryptographically locked bootloaders? The license agreement for Microsoft's forthcoming Windows RT operating system, for example, explicitly bars device manufacturers from allowing the end user to install a custom signing certificate. And even on devices that do allow homemade kernels, how many devices incorporating these non-x86 CPUs will have driver source (or even proper data sheets) that allow support for all the SoC's features in a freely licensed operating system?
    • The license agreement for Microsoft's forthcoming Windows RT operating system, for example, explicitly bars device manufacturers from allowing the end user to install a custom signing certificate.

      I missed the part where you were forced to buy Microsoft devices, instead of employing a little forward-thinking and buying a device without a locked bootloader.

      It's only an issue if you make it one. WinRT will die a death, as long as nobody buys it.

      • The license agreement for Microsoft's forthcoming Windows RT operating system, for example

        I missed the part where you were forced to buy Microsoft devices

        That or you missed the "for example".

        instead of employing a little forward-thinking and buying a device without a locked bootloader.

        To employ forward-thinking and buy an unlocked device, first you have to know that unlocked devices exist. For example, in the United States market, the most popular handheld gaming devices with physical buttons are the DS series and PSP series. Only hardcore geeks ever mail-order a GP2X product, for example; non-geeks don't even know they exist.

    • There's a very simple solution: Don't buy a device that has a locked-down bootloader that hasn't been cracked yet.

      • by tepples ( 727027 )
        The question remains the same: How many of these CPUs will appear only in devices that one should not buy by that metric?
    • by jd ( 1658 )

      The MIPS won't, since Microsoft doesn't write for it, so that's 3 of the CPUs. Same for the Itanium, since Microsoft has abandoned that. It's very unlikely Microsoft is developing for both the A9 and A15, so that eliminates half of what's left. Most ARMs won't be running a MS OS, it'll be a minority OS for a long time on that chip. So really only the AMD CPU even has the potential for vendor lock-in by Microsoft.

      • Microsoft's forthcoming Windows RT operating system, for example

        The MIPS won't, since Microsoft doesn't write for it

        I wasn't referring to Windows RT as the only example of a locked bootloader. For MIPS, I'd be more inclined to use the examples of PlayStation 2, PlayStation Portable, TiVo DVR, and various companies' set-top boxes.

  • 64 bit ARMv8 (Score:4, Interesting)

    by Lazy Jones ( 8403 ) on Wednesday May 16, 2012 @08:36AM (#40015989) Homepage Journal
    The 64 bit ARM architecture for server CPUs is much more interesting [eetimes.com] ...
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      No, it's not.

      ARM cpus are actually pretty lousy when it comes to computations/watt. That crown goes to low-end celeron CPUs, by a massive margin. It's just that ARM can operate in the very low-end power sipping envelope that a smartphone/tablet demands.

      You have to remember that these new arm SOCs are actually not very fast when compared to desktop CPUs. The lowest-end single core celeron murders the highest-end quad core arm SOC in terms of computational power. This is the real reason you don't see ARM base

  • by ewg ( 158266 ) on Wednesday May 16, 2012 @08:49AM (#40016129)

    Please don't use the phrase "heating up" referring to CPUs, even as a metaphor!

  • The Itanic hasn't sunk just yet.
  • by David_Hart ( 1184661 ) on Wednesday May 16, 2012 @09:34AM (#40016711)

    I have a quad core i5 desktop and I rarely use it now except for home video encoding/decoding and editing and to stream media to my TV, and most of that is offloaded to the GPU. I use my PS3 and Wii for game playing. Even my relatively new HP DM4T (2010) laptop has been gathering dust lately. I've been spending most of my time, like most people, on my tablet, a HP Touchpad running CM9 android.

    For personal use, CPUs simply do not matter any more, just battery life...

    For corporate use, CPUs matter as we keep trying to pack more application servers on VM machines.

    • I have a quad core gaming PC and I use it daily for internet access, gaming obviously, and streaming media to my TV. I don't own a games console or tablet as I don't believe in having three devices to badly do the job of one. I've been spending most of my time, like some people (i won't presume to apply my anecdote to most people), on my PC, a Q6600 which I'm about to upgrade. It's done 5 years, it's time for something new.

      For personal use, CPUs matter a whole lot to me. My PC is my entertainment centre,
      • I find it hard to use a desktop late at night with a toddler in my lap, but a tablet (touchpad for me too, actually) works fine.

        • I would find it hard to use a desktop late at night with a toddler in my lap too.

          I decided to not have the baby. Horses for courses.
  • Look at AMDs client roadmap for 2012 [anandtech.com] and 2013 [anandtech.com]. Did you see the recent Trinity benchmarks? Sucky CPU, decent GPU. Well look at the roadmap, those Piledriver cores are all you're going to get in AMDs "high-end" all the way through 2013. I'm sure you'll get more power in a cell phone or tablet format, but if you just want CPU power and don't care that it burns 100W because it's plugged to the wall then the future is mostly depressing. To use a car analogy, lower MPGs are great but it's not exactly what's going

    • I think you missed the point where the 8150 beats the 2600k when you're using the right software, at half the price.
      Yes, with single threaded code or code (especially benchmarks) compiled with Intel compilers, Intel CPUs are faster.

    • by tyrione ( 134248 )

      Look at AMDs client roadmap for 2012 [anandtech.com] and 2013 [anandtech.com]. Did you see the recent Trinity benchmarks? Sucky CPU, decent GPU. Well look at the roadmap, those Piledriver cores are all you're going to get in AMDs "high-end" all the way through 2013. I'm sure you'll get more power in a cell phone or tablet format, but if you just want CPU power and don't care that it burns 100W because it's plugged to the wall then the future is mostly depressing. To use a car analogy, lower MPGs are great but it's not exactly what's going to get cheers from the Top Gear crowd. Sure a good soccer mom car sells and it's the same for CPUs, but they don't excite anybody.

      You write like a clueless shill. More and more consumer software will be leveraging the design of once Bulldozer, now replaced by Piledriver which is much improved. Even the FOSS world has a lead on the Windows world when it comes to Concurreny Development. LLVM/Clang/Libc++/Compiler-RT/LLDB/Libclc and more are being optimized with target hardware from AMD, ARM, Nvidia, Intel and much more to take advantage of their various design tradeoffs. AMD bit the bullet and in the next 12 months it will heavily pay o

  • I found the MIPS Aptiv line interesting, and hope that they have some success in regaining some market share. Already, they've made some inroads into the Android tablet market, and their specs seem to suggest that they hold their own against ARM on power consumption, while being far more advanced in terms of 64-bit processing (MIPS had it since the 90s, whereas ARM is only thinking about it now.

    I hope MIPS regains some of its marketshare in games, and becomes key in new IPv6 gear. Some more tablets base

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...