CPU Competition Heating Up In 2012? 100
jd writes "2012 promises to be a fun year for hardware geeks, with three new 'Aptiv-class' MIPS64 cores being circulated in soft form, a quad-core ARM A15, a Samsung ARM A9 variant, a seriously beefed-up 8-core Intel Itanium and AMD's mobile processors. There's a mix here of chips actually out, ready to be put on silicon, and in last stages of development. Obviously these are for different users (mobile CPUs don't generally fight for marketshare with Itanium dragsters) but it is still fascinating to see the differences in approach and the different visions of what is important in a modern CPU. Combine this with the news reported earlier on DDR4, and this promises to be a fun year with many new machines likely to appear that are radically different from the last generation. Which leaves just one question — which Linux architecture will be fully updated first?"
We want our 8-core i7 desktop CPU now !! (Score:2)
Intel has been dragging its feet on releasing an 8-core i7 desktop CPU to the users.
It took merely 2 years to upgrade from uni-processor machine to a 4-core CPU, and then it stopped.
It has been 10 years and counting, and there is still no 8-core i7 desktop CPU.
Evolutionary! (Score:5, Insightful)
Re:Evolutionary! (Score:5, Funny)
Re:Evolutionary! (Score:5, Interesting)
Why do people keep saying AMD can't keep up? because they don't compete in a market you care about?
My wife's laptop has an AMD E-350.. its got an ATI video card built onto the cpu.. it sucks down a whopping 9 watts, making her super light 10.6" laptop last about 7 hours.. 4GB of ram, 500GB hard drive, can stream HD video without a hiccup, and it was $350.. about what you would pay for a nice video card.. I would say AMD is competing rather well..
In the server space, were ditching Intel as fast as we can.. because for our loads, a 16 core Opteron runs oracle at the same speed as a 12 core Intel (CPU usage is not our limiting factor, disk IO is for our databases) and the difference in price last time we looked was about $7k for a Dell R815 spec'd the same as a Dell R810 with dual CPU's.. That difference is a Fusion IO card, or almost another tray of drives.. which would really help IO.
Re:Evolutionary! (Score:5, Interesting)
Unfortunately most tests aren't covering anything business related like calculating join tables and processing large volumes of relational data. Instead, they report on things business could care less about, like the time it takes to transcode a video file or it's ability to render videogame graphics.
The simple truth is that there are very few CPUs currently on the market which aren't perfectly capable of handling business application processing like document editing in a very acceptable fashion. In fact, the issue with even the "slow" CPUs is the time it takes to load and initialize an application, not in it's responsiveness once the application is loaded. That would seem to be more of a question of storage bandwidth than it would be of processor horsepower, but reviewers still blame the CPU for the performance.
For that matter, even the video playback reviews are kind of pointless. Once you have enough snort to render video without dropping frames or tearing, any extra power is pretty much pointless for video processing. While you can start turning on options in the video pipeline, the truth is the effects of those options are virtually unnoticeable unless you use a super-high resolution screen to display expanded video.
I think Windows RT is going to wake up a significant portion of the population to the benefits of low-power ARM processors in the real world.
The business market requirements are not the same as the general gaming/video market's requirements.
Re: (Score:3)
Re: (Score:2)
Well, it depends. Admittedly, being essentially a luser - surf, email, watch a few vids, read too much, play a few games, and run the occasional vm - I'm part of a very small market share as distinct from any big biz or serious user stuff.
AMD is relevant to me because I could put together a box at prices that I could save up for to let me better do what I wanted. Further, I could move from a Phenom quad-core to a Phenom II hexa-core on the same mobo with only a BIOS update. Also, with the six-core I was
Re: (Score:2)
Re:Evolutionary! (Score:4, Interesting)
My wife's laptop has an AMD E-350.. its got an ATI video card built onto the cpu.. it sucks down a whopping 9 watts, making her super light 10.6" laptop last about 7 hours.. 4GB of ram, 500GB hard drive, can stream HD video without a hiccup, and it was $350.. about what you would pay for a nice video card.. I would say AMD is competing rather well..
Which laptop is it? There seem to be a distinct lack of super light laptops of late. My eee 900 at 935g is currently lighter than any netbook on the market and the Asus UX-21 seems (at 1.1Kg) to be the joint lightest non netbook.
What is it? It sounds pretty good and I want one...
In the server space, were ditching Intel as fast as we can.. because for our loads, a 16 core Opteron runs oracle at the same speed as a 12 core Intel
That's what I found a while back, when everyone said Intel was faster: the 12 core 6100s could cram more FLOPS into 1U than the best Intel 4 socket boxes, and at a considerably lower price. Intel have substantially improved their offerings since then (AMD has not by quite so much), but the price of RAM crashed, making the CPUs and system boards a much larger fraction of the cost, increasing the advantage of AMD further.
I actually did the calculation, since I had to budget the 5 year cost, including electricity and cooling and rack space. The AMD systems chew more power, though not as much as the raw CPU differences, since with RAM maxed out, that is a significant fraction of the power draw.
End result was that the AMD systems were substantially cheaper in compute peak performance per $ and in 5 year compute power per $.
Actually, the performance is very application dependent. Some codes suck on AMD, others (rarer) had the PhenomII macthing an i7 for speed.
Intel still have the single thread performance crown, which is really useful on the desktop and laptop. If you're buying a 4 socket machine, it's a fir bet that your task is parallelizable, which reduces the advantage of intel in the 4 socket market.
For single socket stuff that isn't too performance sensitive, AMD has the additional advantage that cheap consumer level boards support the otherwise expensive enterprisey features of ECC memory. If you care about that sort of thing, then getting a Phenom II is much cheaper than a 1 socket Xeon.
Re: (Score:2)
Answered below.. sorry, it was an AMD C-60.. and 11.6 inch.. Acer Aspire one.. I don't use it, so don't pay attention to the specs.. I just needed to replace her old, old laptop that was heavy (17") with something small and faster/newer.. and with two little kids in the house, I am kind of figuring its going to be amazing if it lasts two years, so dirt cheap is best.
Re: (Score:3)
I am kind of figuring its going to be amazing if it lasts two years, so dirt cheap is best.
My experience is limited to Asus (eee 900), but the interesting thing about those netbooks was that the only thing sacraficed was speed. The build quality was excellent, surprisingly so.
Over the years, I've been kind of smug when other's vastly more expensive laptops have started to develop cracks, bad hinges, failing screens, broken power connectors, etc etc.
I've used mine a lot: it's about 4 years old and I've done
Re: (Score:3)
Sorry, i lied a bit.. its an 11.6" screen and it has a C-60 CPU, not an e-350.. looking at the specs, it looks like its half the power, and a bit slower than the "E" class. but still perfect for what she does.
there are several models of the Acer Aspire One that carry them, think the one I picked up was this: http://www.newegg.com/Product/Product.aspx?Item=N82E16834215172 [newegg.com]
I picked hers up at costco for $350..
Re: (Score:2)
Phenom II is almost completely gone from the channel now, have you run the numbers or done any performance testing with the AMD FX- line?
Our JBoss/MySQL-based app server is one application where Phenom II inde
Re: (Score:1)
End result was that the AMD systems were substantially cheaper in compute peak performance per $ and in 5 year compute power per $.
Actually, the performance is very application dependent. Some codes suck on AMD, others (rarer) had the PhenomII macthing an i7 for speed.
Intel still have the single thread performance crown, which is really useful on the desktop and laptop. If you're buying a 4 socket machine, it's a fir bet that your task is parallelizable, which reduces the advantage of intel in the 4 socket market.
I bolded the important bits. Not all parallel applications are created equal. There are lots which scale well to a small number of cores, but see diminishing returns as the core count goes up. Algorithms which are "embarrassingly parallel" (that is, they scale effortlessly to very high core counts) are the exception rather than the rule. So, the advantage of the AMD approach depends a very great deal on the type of code you're running. As you found, that approach (lots of weak cores) kinda sucks for ma
Re: (Score:2)
I have the same chip in my laptop and I love it. I even get to play games, just finished DeathSpank.
Re: (Score:2)
If I remember right, Oracle enterprise edition and Standard edition have much different pricing on how the CPU's work.. (but standard is locked to two physical processors).. i'm not our DBA, and don't ever have to deal with oracle or the contracts.. so I might be wrong on that..
Re: (Score:2)
I have head AMD is very competitive in the server market. I doubt anyone would make a argument otherwise.
Typically when we talk about this stuff, we are all talking about "Consumer Products"...
That does not include embedded. Many of these low power chips are one degree separation from embedded chips.
I would not call some of these devices a "computer" strictly speaking insofar as modern computing devices are concerned.
Is an iPhone a computer? How about a Tablet? How about an ultralight laptop with a low powe
Re: (Score:2)
One major difference is that the R815 comes with a crap service plan, vs the $1300 the R810 comes standard with. And if I/O is your bottleneck, shouldn't you be considering a R820 anyhow with it's PCIe Gen 3 interfaces, and double the drive bays? Of course this will add to the cost, but there is no AMD alternative.
Re: (Score:2)
Too many other bottlenecks (Score:5, Insightful)
Small SSDs are cheaper (Score:5, Insightful)
slow crappy Hard Drives (SSD's too expensive)
SSDs aren't too expensive if you don't need to keep your library of videos available at a moment's notice at all times. There exist affordable SSDs that are big enough to hold an operating system, applications, and whatever documents you happen to be working on at a given time.
Re: (Score:2)
Re: (Score:1)
"Edit the registry to load your profile off an secondary drive and viola."
I have a similar setup (128GB SSD). You make it sound easy. I tried several ways to move everything over to a hard drive so that nothing user-related was stored on the system/boot SSD. I tried hard links, fiddling with the registry, changing environment variables, but in the end I gave up and kept the stub of my user directory on C: where Windows seemed to want it, and moved all the individual directories (Documents, Music, Picture
Re: (Score:1)
Re: (Score:3)
Re: (Score:2)
I've moved user directories after installation using these basic instructions [lifehacker.com], without having to resort to installation foo. I've actually done this 3-4 times over the past year, due to stupidity on my part trashing my system drive (and not having any backups, which I now do have). I've never seen any junction issues, but that's probably because I have c:\users\spoo pointing to d:\users\spoo (c:\users still exists and is valid).
Re: (Score:1)
Indeed, you can get a reasonably big SSD (Big enough that it's enough for a normal single-OS work laptop) for less than $200, and if you ship out $400, you'll be over 256 GB.
When the cost of labor in western countries is what it is, an SSD as an investment is well worth the money, with payback time measured in months even if it only saves a mere 5 minutes per day. Oh, it also acts as a nice extra protection against shocks (I'm probably not the only one who's online with a toddler on the lap)
Re: (Score:2)
Exactly. Most people buying SSDs are using them to store OS and regularly opened programs, but have standard HDDs to store the stuff that doesn't "need" the added performance. Media is a perfect example of something that is silly to put on a SSD (unless you're actually editing said media, not merely consuming it).
The bulk of the data being generated by most people today does not need to be stored on SSDs, really, nor should it be. It's the equivalent of buying a Ferrari to use as your daily driver.
Re: (Score:2)
Even editing media doesn't require SSDs - media is huge and even buffering a "small" amount like 1MB the seek tim
Re: (Score:2)
I would really like to see smaller (32gb) ssd drives hard wired to the motherboard in laptops, with the option to add a second spinning drive in the open hd bay. Unfortunately two hard drives is only reserved for the upper most echelons of business laptops.
Re: (Score:2)
You could always buy one of the laptops that allows you to remove the optical drive and put a hard drive in its place.
Re: (Score:2)
I'm relatively sure that feature is only found in business class laptops, as well.
Re: (Score:2)
Yep - I've got a 512G hybrid Seagate (32G of which is SSD) in one bay and a 1T Western Digital in what used to be my optical bay (macbook pro) giving me plenty of storage for not much money at all, relatively speaking, and the 32G does seem to help with performance especially for what I run often.
Re:Too many other bottlenecks (Score:5, Informative)
SSD's too expensive
Regular hard drives were just as expensive (if not more so) when they were at a comparable point in their development and life-cycle
Here is an awful-colored chart [ns1758.ca] showing price per MB over the years. It's not so much that SSDs are really that expensive, it's that traditional HDDs have gotten ridiculously cheap, and capacities have grown beyond the storage needs of most average people. I remember actually filling up hard drives and having to buy larger and larger disks to hold my shit every couple years, but the 500 GB WD in my most current build is running at 40% capacity and I've got a lot of media on there.
Re: (Score:2)
but the 500 GB WD in my most current build is running at 40% capacity and I've got a lot of media on there
No you don't, but I'm finally starting to figure out what the people that lived on campus and had 100 Mbit around y2k was talking about when they said streaming was the future while I was still fighting with 64 kbps ISDN hoping to get a 1 Mbps ADSL line. Or even download and delete, which is a lesser form. Right now there's ~20.000 BluRays on Amazon and there's no reason for me to have a petabyte array to store them on. That's 100 people with a 10 TB server or 1000 people with a 1 TB disk or 10000 people wi
Re: (Score:2)
And bloat. Let's not forget bloat. There is much to be said when a modern computer running modern programs and OSes is only slightly more responsive than the same in the 90s.
Re: (Score:2)
With the proper software (e.g. Flashcache) you can run a write back cache on your SSD so as long as your file working set is smaller than the SSD everything will be cached.
All that's great but (Score:3)
When is the battery problem going to be solved? Yes I know batteries have been getting better over the years, but devices these days have a hard time saying alive more than 24 hours doing anything useful these days.
All these wonderful gadgets all end up sucking pond water from the bottom because you need to tether them to a mains socket every few hours...
Never (Score:2)
>> When is the battery problem going to be solved?
Never. How do you want to "solve" that "problem" ? ....
System power is a design issue, but the current state of the art is not really problematic. Of course, if you want turbo-gaming for 12 hours, it's heavy. But else
Re: (Score:3)
Re: (Score:3)
And how many people stay up 24 hours at a stretch using a battery powered device?
Sure I can see the need for longer than a 12 hour lifetime in a few cases, like someone who's "off roading" and can't plug in while they're sleeping, but for the vast majority of the population they just need it to function while they're awake, charge while they're sleeping, and that's more than they need.
As I've never seen a battery powered portable device that requires you to shut down in order to plug in the power suppl
some people still need long life (Score:2)
My mom's a midwife. Her work replaced her blackberry with an iPhone and she went from multiple days without recharging to under a day. Given that she can be away for 30hrs straight, this meant getting multiple phone chargers (home, car, work, etc) and she basically needs to plug it in whenever she can.
Re: (Score:2)
Re:All that's great but (Score:4, Insightful)
Unless you want something with the energy density of thermite
Thermite doesn't have an especially high energy density. See here: http://en.wikipedia.org/wiki/File:Energy_density.svg [wikipedia.org]
Pure aluminium has a moderate energy density. Once you mix in the iron oxide in stochiometric quantities, the energy density goes down by quite a bit (factor of 4). That still puts it as better than any known battery technology, but only by a factor of 2 for zinc air and 5 dor li-poly. All the common fuels have a much higher energy density.
The reason that thermite burns so hot is that the products of combustion have a fairly low specific heat capacity and there is no need to heat up a huge bunch of useless nitrogen (compared to burning fuel in air).
Bottom line is that thermite beats existing battery tech by a wide margin, but falls very far short of common fuels.
Re: (Score:2)
Fe2O3 + 2Al --> 2Fe + Al2O3 is an "extremely intense exothermic reaction" [wikipedia.org]
Also, aluminum has quite a number of useful properties that enhance the reaction -- "at least 25% oxygen, have high density, low heat of formation, and produce metal with low melting and high boiling point", etc.
FWIW I am not sure how "fairly low specific heat capacity" helps (or even if Fe2O3 & Al have it).
Re: (Score:2)
FWIW I am not sure how "fairly low specific heat capacity" helps (or even if Fe2O3 & Al have it).
The reaction inputs and outputs all have to be raised up to the final temperature. If the heat capacity is high then the temperature will be lower, since more energy will be required to raise the temperature.
Locked down (Score:5, Interesting)
Re: (Score:2)
The license agreement for Microsoft's forthcoming Windows RT operating system, for example, explicitly bars device manufacturers from allowing the end user to install a custom signing certificate.
I missed the part where you were forced to buy Microsoft devices, instead of employing a little forward-thinking and buying a device without a locked bootloader.
It's only an issue if you make it one. WinRT will die a death, as long as nobody buys it.
First you have to know that unlocked devices exist (Score:3)
The license agreement for Microsoft's forthcoming Windows RT operating system, for example
I missed the part where you were forced to buy Microsoft devices
That or you missed the "for example".
instead of employing a little forward-thinking and buying a device without a locked bootloader.
To employ forward-thinking and buy an unlocked device, first you have to know that unlocked devices exist. For example, in the United States market, the most popular handheld gaming devices with physical buttons are the DS series and PSP series. Only hardcore geeks ever mail-order a GP2X product, for example; non-geeks don't even know they exist.
Re: (Score:2)
There's a very simple solution: Don't buy a device that has a locked-down bootloader that hasn't been cracked yet.
Re: (Score:2)
Re: (Score:2)
The MIPS won't, since Microsoft doesn't write for it, so that's 3 of the CPUs. Same for the Itanium, since Microsoft has abandoned that. It's very unlikely Microsoft is developing for both the A9 and A15, so that eliminates half of what's left. Most ARMs won't be running a MS OS, it'll be a minority OS for a long time on that chip. So really only the AMD CPU even has the potential for vendor lock-in by Microsoft.
Microsoft is not the issue (Score:2)
Microsoft's forthcoming Windows RT operating system, for example
The MIPS won't, since Microsoft doesn't write for it
I wasn't referring to Windows RT as the only example of a locked bootloader. For MIPS, I'd be more inclined to use the examples of PlayStation 2, PlayStation Portable, TiVo DVR, and various companies' set-top boxes.
Itanic (Score:2)
That's what I thought, reading that. Who buys it? If they're continuing it, Intel & HP might as well make lower end servers and workstations and stop pretending that it's an ultra high performance CPU and instead promote a variety of platforms that use it. The OSs that support it has shrunk, but they could offer servers w/ options of FreeBSD or Debian, workstations/laptops w/ a specially compiled PC-BSD or Debian, and try promoting it that way.
If they're not killing the CPU line, why keep it stuck
64 bit ARMv8 (Score:4, Interesting)
Re: (Score:2, Interesting)
No, it's not.
ARM cpus are actually pretty lousy when it comes to computations/watt. That crown goes to low-end celeron CPUs, by a massive margin. It's just that ARM can operate in the very low-end power sipping envelope that a smartphone/tablet demands.
You have to remember that these new arm SOCs are actually not very fast when compared to desktop CPUs. The lowest-end single core celeron murders the highest-end quad core arm SOC in terms of computational power. This is the real reason you don't see ARM base
"Heating up"? (Score:3)
Please don't use the phrase "heating up" referring to CPUs, even as a metaphor!
Re: (Score:2)
It goes very fast in a straight line, corners horribly and requires enormous amounts of energy. :)
I'm shocked... (Score:2)
Itanic OSs (Score:2)
OpenVMS and NonStop are dead OSs. Only people interested in them would be former DEC and Tandem houses that have too much sunk in, but even they would have been better off either staying w/ their AlphaServers or Himalaya servers, and not bother sinking cash into Itanium.
Not only that, you know that the platform is a loser when even Linux brands choose to drop it - talking specifically about Red Hat, Oracle and Canonical. Even Microsoft, which previously dropped Windows NT on RISC platforms, has dropped
CPU Wars? New Boxes? What? Why? (Score:3)
I have a quad core i5 desktop and I rarely use it now except for home video encoding/decoding and editing and to stream media to my TV, and most of that is offloaded to the GPU. I use my PS3 and Wii for game playing. Even my relatively new HP DM4T (2010) laptop has been gathering dust lately. I've been spending most of my time, like most people, on my tablet, a HP Touchpad running CM9 android.
For personal use, CPUs simply do not matter any more, just battery life...
For corporate use, CPUs matter as we keep trying to pack more application servers on VM machines.
Tablet devices? Touchscreen Interfaces? What? Why? (Score:2)
For personal use, CPUs matter a whole lot to me. My PC is my entertainment centre,
personal use varies... (Score:2)
I find it hard to use a desktop late at night with a toddler in my lap, but a tablet (touchpad for me too, actually) works fine.
Re: (Score:2)
I decided to not have the baby. Horses for courses.
No, not really... (Score:1)
Look at AMDs client roadmap for 2012 [anandtech.com] and 2013 [anandtech.com]. Did you see the recent Trinity benchmarks? Sucky CPU, decent GPU. Well look at the roadmap, those Piledriver cores are all you're going to get in AMDs "high-end" all the way through 2013. I'm sure you'll get more power in a cell phone or tablet format, but if you just want CPU power and don't care that it burns 100W because it's plugged to the wall then the future is mostly depressing. To use a car analogy, lower MPGs are great but it's not exactly what's going
Re: (Score:2)
I think you missed the point where the 8150 beats the 2600k when you're using the right software, at half the price.
Yes, with single threaded code or code (especially benchmarks) compiled with Intel compilers, Intel CPUs are faster.
Re: (Score:2)
Look at AMDs client roadmap for 2012 [anandtech.com] and 2013 [anandtech.com]. Did you see the recent Trinity benchmarks? Sucky CPU, decent GPU. Well look at the roadmap, those Piledriver cores are all you're going to get in AMDs "high-end" all the way through 2013. I'm sure you'll get more power in a cell phone or tablet format, but if you just want CPU power and don't care that it burns 100W because it's plugged to the wall then the future is mostly depressing. To use a car analogy, lower MPGs are great but it's not exactly what's going to get cheers from the Top Gear crowd. Sure a good soccer mom car sells and it's the same for CPUs, but they don't excite anybody.
You write like a clueless shill. More and more consumer software will be leveraging the design of once Bulldozer, now replaced by Piledriver which is much improved. Even the FOSS world has a lead on the Windows world when it comes to Concurreny Development. LLVM/Clang/Libc++/Compiler-RT/LLDB/Libclc and more are being optimized with target hardware from AMD, ARM, Nvidia, Intel and much more to take advantage of their various design tradeoffs. AMD bit the bullet and in the next 12 months it will heavily pay o
MIPS Aptiv (Score:2)
I found the MIPS Aptiv line interesting, and hope that they have some success in regaining some market share. Already, they've made some inroads into the Android tablet market, and their specs seem to suggest that they hold their own against ARM on power consumption, while being far more advanced in terms of 64-bit processing (MIPS had it since the 90s, whereas ARM is only thinking about it now.
I hope MIPS regains some of its marketshare in games, and becomes key in new IPv6 gear. Some more tablets base