Intel Sandy Bridge Desktop and Mobile CPUs 116
Vigile writes "The new Intel Sandy Bridge architecture is being launched at CES this week but the reviews and benchmarks are out today. PC Perspective took a look at both the desktop and mobile variants, the former of which turns out to be quite an impressive processor for both highly threaded and single threaded applications. With some tweaks to the execution unit, a new Turbo Boost mode that increases clock speeds dynamically and a vastly improved integrated graphics implementation, the Core i7-2600K improves in every aspect. Also interestingly, the most expensive desktop part will start at $317, putting the screws to AMD yet again. On the mobile side of things, PC Perspective tested the quad-core Core i7-2820QM and the benchmark results are equally impressive; especially when looking at the gaming performance using integrated graphics. Sandy Bridge will no doubt put quite a dent in the discrete notebook graphics market for NVIDIA and AMD."
Impressive graphics ? (Score:2, Insightful)
What benchmarks is the poster reading exactly ? On the Mac side, the SB IGP barely beat out the current nVidia 320M in shipping MacBooks, at low settings (a CPU bound task) and couldn't match the performance at medium settings meaning the SB IGP is slower than nVidia's offering from 2009!
There's nothing impressive, this is standard Intel IGP fare.
Is this the Tock? (Score:3, Informative)
The Sandy Bridge architecture, aside from the die shrink and subsequent increase in clock rate which that entails, in my opinion, is not that much of an improvement over the previous i7 Lynnfield architecture (i7 860, 870, 875k, 880). Here is an article that benchmarks [inpai.com.cn] a Sandy Bridge CPU vs an i875k where the frequency of both processors set to 3.4 GHZ... not that big of an improvement.
Funny thing is many of the articles today are praising the chip as a big improvement over Lynnfield not making it clear
Re: (Score:1)
Re: (Score:1)
Re: (Score:2, Informative)
All if they don't like you they can disable the processor from afar. All that at no extra cost! That will be a boon for stopping the computers spreading to countries they don't like.
http://www.techspot.com/news/41643-intels-sandy-bridge-processors-have-a-remote-kill-switch.html [techspot.com]
Re: (Score:1)
Oooh, yeah, that's _totally_ what they're going to use it for! Thank you for this non-retarded post making a meaningful point!
OK, in my old age I find sarcasm more and more of a lame mechanism, but since your post is so silly I don't feel bad.
... and an Embedded DRM (Score:2)
Re: (Score:1)
Looking into this further it looks like Intel is FINALLY including vPro/TPM by default(it's currently optional). While you could use TPM for DRM it is also extremely useful for full disk encryption and a load of other stuff.
Re: (Score:1)
The tech basically moves the bios password from residing in the bios into the CPU itself... I'm still not sure why you would want this... The motherboard manufacturer would need to include all the equipment needed to remotely do anything, and the "kill" on the CPU can be undone.
Re: (Score:2)
That's most likely the source of the performance increase... not the clock speed increase
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
I'm a bit of a Mac fanboy, and I do like the Core2Duo / nVidia combo, but C2D is now two generations behind. OK, IGP is roughly two generations behind nVidia, but it's harder to compare.
The only ray of hope is Intel's new mini SSD HDD - which might leave some room for discrete GPU.
Re: (Score:1)
320M still ships with most of notebooks with nvidia GPU. It is still in active use.
The real deal is integrating a decent (320M or R5450) GPU on your notebook, then you don't NEED 320M on your notebook. High-end gamers will still go Alienware and your mom and dad will probably never use the full graphics power of their GPU. They might also find the HD video decoding capabilities of the GPU quite nice and intel has been quite fast in providing driver support.
This is a serious blow to nvidia. Why pay them more
Re: (Score:2)
In the case of Macs, the 320M is an integrated part. This Intel IGP is slower than an nVidia IGP from 2009. There is not evolution at all, this is regression.
Re: (Score:2)
I don't know what benchmarks you're reading, but anandtech's show the intel IGP beating a discrete 320m in 5 out of 6 tests. As far as I'm concerned the evolution here is:
2009: Core2Duo + GeForce 320M integrated – 50W total
2011: Sandy Bridge quad core – 45W total
So, we've dropped in power consumption slightly, added two extra cores, got a (slightly) faster graphics card, the cores on the CPU are each much faster than the cores on the original, and last but not least we've gained hardware video
Re: (Score:2)
I don't know what benchmarks you're reading, but the 320M in the MacBook Pro 13" he used are not a discrete/dedicated card, they are integrated to the system chipset, they very much are an IGP.
Re: (Score:2)
anandtech didn't test with the MacBook Pro – they tested with a Core i5 machine with a discrete GT320m, the intel IGP won 5 of 6 tests.
Re: (Score:2)
Uh ? Anandtech did test the MacBook with a 320M. Check out this image from the review : http://images.anandtech.com/graphs/graph4084/34978.png. In fact, if you look closely at that graph, there is no mention of the GT320M on it, only the Apple MBP13 (P8600 + 320M). In a low settings scenario, SB's IGP barely beats out the nVidia 320M, but that is probably more a testament to the Core i7 chip vs the C2D than actual IGP performance. At medium settings, the tables are reversed and the 320M beats the SB IG
Re:Impressive graphics ? (Score:4, Insightful)
Re:Impressive graphics ? (Score:5, Informative)
The 320M is not a discrete graphics option, it's an integrated graphics option, same as this SB GPU. So you disagree out of ignorance more than disagreement. This is again a really poor showing on Intel's part.
Re: (Score:2)
Re: (Score:1)
Not sure what's Informative about your post, as it's incorrect. The 320M is not a CPU integrated GPU, which is what the SB GPU is. It's an off-chip option. Just because it's in a little chip doesn't make it dissimilar to a discrete card - it just has a lower power envelope and different IO configuration to memory and the CPU bus.
Guess what happens if you take the 320M and put it on a little PCI-express board with some different traces? yep - discrete graphics!
The 320M is a discrete graphics solution, an
Re:Impressive graphics ? (Score:5, Informative)
The 320M used in Macs shares memory with main system memory. That used the be the definition of an integrated graphics part. Dedicated/discrete GPUs have their own memory, hence the dedicated/discrete part of the name. I've been following graphics cards/benchmarks/terminology since the mid-90s and 3Dfx's rise to fame.
The 320M I'm talking about and that Anand used is integrated in the chipset, same as all the Intel graphics before it, so it shares its die with a memory controller, a SATA controller, a PCI interface and a USB controller. It is the very definition of an integrated graphics part. Intel only decided to move the part from the chipset and integrate more on the CPU die itself. That doesn't make their showing any more impressive.
Re: (Score:2)
that intel defence has been used now for 5 years, and that this time it's different.
Re: (Score:3)
Well from the article it is my impression they are trying to work more with Direct X than any other solutions. Since the Mac does not support Microsoft Direct X graphics I assume we will still see Nvidia Discreet graphics for time to come.
Re: (Score:2)
Re: (Score:2)
well. when it's intel, hitting scores that competitors(or rather, proper gpu manufacturers) hit 5 years ago is newsworthy.
is it going to make a dent on amd/nvidia notebook gpu sales? unlikely. it's just a modern version of gma950. will it appear in laptops? yes, of course - so did intels previous gpu's, apple even called them 3d accelerators.
Additional Story Resources (Score:5, Informative)
HotHardware Mobile: http://hothardware.com/Reviews/Intel-Core-i72820QM-Mobile-Sandy-Bridge-Processor-Review/ [hothardware.com]
HH Desktop: http://hothardware.com/Reviews/Intel-Core-i72600K-and-i52500K-Processors-Debut/ [hothardware.com]
Anandtech: http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i5-2600k-i5-2500k-and-core-i3-2100-tested [anandtech.com]
Tech Report: http://techreport.com/articles.x/20188 [techreport.com]
Legit Reviews: http://legitreviews.com/article/1506/1/ [legitreviews.com] (mobile)
Legit: http://legitreviews.com/article/1501/1/ [legitreviews.com] (desktop)
Re: (Score:3, Informative)
Re:Additional Story Resources (Score:5, Informative)
Agreed! The more people read about these products the better informed. A couple more:
bit-tech: http://bit-tech.net/hardware/2011/01/03/intel-sandy-bridge-review/1 [bit-tech.net]
Neoseeker: http://neoseeker.com/Articles/Hardware/Reviews/Intel_i7_2600K_Intel_i5_2500K [neoseeker.com]
Re: (Score:2)
nobody cares since the people buying these won't care about playing Black Ops at the highest settings. i sat out the Steam sale this year because almost every game is available for the X-Box 360 and PS3. I just want a laptop to play Civ 4 once in a while and store all my photos and music. and Sandy Bridge seems to spank Apple's laptops in a lot of areas now
Re: (Score:2)
nobody cares since the people buying these won't care about playing Black Ops at the highest settings. i sat out the Steam sale this year because almost every game is available for the X-Box 360 and PS3. I just want a laptop to play Civ 4 once in a while and store all my photos and music. and Sandy Bridge seems to spank Apple's laptops in a lot of areas now
The problem is that the integrated graphics just barely keeps up with the MacBook graphics at lowest resolution because of the advanced CPU, but once the resolution is increased, Sandy Bridge integrated graphics gets slaughtered by the lowly NVidia 320 in the MacBook.
but apple is pushing open CL and weaker video does (Score:2)
but apple is pushing open CL and weaker video does not help.
Re: (Score:2)
Duplicate Links (Score:3)
CmdrTaco, you duped the links, which appears to be an accident.
More Reviews... (Score:5, Informative)
Here's a few:
http://www.overclockers.com/intel-i7-2600k-sandy-bridge-review [overclockers.com]
http://legitreviews.com/article/1501/1/ [legitreviews.com]
http://www.tweaktown.com/reviews/3754/intel_core_i7_2600k_and_core_i5_2500k_sandy_bridge_cpus/index.html [tweaktown.com]
http://www.hitechlegion.com/reviews/processors/7689-intel-core-i5-2500k-processor-review [hitechlegion.com]
Re: (Score:2)
IMO, this review about how Sandy Bridge finally makes quad-core mobile processors mainstream is particularly important:
http://www.anandtech.com/show/4084/intels-sandy-bridge-upheaval-in-the-mobile-landscape [anandtech.com]
As expected (Score:2)
Personally I'm more looking forward to the octi-core units which are scheduled for in Q3 2011.
Combined with a decent dedicated GFX card they'll make a good basis for a new 3D workstation.
Goodbye LGA 1366 and 1156 (Score:5, Interesting)
I'm all for bigger and better but it's a pain to throw away a $500 motherboard [newegg.com] every 18 months because Intel decided they want to change the socket.
On the other hand the latest 6-core processors from AMD [techguy.org] still support 3+ yr old AM2+ [legitreviews.com] motherboards. It's nice to see someone still looking out for the budget shopper.
Re: (Score:3)
I would attribute this mostly to growing pains related to moving the memory controller into the CPU directly. IIRC AMD had some socket thrash when they did this (though one certainly can wish Intel learned from AMD and only changed the socket once, not twice.)
-nB
Re: (Score:1)
I don't get why 1366 gets no love, and why things like this isn't on 1366?
I would had paid extra for triple-channel over dual-channel, but it suck when the enthusiast gear lags behind the normal stuff :/
Sure they use higher clock on their dual channel sticks, but why no chipset and processors with higher clock triple channel?
The tripe channel CPUs perform better for SCII even though it only uses two of the cores. Enough said =P
Re: (Score:1)
fyi triple channel are coming out on the 3rd quarter
the real replacement for lga1366 is lga1356 and the new lga2011
Re: (Score:1)
That suck, so no upgrades of 1366 processors until then?
Will 1356 be the new mainstream and 2011 the new enthusiast/workstation/... or what? Guess I can check myself.
I don't know whatever 1366 is worth it or not. It seems like but it still feels older :)
Re: (Score:2)
Might have a little bit of truth to it, but I thought this was just par for the course -- AMD did the same socket-switching BS back in the 754-939-AM2 days, when they had the fastest chips.
Re: (Score:2)
Sandy Bridge is the tick, not the tock, since it is a new architecture. Nehalem was the last tick, Westmere was last tock. Ivy Bridge is the next tock. Haswell is reported to be the next tick, then Rockwell as the following tock.
Re:Goodbye LGA 1366 and 1156 (Score:5, Interesting)
This is pretty much the only reason I still stick with AMD. My upgrade cycles are every 2-4 years, so you'd think it would make more sense for me to go Intel since their stuff is "better". Nope! I've kept the same motherboard for the past two cycles, and even though I'm getting a better CPU (going from Ahtlon II x4 to Phenom II x6) and better video card (going from ATI 4850 to nVidia 570), I'm STILL going to keep the same motherboard and RAM. The Phenom II will be the third CPU I've dropped into this motherboard (Athlon X2 -> Athlon II X4 -> Phenom II X6).
Re: (Score:1)
Did AMD pay you to post this?
First,if you are shopping on a budget, you aren't going to buy $500 dual-socket Xeon server board, you will buy an $80 budget one.
Second, if you are using a Socket 1366 or 1156 processor (newer than 18 months old), you probably aren't looking to upgrade, and if you are going to pitch your 18 month old hardware, you aren't going to be especially budget constrained.
Re: (Score:2)
I bought an i7-920 when they first came out, and I'm happy with its performance as it stands. I may look to upgrade with Ivy Bridge next year, but will probably hold off until Haswell in 2013.
Re: (Score:2)
Re:Goodbye LGA 1366 and 1156 (Score:5, Insightful)
Hint: Dropping $300 on every processor generation Intel makes is a waste of money. If you got that much to spend, buy a more expensive CPU and keep it a generation or two longer. It not like it goes broke just because it's not the newest toy anymore, you know.
So in order of why is this is mostly irrelevant to the market:
1) The majority is laptops now (since 2008) and nobody upgrades the CPU there
2) Most people will get their desktop from an OEM and never upgrade
3) If you assume a new Intel will require a new mobo, you buy accordingly
Ok, so maybe you made a smart upgrade investment. Hurray, you belong to 1% of the market. Intel is still laughing all the way to the bank...
Re: (Score:2)
True, but why do I need to buy a new motherboard too? LGA 775 lasted from 2.6ghz Pentium 4s until 3ghz Core 2 Quads. [wikipedia.org] Socket 939 came out in 2004 and was used from 1ghz Athlon 64s to 3.2ghz dual-core Athlon 64 X2. [wikipedia.org] These sockets lasted through several
Re: (Score:1)
True, but why do I need to buy a new motherboard too? LGA 775 lasted from 2.6ghz Pentium 4s until 3ghz Core 2 Quads. [wikipedia.org] Socket 939 came out in 2004 and was used from 1ghz Athlon 64s to 3.2ghz dual-core Athlon 64 X2. [wikipedia.org] These sockets lasted through several CPU generations without change.
Not quite "without change". Only late P4 era LGA 775 motherboards can accept any form of Core 2, and early C2 boards don't work with late C2s. Pretty sure much the same thing happened on the AMD side too. The socket didn't change mechanically, but the specifications for the voltage regulators feeding the CPU's core power rail did.
And maintaining the same socket for so long comes at a cost: performance. As demonstrated by Nehalem, the Core 2 architecture was definitely being held back by the ancient P4 F
Re: (Score:2)
I just upgraded my 3 year old "AM2+" with a X4 640 AM3 (it needed a BIOS update).
Only problem is the north bridge heatsink wasn't designed for that much IO so I'm going to have to install a fan (it's literally too hot to touch).
That plus the fact that they dominate the PassMark's CPU/$ [cpubenchmark.net] benchmark means I'll be buying AMD for a long time...
Except if it's their GPU offerings. They need to pull their heads out of their asses and catch up with Linux Support to Nvidia.
Re: (Score:2)
Sandy Bridge has been on sale in Australia for about two months.
I just got: i5 processor, gigabyte motherboard, 8 gigs ddr3 1333, ati radeon 5770, all for 740$
Re: (Score:2)
Re: (Score:2)
I'm all for bigger and better but it's a pain to throw away a $500 motherboard every 18 months because Intel decided they want to change the socket.
I hear ya, but on the other hand your new 1155 mobo is likely to have 6 GB/sec SATA and USB 3.0 which your existing 1366 mobo most likely does not have. Changing out your mother board won't just get you a new socket.
Re: (Score:2)
Likely? Source? Sandy Bridge doesn't guarantee USB 3.0, it's not even part of the chipset features [wikipedia.org]
Re: (Score:2)
"I hear ya, but on the other hand your new 1155 mobo is likely to have 6 GB/sec SATA and USB 3.0 which your existing 1366 mobo most likely does not have."
Likely? Source? Sandy Bridge doesn't guarantee USB 3.0, it's not even part of the chipset features [wikipedia.org]
It's not a feature of Sandy Bridge. Its a feature of newer motherboards - look at recent offerings on Newegg [newegg.com]. Already 35 of the 271 Intel motherboards listed on Newegg already have these higher speed interfaces (some as cheap $110). The point is updating your motherboard likely adds additional performance independent of the processor socket.
Re: (Score:1)
I read on Wikipedia that the Sandy Bridge replacements for LGA 1366 high-end desktop CPUs aren't due out until Q3 2011. Maybe that's when X68 will be released.
Re: (Score:1)
That would had been more of an issue if the competition had better chips.
They don't.
So regardless if you feel Intel is holding their chips back or not what are you going to do about it?
Re: (Score:1)
Re: (Score:2)
No, Turbo Boost does not dumb down your processor. It turns an SMP system into an asymmetric multiprocessor system on demand. If you are running a single CPU-bound thread (a pretty common workload), then it overclocks one core and underclocks the others so that you get better single-thread performance but don't overstep the CPU's thermal dissipation limit.
Ideally, you'd never use Turbo Boost, because all of your applications would be written to use n threads (where n is variable depending on the number
Re: (Score:2)
intel also needs more PCI-E lanes as just X16 for (Score:2)
intel also needs more PCI-E lanes as just X16 for video is not ok with Light peak , USB 3 , cable card tuners and more on the way that needs more then just a pci-e x1 slot should have 20 so you can have x16 video and x4 for a add in card.
Re: (Score:2)
That's 16 lanes coming directly out of the CPU; the chipset also provides an additional 8 lanes.
This means that in order to get data from a discrete GPU to a PCIe lightpeak card will require a journey from the GPU, through the CPU PCIe lanes, through the CPU, down whatever they're calling the Frontside Bus this week, into the Chipset's PCIe controller, down those lanes and into the lightpeak card. I don't know if that will affect performance much.
Of course, I doubt we'll see GPU support for Lightpeak monito
or use a Voodoo 2 like Loopback Cable (Score:2)
or use a Voodoo 2 like Loopback Cable to get the video on the light peek bus.
Laptops with SandyBridge's launch? (Score:2)
Anyone know whether or not Dell's M6xxx mobile workstation line will start offering Sandy Bridge processors on/around the official Sandy Bridge launch date?
I've read that Dell will roll out a new product (M6600) with Sandy Bridge, but I don't know if it's happening this week, or something later.
Re: (Score:2)
And they will reply to you describing their great value deals of the week.
Re: (Score:2)
Re: (Score:2)
and who cares? Remember PIII?
Still no decent low cost computer (Score:2, Interesting)
Re: (Score:1)
That seems pretty decent for me.
Of course, if you want to play games you will NEVER be happy....
Re: (Score:1)
Price vs Performance (Score:5, Insightful)
Also interestingly, the most expensive desktop part will start at $317, putting the screws to AMD yet again.
When has Intel ever lowered prices without needing to?
It's more likely that instead of putting the screws to AMD, Intel is worried about Bobcat and Bulldozer coming out pretty soon and factoring that into their prices (to gain market share before AMD chips get out). On merit Bobcat CPUs should dominate the low-end laptop/netbook market with low power use and real integrated graphics. Bulldozer should do well in the high-end server market again with low-power and more cores... basically where intel CPUs have hyperthreading, Bulldozer has another actual core (for integer instructions).
Re: (Score:3)
When has Intel ever lowered prices without needing to?
Intel will act rather aggressively to deny AMD any high margin CPUs if they can, they know keeping AMD cash starved is the best guarantee for their continued dominance while Intel has plenty markets to get their margins. Intel is way ahead of the game at this point, they know AMD has some launches soon and for one they empty the market of people looking for a new CPU soon and second they make sure that instead of praising reviews and reports that AMD force Intel to price cuts they instead get "too little, t
Re: (Score:2)
Re: (Score:1)
And has a very compelling mobile platform coming.
9 Watt, better than atom by far, and dx11 incremental embedded improvement. And an 18 Watt part competitive with current generation mid range desktops.
I know I would love an Ontario pad, netbook, or even laptop. Still more thank I'd be willing to use power wise in a phone.
Re: (Score:2)
They only missed the opportunity in the sense that the Brazos platform is not quite out yet. This combination will kill pads and the derivatives may kill phones.
Re: (Score:1)
ARM-based devices are set to take more then 10% of all US Internet browsing in 2011 with no signs of slowing down anytime soon.
Not just that, but nVidia just announced [engadget.com] their Project Denver, which is an ARM core integrated with their graphics and aimed at the desktop. And as mentioned in the same article (and expanded on [engadget.com]), Windows 8 will have ARM support.
I have a strong personal bias against x86 and would be reasonably delighted to see ARM start to take over on the desktop in the next few years, but only tim
Beware if you want to install Linux! (Score:1, Troll)
According to semiaccurate:
" If you try to use Sandy Bridge under Linux, it is simply broken. We tried to test an Intel DH67BL (Bearup Lake) with 2GB of Kingston HyperX DDR3, an Intel 32GB SLC SSD, and a ThermalTake Toughpower 550W PSU. At first we tried to install vanilla Ubuntu 10.10/AMD64 from a Kingston Datatraveler Ultimate 32GB USB3 stick. The idea was that it would speed things up significantly on install.
That's when the crippling bug surfaced. It seems the USB3 ports on the Intel DH67BL don't want to
Re: (Score:2)
It's not like Charlie hasn't shown himself to be a complete and total ass before, as Intel pointed out at the end of the article. Demerijan is a whiny tosser who has been a spouter of bullshit for quite some time now.
Re:Beware if you want to install Linux! (Score:4, Informative)
Uh... so you install a several months old version of Linux on a brand new architecture and it doesn't work, therefore the architecture is "broken"????? There are fully 100% open source drivers available for Sandy Bridge RIGHT NOW. Phoronix [phoronix.com] (usually the purveyor of sensationalism but a voice of reason in this case) goes out of its way to detail exactly what you need to run Sandy Bridge with 100% open source code. Now... is it 100% released yet? No, but at the same time, you have to remember that SB isn't even officially for sale yet. It WILL be fully released in the next round of distro updates, and you can get all the stuff to run it right now if you are truly as l33t as you think you are. I'm just sitting back and waiting for the AMD fanboys to scream about how AMD is so wonderful and all AMD graphics work perfectly in Linux when someone gets GLX gears running on a 6000 series part in 6 months......
Re: (Score:1)
Semiaccurate.. hmm. Seems a fitting name for such idiocy.
Sandy bridge does not have USB3. That motherboard may have a USB3 chip onboard, but it has nothing to do with Sandy Bridge other than the lame fact the SB chipsets don't have USB3 yet.
Re: (Score:2)
you mean like rather just announced and released hardware risks to break compatibility with what - 18+ months old platform? Shocking!
Re: (Score:2)
That's when the crippling bug surfaced. It seems the USB3 ports on the Intel DH67BL don't want to work.
This is hardly surprising. The board's only certified to comply with the USB2 specification.
http://www.intel.com/support/motherboards/desktop/sb/CS-026528.htm [intel.com]
Euler3D -- colmpiler? (Score:2)
There's one benchmark here that could reasonably be compiled on a processor specific basis, to show what the processor really can do (as opposed to all the other benchmarks, which are based on proprietary least-common-denominator executables: Euler3D.
And there are processor specific enhancements that could have great influence (150%?) on this code's performance... As it happens, this benchmark the one of greatest professional interest to me, anyway :-)
I'd really like to know how its performance would c
Treacherous Computing (Score:3)
Re: (Score:1)
Re: (Score:1)
Thanks for pointing that out out--I was weary after the kill switch news, but after reading those, this Gulftown will be the last Intel processor I use.
Wow "Turbo Boost Mode" (Score:1)
Turbo boost (Score:2)
" a new Turbo Boost mode that increases clock speeds dynamically "
Dynamically? I want my Turbo Boost button [codinghorror.com] back. 66 megahertz or bust!
What screws? (Score:2)
I don't see any screws. The business analysts at Intel determined the "correct" price based on the performance of the chip. AMD's most expensive desktop CPU is $265. About 17% less than the i7-2600K. The graphs I'm looking at on Anandtech right now show this to be about in line with the performance delta between these 2 CPUs.
Intel has almost always had an idiotically-priced Extreme Edition CPU a