AMD Launches New Richland APUs For the Desktop, Speeds Up To 4.4GHz 153
MojoKid writes "AMD recently unveiled a handful of mobile Elite A-Series APUs, formerly codenamed Richland. Those products built upon the company's existing Trinity-based products but offered additional power and frequency optimizations designed to enhance overall performance and increase battery life. Today AMD is launching a handful of new Richland APUs for desktops and small form factor PCs. The additional power and thermal headroom afforded by desktop form factors has allowed AMD to crank things up a few notches further on both the CPU and GPU sides. The highest-end parts feature quad-CPU cores with 384 Radeon cores and 4MB of total cache. The top end APUs have GPU cores clocked at 844MHz (a 44MHz increase over Trinity) with CPU core boost clocks that top out at lofty 4.4GHz. In addition, AMD's top-end part, the A10-6800K, has been validated for use with DDR3-2133MHz memory. The rest of the APUs max out at with a 1866MHz DDR memory interface."
As with the last few APUs, the conclusion is that the new A10 chips beat Intel's Haswell graphics solidly, but lag a bit in CPU performance and power consumption.
I love my AMD (Score:5, Interesting)
Re: (Score:2, Funny)
Yeah! After waiting 4 months after Bulldozer launched to get that $189 price, and now waiting another 8 months after Piledriver launched to get the current $180 price, you got almost-as-good-at-Intel-in-a-couple-of-synthetic-benchmarks performance for the low low price of $369 in 2013!!!!
Those blubbering morons who bought the 2600K in 2011 for $350 are stuck with outdated crap that will finally be eclipsed when steamroller launches next year*! What a ripoff!
* Assuming that they spent extra for the K-series
Re: (Score:2)
I have the Piledriver and it is fast enough for my needs. Even the area where it is supposedly weaker, FP computation, I can do real-time ray-tracing benchmarks at 1/3 to 1/5th the speed of a 1Tflops GPU.
Re:I love my AMD (Score:5, Funny)
Is that what that was? I hope CajunArson isn't getting paid to shill, that post was so poorly written I honestly can't tell whether it's meant to be anti-Intel or anti-AMD. My money's on both: he's secretly using a VIA processor.
Re:I love my AMD (Score:5, Insightful)
Depends heavily on use, though.
Intel's been focusing on single-thread performance and power efficiency - Haswell basically did nothing for performance, giving a few percentage points of improvement, but dropped the power consumption down to the point that putting it in a tablet actually makes sense. Idle power was a particular focus.
AMD's been focused more on multi-threaded performance, cramming a ton of cores onto one chip. In some cases that works well, but in others they suffer heavily. They also focused on integer, not floating-point, performance. Sadly, even when playing to AMD's strengths, Intel's process node advantage (and compiler advantage, oftentimes) lets them at least keep pace.
I will agree that AMD has been much better at socket compatibility. My 2006 Intel motherboard is now three sockets out of date, while my similar-age AMD board would probably work with a current Bulldozer. And AMD's pricing, thankfully, reflects their performance. I might be getting one of the Richland chips for a low-cost SFF build I'm planning.
Re: (Score:1)
while my similar-age AMD board would probably work with a current Bulldozer.
Yeah, no. That whole "socket compatibility" thing is long gone in AMD. Socket FM1 was released in 2011, FM2 in 2012, probably FM3 this year.
Compare that with Intel's sockets 1156 (2009), 1155 (2011), 1150 (2013).
Sure, AM3 is better, let's see how it works out with their next AM3+ release.
Benchmarks vs. Business PCs (Score:3, Interesting)
I'm trying to figure out right now whether office PCs will see the difference between AMD and Intel. It seems like as long as you install plenty of RAM, pretty much anything should handle a moderately multitasking business PC for at least a few years. I keep seeing posts of Intel vs AMD benchmarks, but even with the benchmarks being what they are, how much difference will a nontechnical end user really notice in an office environment? I run an AMD A8 quad core laptop at home, but it runs Linux and does just fine. I don't want to judge Windows performance based on my experience with Linux though.
business users won't notice (Score:3)
I have a 2yr old core i3 laptop that runs office apps just fine. It'll do high def streaming just fine too. "Regular" office stuff just isn't all that strenuous.
There are scenarios where you would see a difference, but they tend to be more technical users...video editing or transcoding, source code compilation, database indexing, numerical simulation, etc.
Re: (Score:2)
As I see it:
If you care about performance, you buy Intel.
If you care about power consumption, you buy Intel.
If you're cheap and don't care about power consumption and want to play games on really low graphics settings, you buy AMD.
You can add "if you care about market conduct, you buy AMD."
Full disclosure, I care about performance, so my plinko chip doesn't fall that far.
Re: (Score:2)
Re: (Score:2)
If you are in a mobile setting and need OpenCl acceleration it is very very hard to beat an AMD laptop. I get good battery life and still have nice gpu acceleration. If I switch to the dedicated gpu the battery life drops quite a lot on both Intel and AMD systems so it is nice to have the APU for doing GPU acceleration with OpenCl.
Choices, choices... (Score:1)
I'm not sure which to go with any more- still leaning towards Intel since I'll be getting a separate graphics card and I like their raw power, but at the same time, it's hard to beat the price on AMD.
Good thing I've still got a month to mull it over.
Re: (Score:2)
Re: (Score:1)
You never had (cpu related) cooling problems with your intel machines? That would be quite helpful to know, actually!
Re: (Score:2)
Re: (Score:1)
+1 Helpful, good sir.
Re: (Score:1)
I'm not sure which to go with any more- still leaning towards Intel since I'll be getting a separate graphics card and I like their raw power, but at the same time, it's hard to beat the price on AMD. Good thing I've still got a month to mull it over.
You can get a cheat quad core AMD - the Athlon series of the FM2 socket come with disabled gpu and low price (70 euros or maybe lower for a quad).
I have a A8-5500 and its just perfect for my needs. I run Linux on it and so far its flawless (even the maligned fglrx driver runs perfectly). The GPU in it handles everything i need and its runs cool&quiet with its DEFAULT heatsink (which is small).
Anyway, i find all this benchmark wars a bit like pissing contests since, as PC sales too suggest, the curren
Re: (Score:1)
Just the facts (Score:2)
Xbox One GPU Cores....768
PS4 GPU Cores...........1152
Re: (Score:3)
You're a console fanboy, aren't you? (Score:2)
Here, have some more facts:
Radeon 7870GE GPU Cores: 1280
Radeon 7950 GPU Cores: 1792
Radeon 7970GE GPU Cores: 2048
Radeon 7990 GPU Cores: 4096
Oh, and don't forget the clock speeds. The A10 and PS4 (and probably the Xb1) run at 800MHz, while many of the discrete cards run at 1GHz (only the 7950 runs at less, at 850MHz).
Re: (Score:2)
Re: (Score:2)
Sorry, must be spending too much time reading comments on other sites. Thought you were trying to "prove" how "PC gaming" "sucks", just like 80% of the guys on most gaming sites. Apologies.
Re: (Score:2)
You can't really criticise people on other sites when you're doing the exact same thing they do - you're jumping at the guy like a rabid fanboy.
FWIW I was always a PC gamer, I bought a 360 in 2006 and enjoyed a lot of games on it, I had an N64, Gamecube and Wii but never ended up playing them much largely playing PC or 360 instead. I'm back to PC gaming now, mostly Diablo 3, Starcraft 2, and Wargame (both versions).
Is it really so hard to realise that all platforms have some decent games and that over time
A10 performance (Score:3)
Anyway, that gaming grade computer was $575 retail at my shop and ran most modern games at medium to high settings. It blows away a GT430 and most GT440's so that's nice. Now if someone wants a doable graphics card with good video encoding speed to boot, boom, APU. These are amazing for that! Usually you're talking about a $500 computer going to $650 minimum to bump up the power supply to support a GTX640 or 650 minimum to even call it a gaming computer. Now you just swap an i3 out for a 4-core APU and tada, basic gaming computer.
the only question that matters (Score:1, Funny)
Re:I beg your pardon (Score:5, Informative)
Re: (Score:2)
I was confused what Auxiliary Power Units had to do with this.
I think aviation/spaceflight has dibs on this acronym.
Re: (Score:2)
Re: (Score:2)
Does this offer any benefit at all when you're using a discreet card? or for those of us who always do is it just a complete waste of silicon integrating it in?
Re: (Score:2)
You can watch videos with the APU while the separate GPU is busy mining Litecoins.
It depends on the kind of videos whether you like your graphics cards discreet.
Re: (Score:3, Informative)
And after the APU is done processing, it says in a catchy Middle Eastern accent, "THANK YOU, COME AGAIN!"
India is not in the middle east.
Re: (Score:2)
Re: (Score:2)
I find it hard to understand how “technically” most of the Middle East [wikipedia.org] is in Europe, when none of the countries of the area are in the European continent. Only a small part of contemporary Turkey is to the west of Bosporus.
Re:I beg your pardon (Score:5, Informative)
http://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit [wikipedia.org]
APU is the only unusual acronym in the summary. It refers to a chip with both the CPU and graphics processor on the same die. It was previously called Fusion, but trademarks got in the way.
Re: (Score:2)
So if I buy this I won't need my Radeon 7850 video card anymore? Should I sell it on ebay now before resale value plummets?
Or is this APU just a slightly better version of motherboard integrated graphics that's been around for decades? Not fit to play 3-D games?
Re: (Score:1)
Thanks. I was wondering myself. For some reason all my brain could come up with was "analog processing unit" but I was pretty sure that wasn't right.
Re:I beg your pardon (Score:5, Informative)
Here is your Radeon HD7850: http://www.gpureview.com/Radeon-HD-7850-card-678.html [gpureview.com]
It has 1024 Shader Processors ("Radeon Cores" in the summary), and (stock) is clocked at 860MHz. The 8670D included in this new APU has 384 Shader Processors, and is clocked at 844MHz. So about 2/5ths of the computing power; presuming all other factors are equal.
So while for high-end gaming, it won't quite cut it (Turning on most of the shiny and enabling it across 3 monitors with Eyefinity would make it beg) - it should be plenty powerful for light/medium gaming on a single monitor, or any light/moderate duties across multiple monitors with Eyefinity.
Re:I beg your pardon (Score:4, Informative)
If you feel the 7850 is needed then these will be too slow for you.
The GPU in the A10-5800 (the one currently on the shelves) is fairly accurately labeled a 6550d and requires settings to be turned down to Low@720p/1366x768 to get acceptable performance in a game like Battlefield3. The new APU is only incrementally more powerful and faster.
What these "APU" chips (which in my mind includes Haswell Chips) are obsoleting are the lowend budget cards with 64bitGDDR5 and 128bitDDR3 that get put in a lot of office desktops.
err 7660d not 6550d (Score:2)
But whatever.
Re: (Score:2)
http://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit [wikipedia.org]
APU is the only unusual acronym in the summary. It refers to a chip with both the CPU and graphics processor on the same die. It was previously called Fusion, but trademarks got in the way.
Unfortunate, because it already stands for Auxiliary Power Unit in aerospace. But I think we've passed peak TLA.
Re: (Score:1)
Intel and AMD now have graphics hardware built into their CPUs.
AMD has traditionally had better graphics, but worse CPUs when both are integrated when compared to intel's similar offerings. This trend appears to continue in next generation.
Re: (Score:1)
4.4 GHz? Oddly not mentioned in TFA....
See page 2 [hothardware.com]
Re: (Score:2, Informative)
P.S. --> the score in question from my previous post was for Cinebench 11.5, but there are many many others like it. And don't think that OpenCL holds any miracles for Trinity either, the 4600 is actually a better OpenCL part than it is a GPU.
Re:Fascinating misues of adjectives there! (Score:5, Informative)
P.S. --> the score in question from my previous post was for Cinebench 11.5, but there are many many others like it. And don't think that OpenCL holds any miracles for Trinity either, the 4600 is actually a better OpenCL part than it is a GPU.
Really? Because the one OpenCL benchmark [hothardware.com] I can find in TFA pegs the new chips at 2.5 times faster than the 4600 that comes with the i5-4670k. I wouldn't consider a part that is less than half as fast to be "better." Maybe that's just me? Could be. Also, I wouldn't say "at best" 20% faster when several benchmarks peg it at 30% or more. The Enemy Territory: Quake Wars [hothardware.com] high-res benchmark, in particular, is... hilariously one sided (and since most people are going to be playing at high-res settings, it's a benchmark that actually matters). Actually, all the high-res gaming tests are, with the new chips often coming in close to twice the Haswell chips. In fact, the Cinebench 11.5 tests [hothardware.com] peg the Richland at 60% faster than the i5-4670k, so I'm not sure where the hell you got any of your numbers from.
Re: (Score:2)
It's the same story every generation.
CPUs: AMD wins in the $/performance category but loses in terms of pure performance. For 2 generations AMD won the pure performance crown as well.
Onboard (or on-die) GPUs: Intel's implementation will get you moderate FPS on games released in [PurchaseYear - 1] at a sub-native resolution. AMD's implementation will at least run medium settings around 30 fps at native resolution. AMD wins in the $/performance category AND the pure performance category.
Discrete GPUs: A
Re: (Score:2)
It's the same story every generation.
CPUs:
CPUs: The low price AMD units win the performance/purchase price competition. Low price Intel units win the performance/(purchase + operating costs) competition, at least they do if your computer is on very much and you pay for your own electricity.
Re: (Score:2)
Re: (Score:2)
Lately I've
Re: (Score:1)
AMD does really well on multi-threaded performance as well.. I went with the 8-core AMD for my development workstation, as it handles the background services during development (webserver, sql database, no-sql database) etc better... at the same price AMD spanks Intel for my use case, and would have to spend a lot more to get the same level of performance (more expensive MB and CPU).. Of course most people would be happy with any >= $150 CPU combined with enough ram and an SSD these days.
Lately I've suggested people get the cheapest laptop they can find (mostly in the $300-450 range) and max out the RAM and put in an SSD. Generally a better experience than more expensive laptops... unless you need a high end display.
Oh absolutely. Everyone who considers Bulldozer a failure is obviously not doing any heavily-multi threaded work.
We recently bought a new server (ESXi host) and we went with AMD because of the cost and scalability.
We ended up spending the $3000 difference (had we gone with Intel) on SSDs (vs. 15k SCSI drives).
Re: (Score:2)
Exactly. In reality 99% of people don't actually need raw performance. And 90% of the 1% are idiots.
My current computer is a AMD Phenom II X4 (well actually a X3 but Asus kindly allows me to unlock the extra core for free). Its been serving me nicely for the last 4 years.
I went with it because of the good performance/price point. I still don't actually have any plans to replace it.
The target of the APU is exactly like my computer. Work machines and everything from a casual gamer down.
Thats a pretty massive
Re:Fascinating misues of adjectives there! (Score:5, Insightful)
Yeah you're right. Fuck AMD. Let's support Intel, the anti-competitive market-abusing cocksuckers who had to secretly pay off Michael Dell to use their chips. That's a company I want to support with my money.
Re: (Score:2)
Re: (Score:1)
Piledriver is not the only CPU core they have. The Jaguar core (to be used in the PS4 and XBox One) is low power and has better performance than the Intel Atom cores.
Re: (Score:2)
Considering the ARM Cortex-A15 has better performance than the Atom, that's not saying much. Intel's been pretty neglectful of the Atom line.
Re: (Score:2)
I'll support the company that develops the best technology. When AMD build their own fab again and develop 22nm process technology I'll support them.
Re: (Score:2)
Might be never because if they did they would probably spin it off again.
Re: (Score:2)
Re: (Score:2)
And as soon as AMD started abusing a position of power, I would switch to buying Intel or products from some other competing vendor. Until that time, I'm going to give AMD the benefit of the doubt.
Re: (Score:2)
Re: (Score:1)
If I buy $350 cpu and a discrete GPU it will beat the hell out of 150 cpu, did I get your logic right?
Re: (Score:3)
If I buy $350 cpu and a discrete GPU it will beat the hell out of 150 cpu, did I get your logic right?
I think the point is that regardless of whether you buy Intel or AMD, you'll still probably be playing new games on the lowest graphics settings available. If you actually bought a PC to play games, you'll be buying a discrete graphics card, so the on-chip GPU is just a waste of space unless the OS is able to switch back to it for the desktop to save power when you're not running games.
Re: (Score:2)
Gee, if only AMD had a line of chips that didn't have the graphics unit in them...
If only AMD had a line of chips which didn't have the graphics unit in them and were actually competitive with anything other than Intel's low-end parts....
Re: (Score:3)
If only people would stop measuring their penis sizes and focus on what it's used for. Can the CPU play games? without even breaking a sweat. If you need a CPU that is 75% idle while playing a game then that's your preference though.
Re: (Score:2)
Re:Fascinating misues of adjectives there! (Score:5, Interesting)
Richland's GPU is at best about 20% faster than the intentionally-midrange HD-4600 GPU in Haswell. Add in any form of desktop GPU, including midrange models from 2011, and Haswell wins by a landslide.
Yes, if you buy a $250-$350 CPU and then add a $100 video card, it will outperform a $150 all-in-one unit. No shit.
At CPU, I recall seeing delightfully hilarious graph where a 6800K overclocked to 5GHz had exactly half the score of the (stock clocked) 4770K. Before we get to the usual "But AMD is cheap!" argument, when you take into account the $150 price of the 6800K and the $350 price of the 4770K, AMD only wins on price/performance if you intentionally buy the most expensive Haswell model available and intentionally don't overclock it while also overclocking the crap out of the 6800K.
You're looking at this from an enthusiast perspective. But if I'm building a system for someone who mostly does web surfing, Office, and occasionally some light gaming like WoW and The Sims, then an AMD APU starts to look a lot better from a price/performance perspective. You assume that as long as the performance per dollar stays high, the buyer is willing to spend as much as necessary, but that's simply not true for most users. Probably 90% of users will never even hit the maximum limit of an A10-6800K, so for these people, Haswell is overkill.
Small form factor, etc (Score:2)
Also, consider things like smaller form factor cases, or even laptops. In many cases where space in a concern, a decent mobile APU is better than a CPU+GPU.
In other situations, well, good enough is good enough. I'm building a small luggable (basically suitcase-PC) for LAN parties, to replace a shuttle which I previously used to drag around.
Some people show up with *huge* Antec cases and dual CPU,capable of playing [latest shooter] resolution at >1080P at superhigh detail, and then we end up playing Starc
Re: (Score:2)
You're looking at this from an enthusiast perspective. But if I'm building a system for someone who mostly does web surfing, Office, and occasionally some light gaming like WoW and The Sims, then an AMD APU starts to look a lot better from a price/performance perspective. You assume that as long as the performance per dollar stays high, the buyer is willing to spend as much as necessary, but that's simply not true for most users. Probably 90% of users will never even hit the maximum limit of an A10-6800K, so for these people, Haswell is overkill.
For gaming the graphics part is most important, but otherwise as a general rule light usage is poorly threaded and heavy usage well threaded. Often the "snappiness" of the computer is based on the performance of a single thread. So for the non-gamer I'd go with high single thread performance, for the gamer I'd suggest a discrete card but for the right level of casual gamer I guess an APU is what serves them best.
Re: (Score:2)
"But if I'm building a system for someone who mostly does web surfing, Office, and occasionally some light gaming like WoW and The Sims, ..."
Then who cares about the "price/performance perspective" either? When performance doesn't really matter, there's no reason to consider it in the equation at all.
Re:Fascinating misues of adjectives there! (Score:5, Informative)
Richland's GPU is at best about 20% faster than the intentionally-midrange HD-4600 GPU in Haswell.
Yes, but what OpenGL features does the Haswell APU have compared to the full GL 4.3 found in the AMD version? How good are the Intel drivers? How many textures can I bind at once? What anti-aliasing modes does it support? What are the max number of shader varying/uniform attribs? How many shader instructions can I fit within my shaders? Back in 1999, comparing raw polygon speed may have meant something, but these days it's not really as interesting as the rest of the details....
Did it ever occur to you to look it up? (Score:3)
Intel provides rather extensive technical documentation of all their products. http://www.intel.com/content/www/us/en/processors/core/CoreTechnicalResources.html [intel.com] is the page with basic datasheets (basic in this case meaning a couple hundred pages, their more detailed ones are a thousand). If you truly are as interested in the technical details as you pretend, then go look them up.
However if you are just throwing out technical shit in an attempt to deflect the argument then knock it off. Particularly since m
Re: (Score:1)
If we weren't interested in specs we would be using a tablet.
Re:Did it ever occur to you to look it up? (Score:5, Informative)
Intel provides rather extensive technical documentation of all their products. http://www.intel.com/content/www/us/en/processors/core/CoreTechnicalResources.html [intel.com] [intel.com] is the page with basic datasheets (basic in this case meaning a couple hundred pages, their more detailed ones are a thousand). If you truly are as interested in the technical details as you pretend, then go look them up.
I've had a look through, but apart from saying "it has 20 execution units", it doesn't really mention any specific figures (for the actually useful information). It does however state that it's OpenGL4.0, which is a little disappointing (a step up from 3.2, but it's still lagging behind AMD & NVidia).
However if you are just throwing out technical shit in an attempt to deflect the argument then knock it off. Particularly since much of what you are asking for are the kind of the things that would be of concern for high end dedicated GPUs for particular applications, not for an integrated controller for general use.
Well, I'm a graphics engineer in the games industry by trade, so I guess you could say I have a passing interest. The things I am asking for, are things that can help improve the performance of the products I work on. Now you might not find this stuff particularly interesting, however I do. So as a very simple example, I have an order-independent-transparency pass to handle pixel perfect transparency. On the current integrated AMD GPU, I can basically pick between any number of algorithms to achieve this (weighted average, dual depth peeling, etc, etc). Now, which one I choose, is going to be largely affected by what GPU resources I need to use for other things, and this includes: memory, the max number of shader attribs, the max number of bindable texture units, etc; but in general, I have resources to spare, so I am free to pick and choose.
The problem with Intel APUs in the past, is that whilst the last generation may have implemented OpenGL 3.2 to the letter, the max attrib counts and shader instructions were significantly lower than the AMD/Nvidia equivalents. This means you typically have to insert an Intel only codepath, where you will either just rip out the nice stuff, or you'll end up using a much slower multipass technique. As a result, making frame-rate comparisons in any game is most likely to be meaningless (since there is a good chance they are running a simplified codepath for intel).
It's all well and good, and matters for certain markets and applications, but those markets are generally not the ones using an integrated GPU. Most people just care how fast it runs their stuff.
Yes, and No. It's very true that most people just want their stuff to run quickly. However, to say that the legions of people out there running low powered ultrabooks and cheap generic laptops don't care about this stuff, is complete and total bullshit. You might imagine that all gamers have £3000 desktop rigs with all the trimmings, but the reality is infact very different. If I can spend a few months optimising the graphics routines to run a game smoothly at 720p on an Intel APU, then the market sector into which we can sell our product, has more or less tripled. Even if you don't go to the effort, you will probably be forced into making those optimisations anyway. Honestly, you would be surprised at just how many people ignore the minimum system requirements on a game, and simply assume their "i3 Dell laptop is brand new, so it should play the latest games". What are you going to do? Refund half of your sales? Or fix it? If you see sense, you'll fix it, and then most of your users will have the luxury of being able to ask how quickly it runs....
Re: (Score:2)
They provide GL 4.3 support in the APU driver? Nice, I might buy one now.
Re: (Score:2)
The market for both series of products, is that you don't add anything. Use the on-CPU graphics. If you are using a graphics card instead of the integrated graphics, then neither Haswell or Richland is of interest to you. You have a Sandy Bridge-E or a non-APU AMD equivalent, which you're using along with the graphics card.
Re: (Score:2)
Also, P.S.
APU is taken. [wikipedia.org] They need to come up with a different acronym. They can't toss the "well that's not computing!" card either, because it's still taken. [wikipedia.org]
Re: (Score:2)
Who in their right mind would intentionally buy a cpu that uses thermal paste under the heatspreader?
The 99.9% of PC users who don't overclock them?
Re: (Score:1)
Even if you're not overclocking, you can expect a shorter service from the parts.
Funny thing about the overclockers - they're good at showing up design flaws that can affect the rest of us. The on-the-fly overclocking most modern CPUs do on thier own now (intel 'turbo speed' is just a situationally triggered overclock) just makes it more relevant.
Re: (Score:1)
Even if you're not overclocking, you can expect a shorter service from the parts.
It's very unlikely that you would ever experience this in practice.
Funny thing about the overclockers - they're good at showing up design flaws that can affect the rest of us.
Funny thing about the overclockers -- they have a vastly overinflated sense of their expertise and relevance.
The on-the-fly overclocking most modern CPUs do on thier own now (intel 'turbo speed' is just a situationally triggered overclock) just makes it more relevant.
Turbo is not an overclock. Intel closes timing over all turbo boost clock speeds (as well as other parameters like the classic trio of process variation, voltage, and temperature). Overclocking is when you operate a chip at some combination of PVT and frequency where the manufacturer didn't close timing.
Intel's turbo is really about
Re: (Score:1)
The whole heat spreader design is so stupid. Instead of a thin aluminum (or whatever) cap, why not make it a thick copper block with fins and fan mounting points, and attach that directly to the core right at the factory? You'd get much better results.
Yeah, then you have a larger and less flexible design that OEMs have to deal with. I say fuck em.
Re: (Score:2)
Re: (Score:3, Informative)
Intel can get away with solder in components because they change the socket type so often that people are unlikely to be able to upgrade the processor anyways. AMD OTOH, has a tradition of not forcing you to do that every single time you upgrade.
Personally, I refuse to buy Intel parts, and quite frankly, the way I use my computer, I don't need the overpriced solutions that Intel is pushing.
Re:Still a step behind Intel (Score:4, Insightful)
In practice, how often do people upgrade a CPU in the same mobo these days anyway? Even in server settings it's not that common; it's more common to buy a CPU/mobo package, and keep it until it's time to replace both.
Re: (Score:2)
Can't speak to everyone else, but I "upgraded" the APU in my HP laptop to an A8 from an A4 last year, and if the new FS sockets were still backwards compatible, I'd have upgraded again (and that still irritates me that it's not). The computer is fine.. keyboard, lcd, drive, etc.
Re: (Score:2)
In practice, how often do people upgrade a CPU in the same mobo these days anyway?
Which people? I upgraded my X3 720 to an X6 1045T. Almost the same base clock, it overclocks itself to the same speed I was able to get out of my 720 when I am running few threads so I don't even lose single-thread performance, cost me $120 shipped and taxed... used. And my 720 cost me only $110 shipped and taxed, new. I did this because I could. And I went AM3 in the first place with the expectation that I'd be able to do this.
Re: (Score:3)
Well there's three things:
1) Ability to upgrade
2) Ability to mix/match motherboard/CPU
3) Replacement cost if it fails out of warranty
On the other hand, if you buy a new motherboard/CPU combo you have a working old machine to sell or re-purpose, if you upgrade just the CPU is a low end CPU with no motherboard will usually be a complete write-off. The BGA package is cheaper, which might offset the lack of choice and most the functionality is now on the processor or chipset anyway. The repair cost is pretty re
Re: (Score:2)
With Intel it is less likely to be a possibility. I have built low priced machines for family and friends with AM3 twin core CPU's, all of them are upgradable to six core now (and some were already upgraded to four core when those were cheap).
Re: (Score:2)
I am not sure but it looks like you're confusing what you do for what everybody else does. I really don't know but, well... I upgrade CPUs sometimes. It doesn't help that I have a number of fairly new PCs and am always tweaking and poking. But, yeah, I buy new CPUs, update RAM, upgrade GPUs, etc...
Re: (Score:2)
That doesn't really make much sense. Your Northwood PC was a Socket 478 machine. Intel came out with this socket when AMD was in the middle of the long-lived Socket A era, and Intel stuck with it while AMD went through Sockets 754 (which was pretty short-lived) and 939. When DDR2 came out both Intel and AMD came out with new sockets. Intel created LGA775, which was another fairly long-lived socket as it spanned the later P4's, the Pentium D, then the Core 2 Duo (including the Conroe) and the Core 2 Quad
solder in kills MB choice so you may not be able t (Score:3)
solder in kills MB choice so you may not be able to get a board with what you need.
Say you need a board with lots of slots but so much in the cpu sorry the broads with lot slots only come with the high end cpus.
Need a lot of cpu power but not all kinds of OC and other stuff found in higher end MB no we don't have the mid range or lower boards with fast cpus.
Re: (Score:2)
Re: (Score:2)
Haswell has launched. They could have made a socket for the R-series, but it couldn't have been the same socket as the other processors. The socket is LGA1150, not 1550. No Intel part uses GDDR, it's all eDRAM + system memory (DDR3). I guess we'll see when Intel releases their sub-$300 line, so far it's only been the top models on display. Personally I ordered an i7-4665T for a fanless build, looks to pack an awful lot of power in a 35W TDP.
Re: (Score:2)
Re: (Score:2)
Funilly enough this is a similar solution to the AMD chip that will be in the xbox one (Chip on package edram).. When will AMD bring that tech to their PC part line?
In the second half of 2013 [extremetech.com].
Re: (Score:2)
GPUs need access to very fast memory, and that's not something that can be provided on memory modules.
Isn't that all onboard the graphics card now?
In my experience, and in just about all the benchmarks I've looked at ever, memory speed has absolutely 0 effect apart from some very specific situations. In many cases, lower latency, slower RAM will be quicker.
It's all a bit pointless, really, IMO, since RAM is RAM is RAM. I've never seen a system significantly (ie more than 10%) improve by adding different
Re:Look at all that speed (Score:5, Informative)
Huh? The front-side bus hasn't existed in years. AMD abolished it way back in 2003 when they moved the Athlon 64's memory controller on-die. Intel did the same thing with Nehalem in 2008.
Perhaps you just meant that there isn't enough memory bandwidth to use the GPU to its full potential with games? The good news is that AMD's upcoming Kaveri will have GDDR5 support [brightsideofnews.com], with a homogenous memory architecture similar to the new consoles.
Re: (Score:2)
Yeah but it's only 128bit and it's mutually exclusive with DDR3 so you cap out at 4GB with yet-unreleased high density GDDR chips, not exactly useful for a general purpose computer.
Re: (Score:2)
128bit GDDR5 is still twice the bandwidth of 128bit DDR3 (quad-pumped, not double-pumped). And won't they have banked memory, so you can have more DIMMs than channels (just like you can have four DIMMs in a dual-channel DDR3 system)?
Re: (Score:2)
With the GDDR5 version memory will be soldered onto the mobo, along with the APU. No DIMMs. Though it isn't clear to me if it maxes out at 4GB or 8GB - might well be 4GB which makes it useful for gaming but sucks if you want to do crazy hungry web browsing or something else on the side.
Re: (Score:1)
Well seeing as the only chips in the wild at the moment all have 8 gigs sodered on (ps4 and xbone) I'd have to guess that the limit is at least 8 gigs.