AMD Next-Gen Kaveri APU Shipments Slip To 2014 138
MojoKid writes "The story around AMD's upcoming Kaveri continues to evolve, but it's increasingly clear that AMD's 3rd generation APU won't be available for retail purchase this year. If you recall, AMD initially promised that Kaveri would be available during 2013 and even published roadmaps earlier in May that show the chip shipping in the beginning of the fourth quarter. What the company is saying now is that while Kaveri will ship to manufacturers in late 2013, it won't actually hit shelves until 2014. The reason Kaveri was late taping out, according to sources, was that AMD kept the chip back to put some additional polish on its performance. Unlike Piledriver, which we knew would be a minor tweak to the core Bulldozer architecture, Steamroller, Kaveri's core architecture, is the first serious overhaul to that hardware. That means it's AMD's first chance to really fix things. Piledriver delivered improved clock speeds and power consumption, but CPU efficiency barely budged compared to 'Dozer. Steamroller needs to deliver on that front."
what about the next NON APU chips? (Score:2)
I don't like Intel lack of PCI-e in the i5 and lower range i7.
also where are the nexgen higher end i7's with QPI?
Re:what about the next NON APU chips? (Score:4, Informative)
Steamroller FX chips aren't even on the roadmap at this time. That doesn't mean that they will never come, but AMD is clearly prioritizing their APUs over enthusiast-oriented chips at this time.
Re: (Score:3)
Re: (Score:2)
I also really can't blame them in this matter as I have been saying for years and time and time again I've seen with my own two eyes that CPUs passed "good enough" several releases ago and even the Phenom II based chips and first gen Bulldozers really are overkill for all but a handful of users.
This.
I have a Phenom II 955 and the only problem I have is the motherboard doesn't support SSD's properly. The CPU handles new games fine, I've upgraded my graphics card and RAM, bought 2 SSD's (thinking about buying a third so I can use my 512 GB SSD exclusively for game installs and the old 256 GB went into the laptop) but not the CPU. The only thing at the moment that would force me to upgrade my CPU is the motherboard (I think I can still get an AM3 compatible board though). If I had to buy a new CP
Re: (Score:2)
Re: (Score:2)
Presumably the console chips are already in production. AMD's been fabless for quite some time, so there shouldn't be any reason production is an issue -- and looking at console sales over the years, it's not as if they're ramping up for a new iPhone, even given that both new consoles are using AMD chips.
They ought to have plenty of resources available for new chips if they have a rationale for making them. Low end users rarely clamor for a hot new processor, only a cheaper chip with the same hotness. And
Re:what about the next NON APU chips? (Score:4, Informative)
Re: (Score:2)
But all those designs now are fixed and they won't be asking for new ones except maybe a die shrink for 5-10 years based on the last generation (Nintendo 6 years, Sony and Microsoft 8 years), if anything I would expect this generation to last longer. Those royalty payments will be much needed but that market won't provide any new business until 2020.
Re: (Score:1)
That's fine, who actually needs a new CPU? I think anyone who has a working computer and buys a new computer right now of any type other than a laptop is a nutcase. I can understand moving to Haswell for mobile or even small-scale off-grid, because of the amazing power consumption benefits. They really are astounding, especially considering Intel's poor past record on power consumption with literally every processor before Haswell except the Mobile P3 — some of the CPUs had pretty low consumption, but
Re: (Score:2)
I'm a nutcase, then. I put together the current box in summer of '09, and I'm on the third CPU,an AMD 1090T. If there's no successor to Piledriver, then I'm still gonna save up for what will be a significant upgrade to my system. Am I lucky enough to get affordable housing, I'd even be able to build a new system with the eight-core.
I _like_ having extra cores - they make easy having different things running, playing with virtual machines, crunching for WCG. The hit I take on heat and electric are offset
Re: (Score:2)
I just swapped out my AMD 1090T system for an Intel i7 socket 2011 system. I also put my AMD systemtogether in 2009, and I was getting tired of waiting... 6 AMD cores at 3.2GHz ain't what it used to be (or, perhaps, the work I'm doing has grown in size), and the new AMD architecture still has issues, as well as not nearly matching Intel on critical things like media processing. I mean, I can get better-than-realtime rendering on AVC video now, at least for some purposes. Also got me up to 64GB of RAM, which
Re: (Score:2)
Sounds a sweet system, what with the CPU, scads of RAM, and the screens. Video editing is one of those things that "I want to learn how but don't have time/energy/focus for right now"
I like where AMD is going with their arch and HSA from the tiny bit I can follow of such things; it looks to offer a logical, streamlined approach to future workloads. There's part of the prob- often ahead of time, AMD offered full 64-bit, then cores, now the [bundled/hybrid?] cores, etc., while waiting for programmers to cat
Re: (Score:2)
Re: (Score:2)
You actually can max out an X6 doing PC-DAW processing, if you're lazy. But if you're building for an actual studio, they probably learned on physical boards, and will buses properly, plug-ins more sparingly, and find that if anything, the HDD is the bottleneck.
And yeah, it's the constant price pressure that had made most laptops pretty disposable. My son's only 22, and he's managed to go through four laptops... the current AMD-A8 based unit is dandy for grad school. Previously, he had pushed for gaming per
Re: (Score:2)
Re: (Score:2)
If you want lots of PCIe the Intel 4820K is coming, somewhat soon.
Re: (Score:2)
The 4820K doesn't support ECC RAM, though.
Re: (Score:2)
I agree that, since I'm going to get a dedicated graphics card anyway, I'd much prefer some more focus CPU. That being said, an APU could open new possibilities for using OpenCL and the like without bogging down the main CPU cores or competing against graphics for the GPU's precious shader units. Now I'm intrigued - I wonder if this arrangement would actually work...
Re: (Score:2)
They needs to scale the GPU part of the APU to deliver a decent level of OpenCL performance. But AMD has one big advantage -- a bit problem with OpenCL is latency -- there's overhead in compiling the OpenCL code, overhead in communicating with the GPU, etc. So you typically see CPU use drop in OpenCL applications, and some of the CPU being used is "wasted" in OpenCL overhead. SO in short, as the CPU speed increases, you need a much faster or better architected GPU to make that extra processing worthwhile.
Bu
oh noooo (Score:1)
An unnecessarily overpowered chip will be delayed, so more of the hardware features no one asked for will be delivered to a market that usually works in symbiosis with the Microsoft inefficiency treadmill but is now being destroyed entirely by that same company.
Re:oh noooo (Score:5, Insightful)
I would love to see what these chips do for engineering simulations. In simulation software there is a lot of back and forth between parts that can be done on a gpu for a huge performance gain to parts that work best on a cpu. The problem is that mostly you end up running them pure cpu only because the overhead of handing off to the gpu and getting a result back is so high. Kaveri is the first chip I know of that can do a zero copy transfer between the gpu and cpu. It may not be great for all apps but it should be AMAZING for engineering sims if they are modified to take advantage of it.
Some of the papers I read found that on simulations large enough for the gpu to make a difference you could get a 50x performance increase and theoretically it should have been around 200x or so but the overhead of loading and retrieving the data was still very large.
Re: (Score:2)
Looks like the developers you work with haven't discovered asynchronous operation and the principle of overlapping communication and computation.
Re: (Score:2)
In a simulation there are a lot of steps where you have some linear calculations and then some other calculations that can be thousands run in parallel but the next linear calculation depends on the parallel calculation. Async makes no sense for that. The next linear calculation can't be run until the previous parallel calculation has finished. Think of some steps as simple linear equations and the next step as a set of codependent equations that need to be converged. Doing that in parallel is vastly faster
Re: (Score:1, Insightful)
An unnecessarily overpowered chip will be delayed, so more of the hardware features no one asked for will be delivered to a market that usually works in symbiosis with the Microsoft inefficiency treadmill but is now being destroyed entirely by that same company.
Why don't you wait for it to come out before criticizing it. AMD is in trouble and it would hurt us all including the intel fan boys who are reading these comments if AMD is gone.
I noticed Intel is already raising prices on the newer I7s for no other reason that they think there is no competition as everyone bashes them in every tech website.
Steamroller is actually slower than the older Phenom II per clock cycle and is a crappy chip. That is true. I still own a phenom II but my crappy mobo is showing its ag
Re: (Score:2)
That four months is four months of lost revenue for AMD. Back when I worked in the chip business we typically had about six months after release where we could charge a premium for the new chips before the competition released something better. Unless they slipped too, every day we slipped was a day of premium, high-margin sales lost.
Yeah, we'd probably still be selling those chips two years later, but they'd then be at bargain basement prices with tiny margins.
Re:OMG four whole months to wait. (Score:4, Insightful)
Re: (Score:3, Insightful)
The fact that you can get an 8 core, 4 GHz CPU for $200 is a big plus for some people.
I guess, for the people who like big numbers never mind that it's usually just breaking even with competing quad-cores with lower frequency but higher IPC. The FX-8350 has a single threaded performance equal to the Phenom II X6 1100T and Intel Pentium G840, it can win some multi-threaded tests because price wise it competes against Intel's hyperthreading-crippled processors but it's no impressive chip. But at least it sucks less than the FX-8150 , which really was the worst of Bulldozer.
Re:OMG four whole months to wait. (Score:4, Insightful)
AMD has clearly lost the performance war. But I'm still hoping the brand sticks around because I believe it's the only thing keeping Intel CPU prices low.
But in any event, I think the really important point is in the end of this article - http://hothardware.com/News/Praying-For-Consoles-AMD-Details-2013-Game-Plan-Offers-Updates-on-New-APU-Performance/ [hothardware.com] - AMD is banking its future on the APUs in embedded applications, low end laptops, and consoles. Unless they get into tablets and mobile devices in big ways, I think they're planning to grow their share of a market that's shrinking rapidly. "King of console processors" is meaningless if 90% of the demographic that played Xbox360 in 2005 is playing on an iPad in 2020.
Re: (Score:3)
AMD has clearly lost the performance war. But I'm still hoping the brand sticks around because I believe it's the only thing keeping Intel CPU prices low.
Intel CPU prices were higher when AMD was competitive with them.
ARM are Intel's real competitor at the moment, not AMD.
Re:OMG four whole months to wait. (Score:5, Insightful)
ARM is a threat to Intel in the near future and indirectly. People are gravitating towards tablets and smartphones instead of buying deaktops. However, those of us that actually need desktops today have only Intel and AMD to turn to, and Intel's margins are too high and their products are too artificially crippled for my tastes, which is why I sincerely root for AMD's success.
Re: (Score:2)
But, as I said, Intel CPU prices were higher when AMD's high-end was competitive with Intel's high-end.
Intel has no competition for their high-end desktop CPUs. So why don't they push prices up much higher? It's not because you can turn around and buy a similar performance CPU from AMD.
Re: OMG four whole months to wait. (Score:2)
Comparing to top end graphics cards (Nvidia Titan, for example), the top end Intel chips (3970x) are relatively unsophisticated compared to their cheaper line ($300). That is, they are already inflating their margins by astronomical amounts because they have no competition at the top end.
Re: (Score:1)
You ask why they don't keep pushing prices up(could ask that until infinity dollars) as if you think Intel's pricing is reasonable for their top end CPUs and you think they are doing consumers a favor for not asking more.
Comparing to top end graphics cards (Nvidia Titan, for example), the top end Intel chips (3970x) are relatively unsophisticated compared to their cheaper line ($300). That is, they are already inflating their margins by astronomical amounts because they have no competition at the top end.
And just wait to think what will happen if AMD liquidates and closes it doors if it can't get a competitive CPU? We most certainly will accelerate into a post PC world as the costs of all notebooks and desktops at the low end will double very quickly.
Please AMD do not mess this one up. My phenom II is begining to show its age and in 2 years I am planning on replacing it perhaps with this new chip?
Re: (Score:2)
It's the economy[, stupid]. People won't pay a thousand dollars for a magic CPU right now. Intel is already charging what the market will bear. AMD is chasing the budget segment right now because it's large and growing, and there's no way they can compete with intel for the high end. Intel can just ratchet down their prices until they're only making a reasonable amount of profit. Instead they're taking over the market where Intel is not competitive, which is the only thing that makes sense. The Intel-based
Re: (Score:2)
By the end of the year, Intel will be making these at 22nm, and by 2015 they'll be down to 16nm. I don't think there's anyone planning to man
Re: (Score:3)
The CPU prices were higher because it was more expensive to make CPUs back then. With the past technology you couldn't put as many dies on the wafer.
Re: (Score:2)
Intel used-to have a monopoly grip on the workstation and server markets. Intel's P4 debacle greatly loosened Intel's grip, and Opteron leap-frogged Intel, and the demand was such that even the most dedicated Intel OEMs started offering Opteron servers and workstations, and AMD isn't gone from that market by any means, so if Intel slips up just a bit, AMD will be there to take up the slack. So Intel has good reason to be much more careful abo
Re: (Score:1)
AMD has clearly lost the performance war. But I'm still hoping the brand sticks around because I believe it's the only thing keeping Intel CPU prices low.
A while back their CEO fired most of their processor design team during lean times, apparently thinking they could re-hire them when fat times arrived again. Alas, firing your core talent during lean times means you will never have fat times ever again. I'm rather doubtful AMD will ever come anywhere near the pinnacles of the past and in fact wondering whether I should permanently write them off.
Re: (Score:2)
To be fair, AMD has always been in a fundamentally tough position - for many years Intel has been able to spend more on research and development (R&D) than AMD made as gross income for the year. That's a David-And-Goliath position, except this Goliath is wearing a kick-ass helmet. I don't know if it was ever feasible for AMD to eat a big chunk of Intel's market.
Slashdot has linked articles discussing the events you mentioned. AMD's best products came when they used software to design C
Re: (Score:2)
AMD has clearly lost the performance war. But I'm still hoping the brand sticks around because I believe it's the only thing keeping Intel CPU prices low.
I'm not so sure, actually. I think a big part of today's relatively low prices is Intel competing with itself.
We're well and truly into the age of 'good enough' computing. You don't need a new computer to run the latest Windows or Office version or even most games. Unless you're transcoding hours of video or playing today's games on high detail mode that four year old Core 2 Duo is fine, and will be for a while yet.
If Intel raise their prices, the risk isn't that their customers will flock to AMD. The risk
Re: (Score:1)
They wont flock to AMD.
Most sales are from big name OEMs these days like HP and Dell. They only have intel lines as customers see the stickers and it is a name they are familiar with. Worse, geeks who go to slashdot or its Redmond fanboy version aka neowin.net have been warning them of AMD for a few years now if they users ask them for advice.
There are still some who homebrew systems but they are almost all performance oriented folks who buy Intel anyway. Even a cheap i3 build will cost $200 more than buyin
Re: (Score:2)
And yet my $400 HP Llano notbook has made me more satisfied than any laptop I've had in the past :)
Re: (Score:1)
HP: DM1Z Notebook - AMD APU models only
HP: Walmart Pavillions - AMD AM3 Athlon x2-x3-x4s (Phenom's also) Complete Desktops
HP: Walmart Laptops - AMD models, walk in and smell the chips
Acer: Walmart Laptops/Desktops - almost exclusively AMD systems
These are In stock and what many people buy everyday for home and small business use.
Re: (Score:2)
So maybe that will keep Intel's prices low no matter what happens to AMD. If the latest and greatest Intel Core iWhatever costs too
Re: (Score:2)
A great tablet (Current generation iPad, Google Nexus 10, Asus Transformer Pad Infinity) and twenty of the best games and hundreds of other useful apps and a decent and very easy to use web browser and all sorts of video streaming services might cost you $600 total. A console and ten of the best games and few other apps and a web browser that's a pain to us
Re: (Score:3)
I guess, for the people who like big numbers never mind that it's usually just breaking even with competing quad-cores with lower frequency but higher IPC.
If by "breaking even" you mean pretty much beating the similarly priced intel processor on every multithreaded task, and even beating the $$$ top end intel i7 on some then yes.
A good fraction of what I do requires more than one core these days. In things like parallel compiling, I believe that the top AMD one was, in fact, king of the hill. That's a big p
Re: (Score:2)
I imagine the next-gen consoles being all AMD has something to do with that.
Aren't the new consoles the first time in decades that a console is much less powerful than a typical gaming PC at release? The original Xbox, for example, was a PC with a decent CPU for that era and a faster GPU than you could buy for a PC, and only really limited in RAM.
Re: (Score:2)
Actually, this is the first generation that has similar memory to a PC. The PS3 launched with 256MB of RAM when a low-end PC might have 2GB, i.e. only 1/8th of the memory. The new consoles have 8GB when a typical gaming PC might have 9-10GB.
Although they've still found a way to ruin that with bloated operating systems taking up nearly half the memory.
Re: (Score:2)
Low end PC's at the time shipped with 512MB, not 2GB. If they shipped with 2GB Vista might have faired better.
Re: (Score:2)
The original Xbox had a REALLY SLOW CPU for a new piece of hardware from that era. NV2A is in between NV20 and NV25, though, so the GPU was actually competitive when it came out.
Re: (Score:2)
No. That's four months without the increased sales that new feature may inspire (they hope), but in the meantime their current stuff is still selling. It should be obvious. Please sober up before posting again.
What is next? (Score:1)
I like AMD now for budget builds. I loved AMD when they were smacking intel. However, this line of chip names has to end soon. There just that many more cool sounding pieces of heavy construction equipment.
The New Phenom VIII x16, based on Suction Excavator technology!
or
From the new Skip Loader core comes the AMD Skippy x8!
or
Our new Pipelayer core provides all the uumph you need to penetrate difficult projects.
These just don't have the same ring.
Re: (Score:2)
AMD APUs have the highest performance per dollar. (Score:5, Interesting)
The AMD APUs really are a great melding of price vs performance. Sure Intel has faster CPUs, but they're also more than twice as much! The highest end APU is $150, and the highest i7 is $340. The i7 will have higher CPU performance, but most games aren't CPU bound, they're GPU bound. The AMD APUs have decent GPUs. They won't replace your high end GPU if you're playing Battlefield at 1080p, but if you're a mid-level gamer they perform great. Plus you can always add a decent GPU for $150 and you're still less than that 4700 i7!
Re: (Score:2)
Comparing most expensive chips isn't fair or useful. Intel's most expensive chips can cost a lot more because AMD doesn't have anything competitive.
A AMD FX-8350 costs $200. In Intel land, a i5-3750 is the right cost equal, at about $215. Intel's lead is so large that even a previous generation unit from their line up is approximately equal performance to AMD's current models. Which of those two is faster depends on the benchmark [techreport.com].
At the $100 end of the market, there are a few really cheap models where A
Re: AMD APUs have the highest performance per doll (Score:3)
Re: (Score:1)
you will have to spend that extra 75 bucks on cooling and a power supply though so you really end up spending about the same for a lower performing chip
that's what kind of happened to me, havent had an intel system since the pentium 1, went to go upgrade / make a new box, so getting a new mb and video card anyway, both the AMD and INTEL gigabyte brand motherboards were 80ish bucks, the 8 core AMD was cheaper, but knowing from my quad core I instantly needed a 40$ fan cause the OEM fans work, but sound like
Re: (Score:3)
The boxed processor I bought (FX4300, quad @ 3.8ghz OC'd to 4.1ghz) is using the stock AMD HSF that it came with. It usually runs around 42C and gets up to maybe 53-54C under a load that pegs all 4 cores (movie encoding, which I often do).
At no point is the CPU fan ever even audible. The only thing I can hear other than some HDD whine is the PSU fan on my Antec Neo Eco 520.
And how are you getting that the AMD processor uses 220w more than the Intel? The Bulldozer TDP is 125w (AMD FX-8350 Vishera 4.0GHz)
Re: (Score:1)
The AMD branded cooler master that came with mine in the amd box never had a problem keeping things cool, but it was so damn loud my wife complained about it from across the apartment 2 rooms down the hall
TDP has little to do with how much power a CPU sucks out of the supply, the 8 core 4.2 ghz under full load will suck almost 300 watts of power
my 4170 quad sucks down 198 watts and my old ATI6870 sucks 247 under full load so without thinking about motherboard, fans, optical disk's, shit plugged into the por
Re: (Score:2)
Something's wrong with your wattage figures, like you're taking the whole system's power use when stressing the CPU with a burn program, likewise with the GPU (with something like Furmark)
The point still stands, but to me the more annoying point is paying for the electrical utility bill.
To min/max the game what you need to do is build an Intel system with a lowest end mobo (around $50), Intel 3770 or 4770 or 4570, a good 300W or 350W PSU, stock cooler, max it out at 16GB. You can't run your 4170 on a low en
Re: (Score:1)
agreed, that's why my most recent build was an i7 3770K, in the middle high end AMD just cant wrangle the numbers, and in the low end no one cares whats in their 299 walmart machine
Re: (Score:2)
I think a FX6300 is half decent, esp. if you want to play with virtualisation with Vt-d. You had the worst of the bunch with that FX 4100.
Re: (Score:2)
Here's a dirty little secret about the "temperatures" that are reported by AMD (and probably Intel) processors. They are *not* calibrated against anything. You can't say that the processor runs at 53-54C with any certainty (it can be +/- 10% or more when compared to a
Re: (Score:1)
According to Tomshardware an Icore3 can fucking beat that 8 core. Especially in Skyrim and Crysis.
FYI I am typing this on an AMD phenom II sadly as I am not an intel troll. The FX really is a crappy chip and there is no sense trying to defend it as the people who play games or do any graphics work use dedicated graphics anyway. The intel integrated crap is fine for Office work and web browsing in this day and age.
Here is hoping this next generation one fixes the problems.
Re: (Score:1)
Are there really that many people playing Skyrim and Crysis? Particularly playing those games with low graphic settings in order to not be GPU limited? A lot of my clients still have Core 2 duo or Phenom II and don't need more power. Even worst, a lot of their employees still have P4 at home and see no reason to upgrade. Also, considering the Xbox One and the PS4 both have an AMD processor (and not a fast one), it's kind of obvious there is very little use of a Core i7 for most people, even for gaming.
Perso
Re: (Score:1)
the FX is pretty garbage, I have a FX4170, and its noticeably better than my old phenom II 720, it can outrun an i5 in daily operations, it cant hold a candle to the i7
most disappointing AMD chip I have ever bought
Intel have 30 people working on Intel graphics (Score:4, Interesting)
The AMD APUs really are a great melding of price vs performance....
Even though I loath the 70% gross margin that Intel insists on. They have http://www.phoronix.com/scan.php?page=news_item&px=MTI5MTI [phoronix.com] 20-30 people working on Linux Drivers vs 5 from AMD. There is more than one way to measure bang for buck. That said when I buy a separate graphics card it will be AMD.
Re: (Score:1)
Wait a second. You know the AMD drivers are for shit, but you're going to willingly choose AMD? The company that pays lip service to open source, but just trickles out information slowly so that the free driver will always suck and never support some of even their old hardware, like R690M?
Why would you pay for abuse? As long as AMD is being a collection of asshats about video drivers, giving them money is just voting for asshattery.
All my CPUs have been AMD for ages now but every time I try an ATI GPU I win
Re: (Score:2)
Lip service? They've pretty much released docs for everything under the sun that cleared the legal dept's requirements.
Nice weasel words. Get back to me when they release enough information for the R690M to work. That's only been out for years and years.
Re: (Score:2)
But software bugs != lack-luster support
So I'll take the "lack-luster support" then, rather than the bugs. Funny how nvidia's approach with free open source drivers is working better, nvidia doesn't collaborate at all so the devs are all on their own, in the end they have to figure everything out themselves and everything gets supported, with no non-free dependency and no graphical corruption in a fucking window manager.
Re: (Score:2)
Re: (Score:1)
I went to go upgrade my quad core FX last year, in order to get the 8 core I needed a new fan (cause the ones amd ship are a joke ... they keep it cool, running at 6k rpm and loud as a jet) and a new power supply
by the time I bought those two things I was at the price of a 3770k and still didnt get close to performance .. bought intel
until AMD can get their power to power ratio in line they are just not worth looking at in the mid to upper end, and no one cares about the low end, go get yourself a 99$ off l
Re: (Score:1)
Throw in motherboard and ram and possibly graphics card in that mix and picking that CPU vs the Intel one make less of a difference.
For me personally I need to get a new HDD, PSU, want to get a case and need a new monitor to.
Reason to pick AMD? None.
Re: (Score:1)
The AMD APUs really are a great melding of price vs performance. Sure Intel has faster CPUs, but they're also more than twice as much! The highest end APU is $150, and the highest i7 is $340. The i7 will have higher CPU performance, but most games aren't CPU bound, they're GPU bound. The AMD APUs have decent GPUs. They won't replace your high end GPU if you're playing Battlefield at 1080p, but if you're a mid-level gamer they perform great. Plus you can always add a decent GPU for $150 and you're still less than that 4700 i7!
This is why my next machine will be an AMD APU. While I have a standalone card now, if it dies I'd likely just move to using the APU alone. I don't think it'd present a major problem, especially whenever it is I upgrade. They're only getting better.
Um no. (Score:2)
1) AMD highest end APU is not 150$, it's 330$. With tiers all the way down.
2) If GPU bound, you can get an i5 for 100$ less or an i3 for even less, and put that towards a GPU.
3) Games are also mostly limited to 1 core more or less. Making about 7 of AMD's cores more less useless in this regard.
AMD are not great price vs performance. AMD ARE good at price at the low end. If you are building a basic machine on the cheap, AMD is your chip right now (or a business server). However if you wish to use it for any
Old news delayed, shipment to Slashdot slips (Score:2)
This has been known for a couple months or three months, and even then was not a big surprise as the original target was "H2 2013" with no commitment.
A chance to fix things? (Score:2)
Re: (Score:1)
I've been using ATI parts with X since mid-90s, and never had any complaints about driver support (execpt that for around 2 years, I was forced to use a non-free x server [not by ATI; I don't remember who made it, but I ripped it from RedHat, and used it with Slackware] for the GPU on my laptop, in the 90s). It is better now than it ever was. The only real complaint is that the newer free drivers for AMD (e.g., radeon driver) rely on non-free firmware-- but, of course, there are folks working to remedy this issue..
The only company making a GPU that is in the same class of commitment to supporting free software on their GPUs as AMD is Intel.
Oh, you are running some binary blob. Try the free radeon driver. It works quite well-- esp with a 3.11+ kernel, so you get power saving in GPUs using the radeon driver (way cooler running on my laptop, and you won't have to run a -rc kernel in about a week or two when 3.11 is released for real). If you need open-cl, well then you are forced to the binary blob for now, but I expect this will change within a couple years, at the most.
The bashing for me is Linux SUCKS with driver support! I had an ATI 5750 and can only run 2009 era distros and not run update on them as Linux lacks a stable ABI because socailists like RMS feel binary blobs are evil and that all should be opensourced and recompiling them enforces this freedom.
I call that slavery as I loose the freedom to run the OS and XORG I want. Every other os including other free ones like FreeBSD have an ABI and because it is stable you can even add them as kernel modules in newer ver
Re: (Score:3)
Re: (Score:1)
You're full of it. There's no reason you can't keep running the same OS kernel as long as you want to, not breaking anything. It's YOUR decision to upgrade to the latest shiny new kernel. If you're THAT worried about keeping up to date then this is the price you pay. I mean, it's fun to blame Linux for this. It really is. But all those Android tablets are running it just fine with driver issues.
If you REALLY wanted to see Linux be better-supported by driver creators, then you'd pressure them more, not Linux. A stable ABI isn't the answer you're looking for when simply recompiling the driver is good enough for most bleeding-edge users to play with it. The kernel simply doesn't change THAT much, so extensive testing isn't necessary as often as people seem to think it is.
Look my system runs fine when I install it. Then Ubuntu update comes in and it goes black. I am the user at this point and not technical at all. In real life I know to alt F-key another tty and run a kill -9 to run a shell terminal to fix it, but good luck with that for 99% of all other users.
Well I would still be using Linux now otherwise. Android is stable as each minor release has the same ABI. Hairyfeet said the same thing why he does not sell Linux at his shop. He puts it in and the customer always com
Re: (Score:1)
And Hairyfeet is a god damn Linux Idiot. I'm running Gentoo x86_64 and guess what, I haven't had problems with hardware suddenly stop working because of an update and the reason why is I don't run a crap Distro like Ubunta that tries to hide everything from the user. They're really no better then MS in that regards. If you want stability, you need to be running Debian itself as they don't screw the drivers up like Ubunta.
Even for Gentoo, hardware drivers are no longer an issue unless you're replacing a god
Name Kaveri means Friend is Finnish (Score:4, Informative)
Re: (Score:2)
Furthermore, "apu" means help as a standalone word, or helping/auxiliary as a prefix. So "apuprosessori" would mean a coprocessor.
Also, to nitpick a little, "kaveri" is more like a buddy, or even a (random) guy, as opposed to a close/true friend.
Re: (Score:2)
Indeed. "Ystävä" can be used for a close friend.
But anyway, "Kaveri APU chip" quite literally says "helper buddy chip" -- the name of this AMD product sounds incredibly cute to the Finnish ear. Even though Kaveri is actually a river.
As a sidenote, Roccat [roccat.org] is a German company which makes PC peripherals which carry Finnish names.
Re: (Score:2)
Re: (Score:2)
I know Intel can do it, but they simply don't want to cannibalize their sales of power inefficient high-end chips.
Missed the newest Haswell line, eh?
Re:Greater power efficiency please (Score:5, Insightful)
AMD and to a lesser extent, Intel, are misreading the mass market. What everybody else except those hardcore GamerZ (rhymes with lamers) want isn't more "powerful" desktop systems that consume enough watts to power a third world household with room to spare but more power efficient APUs, aka SoCs or systems on a chip. I know Intel can do it, but they simply don't want to cannibalize their sales of power inefficient high-end chips.
How has Intel misread the market? Ivy Bridge was Sandy Bridge with much lower load power. Haswell is Ivy Bridge with much lower idle power. True, Intel is still struggling to compete in the smartphone/tablet segment that is dominated by ARM, but Haswell is far superior to past Intel chips when it comes to power consumption.
Re: (Score:2)
Indeed, for the first time Intel is the clear winner in power consumption. Until Haswell, the intel CPU+chipset would always consume more power than the AMD CPU+chipset. To me, that is the real story of the modern CPU, not whose CPU is fastest.
Re: (Score:1)
> Haswell is far superior to past Intel chips when it comes to power consumption.
Only the laptop chips. The desktop chips are actually more power hungry. On top of that the laptop chips are only more power effient when doing light work, when fully loaded they have no advantage over Ivy Bridge.
Re: (Score:1)
I don't think you can be power efficient when using max power for gaming, editing, etc
Re:Greater power efficiency please (Score:4, Insightful)
So basically, you havent looked at Intel CPUs of the past 2 years at all, right?
Re: (Score:2)
Re: (Score:2)
I'd consider an ARM desktop if there actually were motherboards to buy!
I only know of one, it's 349 euros and has a Tegra 3 which is outdated but has PCIe. Tegra 4 is a better fit for a desktop, CPU wise, but doesn't have PCIe.
http://shop.seco.com/gpudevkit/gpudevkit-detail.html [seco.com]
What you would need is a Tegra 5 which will just come with desktop graphics, so the feature level and driver support will both be easier. Just use nvidia driver or nouveau, presumably, and have real OpenGL not OpenGL ES. But we don't
Re: (Score:2)
Temash has 3.5W to 5.9W TDP, that's the max power use for a CPU+GPU+southbridge. Low end Kabini is 9W. Haswell at 10W.
So yes I say fast ARM and slow x86 meet at a similar point, a few years ago you had an Atom smartphone which was fast and worked. You have a Toshiba tablet with Tegra 4 that overheats, though it's bad design and that ARM SoC is a semi-failure.
Note that "idle" on a modern desktop is not so much 0 to 1% CPU use, I have firefox using 30 to 50% of one CPU core right now doing who knows what. To
Re: (Score:2)
There are image processing techniques that are still too compute-intensive for routine use. Linear motion blur correction for a 4000x3000 image can run several minutes on a state-of-the-art processor. Now upgrade that to an algorithm that searches for the sharpest possible image from a set of nonlinear multi-direction blurs: come back tomorrow, and if the CPU hasn't fried itself it still won't be done.
The ability to use CPU power far exceeds any likely improvement in the foreseeable future.
Re: (Score:2)
Better then that. My ivy bridge i7 2,6 Ghz Mac Mini (the entire computer, not just the CPU) idles at below 15w and maxes out below 60w at max load.
Re: (Score:2)
Lemme guess, native English-speaker on ibogaine with too much crank and Jack on the side? Awesome, man. First belly laugh of the day.
Re: (Score:2)
dunno what your needs really are but if you want a crapload of connectors you can look like at a 990FX chipset mobo like the Gigabyte GA-990FXA-UD3, you can plug in lots of PCIe 4x or 8x or 1x cards in that, ergo additional SATA and LAN controllers. Drop an old 1MB or 2MB vid card in the PCI slot if all you want is a display for installing it, else a radeon 5450 or geforce 6200 or 210 in a PCIe 16x slot will do.
for a relatively low power CPU, well lol you can go with an Athlon II X2, even though that's a 20