Intel and AMD May Both Delay Next-Generation CPUs 193
MojoKid writes "AMD and Intel are both preparing to launch new CPU architectures between now and the end of the year, but rumors have surfaced that suggest the two companies may delay their product introductions, albeit for different reasons. Various unnamed PC manufacturers have apparently reported that Intel may push back the introduction of its Ivy Bridge processor from the end of 2011 to late Q1/early Q2 2012. Meanwhile, on the other side of the CPU pasture, there are rumors that AMD's Bulldozer might slip once again. Apparently AMD hasn't officially confirmed that it shipped its upcoming server-class Bulldozer products for revenue during August. This is possible, but seems somewhat unlikely. The CPU's anticipated launch date is close enough that the company should already know if it can launch the product."
Collusion (Score:4, Interesting)
Ya right (Score:4, Interesting)
The current situation is Intel is slaughtering AMD. AMD hasn't had an architecture update in a long, long time and it is hurting them. Clock for clock their current architecture is a bit behind the Core 2 series, which is now two full generations out of date. Their 6 core CPU does not keep up with Intel's 4 core i7-900 series CPU, even on apps that can actually use all 6 cores (which are rare). Then you take the i5/7-2000 series (Sandy Bridge) which are a good bit faster per clock than the old ones and there is just no comparison.
On top of that, Intel is a node ahead in terms of fabrication. All Sandy Bridge chips, and many older ones, are on 32nm. AMD is 45nm at best currently. Not only does that equal more performance but it equals lower heat for the performance, particularly for laptops. Then of course Intel is talking about Ivy Bridge, which is 22nm, another node ahead. Their 22nm plant is working and they've demonstrated test silicon so it will happen fairly soon.
The situation is not good for AMD. All they've got is the low end and that is getting squeezed hard by Intel too. They need a more efficient CPU and they need it badly. Delaying is not something they want to do, Bulldozer has been fraught with delays as it is. They've been talking about it for a long time, like since 2009, and delivered nothing.
They have every reason to want to get Bulldozer out as soon as possible and preferably before Ivy Bridge. Each generation that Intel releases that they don't have a response for just puts Intel that much farther ahead.
Now that said, Intel may well have decided to hold Ivy Bridge if AMD can't deliver Bulldozer because they don't need to. Sandy Bridge CPUs are just amazing performers, they don't need anything better on the market right now. However I can't imagine AMD colluding with Intel on this. They are not in a good situation.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Intel's problem that everyone has been sounding the alarm on is that in the coming years being the x86 people with court mandated competiti
Re:Ya right (Score:4, Insightful)
I don't believe the bit about 32nm is accurate. I just ordered a new laptop with an AMD A6-3400 CPU. This is a fusion based chip and is 32nm.
As far as the performance claims regarding 6 core AMD chips, I have to agree with that. However, the cost of an Intel chip is not worth it. My 6 core AMD upgrade saved me hundreds of dollars. it still improved my starcraft 2 framerate by double over my phenom 9600 x4.
Intel stuff is faster if you have the money. It's not fanboyism, just practical price/performance based on benchmarks.
Re: (Score:2)
I bought a nice 6 core Phenom X6 x1035T. It is underclocked to only 2.6 ghz, but for $450 I got 8 gigs of ram and virtualization to run VMWare with its SSD instructions. With the 6 cores and 8 gigs of ram it rocks to have 3 - 4 VMS running for the price I paid.
With an ATI 5750 that came with it, games run reasonable well too. As soon as I upgrade the PSU I plan to flash the bios so I can clock my cpu to 3.2 ghs. Asus crippled but there are hacks to get around it.
For value, the AMD phemom II is only 4-7% slo
Re: (Score:2)
Intel core-i5-2310 Sandy Bridge - $190.
AMD Phenom II X6 1090T Black - $170.
That's a $20 difference and BTW the i5 blows away the Phenom (any Phenom). You don't even need an i7.
Intel is able to price their cpus at a bit of a premium over AMD, which is why Intel is rolling in money and AMD is not. But there's a good reason why Intel has that pricing power and its one word: "SandyBridge".
It is also true that the absolute highest-end unlocked Intel cpu is priced at a very serious premium... but if you are try
Re: (Score:2)
Re: (Score:2)
I've recently switched from all-nvidia to AMD GPUs and was pleasantly surprised when the drivers horror story just didn't happen. Aparently the monthly release cycle did wonders for them, both on Windows and Linux.
Re: (Score:2)
Re: (Score:2)
Let me be clear, when I was talking about price, I was including the fact that I upgraded an existing system with a 6 core CPU rather than buying new RAM + motherboard + CPU to switch to an Intel chip.
There are workloads that AMD chips beat Intel chips. The benchmark mentioned by the other poster is an example. One benchmark does not prove anything, but I'm certainly happy with my purchase.
What I like most about the AMD CPU is that it's great for building packages. I use it occasionally on the MidnightBS
Re: (Score:2)
Intel's HD Graphics are still a joke (my amd netbook, 1.6GHz dual core, beats an i5 in graphics, ffs)
Re: (Score:2)
ECC is nice, sure - but if you buy RAM that works its not really necessary.
some people actually use their computer (Score:2)
Re: (Score:3)
Hm, actually, if you're going to have a pile of heavy-duty VMs running concurrently, a higher number of slightly less powerful cores are going to be much better than a
Re: (Score:2)
for something more useful than Starcraft
If you think you need more CPU for desktop than for gaming, you are doing something seriously wrong, no matter how many VMs you use. Seriously, check the hardware requirements for, say, Starcraft 2. It totally owned my machine before I last upgraded. The same machine that practically flies for development, VMs and computational fluid dynamics. Yes, it uses on-access virus scanning and W7.
4G ram
There's your problem. VMs are big. Swapping to hard disk is slow. More CPU won't help. You need more RAM.
Either that or y
Re: (Score:2)
It depends on the games you are looking at. Most of the first person shooters are highly GPU focused, with very little AI involved. Much of this is due to this idea of releasing the same game for consoles as well as PCs, they go for the lowest common denominator, and that generally means low end CPU and even the graphics tend to avoid being cutting edge to make the PC version almost identical to the console version.
Re: (Score:2)
If you're running numerous VMs, RAM is your problem. Given enough RAM, a core 2 duo will run several VMs in a test environment just fine. With 4GB ram, you can throw the fastest CPU at it you like; if you start running into swap, you're fucked.
In fact, that goes for almost every non-gaming task you'd use a box for these days, other than transcoding video and a few other CPU bound niche tasks.
Re: (Score:2)
Comment removed (Score:4, Informative)
Re: (Score:2)
Intel's compiler is hard to beat, and so a great many people use it. The fair test would be the intel compiler on Intel, and a compiler of AMD's specification on AMD's chips.
Re: (Score:2)
Re: (Score:2)
People don't really care about clock for clock these days.. The kicker is in energy efficiency and cost for performance.
AMD do reasonably well in the energy efficiency and really well in the "Bang for Bucks" department. Yep, Intel currently outstrip them on the high end, but AMD have a lot of the mid to low range market, and still have a good showing in the server market. I wouldn't exactly call that 'getting slaughtered'..
Still, as you say, they do need to get newer architectures out the door to keep bei
Re: (Score:2)
AMD do reasonably well in the energy efficiency
Where? Every benchmark I've seen puts AMD well behind the i3 and i5 in performance/power and the idle consumption of the i3 and i5 isn't much worse than an Atom.
Re: (Score:2)
To an extent yes, but Atom sucks, I mean seriously, Intel ought to have been too embarrassed to let that dog see the light of day. And yes, the Intel offerings do offer better battery life, but at a cost, the only ones I looked at were several hundred dollars more. Battery life is great, but with that much extra on the price tag you might as well just buy a couple extra batteries.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
The trouble with MSWindows 8 on ARM (if you even consider MS an option) is software from other companies. It won't run. It won't even install. (That's a prediction, not tested. But a reasonable bet.)
I don't expect MSWindows8 on ARM to have ANY effect on desktop sales, even at the extreme low end. It may be fine for phones, where there *aren't* any legacy problems. (Not the way I'd bet, but a possibility.) But I don't see any possibility for it on a general purpose computer. Going to run an X86 emula
Re: (Score:2)
Re: (Score:2)
Remember how emulating a x86 on PPC was a piece of cake
As someone who's written x86 emulators, that comment would have destroyed my laptop if I had been drinking coffee at the time.
Re: (Score:2)
Re: (Score:3)
That's benchmark. Real life usage is likely to be very different.
So the i5 uses less power than the best remotely comparable Phenom at idle, uses less power under 100% load, yet magically uses more power in between?
I guess it's possible, but not exactly likely.
Re: (Score:3)
... wait. Yes, I do. To get readers.
From the article:
The CPU's anticipated launch date is already close enough that the company should already know if it can launch the product or not; waiting until now to announce a delay isn't something Wall Street would take kindly. Moreover, AMD has been fairly transparent about its launch dates and delays ever since the badly botched launch of the original K10-based Phenom processor back in 2007. Llano has been shipping for revenue for several months, and we're not aware of any 32nm production troubles at GlobalFoundries.
Re: (Score:2)
On top of that, Intel is a node ahead in terms of fabrication. All Sandy Bridge chips, and many older ones, are on 32nm. AMD is 45nm at best currently.
No, the Llano chips are shipping and 32nm SOI. However only the low power Bobcat cores are made on that process, they need the high power Bulldozer cores to compete with Intel on performance. But yes, AMD ships very many 45nm chips still.
Re: (Score:3)
The current situation is Intel is slaughtering AMD.
Then why do I only buy AMD (and ARM) these days? Frankly, Intel just seems to fib about their power envelope every generation and I do not, repeat, do not like to be surrounded by noisy computers. Currently running a quietized 4 way Phenom II box, very happy with it. I have not been happy with any intel box as a workstation for quite some time. Nothing beats the Pentium M in my aging Shuttle for a basically silent server (21 db @ 3 meters). Every other Intel box I have run recently requires stupid amounts o
Re: (Score:2)
Then why do I only buy AMD (and ARM) these days? Frankly, Intel just seems to fib about their power envelope every generation and I do not, repeat, do not like to be surrounded by noisy computers. Currently running a quietized 4 way Phenom II box, very happy with it.
You'd have been happier with an i3 or i5. I can just hear the fans on my i5 server when I stand with my ears a few inches away from it.
Re: (Score:2)
Not at the same price point. For what he'd spend on the i3/i5 he could probably have water cooling or something else that is rediculous. That is the thing these kinds of comparisons always leave out - cost. I've been running AMD systems for a while now - I can upgrade a box every two years when buying Intel would mean I'd be upgrading them every four years. While in years 1-2 the Intel system would be somewhat faster, in years 3-4 the AMD system would be miles ahead. Plus I'm sinking less money into wh
Re: (Score:2)
That is basically my point. It is better to spend $200 on a motherboard+CPU every two years than $400 every 4 years.
If you spend $400 you get a marginally better system for the first two years, but the $200 system two years later will greatly outperform it.
Equipment like cases, optical drives, PSUs, etc that don't depreciate much is a different story - buy something decent and don't replace it.
A CPU or motherboard is a rapidly depreciating asset. Every month its value drops considerably. The best thing y
Re: (Score:2)
You'd have been happier with an i3 or i5. I can just hear the fans on my i5 server when I stand with my ears a few inches away from it.
Both Intel and AMD give the TDP of their four core parts (i5 for Intel, Phenom II for AMD) as 95 watts. The difference is that I believe AMD. And I wonder about your belief that an i5 box would be quieter than mine with similar components, or your definition of what a few inches is, or whether you have the stereo on when you post to Slashdot. I have had people tell me with great assurance than an XBox 360 is quieter than a PS3, or that they can't hear their PS3 when watching a movie, both patently absurd wi
Re: (Score:2)
Both Intel and AMD give the TDP of their four core parts (i5 for Intel, Phenom II for AMD) as 95 watts
And I've never seen the power consumption of the i5-2400 go much above 50W in benchmarks. So perhaps Intel are lying; the i5 seems to use about half as much power as they claim it does.
Perhaps rather than believing either company's numbers you should actually try measuring them?
Re: (Score:2)
There are worse things to be, AMD needs all the help they can get and I really dont want to have the CPU market become a 1 horse race.
Re: (Score:2)
Re: (Score:2)
It's the desktop and server CPU market though. Anything else in those areas are just a tiny niche.
To be fair, ARM is likely to eat into the low end of that market over the next few years. My Atom-based server/DVR was fast enough until we got OTA HD here and it became too slow for transcoding, and the Ion Xbmc box is plenty fast enough for video playback or general desktop usage; my i5 laptop spends most of its time at 1.2GHz, where it's probably not much faster than an Atom.
So if ARM can produce a chip at least that fast (if they haven't already) I think there's a chunk of the x86 market waiting for the
Re: (Score:3)
There are Nvidia Tegra 2s being sold clocked at 1.2 GHz (dual core) right now. The Tegra 3 line will be quad core 1.5 GHz with 1.5 GB RAM. With Win8 supporting ARM, I can easily see ARM netbooks/laptops becoming commonplace within the next few years.
Re: (Score:2)
Guess what: web servers aren't everything.
Re: (Score:2)
Re: (Score:2)
Then why do I only buy AMD (and ARM) these days?
Because you're an AMD fanboy?
Or maybe because I get more for my money and I am pleased with the mips per watt performance?
Re: (Score:3)
No offense, but I'm typing this on the $350 15" Acer laptop with an E-350 (zacate fusion processor). I played portal 2, start to finish, on this thing at medium settings. It gets about 6 hours of battery life out of light web browsing. Intel may be killing AMD on the low end, but based on my comparison to a $600 HP probook with an i3-2ksomething, it's indistinguishable at web browsing and word processing, and the i3 just fails any time you try to run a game.
Not saying that the sandybridge i3 isn't a bett
Re: (Score:2)
Re: (Score:2)
Re:Ya right (Score:5, Informative)
Actually, in Opteron vs. Xeon, AMD is doing quite well. Clock speed only gets you so far if you're bottlenecked on memory bandwidth.
Re: (Score:2)
Of course this gets modded up even though Xeons have had more memory bandwidth than Opterons for over 3 years....
Re: (Score:2)
Opteron has 4 channels/CPU vs. Xeon's 3. Both are DDR3, same speed. Previous generations of Intel chipsets had problems achieving full speed on the memory bus in multi-dimm configurations.
So what would make you say the Xeon has more memory bandwidth?
Re: (Score:2)
What are you talking about? Curren Opterons have a 2 chip MCM module (see the AMD fanboys squirm over that after they denigrated Intel for doing it first). Each chip in the module has a 2-channel memory controller that isn't any different from the ones on the desktop Phenoms. Manwhile, Intel has been selling single chip solutions with 4 memory channels since 2009, with lower end models having 3 channels (Intel also had a "real" 8 core CPU out before Bulldozer despite AMD's marketing claims that they inve
Re: (Score:2)
The high end Xeon chips had THREE memory channels according to Intel right up till the last batch released Q2 of 2011, did they lie or did you mean for over 3 weeks (rather than years)? Note that the Westmere runs in the mid-2 GHz speed range and at 130Watts.
Personally, I don't care if there's one or 2 dies in the package for the 8 and 12 core chips (though if the workload is memory bount, the 12 core shouldn't be used), it doesn't seem to matter much since there will be 2 or 4 chips on the board anyway. F
Re: (Score:2)
"The current situation is Intel is slaughtering AMD"
Frankly in the consumer space ARM is slaughtering Intel. The truth is that to day 90 of all desktops have more than enough CPU power. The most intensive thing most computers do today is playback HD video. Sure the I7 SandyBridge is blindingly fast but most people don't need the speed or the price tag. The new A8 and I3s show where the future is going. Fast enough with good enough graphics and low price.
Re: (Score:3)
Basically you are right. AMD has nothing even remotely close to SandyBridge and Bulldozer won't get them there either. I've been a long-time AMD fan, and over the years AMD has saved me bundles of money with their socket compatibility.
But AMD has to make a socket switch now and there are way too few AM3+ mobos available. Not only that but the mobos that are available are wired for compatibility.. they will work with AM3+ cpus but they won't be able to make use of all the new performance capabilities. So
Re: (Score:3)
The general tradeoff between Intel and AMD is that AMD optimizes its instructions to run fast from cache and poorly when cache misses occur. Intel optimizes its instructions to run fast (as is possible) when cache misses occur and to run modestly otherwise. That's the best way I can describe it.
For a long time AMD was able to compensate by placing larger caches on the cpu die, and Intel didn't care and generally had smaller caches. That changed with core 2 duo and later chips (particularly SandyBridge).
Re: (Score:2)
Re: (Score:2)
Perhaps it's Intel wanting to keep AMD alive for anti-trust reasons. They could very well continue to slaughter AMD, but is it in the best interests of Intel? If AMD dies, then Intel's going to ge
Re:Ya right - fabrication (Score:2)
45 nm is correct for the AMD Phenom II series. But the "Llano" APUs for low-end desktops are already in 32 nm. So you could say Intel is half a step ahead right now. Overall, however, I agree that AMD is under pressure and cannot afford artificial delays in their products.
What they still have are some niches where Intel has slacked off or does not compete for other reasons. The most important one right now are the APUs. AMD's Brazos platform does well on netbooks, and IMHO the LLano is a good choice for che
Re:Ya right (Score:4, Informative)
The current situation is Intel is slaughtering AMD.
Except where AMD is slaughtering Intel, of course.
HP, Asus, MSi, and Lenovo have all adopted the E-350 over the Atom alternatives in notebooks and low end laptops.
Remember that Notebooks and Laptops are replacing desktops in the typical home. Intel is probably pretty worried that they have absolutely no competitor to the E-350 that doesnt both cost significantly more and draw significantly more power.
Re: (Score:2)
Llano. AMD have been fucking up badly recently but even they wouldn't name their new chip Lamo.
Re: (Score:2)
Llamo, on the otherhand has much better graphics compared to an Intel atom .. to hell even a full speed i5! Bulldozer will have even better graphics. If you just run IE 9/10, flash, and Office the AMD llamo and bulldozer will seem faster and less choppy. They can also run games like World of Warcraft as well. Sub notebooks sucks and can do these things and run these games.
Llano has much better graphics than Intel's offerings, too good in fact. They stuffed in far more performance than has ever been seen on an embedded GPU, and bottlenecked it severely on the CPU's memory bus. It can't run full speed. It can't even run half speed without running into bandwidth limits. There is good reason why its discrete brethren are reaching into the triple digits. All they're doing is sucking down more power and real estate on silicon that can't be properly used.
Bulldozer won't change
Re: (Score:2)
It can share fine if it uses the same dedicated memory controller without doing an interrupt to the CPU or chipset each time it needs to access ram. That is what crippled the other integrated chipsets.
No it's not. Older integrated GPUs were crippled by low core performance, not memory bandwidth (though, to be fair, they couldn't have high core performance because the memory bandwidth was so low).
Modern CPUs want a lot of memory bandwidth. Modern GPUs want a staggering amount of memory bandwidth (the GTX580, for example, has around 200 gigabytes per second and even the 8600GTS has 32 gigabytes per second whereas a dual-channel DD3-1600 system only has 25 gigabytes per second). Stick both of those on one c
Re: (Score:3)
Intel only beats AMD with their most recent SandyBridge chips
Intel has been beating AMD since the Core-2, only AMD fanboys claim otherwise. Mostly by saying 'but, but, if you run benchmarks at 3840x2160 then the CPU is irrelevant'. Well, duh.
Re: (Score:2)
Intel has been beating AMD since the Core-2, only AMD fanboys claim otherwise.
Uh what? Maybe since i5. Core 2 was never that exciting.
Re: (Score:2)
It may not have been that exciting, but it had AMD beat in pure performance by a comfortable margin.
Re: (Score:2)
Ivy Bridge can't compete with ARM... but similarly, ARM can't compete with Ivy Bridge either. IB will have its biggest advantage in thinner-lighter notebooks (Macbook Air & the new Ultrabooks being put out by lots of different vendors). The projected power envelope is about 17 watts which is much much higher than even the tablet-level ARM chips. At the same time, its performance will destroy anything that ARM will have in the next 5 years (the newest ARM chips that are coming out next year are just b
Re: (Score:2)
I think this was true ~8 months ago but Intel mobos are priced about the same as AMD mobos these days... really ever since SandyBridge came out. There is so much chip integration now that the only real differentiation between mobos is added features and BIOS software.
Most of the costs involved in building a gaming system are unrelated to the cpu. I don't count built-in graphics as being decent, though you might, and I've fried enough systems with cheap PSUs that I don't buy cheap PSUs any more. So my con
Re: (Score:2)
It depends what you mean by server. The twelve core AMD chips beat anything Intel can sell you if you have tasks that are going to use as many cores as they can get. Stick four of them in a SuperMicro board with a bit of memory and you have something that will outperform a far more expensive four socket Intel Xeon based machine.
Windows 8 (Score:2)
Gee those delays mean the brand new shiny chips will just hapen to come out with Windows 8. Coincidence?
Not only can you finally ditch that aging Vista or XP machine, with shiny Windows 8 but now you can have a shiny new CPU too!
Re: (Score:2)
Possibly, but realistically most people would do fine with AMD's Fusion core processors, the ones they've already released. Tthere are legitimate reasons to have more power, but for the things that people typically do, it's more than enough power.
Re: (Score:2)
As would my 486DX and debian with you.
Re: (Score:2)
Re: (Score:2)
Nice! Specs? How many gear-yards do you spin?
Re: (Score:2)
old news? (Score:3)
Probably not relevant to Moore's Law (Score:4, Interesting)
The most naive question to ask if is this sort of delay is relevant to Moore's law and similar patterns. There are a variety of different forms of Moore's law. We've seem an apparent slowdown in the increase in clockspeed http://www.tomshardware.com/reviews/mother-cpu-charts-2005,1175.html [tomshardware.com]. The original version of Moore's Law was about the number of transistors on a single integrated circuit and that's slowed down also. A lot of these metrics have slowed down.
But this isn't an example of that phenomenon. This appears to be due more to the usual economic hiccups and the lack of desire to release new chips during an economic downturn (although TFA does note that this is a change in strategy for Intel's normal approach to recessions.) This is not by itself a useful data point, so this is not further need to panic.
On a related note there's been a lot of improvement in the last few years simply by making algorithms more efficient. As was discussed on Slashdot last December http://science.slashdot.org/story/10/12/24/2327246/Progress-In-Algorithms-Beats-Moores-Law [slashdot.org] by a variety of benchmarks linear programming has become 40 million times more efficient in the last fifteen years and that only a factor 1000 or so is due to the better machines, with a factor of about 40,000 attributable to better algorithms. So even if Moore's law is toast, the rate of effective progress is still very high. Overall, I'm not worried.
Re: (Score:2)
Does it have to be? Moore doesn't work at Intel anymore so they might have a new plan. It was going to have to hit a physical limit at some point anyway which Moore would have very clearly known when he proposed it in the first place.
Re: (Score:2)
AMD! (Score:2)
Re: (Score:2)
I don't buy Intel necessarily for the CPU, I buy Intel for the supporting chipsets. Intel chipsets in most instances are rock solid....with excellent driver support for both Windows and Linux. That being said, I'm glad AMD exists....if AMD didn't exist, it would be necessary for Intel to create one ;)
-M@
Stagnant Economy (Score:2)
I have to wonder how much of this is due to the stagnating economy in much of the developed nations. My recollection is that the last time the economy went south, all sorts of projects were either postponed, put on hold, or simply ended.
Waiting on the die. (Score:2)
Processor technology is at a state of gaining more by reducing die size than design. Because of SMP, sophisticated design changes are not needed to gain performance. Intel knows they can seriously move ahead of AMD if the next processor is successfully reduced to 22nm. They might as well wait. AMD isn't able to drop cash on reducing die size on every other release. If they can put off releasing their next processor at a point when they can afford smaller die production they will get a lot more out of i
Re: (Score:2)
" would have made it way ahead of the curve predicted by Gordon Moore"
They are ahead of the curve. Intel is postponing because of their recent trigate tech. Intel could still release 22nm right now if they wanted, but it wouldn't be worth it with such a huge difference in power leakage.
Now, if you want to look at processing power, look at GPUs. They're beyond doubling every 18 months already and are expected to approach 2.5x-3x/18months in the next 2-3 years.
Re: (Score:2)
Re:Seems perfectly reasonable (Score:4, Interesting)
I was surprised that the original date when it was planned to be released back than would have made it way ahead of the curve predicted by Gordon Moore. I was saying that it should be delayed some time so it actually is more accurate to the prediction and it has been delayed, however I will never know whether it had been done for that reason.
So... If I'm reading you correctly... the reason that they should have delayed the hardware wasn't because of something like it being too expensive to produce for expected market, too difficult to produce in sufficient yields, or any other technical or business reasons that might exist, but because the number of transistors involved didn't match up to a prediction made 30-some-odd years ago?
You realize that prediction has only "come true" when you average the graph over a very long period, and there are significant statistical outliers (that represent significantly successful chips in their day) along that plot?
Wait, wait... you're trolling right? I admit, you got me!
Re: (Score:2)
Re: (Score:2)
A quick google would answer your own questions. K5 was released (was a competitor to the Intel Pentium & IBM/Cyrix 586), the K6 was up against PII and suffered from major thermal problems. While it did give Intel a little bit of a worry as far as sales go, the K6/K6 II's weren't exactly powerful. It wasn't until the Athlon (K7) that Intel shat themselves.
Success is a very loose term for a processor with major problems.
Re: (Score:2)
Re: (Score:2)
I thought 3dnow! appeared in K6-2?
that was a decent chip for a while(around 300mhz, for the price, they didn't oc as nicely as cellies though), the alternative was buying a celeron and overclocking it.
(disc, I got a pair of jeans with a k6-2 550mhz)
Re: (Score:3)
Re: (Score:2)
yeah my k6-2 which was stock 300mhz would only go to a bit under 400mhz oc'd, no amount of cooling helped, and the 300mhz celerons about that time did 450mhz fairly regularly, after that I went with a k7, and a duron then.
been rocking on intels for few years lately though.. because i've been content with work provided workstation laptops. they're beasts compared to what used to be usual, can even play new games well.
lately
Re: (Score:2)
Geez, this brings back memories...
OCs, all the new chips, chipsets, MBs, from the early P5/P6s to the P4
But the market had its way, and it's cheaper to buy a new system (NB) nowadays. Not to mention speeds really haven't gone up as it used to be (not to mention clock speeds, but yeah the MHz is a myth)
We now do everything we need, except for the ultra needs of a few percent of the people. And you can always fire EC2 instances for sheer computing power.
Re: (Score:2)
Depends how you measure. I've recently gone from a Q6600 Core2 Quad, to an i7 2720 in my new macbook pro. Yes, desktop, to mobile CPU.
The i7 kicks the living shit out of my Core2 in transcoding video. Sure its a bit of a niche task, but CPUs have in general been "fast enough" for the day to day non-niche desktop crap since the original pentium, really.
Re: (Score:2)
Re: (Score:2)
any AM3+ motherboard will be able to support the next generation of CPUs alone is a good enough win in my book. As soon as the box is delivered I have to have have my computer upgraded in under 10 minutes
FTFY
No, but seriously, we went from AM2 to AM2+ to AM3 to AM3+, so compatibility is not that great. I just got my AM3 box last month (because going Intel would cost me about $100 more and perform a tad slower in Premiere) and I don't know if I'll be able to upgrade to Bulldozer since not all AM3 boards will work with AM3+ CPUs. An AM2+ Phenom II X4 920 owner will certainly have to buy a new mobo. Sure, it's a bit better than what Intel does, but you can only go one generation further and, frankly, I don't t
Re: (Score:2)
We're simply looking at an increasing gap in the demand for CPU power.
For most people (documents, spreadsheets, email, browsing) an i5 is more than powerful enough to satisfy the current demand.
OTOH there are specialized areas (visual effects, simulation,...) where even the current top-of-the-line models of Sandy Bridge Xeons reach their limit way too easily.