Why Intel Leads the World In Semiconductor Manufacturing 226
MrSeb writes "When Intel launched Ivy Bridge last week, it didn't just release a new CPU — it set a new record. By launching 22nm parts at a time when its competitors (TSMC and GlobalFoundries) are still ramping their own 32/28nm designs, Intel gave notice that it's now running a full process node ahead of the rest of the semiconductor industry. That's an unprecedented gap and a fairly recent development; the company only began pulling away from the rest of the industry in 2006, when it launched 65nm. With the help of Mark Bohr, Senior Intel Fellow and the Director of Process Architecture and Integration, this article explains how Intel has managed to pull so far ahead."
What's the mystery? (Score:5, Funny)
Andy Grove paid billions to get access to Area 51 alien technology back in 1998. What's so hard to understand?
Re:What's the mystery? (Score:5, Funny)
Andy Grove paid billions to get access to Area 51 alien technology back in 1998. What's so hard to understand?
Ah its that chip from the android that came from the future. What could possibly go wrong.
Re: (Score:3)
Andy Grove paid billions to get access to Area 51 alien technology back in 1998. What's so hard to understand?
Ah its that chip from the android that came from the future. What could possibly go wrong.
It was the chip used by the mother ship in Independence Day that could run the virus from Goldblum's Powerbook. It already had cross platform virtualization technology and was years ahead of its time.
Re: (Score:2)
except the alien's were still running IPv4, I guess for some legacy servers...
Maybe there's an Area52, in Tel Aviv (Score:2)
Remember Pentium M?
Intel had to rely on Pentium M to pull itself out of that big sink hole back then
Re: (Score:2)
And use its monopoly so OEMs would not use AMD processors
Re:What's the mystery? (Score:5, Insightful)
Company A produces a better product then Company B.
Company A has better marketing then Company B.
Company A prices are nearly the same as Company B.
Company A for the Win.
No Conspiracy, No Evil, Their customers want a good product at a fair price, That is what they provide.
Right before Intel released their CORE processors AMD had a very strong showing. Then Intel released a much better product and they took their #1 spot back and put distance behind their competitor.
Now AMD will need to make a much better product, Market their Product better, and/or Lower their costs.
And AMD got kind of an unexpected break (Score:5, Informative)
So the original Athlon was a shot out of the blue, it was the first AMD chip that really competed with Intel chips. Intel had to stop sadbagging and release faster P3 chips (it was capable of making them just wasn't because it didn't need to). AMD legitimately brought some serious competition. It was badly hamstrung by having horrible, horrible motherboard chipsets, but there you go.
Now the Athlon maintained competitiveness the next generation... Because Intel fucked up. Their Netburst architecture wasn't very good. I don't fault Intel on this, their research showed it would scale really well MHz wise, possibly up to 10GHz, so the slower IPC wouldn't matter. However it didn't, so they had a slower architecture compared to AMD. The problem? AMD wasn't updating. They just kept doing minor rehashed on the same thing.
Then, as you say, Intel dropped Core. They hadn't been standing still, they never do. They corrected the mistakes of Netburst and made a chip that was very fast per clock. AMD was still playing with old tech and Intel pulled way ahead. Then even worse as it continued, Intel kept revising their chip, AMD kept playing with the same basic thing. Their Bulldozer launch got pushed back and back. When it finally did happen recently, it was not at all competitive to Sandy Bridge, and of course Intel now just launched Ivy Bridge.
So AMD's initial competitiveness was no fluke, they dropped a good product. But the length it went on was kinda a fluke, since Intel screwed up, and AMD didn't do anything to work on improving their tech in a big way.
Re:And AMD got kind of an unexpected break (Score:5, Interesting)
It might be worth pointing out that Core wasn't on the roadmap. It was a happy accident.
The design came from the Pentium-M, which was just a rehashed Pentium 3. The P4 "Netburst" was on the roadmap for a decade when it came out, followed by IA-64.
The Pentium M was intended to be a "mobile" chip to put into mid-range laptops where the P4 was too big and hot. The rather unknown Israel design team was put on it and produced a really remarkable product that scaled far better than they expected. As a result, after release, they were put to the task of improving it and re-working it to be a real desktop chip (the Core) and then, because it was still so tiny, the CoreDuo and later the Core2Duo.
Talking about flukes, anyway. Sometimes engineering stems from them. It's not because they did it wrong, it's just how it is.
As far as I know, they are still benefiting from some of the amazing hand-layouts that were done on the Pentium M and early Core chips. Nobody else would even consider doing a manual layout on a modern chip. They had a few people who did just that and it made all the difference.
Re:What's the mystery? (Score:5, Insightful)
Re: (Score:3)
And of course, price/performance AMD has been ahead of Intel any number of times.
Re:What's the mystery? (Score:5, Insightful)
Uh, there's a reason many Linux/BSD distros refer to their x86-64 port as "AMD64" - AMD *invented* the 64-bit x86 extension. And they've always been just slightly ahead of the curve on multi-core - for quite some time, AMD had viable dual-core desktop processors while Intel's were nigh-unusable due to heat and performance.
And there's a reason many supercomputers are built around massive piles of Opterons - AMD makes a superior massive-number-crunching processor.
Not to mention that the classic Athlon was flat-out *better* than the Pentium III. Even as early as the K5, AMD had technically-superior designs held back by implementation issues. Just like Bulldozer, come to think of it. It's a very *interesting* design, and I'm not entirely convinced the rather obvious shortcomings are due to faulty design work, rather than faulty production work.
Oh, and the Fusion "APUs" are great low/middle-end laptop chips. Far, far better integrated graphics with comparable CPU performance and power draw, compared to Intel's offerings. If I were to buy a laptop for standard home usage, I'd grab one of those.
Then there's the whole graphics thing. Sure, you could argue that's more ATI than AMD, but they're definitely beating Intel in the graphics market, that's for sure.
Intel makes for awesome Linux boxes. (Score:5, Insightful)
My desktop right now has Windows and is running a first-generation Core i5 with an AMD Radeon 6870 added in. When that machine get's replaced with another gaming Windows machine in a year or two I'll be pulling the AMD graphics out of it and running on the i5 integrated Intel graphics. It will be super-low-maintenance in Linux. None of this rebuilding fglrx or nVidia modules every time you upgrade the kernel.
When I go looking for a Linux machine the very first thing I look to check-off is "Intel graphics"? Yup, then it's a buy.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Just for the record... I've used an Ubuntu box for a 5+ years of WoW and now SWTOR, under wine. The proprietary nvidia drivers are easy to install and work well under Linux, and have coped with my moves from 7600 to 9600GT to 460.
Is it masochism? Yes, a bit, but I've not really got much use for a Windows box aside from gaming, so I thought I'd give it a go.
Re: (Score:2)
Just so you know, the new on chip video stuff is pretty damn good.
Soon it will be like the sound chip. Good for almost everything.
This is the same thing we have gone through many times.
Networking:
Card
on board but sucks, use card
Chip price falls
on board pretty good, use card for network gaming
Corporation stop using the card.
Mobo manufacture highers network chip expert.
On board good for everything except some extreme situation.
Same this for sound,
Same thing will happen to graphics.
There will be the really hig
Re: (Score:3)
There are open source drivers for radeon too, they might not perform as well as the closed drivers but they still outperform most intel cards while being just as convenient, plus you have the option of using closed drivers if you want the extra performance.
Re: (Score:2)
Don't work at all on R690M. Never buying anything with ATI in it again unless it's dirt cheap used, and NO MORE LAPTOPS WITH ATI GRAPHICS EVAR. It's only fairly recently that the open source drivers don't actually choke on all the various Rage, Rage Pro, etc chips. I know because I have three old laptops with 'em and they always gave me lots of grief. Don't get me started on the radeon driver.
I hope one day that the OSS video drivers are worth a damn, but until then, there's nVidia. Well, and intel, but ser
Re: (Score:2)
Re: (Score:2)
Intel, with their open-source graphics stack, makes for some of the easiest-to-maintain Linux boxes around. I'm typing this right now on Arch with Intel graphics. Sure, they don't have a lot of "gaming punch" but they are darn stable and just work with Linux.
If you don't need gaming performance, you can go with anything on the market. I know that the AMD open source drivers are very stable and support Compiz-like effects, and the same is probably true for NVidia.
Re: (Score:3)
This is quite true; Intel is the way to go if you don't want to game, no doubt about that. You even do get some pretty good 3D and graphics acceleration for movies and stuff as well.
Nvidia+binary blob is the way to go if you *DO* want to game, and you'll probably want a distro that includes the binary blob to avoid manual installation issues. (like arch).
Re: (Score:2)
Re: (Score:2)
hmm, I have an older macbook with intel graphics and i've had quite a bit of trouble with xorg on it. I've found a version that mostly works and i'm scared to upgrade further for fear of breaking it. Maybe i'm just unlucky.
Not that ATI and NVIDIA are great either, back when I last worked with an ATI card in a linux box FGLRX had a nasty habbit of crashing the system when it couldn't find it's module and the nvidia linux binary driver wouldn't work with my latest card (an EVGA GT 430 dual DVI, interestingly
Re: (Score:2)
FWIW, the GTX 680 supports up to four displays per (single GPU) card. I haven't tried it yet, but I would expect it to work much better than the 4 monitors across 2 cards support that one was limited to in the past.
Multimonitor across multiple cards has never been stellar in any OS, probably because it's a very niche use case, but multimonitor on a single card is a very common use case.
Re: (Score:3)
Have you ever seen a Mac Pro with multiple cards? It's flawless. Ive built advertising/media displays with up to 16 monitors. Never an issue.
Mark Bohr? (Score:2)
Answer: Taiwan (Score:2)
Explained in "Great by Choice" (Score:2)
good silicon, bad architecture (Score:2)
If Intel would have kept their perpetual ARM license, they could rule the world. But even with cutting edge fabs their are going to be overrun by a more ubiquitous CPU architecture.
Intel should stop making x86s. And especially stop the nonsense of trying to use x86s to compete against stream processors in GPUs and HPC. But it is really too late for them, they gave their ARM license to Marvell (who have basically pissed away a good opportunity as well by not aggressively pursuing new designed based on the li
Apple is not a semiconductor company (Score:5, Informative)
Apple is a product company. It designs its products, and then someone else makes it.. Many components like the processor are third party, and companies like APPLE design a system around it.
After that the design goes to samsung, and its manufactured by samsung. I think samsung uses TSMC fab.
So if apple wanted to have a 22nm chip it could
1. Build a Fab(invest many billions)
2. Pay TSMC and partner with them in tech (invest some billions).
Return on investment may not justify the cost.
As you go smaller, you do gain an area and cost advantage, but you also run into lot of issues related to physics. So 28->22nm is not easy, and its really commendable Intel has done it.
Re: (Score:3)
Indeed, there's a reason fabs are seen as national treasures.
Re: (Score:2)
As you go smaller, you do gain an area and cost advantage, but you also run into lot of issues related to physics. So 28->22nm is not easy, and its really commendable Intel has done it.
There's also the limited competition to TSMC, since nobody but Intel has access to Intel's plants (partnering with FPGA companies don't count) the rest of the market "has to" go with them even if they're behind Intel. Their competition is GloFo and UMC, none of which are impressing much. So what if Intel has 22nm? AMD still has to buy from TSMC. nVidia still has to buy from TSMC. Apple still has to buy from TSMC. They simply don't feel the pressure that their customers do, they sell and make a profit anyway
Re:Apple is not a semiconductor company (Score:5, Informative)
Let me say a few words here as I worked in the semiconductor industry for over 28 years. So you fully understand just what it means to make a semiconductor foundry these days, here is a thought experiment for you I worked a few years back.
1) You want to build facility for manufacturing wigit.
2) That facility will cost you between 3b to 5b dollars.
3) In order to justify the ROI on that facility you need to take at least 5% total world wide market share for that wigit
4) You get to scrap your factory in 3 years.
My numbers may be a little outdated today but that only means my cost projections are too low as well as the total market share. From simply an accounting standpoint this is nuts. When I got into the business in the early 70's there were hundreds and hundreds of fabrication facilities. Every start-up had it's own fab. Today you can count the premier companies that have fabs on maybe 1 hand and the total number of significant players in the semiconductor market with their own fabs on both hands.
Intel deserves very high kudo's for what they have accomplished. The risk they take is enormous but they demonstrate time and time again what a manufacturing powerhouse they really are.
Fab has become a service (Score:2)
You are right, and you are also right that your numbers are on lower side.
Consider the Auto analogy.
If you start an Auto company, manufacturing common rails, you will mostly buy tech from either Bosch(likely) or Delphi.
you won't start from scratch.
Same deal here. To design a chip does not require more investment.
As you go further down the line, you need more investment
For example
1. Algo development - Chip arcitechture - Basically an algo or an idea
2. RTL (Actual Behavioural model) - Now you need Simulators
Re:Fab has become a service (Score:5, Funny)
"Consider the Auto analogy."
No.
Re: (Score:3, Informative)
Intel's really in another league.
There's a reason Intel shows yields on a log scale. Hint: it's not because it's low. The rest of the industry is reasonably happy at 50-70% yields (and everyone knows it since every buyer sees the yields on the chips its buying). That's why Intel dominates. Getting to smaller feature sizes means they can make smaller die, which means more die per wafer, which means cheaper CPUs. People arguing over little performance gain are missing the fact that going from 32nm to 22n
Re: (Score:2)
You don't notice the price getting cut in half, b'cos you're the end user. When Intel sells to motherboard manufacturers, such as Asus, Gigabyte and all those hundreds of Chinese and Taiwanese, it's all the time price negotiations. In order to build in periodic price reductions, these die shrinks need to happen - it's not just Intel's margins that are supported by it.
I agree about the advantages of owning one's own fabs. It's not just the turnaround time, but when a company owns its own fabs, then in o
Re: (Score:2)
Re: (Score:3)
There is plenty of demand for ICs built on non-leading-edge technology. For instance, On Semiconductor has custom foundry services at 0.18u, 0.25u, 0.35u, 0.6u, and 0.7u. 0.35 micron is about 14 years old now, IIRC.
Re: (Score:3)
I think he may have meant doing an accounting write-down, so that the fab is depreciated, and anything produced by it is a windfall. This is a result of depreciating annually the new equipment that gets installed in a fab, and is typically used for 1 or 2 product nodes. Once most production has moved to a new process, the company can either use this node to produce something low tech and cheap, whose cost would be supported by the fact that the fabs are depreciated, or, if no such opportunity is there, th
commendable maybe (Score:2)
It's notable. It'll be commendable if the huge investment they made to pull it off gives them an advantage with more than the investment. You can't know that until you've gone to mass production and seen what the problems are.
Re: (Score:2, Informative)
Anobit and P.A. Semi are fabless companies. They design chips but do not manufacture them.
Re: (Score:2)
Doesn't matter. A lot of semiconductor companies design, but avoid the cost of owning and running their own fabs. Apple's ownership of PA Semi and now Anobit was a pretty low profile entry into the semiconductor market.
But even there, they never took the initiative in developing their own architecture. PA Semi was doing a PPC, and Apple could easily have taken that and had them put together a roadmap that the Mac platform needed. Instead, the team was converted to an ARM team. Apple's role in semicon
Re: (Score:2)
Re: (Score:2)
Re:How come Apple (Score:4, Insightful)
Let's see. How come GM doesn't have these tires factories that Firestone and Michelin has? I wonder... GM is making a heck of a lot more money than Firestone though...
Re: (Score:2)
This decade or previous decade? The 2000's were pretty sad, but the 2010s GM has shown a pretty robust recovery.
The example doesn't really fly though, GM still makes a lot of its own parts, they haven't farmed out the core product.
Re: (Score:2)
The example doesn't really fly though, GM still makes a lot of its own parts, they haven't farmed out the core product.
What do they actually make themselves aside from bodies, blocks, and differentials/axles? Do they even make all that themselves any more? Are they still casting in this country?
Re:How come Apple (Score:4, Interesting)
Re: (Score:2)
Apple takes techology designed and manufactured elsewhere (though they may make some minor tweaks), combines it with their own software and turns it into slick products that people are prepared to pay a premium for. That requires design and marketing prowess but not any great technicalcapability.
Re:Hmmm. (Score:5, Funny)
Spying... on their competitors who are all years behind them?
You must be pretty high up in the CIA to have thought of such a genius spying scheme.
Re: (Score:2)
Can't really mark up MSAA to anything AMD did, more what they didn't do ... they didn't throw as much money at devrel as NVIDIA.
Because of Tim Sweeney's anti-PC stance and the universal adoption of his engine DX10.1/11 truly portable MSAA implementations have been hugely delayed and most devs have chosen to go with NVIDIA's free on site developers and their DX9 MSAA hack. AMD has been able to get this to work on their hardware, but it's artificially restricted to NVIDIA hardware without hacked binaries (whi
Re: (Score:3)
Is this true or just trolling?
Every benchmark I've seen so far has shown performance increases and power consumption decreases at around the same price.
If your statement is true then that suggests a lot of review sites out there are spoofing their results and that's very very bad.
Sure, if you have a sandy bridge chip there isn't a whole lot of reason to upgrade because for most users anything made in the last 5 years or so can handle everything you'd want to throw at it.
Re: (Score:2)
Is this true or just trolling?
From what I understand it's about 30% more efficient I don't know about it being any faster though. I also thought it could power down core's independently.
Re: (Score:2)
Re: (Score:3, Informative)
Most of the complaints are tiered towards the overclocking space; doesn't go as far as the old sandy bridge.
http://www.overclockers.com/intel-i7-3770k-ivy-bridge-cpu-review /
http://www.overclockers.com/ivy-bridge-temperatures
http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review /
http://www.anandtech.com/show/5763/undervolting-and-overclocking-on-ivy-bridge
etc.
Re: (Score:3, Interesting)
It's true, more or less. [slashdot.org]
Re: (Score:2)
Not exactly true but also not exactly false. They're packing just over 400M additional transistors on it. Most of which are being dedicated to graphics processing. Some minor tweaks to the cores give a very modest improvement over a Sandy Bridge with the same clock. When Intel packed the cores into a much smaller space simply to make room for the larger on-board graphics unit they actually shot themselves in the foot a bit. Very similar thermal loads are having to be displaced through a much smaller ar
Re:Too bad their 22nm 3D failed (Score:5, Interesting)
Of course, anyone who actually needs decent graphics wouldn't be using the on-chip graphics anyway, so I question just how useful this really is.
Re:Too bad their 22nm 3D failed (Score:5, Insightful)
Of course, anyone who actually needs decent graphics wouldn't be using the on-chip graphics anyway, so I question just how useful this really is.
There's a whole world of people who would quite like decent graphics, but who don't want to spring another hundred bucks or two to get something fancy. There's also the mobile market (laptops, tablets, etc.) where fitting an extra graphics card looks more like a liability than a good thing. Overall, it looks to me like a smart area for Intel to pitch their transistor budget at.
Re: (Score:2)
Re:Too bad their 22nm 3D failed (Score:4, Informative)
Traditionally, Intel has always been able to show lower power consumption and more than a tangible performance improvement when just doing a process shrink, but the Ivy Bridge does nothing extra in terms of performance and consumes not lower power than its older 32nm sibling
Is there any reason the parent is at +4, Interesting and not -1, Troll? Are the AMD fanbois really so desperate that they have to mod up blatant lies? Ivy Bridge uses 25-30W lower power at stock speed to deliver marginally better than SB CPU performance and considerably better (but still crappy) GPU performance. The only people that whine are those who want a 4.5+ GHz overclock. Anandtech called it quite possibly the strongest tick [Intel] has ever put forth [anandtech.com], but I guess if you don't like reality you can invent your own.
Re:Too bad their 22nm 3D failed (Score:5, Interesting)
The shrink from 22 to 32nm is a staggering size change - 33% finer lithography - and it uses their much-hyped 3D transistor technology on top of things. Yet, Ivy Bridge, being just a shrink of the older Sandy Bridge die, shows no improvements over the 32nm version. Traditionally, Intel has always been able to show lower power consumption and more than a tangible performance improvement when just doing a process shrink, but the Ivy Bridge does nothing extra in terms of performance and consumes not lower power than its older 32nm sibling - and let's not mention the inefficient heat packaging causing temperatures hotter than the 32nm Sandy Bridge. There's a problem here, Intel.
While I will accept you reversed some numbers (the shrink was from 32 to 22, not the other way around) and Intel is using tri-gate transistors, most everything else you describe is just flat out wrong. Ivy Bridge DOES show lower power consumption at stock voltages (TDPs of 77W vs 95W are a testament to that), and it is higher performance at the lower power consumption (though not by huge amounts, nor was it intended to be). Since it is lower power than Sandy Bridge at the same frequency, it is not having any issues related to thermals and packaging.
Now, if you want to rant about the fact that it doesn't handle overvoltage well for overclocking purposes, that is fine, but it is a separate discussion compared to stock. What you are seeing now is that Intel (probably extremely wisely for the market they are chasing most heavily [anandtech.com]) has tuned in their process node for stock voltages, but this is resulting in very leaky transistors at high voltages. Additionally, while the current packaging has the ability to remove heat just fine at stock voltages, when you start leaking too much the heat builds up too quickly- which certainly is a 22nm node issue and not actually a packaging issue. [tweaktown.com] Quite possibly, though how far in the future I can't begin to guess, they will probably tweak the process for the Extreme Edition CPUS to make them handle an overclock without leaking so much, but that will take some time learning how they can play with the various knobs to get what they want without destroying what they need.
This leaves me with the feeling that the only problem here is your expectations of a CPU that was manufactured with the intent of taking the mobile market by storm (and they have tuned the process properly for that) when what you want is an overclocking king. Let's see how they tune the process technology for the Extreme Edition (and hopefully copy into other desktop-bound CPUS) before any decisions are made that they have screwed the pooch on being able to overclock.
Re: (Score:2, Insightful)
why the hell did this get marked intersting?
1. if it was "just" a shrink over 32nm and had the same die size and power consumption, it would be no improvement. instead the die size is quite smaller than 32nm. past shrinks have held the die size only slightly smaller as they either added more cache or other logic in addition to the shirnk. in this turn things are mostly the same resulting in a smaller die. this means more dies per wafer and, eventually, lower cost (noting a move to 450mm wafers within the ne
Re: (Score:2)
When you've mistaken overclocking for normal operation, I think you've either missed the boat or you're not being honest with your agenda.
IB is hotter in overclocking, but at recommended settings, it is cooler and a little faster than SB. If you only care about overclocking, that's fine, but to not say that much is not being honest to us.
Or did you not even bother to understand the article you're referencing?
Re: (Score:2)
When you've mistaken overclocking for clocking differently from how the processor was marked for economic reasons, you've missed the boat, too.
Intel marks perfectly good processors down to lower speeds intentionally because people won't buy them at the top price point. But they still make the top price point available to people who have more money than time to dick with overclocking. And there's enough crap parts in the channel (that won't OC well) to keep some people buying the expensive chips.
How the proc
Re: (Score:3)
"How the processor performs when overclocked (or really, not factory-underclocked) provides an extremely revealing look at business practices and yields."
It's an interesting look to be sure, but considering the IB chips perform fine and use less power than SB at their marked clock speeds, there's nothing you can derive from that look to back up "the Ivy Bridge does nothing extra in terms of performance and consumes not lower power than its older 32nm sibling" in the post you appear to be defending here...
Re: (Score:2)
Not to mention the damned-if-they-do-damned-if-they-don't aspect of that line of reasoning. If the 22nm parts don't overclock as well as the predecessors, their node transition "failed". If they do, Big Bad Chipzilla is gouging us by artificially restricting the supply of high-clock parts.
Re: (Score:2)
That's completly wrong.
Bottom line: the gap from rated to actual use is shrinking, a lot.
The fct that some people want to run the chip outside performance spec is fine,l but don't go crying to intel when it doesn't work the way you want it to.
I can add nitrus to my Dodge, but when the engine gets damaged, IT's not Dodges fault they they don't design the ability to do it into the Caravans engine.
"Intel marks perfectly good processors down to lower speeds intentionally because people won't buy them at the top
Intel Development Model (Score:3, Informative)
Two words: Tick Tock.
Intel's development model [intel.com] is Tick (die shrink), Tock (new features). It's been this way for many years. Honestly, I'm not sure why you expected a Tick to have any new features. They did call Ivy Bridge a "tick plus," but even then I wouldn't expect any major overhaul in features or performance. Tick is a manufacturing process improvement, not an architecture improvement.
As far as heat packaging, I believe others have covered that sufficiently.
Re: (Score:2)
You missed this one, I guess. [slashdot.org]
Re: (Score:2)
That's just for overclocking. Regular usage is improved.
Re:Too bad their 22nm 3D failed (Score:4, Insightful)
And you seem to have missed the part where "running hotter than SandyBridge" applies only to overclocking. Yes, IB is a worse overclocker than SB, but under normal conditions IvyBridge is faster and uses less power than SandyBridge. Remember that overclockers are a tiny portion of the market. IvyBridge isn't the amazing revolutionary chip some people were expecting but it is a successful, evolutionary step forward. Just like most processor generations.
Re:Too bad their 22nm 3D failed (Score:5, Insightful)
1. Ivybridge is a die shrink, nothing more nothing less. Everybody who really thought it would be lightyears ahead of Sandybridge in terms of performance was simply deluded. It's a new die size that anybody has yet to perfect, that will come in Haswell.
2. All this ire is unfairly directed at Intel. On the basis that AMD seems to have no idea what it's doing at the moment, Intel can relax and do as they please. If you want to be pissed at anybody, be pissed at AMD for not being anywhere near competitive and pushing Intel to continuously raise their game.
Since 1998 I've only ever used AMD CPUs in my builds. When I came to build a new rig in February, I simply couldn't justify buying AMD for my CPU again because they were so far behind and there was zero indication that Bulldozer would rectify that. It's sad watching them busily engage in killing themselves off as a serious desktop CPU manufacturer and leaving Intel to potentially become lazy and overpriced, but that isn't Intel's fault.
Re:Too bad their 22nm 3D failed (Score:5, Informative)
> but that isn't Intel's fault.
Actually it is, to at least some extent. Go back a few years, when Intel was making misstep after misstep, and AMD was coming on gangbusters with K8. At that point, Intel had missed the market so badly that had they been AMD they would have gone under. They weren't AMD, they were Chipzilla. AMD enjoyed a good product cycle with K8, until Intel managed to come back. But they didn't enjoy the great product cycle they should have. Their great product cycle was turned into a merely good product cycle because Chipzilla twisted a few arms and kept K8 out of key opportunities.
The other piece of reality is that Intel combines first-rate process technology with first-rate design capability. (I say "capability" because more than once they've shown themselves to be very capable of letting their eye off the ball, design-wise.)
AMDs biggest problems have always been financing and less-than-best process technology. Bulldozer is a misstep, agreed. But it's not a misstep of the degree of Netburst or IA64. Had K8 gotten the success it deserved, AMD would have been better able to properly fund their design shop. That wouldn't have helped their process problems, however.
The simple fact is that the way things are today, Intel can afford to screw up badly, and can recover. None of their competitors can.
Re: (Score:2)
I'm suggesting that the original sans-L2-cache Celeron, the entire NetBurst architecture, and IA64 were all pretty bad missteps.
Chipzilla twisted a lot of arms to hinder and delay K8 adoption in the marketplace. It should have done better.
Re: (Score:2)
Re: (Score:2)
Whose ire exactly are we talking about here? It's not mine. I've used mostly AMD processors myself for the last 15 years, but that will probably change because of the better power and thermal characteristics of the Sandy Bridge series. My next system from scratch will likely host an Intel CPU.
As far as the parent of my original comment, I think he probably read that earlier article BEFORE someone corrected the summary to qualify that it involved overclocking (the comments make clear it wasn't initially s
Re: (Score:2)
No, the correct moderation for a factually incorrect comment is "overrated".
Agreed. Moderation winds up smelling a lot more like politics than peer review, though. Thus someone moderates him Troll instead.
Re: (Score:2)
It is being punished as a troll, because it is wrong in annoyingly misleading ways. It probably would not have been punished as a troll if the GP had said "in the area of overclocking, 22nm is a fail because..." By making a sweeping comment that only applies to a very small subset of the market, potentially informative because troll. The mods got it right.
Re: (Score:2)
Fine, then mark the previous article - or the editor who approved it - as Troll, and leave this guy the hell alone for being misled by it.
Re: (Score:2)
Yep, totally not his/her fault for spreading lies and not bothering to read the several articles linked to the previous /. story or the commentary attached to the /. article. S/he deserves being modded up, plus some bonus karma for being such a victim, and not having lies modded down so that they stop sustaining themselves is a necessary sacrifice. The fact that s/he is posting as AC and will not suffer/benefit beyond that one post notwithstanding.
Sorry. I think I need more coffee.
Re: (Score:2)
Just because a post might be incorrect doesn't warrant moderating it as Troll. Just leave it unmoderated. Nowhere did I advocate a positive moderation.
Re: (Score:2)
People should be punished for for spreading lies and commenting in a factual way about things they don't know anything about. This, let people go unpunished for spreading shit need to end. Because it's not only that person, it;s the people who read the comments. The poster is putting shit in the swimming pool.
Re: (Score:2)
You must be new here. Moderation != peer review. You've made it clear that's the delusion you live in, but it's not the observable fact. Moderation is politics and emotion and groupthink most of the time.
Re: (Score:2)
Just because a post might be incorrect doesn't warrant moderating it as Troll. Just leave it unmoderated.
Re: (Score:3)
Overrated I would think.
There needs to be a factually wrong mod. The lack of one causes a lot of problems.
Re: (Score:2)
It would be interesting if actual experts could have a special account, and they could mod +1 factual -1 wrong in the area of their expertise /. to do, but it would be interesting.
IT would be too expensive for
Hell, I'd like to see an experiment just to see f having a bonafide experts modding would change the conversation to a more productive one?
Re: (Score:2)
How it's worded will also dictate is something is wrong. All the bold letters in the world won't change that fact.
"You mother is over weight" - Observed fact
"You're mother is a fat ass lard beast" - Observed fact by a troll
"You're mother is thin" Incorrect
"You're mother is a thin whore" Incorrect and a Troll.
"You're mothers children aren't as smart as people would like them to be" - Observed Fact
"You're mothers Children are a bunch of drooling potato heads" - Fact and troll.
Today's lesson was brought to you
Re: (Score:2)
No, a factually incorrect comment should be modded down so I don't have to waste my time reading it. Mod it "overrated" rather than "troll".
Re: (Score:2)
Agreed. Moderation winds up smelling a lot more like politics than peer review, though. Thus someone moderates him Troll instead.
Only partially wrong (Score:3)
TI closed a lot of FABS. All TI designed stuff does not exclusively get manufactured in TI fabs. Much of it actually goes to TSMC
Re: (Score:3)
I work at TI. We do have our own fabs, but we also outsource manufacturing to foundries too. The new 65nm flash process I work on was developed at TSMC, and all manufacturing will be done there. I know that other processes run in TSMC as well, but I'm not sure which ones (we have a lot).
Re: (Score:2)
This is called business, using whatever advantage you have to compete against a competitor. Last time I checked Intel was a business.
Re: (Score:2)
Having superior technology and then hobbling it so that its no better (or even inferior) to the competition is not good business...
If they built ARM chips on the smaller fab process they would be able to easily lead the market.
With Atom they are barely competitive, while also being incompatible with everyone else.
Re: (Score:3)
pushing their own inferior architecture and holding everyone back
Oh brother, not this again.
On the low end, Intel will never beat ARM because of the large, expensive instruction decoder. That applies to the deep embedded stuff.
Cellphone chips aren't low-end any more. They're getting bigger and bigger and bigger. For big processors, intel does very well, as they have the best OoO scheduling and branch prediction which keeps the large, expensive ALUs busy, giving a very high IPC.
In 5 years time, ARM will nee
Re: (Score:2)
Personally i find this despicable and extremely arrogant, pushing their own inferior architecture and holding everyone back when they could be making ARM chips that were superior to everyone else's.
That's because you're not very smart. Intel tried that already. They weren't very good at it. Went back to making AMD64 chips, which they are pretty good at. Now, if you made AMD and intel into one company, besides that we'd all be fucked, they'd also make some truly awesome processors.
Re: (Score:2)
That's ridiculous. When Intel rolls out a new processor arc it puts a whole range of products on it, and adoption of the new cpus is very rapid. The Ivy Bridge cpus are absolutely mainstream designs with integrated GPUs and low power consumption rather than high end designs that sell in small volume.
Intel also introduced a new mobile design that looks extremely interesting - the first processor with vertical transistors. And the smaller feature size and power consumption is a huge win in this market.
Re: (Score:2)