Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Intel Hardware

Why Intel Leads the World In Semiconductor Manufacturing 226

MrSeb writes "When Intel launched Ivy Bridge last week, it didn't just release a new CPU — it set a new record. By launching 22nm parts at a time when its competitors (TSMC and GlobalFoundries) are still ramping their own 32/28nm designs, Intel gave notice that it's now running a full process node ahead of the rest of the semiconductor industry. That's an unprecedented gap and a fairly recent development; the company only began pulling away from the rest of the industry in 2006, when it launched 65nm. With the help of Mark Bohr, Senior Intel Fellow and the Director of Process Architecture and Integration, this article explains how Intel has managed to pull so far ahead."
This discussion has been archived. No new comments can be posted.

Why Intel Leads the World In Semiconductor Manufacturing

Comments Filter:
  • by Anonymous Coward on Wednesday May 02, 2012 @05:01AM (#39865785)

    Andy Grove paid billions to get access to Area 51 alien technology back in 1998. What's so hard to understand?

    • by Chrisq ( 894406 ) on Wednesday May 02, 2012 @05:24AM (#39865857)

      Andy Grove paid billions to get access to Area 51 alien technology back in 1998. What's so hard to understand?

      Ah its that chip from the android that came from the future. What could possibly go wrong.

      • Andy Grove paid billions to get access to Area 51 alien technology back in 1998. What's so hard to understand?

        Ah its that chip from the android that came from the future. What could possibly go wrong.

        It was the chip used by the mother ship in Independence Day that could run the virus from Goldblum's Powerbook. It already had cross platform virtualization technology and was years ahead of its time.

    • Remember Pentium M?

      Intel had to rely on Pentium M to pull itself out of that big sink hole back then

    • by jellomizer ( 103300 ) on Wednesday May 02, 2012 @10:14AM (#39867593)

      Company A produces a better product then Company B.
      Company A has better marketing then Company B.
      Company A prices are nearly the same as Company B.

      Company A for the Win.

      No Conspiracy, No Evil, Their customers want a good product at a fair price, That is what they provide.
      Right before Intel released their CORE processors AMD had a very strong showing. Then Intel released a much better product and they took their #1 spot back and put distance behind their competitor.
      Now AMD will need to make a much better product, Market their Product better, and/or Lower their costs.

      • by Sycraft-fu ( 314770 ) on Wednesday May 02, 2012 @12:44PM (#39869605)

        So the original Athlon was a shot out of the blue, it was the first AMD chip that really competed with Intel chips. Intel had to stop sadbagging and release faster P3 chips (it was capable of making them just wasn't because it didn't need to). AMD legitimately brought some serious competition. It was badly hamstrung by having horrible, horrible motherboard chipsets, but there you go.

        Now the Athlon maintained competitiveness the next generation... Because Intel fucked up. Their Netburst architecture wasn't very good. I don't fault Intel on this, their research showed it would scale really well MHz wise, possibly up to 10GHz, so the slower IPC wouldn't matter. However it didn't, so they had a slower architecture compared to AMD. The problem? AMD wasn't updating. They just kept doing minor rehashed on the same thing.

        Then, as you say, Intel dropped Core. They hadn't been standing still, they never do. They corrected the mistakes of Netburst and made a chip that was very fast per clock. AMD was still playing with old tech and Intel pulled way ahead. Then even worse as it continued, Intel kept revising their chip, AMD kept playing with the same basic thing. Their Bulldozer launch got pushed back and back. When it finally did happen recently, it was not at all competitive to Sandy Bridge, and of course Intel now just launched Ivy Bridge.

        So AMD's initial competitiveness was no fluke, they dropped a good product. But the length it went on was kinda a fluke, since Intel screwed up, and AMD didn't do anything to work on improving their tech in a big way.

        • by SecurityTheatre ( 2427858 ) on Wednesday May 02, 2012 @05:20PM (#39872943)

          It might be worth pointing out that Core wasn't on the roadmap. It was a happy accident.

          The design came from the Pentium-M, which was just a rehashed Pentium 3. The P4 "Netburst" was on the roadmap for a decade when it came out, followed by IA-64.

          The Pentium M was intended to be a "mobile" chip to put into mid-range laptops where the P4 was too big and hot. The rather unknown Israel design team was put on it and produced a really remarkable product that scaled far better than they expected. As a result, after release, they were put to the task of improving it and re-working it to be a real desktop chip (the Core) and then, because it was still so tiny, the CoreDuo and later the Core2Duo.

          Talking about flukes, anyway. Sometimes engineering stems from them. It's not because they did it wrong, it's just how it is.

          As far as I know, they are still benefiting from some of the amazing hand-layouts that were done on the Pentium M and early Core chips. Nobody else would even consider doing a manual layout on a modern chip. They had a few people who did just that and it made all the difference.

  • by MnemonicMan ( 2596371 ) on Wednesday May 02, 2012 @05:16AM (#39865835)
    Intel, with their open-source graphics stack, makes for some of the easiest-to-maintain Linux boxes around. I'm typing this right now on Arch with Intel graphics. Sure, they don't have a lot of "gaming punch" but they are darn stable and just work with Linux.

    My desktop right now has Windows and is running a first-generation Core i5 with an AMD Radeon 6870 added in. When that machine get's replaced with another gaming Windows machine in a year or two I'll be pulling the AMD graphics out of it and running on the i5 integrated Intel graphics. It will be super-low-maintenance in Linux. None of this rebuilding fglrx or nVidia modules every time you upgrade the kernel.

    When I go looking for a Linux machine the very first thing I look to check-off is "Intel graphics"? Yup, then it's a buy.
    • Okayyyy... So you'll use intel integrated graphics in your next windows gaming machine that'll run on Linux? That sounds like masochism ; do you actually play some games or is Mine sweeper and Tetris all you need to run on your rig?
      • No, when I buy my next Windows machine it'll have either AMD or nVidia graphics. My current desktop machine, which has AMD graphics, will have those graphics pulled at that time. Which means the machine will then be using its Intel graphics - what is integrated right now on the processor. That machine, the old one when I get my new one, will be Linux. With Intel graphics. And I don't play games under Linux, that's what Windows is for, which will be the new machine - in a year or two.
      • by pr0nbot ( 313417 )

        Just for the record... I've used an Ubuntu box for a 5+ years of WoW and now SWTOR, under wine. The proprietary nvidia drivers are easy to install and work well under Linux, and have coped with my moves from 7600 to 9600GT to 460.

        Is it masochism? Yes, a bit, but I've not really got much use for a Windows box aside from gaming, so I thought I'd give it a go.

      • by geekoid ( 135745 )

        Just so you know, the new on chip video stuff is pretty damn good.
        Soon it will be like the sound chip. Good for almost everything.
        This is the same thing we have gone through many times.
        Networking:
        Card
        on board but sucks, use card
        Chip price falls
        on board pretty good, use card for network gaming
        Corporation stop using the card.
        Mobo manufacture highers network chip expert.
        On board good for everything except some extreme situation.

        Same this for sound,
        Same thing will happen to graphics.
        There will be the really hig

    • by Bert64 ( 520050 )

      There are open source drivers for radeon too, they might not perform as well as the closed drivers but they still outperform most intel cards while being just as convenient, plus you have the option of using closed drivers if you want the extra performance.

      • Don't work at all on R690M. Never buying anything with ATI in it again unless it's dirt cheap used, and NO MORE LAPTOPS WITH ATI GRAPHICS EVAR. It's only fairly recently that the open source drivers don't actually choke on all the various Rage, Rage Pro, etc chips. I know because I have three old laptops with 'em and they always gave me lots of grief. Don't get me started on the radeon driver.

        I hope one day that the OSS video drivers are worth a damn, but until then, there's nVidia. Well, and intel, but ser

        • I had to chuck a pretty good laptop (IBM T60) because ATI stopped making drivers for the video card entirely - windows or linux.
    • by fa2k ( 881632 )

      Intel, with their open-source graphics stack, makes for some of the easiest-to-maintain Linux boxes around. I'm typing this right now on Arch with Intel graphics. Sure, they don't have a lot of "gaming punch" but they are darn stable and just work with Linux.

      If you don't need gaming performance, you can go with anything on the market. I know that the AMD open source drivers are very stable and support Compiz-like effects, and the same is probably true for NVidia.

    • This is quite true; Intel is the way to go if you don't want to game, no doubt about that. You even do get some pretty good 3D and graphics acceleration for movies and stuff as well.
      Nvidia+binary blob is the way to go if you *DO* want to game, and you'll probably want a distro that includes the binary blob to avoid manual installation issues. (like arch).

  • Any relation to Neils [nobelprize.org] and Aagie [nobelprize.org]? No wonder Intel's headed for the atomic realm.
  • Taiwan is to circuitry what Japan was to radio technology. http://www.intel.com/jobs/china/students/ [intel.com]
  • Go read "Great by choice", and Intels strategy (aka "Intel delivers") is explained, but in a nutshell they realized early on (like in the 70's) it wasn't enough to make good chips, you have to make lots of them, perfectly. So they are heavy into the manufacturing side and making sure it works really really well.
  • If Intel would have kept their perpetual ARM license, they could rule the world. But even with cutting edge fabs their are going to be overrun by a more ubiquitous CPU architecture.

    Intel should stop making x86s. And especially stop the nonsense of trying to use x86s to compete against stream processors in GPUs and HPC. But it is really too late for them, they gave their ARM license to Marvell (who have basically pissed away a good opportunity as well by not aggressively pursuing new designed based on the li

For God's sake, stop researching for a while and begin to think!

Working...