Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware Technology

Moore's Law Limits Pushed Back Again 334

quackking writes "Since the weather in Rochester stinks, people spend a lot of time indoors making cool stuff. And at Rochester Institute of Technology, they have figured out how to make silicon chips with 38 nanometer rules..this is an order of magnitude better than what is standard at present. The process is called liquid-immersion nanolithography, a cool idea - starting with the commonly-observed phenomenon that things look bigger under water - they submerge the silicon wafer."
This discussion has been archived. No new comments can be posted.

Moore's Law Limits Pushed Back Again

Comments Filter:
  • Power use(rs) (Score:5, Interesting)

    by Space cowboy ( 13680 ) * on Sunday April 04, 2004 @04:47PM (#8763370) Journal

    Betcha the top-of-the-range chips that Intel and AMD make will *still* manage to consume ~100W of power :-)

    Talking of power users, it could make for some seriously large on-die L2 (or even L1) cache though. Since the fetch-from-memory is like hitting a brick wall (for a CPU), the more the better - look at how the P4EE performs compared to the non-EE version...

    I guess it could also be used for lots of on-chip cores. 16 CPU's per die would be nice, although they'd have to have a large die for all the memory traces going to the motherboard. Even AMD's hyper-transport might struggle with that :-)

    Simon.

    • Re:Power use(rs) (Score:5, Informative)

      by imsabbel ( 611519 ) on Sunday April 04, 2004 @04:57PM (#8763462)
      Er, they will use even more power.
      Because leackage is only increasing with further process shrinks. And its increasing A LOT.
      With 0.35 um processes, leackage power was in the .x percent range, with intels 95nm process its already 20-30%.
      More logic transitors will only result in even more waste, at least with tratitional design rules...
      • Re:Power use(rs) (Score:3, Informative)

        by Anonymous Coward
        Leakage doesn't mean more power. It means more relative power, as you said, 20-30% and increasing. The problem is leakage is making it harder to decrease power consumption.
    • Re:Power use(rs) (Score:4, Interesting)

      by master_p ( 608214 ) on Monday April 05, 2004 @05:46AM (#8767319)

      Since the fetch-from-memory is like hitting a brick wall

      DRAM should be eliminated then. We should all move to SRAM (the kind of RAM used for cache). It is expensive right now, but if produced in large quantities, it will become just as cheap as DRAM currently is.

  • by unassimilatible ( 225662 ) on Sunday April 04, 2004 @04:49PM (#8763388) Journal
    You know, the ones that say,

    "objects may be larger than they appear."

    I have them all over my bedroom.

  • Cool Idea? (Score:5, Interesting)

    by !ucif3r ( 713159 ) on Sunday April 04, 2004 @04:49PM (#8763400) Homepage

    As these chips get smaller and smaller, 'cool' is the only thing that isn't going to be synonymous with these chips.

    Any idea how they are going to deal with stability and cooling of these new chips. New computers already use some pretty crazy cooling systems.

    Will watercooled systems become the norm?

    • Re:Cool Idea? (Score:2, Insightful)

      Water cooling!
    • Re:Cool Idea? (Score:5, Insightful)

      by NTmatter ( 589153 ) on Sunday April 04, 2004 @05:04PM (#8763504) Homepage
      It's only natural that if they are made underwater that they be cooled underwater. From what I see, there are only three things that are preventing widespread adoption:

      1) Cost
      2) Difficulty of setup
      3) Public knowledge

      Cost will naturally come down as usage increases. Setup, on the other hand, is still a rather difficult process involving thermal paste, clamps, and lots of water. Most people won't accept it until there's a big funnel marked "ADD WATER HERE" - compounding the problem of non-acceptance is tha fact that nobody knows the cooling we've seen. Nobody's even thought about it. Simply, it's a rare event for a stock PC to spontaneously overheat. The major PC manufacturers tend to prevent that from happening, and can't sell PC's that aren't just "plug and play" in the hopes that customers won't be driven away by complexity of setup.

      So, common watercooling will be a fair ways away, as it still needs to be perfected. I suspect that one of the first major barriers will be modding a case to heat a tank of tropical fish. Overclockers may want to use this technique to simulate conditions outside a geothermal vent and run their own curious creature farms.
      • Re:Cool Idea? (Score:3, Interesting)

        by sumdumass ( 711423 )
        Well i'm kinda wondering if they can incorperate some sort of heat pipe directly into the chip and extend the regular air cools system. Such a heat system could include a heatsink on the bottom of the chip as well as the top along with a couple of channels that run completly thru the chip that allow hot air to escape into the heatsink area were the fans airflow would create a vacume or convection constantlery refreshing the air inside the chip.

        Would negate the need for water cooling at least for a while.
      • by James Lewis ( 641198 ) on Sunday April 04, 2004 @08:39PM (#8764794)
        I would think stuff like VapoChill [asetek.com] would be more likely. The biggest drawback of watercooling is that it well... uses water. Installation is still less than easy for vapochill, but I expect that to be something that can be made much easier, and besides, most people don't ever touch their CPU anyway. That's another thing, you can set this up and ship it, but you probably wouldn't want to do that with a watercooling system. Also, if companies are going to invest in this sort of thing, the longest lasting solution should be sought. Watercooling is better than aircooling (usually) but even it will hit a wall fairly quickly, because it still uses room temperature to cool the CPU. Vapochill keeps the processor -20 C to 20 C, and there's room for improvement on that with better refrigerant tech.
    • Re:Cool Idea? (Score:5, Informative)

      by stox ( 131684 ) on Sunday April 04, 2004 @05:06PM (#8763516) Homepage
      Water cooled mainframes were fairly common 20 years ago.
      • Water cooled mainframes were fairly common 20 years ago.

        They where also quite expensive. Even today, a good water based cooling for your CPU is costly. In any case far more costly (and cumbersome) than most CPU coolers.

    • It's called BTX. (Score:2, Informative)

      by Anonymous Coward
      Look up the BTX system spec. Most of the internal components are moved around from ATX, moved closer to the air intake at the front. I doubt it will become standard to move to a position-dependent and highly-delicate system like watercooling. *Maybe* alcohol recirculation like that one laptop, probably heatpipes, but not watercooling.
    • Re:Cool Idea? (Score:5, Interesting)

      by randyest ( 589159 ) on Sunday April 04, 2004 @05:23PM (#8763613) Homepage
      Not today, but probably very soon, chips will cool themselves [eetimes.com] thanks to nano-technology and Purdue University's clever self-ventilating technology where microfluidic-like layers pump heat-laden air off-chip using a classic "corona wind" effect.

      It's very new news, and not a lot of people realize the impact yet. Cooling is the problem, not scaling. This will allow performance and levels of integration several orders of magnitude greater than otherwise possible.

      Good article too (linked above). Check it out. The printed version off EETimes has some great explanatory graphics too, but those don't seem to be on the web version.
    • Smaller chips are cooler than bigger chips (based on transistor size, all else being equal). They need less voltage to run because there is less distance for the signals to travel.

      The heat problem is due to raising the clock at the same time and/or keeping the size the same but adding more transisitors (e.g. P4).

      • Re:Cool Idea? (Score:2, Interesting)

        by !ucif3r ( 713159 )

        That is only partly true. By cramming more onto a smaller space you increase the problems associated with heat. While the overall heat per transistor has increased the suseptibility of the transistors to heat induced instability is worse and the heat dissapation is worse because there is less area over which to dissapate heat.

        Also the distance really has little to do with the voltage, it is more related to the resistance and current. The lower voltages are related to other improvements.

        The power savings

    • I think the challenge will be geting the heat from the die to the heatsink fast enough as opposed to getting the heat off the heatsink itself.
  • So Small (Score:5, Funny)

    by Jozer99 ( 693146 ) on Sunday April 04, 2004 @04:51PM (#8763410)
    If they are getting so small, shouldn't we start calling them flakes or crumbs, not chips?
    • Nah, flakes and crumbs have negative connotations. We should just call them light chips (made with less oil).
    • Re:So Small (Score:2, Funny)

      by dj245 ( 732906 )
      If they are getting so small, shouldn't we start calling them flakes or crumbs, not chips?

      Why not "Chiplets", which is what I have taken to calling the numerous small broken bits of potato chips in the bottom of the potato chip bag.

  • by Rosco P. Coltrane ( 209368 ) on Sunday April 04, 2004 @04:52PM (#8763419)
    starting with the commonly-observed phenomenon that things look bigger under water

    I knew there was a perfectly rational explanation for why love in the swimming pool always seems more passionate...
    • Re:That explains (Score:3, Insightful)

      by dmayle ( 200765 )
      OT, but obviously spoken by someone who's never had sex in a swimming pool. Besides the chlorination issue (too much and it burns later), water is one shitty lubricant, and manages to wash away the natural ones. It's still fun (do it if you get the chance, especially in a hot tub), but it's not more passionate, just a little more naughty...
  • by Anonymous Coward on Sunday April 04, 2004 @04:55PM (#8763439)
    My experiences with submerging sensitive electronic devices have never been positive ones.
  • by amorphosamon ( 310953 ) on Sunday April 04, 2004 @04:58PM (#8763466) Homepage
    When I first read the article and noticed that the technique involved submersion in water, I thought to myself: "How about vacuums, or zero-gravity? How will these things effect creation processes?"...

    Just google for "Emil Piscani". Not quite what I wanted, but interesting enough...
    • When I first read the article and noticed that the technique involved submersion in water, I thought to myself: "How about vacuums, or zero-gravity? How will these things effect creation processes?"

      This trick is to use the high index of refraciton of the liquid to slow the light. Slower light, shorter wavelength for a given frequency, smaller features before it dies.

      Vacuum and zero gravity may have other useful effects but unless they enable something new they seem unlikely to affect feature size by the
  • Stupid question... (Score:5, Insightful)

    by imsabbel ( 611519 ) on Sunday April 04, 2004 @05:00PM (#8763480)
    Everybody knows that you can increase the fidelity of your lithographic process by increasing the numeric aperture of the optical system.
    But 38nm a "magnitude better" then current processes?
    intel is shipping 95nm prescotts (even if the process sucks (see leakage power), but that has nothing to do with the lithography per se), and 65nm is in developement, with sample chips demeonstrated.

    So its more like a "binary order of magnitude" better than current processes..
  • bigger? (Score:5, Funny)

    by MoFoQ ( 584566 ) on Sunday April 04, 2004 @05:00PM (#8763482)
    sadly, my wallet nor my paycheck gets this phenomenon.

    wonder what would happen to CowboyNeal when he's placed under water....or the national debt....
  • Useful Links (Score:5, Informative)

    by intrinsicchaos ( 652706 ) on Sunday April 04, 2004 @05:01PM (#8763484)
    Here are a few useful links to the RIT website to learn more about this topic:

    Main Site: http://www.rit.edu
    Kate Gleason College of Engineering: http://www.rit.edu/~630www/index.htm
    Microelectronic Engineering Department: http://www.microe.rit.edu/
    Optical Lithography Research: http://www.rit.edu/~635dept5/

    While RIT-bashing is one of the most popular activites around here, RIT isn't such a bad place to be at, even if you're a liberal arts major (which I am). They do some pretty neat stuff around here!

  • by G4from128k ( 686170 ) on Sunday April 04, 2004 @05:01PM (#8763485)
    High-end microscopes go a step further than these chip makers and use high-index low-dispersion oils instead of water as explained at formulas [microscopyu.com] and intro tutorial [microscopyu.com]. Replacing the air between the lens and wafer with a denser high-index fluid increases magnifaction and increases the efective aperture of the optical systems. A larger aperture increase the theoretical resolution of the system.
    • See this article [eetimes.com] that introduces some of the complexities on why things aren't as simple for chip fabrication as it is for microscopy. It describes some of the challenges of immersion lithography in the latest processes. Problems include: not a lot of liquids transmit light well at 157nm wavelengths, contamination, lens/mask damage, and so on.

      Still, the work of the researchers should eventually get us there.
  • Good Idea/ Bad Idea (Score:5, Interesting)

    by Black Mage Balthazar ( 708812 ) on Sunday April 04, 2004 @05:02PM (#8763492)
    This all seems to be panning out, with the major manufacturers on board. So we have extended life on current silicon chip fabrication? My question is that whether or not this is a good thing in regards to innovation (not to knock the innovation contained in this technique).

    What I mean is developing new materials to create processors, or reinventing current methods to save space and power, rather than finding new ways for the status quo to stick around longer. The September 2003 edition of Wired Magazine had a cover story about creating flawless synthetic diamonds, and the possible uses for them as processor components. It turns out that a diamond semiconducter has been developed.

    It has met with major disapproval from both the diamond cartels (ie DeBeers, as synthetic diamonds have the potential to damage their business) as well as semiconducter manufacturers (since they have so much invested in silicon).

    It's possible that both could work together, as the diamond semiconducters are in their relative infancy, and this could provide an interim solution. Or, the big manufacturers could try and drag Moore's Law on as long as possible with silicon.

    • by odano ( 735445 )
      I hope we haven't hit a point that many other industries hit (Music with the internet, Car companies with hybrid cars, etc), in which legal struggles and companies vying for position create massive delays on new and better technology.
    • I was reading a book the other day (Hello World: A Life in Ham Radio) and it talked about how DeBeers got their mines. In the 20s or 30s they bought all the mining rights in some country for the next 99 years for some pathetic little sum. Do they have other mines or what? Because if not won't they loose their rights in about 10-20 years? Anyone know? Thanks.
  • by bmac ( 51623 ) on Sunday April 04, 2004 @05:13PM (#8763567) Journal
    Won't all these relatively linear improvements to the fabrication tech be irrelevant once purely optical chips are rolling. Even though they are many years away, it seems to me that they will be *many* orders of magnitude faster than our current electricity-based chips. Does anyone here know if these current fab techs will be used for optical based chips? I would imagine that optical chips would require an entirely different production means. Also, wouldn't the optical chips run *much* more cooly?

    Sorry for the basic questions, but I'm just a programmer :-)

    Peace & Blessings,
    bmac
    • "Won't all these relatively linear improvements to the fabrication tech be irrelevant once purely optical chips are rolling."

      No. Not at all.

      1.) It will be a LOOOOOOOOOOOONG time before all chips are optical. It will take a while for the optical processors to be developed so that they are as fast as whatever the current batch of processors is, and as cheap. Even then, it's difficult to imagine all the chip makers scrapping all their factories and going all optical.

      2.) I imagine there'll be some big r
    • by Arch_dude ( 666557 ) on Sunday April 04, 2004 @06:18PM (#8763984)
      Sorry, no. Optical chips are less dense than electronic chips, for the simple reason that photons are "bigger" than electrons, A visible photon has a wavelength of about 800nm, so a waveguide for this photon must have roughly equivalent dimensions. From a "speed" standpoint, light in glass moves at C/I (where I is the index of refraction of the material. For optical glass, I=1.5 or thereabouts, so light moves at two thirds the speed of light in a vacuum. electrical pulses in copper move at about .87 C, while radio frequencies in a "transmission line" such as a coaxial cable move at nearly C. Thus, an optical signal on a fiber travels at about 200Km/ms, while a radio signal on a coaxial cable travels at about 300Km/ms. Electronic switches (transistors) can operate up to about 150Ghz. (in the lab, bleeding-edge.) These decvices are 90nm or so. Photonic switches can operate at perhaps 40Ghz. These devices are big discrete components, and even if they are miniaturized they will need to be the at least the size of a photon wavelength, ten times bigger than the electronic switch in each dimension. Photonics is a wonderful science, and will lead further dramatic decreases in cost of communications, but the physics is all wrong for replacing electrons with photons as a way to miniaturize a computer. Look to nanotech for that, but that's another story.
  • Is it good news? (Score:4, Interesting)

    by plusser ( 685253 ) on Sunday April 04, 2004 @05:17PM (#8763589)
    This is good news for the computer industry, but not necessarily good news for everybody else. The problem with decreasing sizes of die geometry is that it makes the devices far more susceptible to the effects of atmospheric radiation. This means that if there is heightened sun spot activity, a solar flare or the devices are used in the upper atmosphere, there is a significant risk that the devices may not operate correctly, introducing soft errors of transistor gates and the possibility of latch-up. This at best will result in random system failure, that many users would thing are more a symptom of bad software or poorly configured hardware. Techniques to avoid this can be implemented by using built-in redundancy, but this introduces its own problems, as it will depend on how quick the device can be reset itself. There is also the issue that the whole device will be much smaller, so several separate devices may be required, as having correction on the same die would introduce additional risks. It looks like early days for the development of this technology, and it would be useful for domestic applications. Whether it would be used for industrial, space and aeronautical applications will depend on the success of reducing the atmospheric radiation effects.
    • by trampel ( 464001 )
      Come on - exactly the same arguments were raised in the late 80s with the advent of 1MBit DRAM chip. Most of these fears apparently were unfounded, or countered by technical measuers. Also, the technique for high reliability applications is not redundancy, but error correction; even high end Intel server chipsets support it nowadays.
  • All very nice (Score:3, Insightful)

    by fr0dicus ( 641320 ) on Sunday April 04, 2004 @05:18PM (#8763593) Journal
    But until hard drives move on, all this CPU power is pretty pointless.
  • by jmarcand ( 248367 )
    This technology has been in the works for some time, here are some relevant press releases from the three major litho tool vendors:
    ASML [asml.com]
    Canon [canon.com]
    Nikon [nikon.co.jp]
  • by www.fuckingdie.com ( 759660 ) on Sunday April 04, 2004 @05:23PM (#8763612) Homepage
    38nm chips, while not all that revolutionary in CPU design my opinion, will undoubtedly spark revolution in the chip cooling industry. That is unless the major problems of thermal loss due to electron tunneling can be dealt with. (I have no doubt that high K materials and the such will eventually help to deal with these issues, but given current trends in the CPU industry, ie Faster = Hotter, I am not all that optimistic.)

    I can see the guys who design chip coolers getting excited about this, cause lets face it - This is their porn.

  • by Bobb Sledd ( 307434 ) on Sunday April 04, 2004 @05:27PM (#8763637) Homepage
    Unfortunately, they found out later that a nanometer also looks bigger submerged, too. :-/

  • Until we move past the hard drive of today and to the solid state drives of tomorrow then I'm not too excited about the power of any chip. Current hard drive technology is the archillies heel of any system nowdays in terms of speed.

    The PCI bus has come of age but it makes no sense to have all this speed and power if we cant stuff or retreive it as we wish from a 150M/s device.
    • The thing is, this isn't Soviet Russia, you can't just reallocate labor into different fields. If people want to work on electrical CPUs then they will and if they think it is lucrative and interesting enough to do hard drive work then they will also... we can't just say "Hey you, work on hard drives and be more communist."
  • Optics (Score:5, Insightful)

    by vlad_petric ( 94134 ) on Sunday April 04, 2004 @05:32PM (#8763657) Homepage
    While I am certainly not a physicist, the limitting factor for lithography is the wavelength - your precision is directly proportional to the wavelength you're using. This is why you can't just use more powerful/precise lenses - the wavelength is still the same when light is back in the air.

    An approach to reducing the wavelength is to simply go to higher frequency, but that poses some tough challenges (x-rays don't behave like visible light). What these researchers have done is to change the environment such that the wavelength is smaller (light is 25% slower in water, and because frequency doesn't change, wavelength gets shorter). Anyway, 25% is hardly "an order of magnitude".

    • Wavelength IS important, but still they manage to make features that are smaller than the wavelength they are using :) They use inferomerty to etch tiny channels with large hammers..

      Eg if you want to etch a cross:
      |
      --|--
      |

      You make a litography mask somewhat like this:
      |||
      ==.==
      |||

      But the price of a mask set with this tech is very high :)
  • This water phenomenon has worked in the lobster business for years. Maybe there will be a suit of the Sun Microsystems/ Microsoft or HP/Gateway proportions?

    Doubtful, lobsters hate litigation.

  • by Performer Guy ( 69820 ) on Sunday April 04, 2004 @05:39PM (#8763707)
    People it's called refraction. We don't need some dumbed down nonsense with no real detail, for most people (especially around here) it is completely counter productive and only confuses the issue.

    The technique mentioned seems to be forming a liquid lens. So my question is, why is this any different from a conventional lens. Lenses are used all the time in lithography.

    Does it help with diffraction limited optics or is there some other reason?

    They're talking about increasing the imaging limits even at current wavelengths by effectively changing the lensing system, so it has to be something to do with reducing diffraction IMHO.

    So... anyone here care to offer a reasonable technical description of why this is?

    Does a single refractive interface with the wafer submerged and therefore inside the lens help reduce diffraction?
    • Does a single refractive interface with the wafer submerged and therefore inside the lens help reduce diffraction?

      It annoys me too, but for a different reason. The fact that things look bigger in water is completely irrelevant to the subject at hand - that is just an optical trick to our eyes (which are outside the water) when the water is contained in a round container (a flat container does not exhibit this effect!) and would make no difference to a lithography system.

      Contrary to the parent comment's

  • by Bendebecker ( 633126 ) on Sunday April 04, 2004 @05:43PM (#8763736) Journal
    "starting with the commonly-observed phenomenon that things look bigger under water"

    This seems supiciously like the Doctor's explanation as to why the inside of the Tardis is bigger than the outside: Take two cubes, one bigger than the other. Now hold the bigger one at an arms length just far enough so that it looks like it can fit in the small one. That's how its done!
    Leela: "That's nonesense"
    Doctor: "Nonesense? That's transdimensional engineering, a key time lord discovery."
  • ALRIGHT! (Score:2, Interesting)

    by 2057 ( 600541 )
    I am going to RIT next year, this is great, if this tech gets picked up my Comp Eng. Degree will be worth alot more! and my parents wanted me to goto NJIT(its scary, in newark). This almost makes the 7k out of pocket worth it...almost...sigh
  • by Brandybuck ( 704397 ) on Sunday April 04, 2004 @05:52PM (#8763783) Homepage Journal
    Chips keep getting smaller, but the laws of physics remain the same. We're getting leakage and reducing the die size only makes it worse. Pretty soon we're going to have chips so small and so hot, that they'll be better at producing fusion than processing data.

    We need a new direction. Moore's law is still in effect, but it doesn't dictate die sizes, only speed and cost. The most obvious alternate road is parallel processing. Multiple chips in other words. We're already doing this. Outside of the PC world this is old hat. We think we're all 1337 because we have a four-way Xeon server, but the non-PC world just yawns at this.

    My prediction: PCs will have 64 processors, each of which will be cooler than today's 3GHz+ p4, but will provide a magnitude more processing power. Software (or compilers) will have to designed for this new architecture, but it's the only way we're going to see a PC capable of running Longhorn or Linux 2.8 that doesn't take 500 watts.

    p.s In the meantime, software follows Moore's Anti-law, which states software will waste all additional resources provided by Moore's Law. If only software would keep up with hardware I would be ecstatic. When WP5.1 did 98% of what WordXP does today, but did it on a 640K 16-bit processor, it's hard to say software is improving in any area but the GUI.
    • One problem with parallel process, it only works with problems in which the problem can be broken into several problems that can be solved individually (example, graphic processing, most mathematic calculations). But parallel processing doesn't work when every step inside the problem require an input from the previous one (I think AI falls under this category... or most common day to day application). In the end, parallel process kicks ass when doing complicated math and graphic processing, but will actua
    • You're spot-on with the heat issue, and thanks to some clever blokes at Purdue University, here's your new direction [eetimes.com].

      Unfortunately for you it's not the de-integration you predict, rather more still more integration with a very clever, very scaleable, amazingly efficient, built-in cooling system.
    • Re:Moore's law (Score:3, Insightful)

      by JCholewa ( 34629 )
      > Moore's law is still in effect, but it doesn't dictate die sizes, only speed and cost

      I ... no, wait. Moore's Law does dictate die sizes, and it doesn't dictate speed and cost! Moore's Law has to do with the increasing density of transistors inside microchips, which directly ties into the size of the die (if you double the number of transistors per square millimeter, you can have the same number of transistors in a smaller die!). Speed is just an occasional side effect that has to do with shorter pa
  • by putigger ( 632291 ) on Sunday April 04, 2004 @05:53PM (#8763792)
    The article is unclear. All they're talking about is using liquid immersion optics. So rather than shrinking the wavelength, which can no longer be done without switching to reflective optics, they're increasing the numerical apeture by imaging through a liquid. This idea has been around for a while and is nothing new. It still does not address the exponentially increasing costs of optical lithogrpaphy tools - in fact liquid immersion optics will only complicate matters further and cotinue to drive costs up.
  • Heat sink (Score:4, Insightful)

    by Anonymous Coward on Sunday April 04, 2004 @06:11PM (#8763935)
    The stupidest thing about this industry is that while the chips are getting smaller, the heatsinks are getting larger.

    Sorry, but the heatsink is part of the CPU, IMO. The CPU cannot run without the heat sink, so it has to be considered a single unit.

    Given that, I'd say that CPU's are increasing in size, not decreasing. My 486-66 may have been 100 times slower than today's P4-3GHZ, but it was also 50 times smaller since it needed no heatsink. Thus, I'd say we only increased size/speed ratio by 50%.

  • by adzoox ( 615327 ) * on Sunday April 04, 2004 @06:12PM (#8763939) Journal
    I remember seeing a video of the IBM plant where the chips in the Power4 and Power5 family are made (aka G5 for Apple) - I thought I saw that the chips were submerged in a liquid and this reduced the need for "clean rooms" - although this could have been a "negative wash" to clean the wafers.
  • http://216.239.41.104/search?q=cache:PfSYNDpv4bYJ : oemagazine.com/newscast/2004/020404_newscast01.htm l+immersion+lithography+rochester+institute+of+tec hnology&hl=en&lr=lang_en&ie=UTF-8 http://biz.yahoo.com/prnews/040218/nyw086_1.html http://216.239.41.104/search?q=cache:Lurb_nMoW6YJ: www.siliconstrategies.com/story/OEG20030227S0068+i mmersion+lithography&hl=en&lr=lang_en&ie=UTF-8 http://www.future-fab.com/document.asp?d_id=1896 Btw, the last article is written by Dr Bruce
  • by SensitiveMale ( 155605 ) on Sunday April 04, 2004 @06:34PM (#8764071)
    I was in the pool!
  • Shrinkage (Score:3, Funny)

    by duckpoopy ( 585203 ) on Sunday April 04, 2004 @07:09PM (#8764304) Journal
    Then when you take the wafer out of the water : Shrinkage. Brilliant idea!
  • Hopefully once this [wired.com] process gets cheaper and more refined, we'll have optical processors and buses.

    A snip from the article:

    But the greatest potential for CVD diamond lies in computing. If diamond is ever to be a practical material for semiconducting, it will need to be affordably grown in large wafers. (The silicon wafers Intel uses, for example, are 1 foot in diameter.) CVD growth is limited only by the size of the seed placed in the Apollo machine. Starting with a square, waferlike fragment, the Linares process will grow the diamond into a prismatic shape, with the top slightly wider than the base. For the past seven years - since Robert Linares first discovered the sweet spot - Apollo has been growing increasingly larger seeds by chopping off the top layer of growth and using that as the starting point for the next batch. At the moment, the company is producing 10-millimeter wafers but predicts it will reach an inch square by year's end and 4 inches in five years. The price per carat: about $5

    I don't know what we can look forward to after that =P


  • by Cutie Pi ( 588366 ) on Sunday April 04, 2004 @10:52PM (#8765544)
    Here's how it works:

    All optical systems have a certain "numerical aperture" or NA, which is equal to the index of refraction of the immersion medium times the sine of the half angle between the lens and the imaging plane, ie: NA = n * sin(theta/2). In traditional lithography systems, the immersion medium is air with an index of ~ 1.0, so the theoretically maximum NA is 1.0, since the half angle cannot be greater than 90 degrees.

    NA is basically a measure of how many diffracted rays make it through the lens. When light passes through the mask, the light diffracts, i.e. spreads out. Light that spreads out the most contains the highest-resolution information. The lens's job is to collect as many rays as possible and focus them to form an image. Of course, not all of the rays are collected, so the image is degraded somewhat. Lens designers try as hard as possible to collect as many orders as they can (i.e. increase NA) to give the highest possible resolution.

    The resolution of the system is actually proportional to the wavelength divided by NA, so there are two approaches to making smaller printed features. First, the wavelength can be made smaller. This has been done over the years, and has been moved from 436nm to 365nm to 248nm and now to 193nm. An attempt was made to move to 157nm, but the materials challenges proved to be too difficult. (These wavelengths, by the way, are either peaks of the mercury spectrum, or various excimer laser wavelengths). Second, the NA can be made larger. This has also been done, with each generation of imaging systems having higher NA. NA has gone from 0.3 or so to 0.75 or 0.80. Every time a move in wavelength or NA has occured, a tremendous amount of research and development has been needed. Also, the imaging systems have become increasingly complex. A state-of-the-art 193nm "scanner" now runs for around $15 million.

    Immersion lithography works because you can increase the NA above 1. Water has an index of refraction greater than 1.0 (it's 1.333 for visible light, not sure for 193nm). Of course, this is all math. What's really going on is that rays that are diffracted at such a large angle that would normally be totally internally reflected inside the lens, can now be transmitted. As I said, the more diffracted rays that make it through, the higher the resolution you can achieve.

    Although I doubt water immersion will be good enough for the 38nm node, other immersion liquids with a higher index of refraction will increase the NA further still, and push the resolution even higher. It is thus very likely that optical lithography, whose death has been predicted forever, will continue to be the dominant technology in making microchips.

One man's constant is another man's variable. -- A.J. Perlis

Working...