Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel Moves Up 32nm Production, Cuts 45nm 193

Vigile writes "Intel recently announced that it was moving up the production of 32nm processors in place of many 45nm CPUs that have been on the company's roadmap for some time. Though spun as good news (and sure to be tough on AMD), the fact is that the current economy is forcing Intel's hand as they are unwilling to invest much more in 45nm technologies that will surely be outdated by the time the market cycles back up and consumers and businesses start buying PCs again. By focusing on 32nm products, like Westmere, the first CPU with integrated graphics, Intel is basically putting a $7 billion bet on a turnaround in the economy for 2010."
This discussion has been archived. No new comments can be posted.

Intel Moves Up 32nm Production, Cuts 45nm

Comments Filter:
  • by Chabo ( 880571 ) on Tuesday February 10, 2009 @07:46PM (#26806107) Homepage Journal

    ... AMD has 45nm. [wikipedia.org]

  • by Jurily ( 900488 ) <jurily&gmail,com> on Tuesday February 10, 2009 @07:52PM (#26806167)

    That used to be true for the last 20 years. The only problem today is that no one really cares anymore about CPU speed. 32nm technology will allow Intel to put more cores on a die. They'll get marginal, if any, frequency improvements. We just need to wait for the applications to follow and learn to use 16 cores and more. I know my workload could use 16 cores, but the average consumer PC? Not so sure. That's why I'd like to see prices starting to fall, instead of having same prices, more power PCs.

    We don't need more cores. Someone should have realized it by now. Raw CPU output isn't what the market needs anymore (even on Gentoo, which is kinda hard to accept).

    We need the same CPU with less power usage.

  • by digitalunity ( 19107 ) <digitalunityNO@SPAMyahoo.com> on Tuesday February 10, 2009 @07:54PM (#26806189) Homepage

    I disagree strongly. Processor speed is still very important - just not for the average consumer. For quite some time now, the majority of consumer applications have been IO and/or GPU bound.

    There is no such thing as a 'fastest useful processor' for some people, primarily in research and academia.

  • Re:bet (Score:4, Informative)

    by CannonballHead ( 842625 ) on Tuesday February 10, 2009 @08:11PM (#26806419)
    *wakes Gogo0 up*
  • by RajivSLK ( 398494 ) on Tuesday February 10, 2009 @08:18PM (#26806517)

    most people already have computers

    Really? Have an eyeopening look here:

    http://www.economist.com/research/articlesBySubject/displayStory.cfm?story_id=12758865&subjectID=348909&fsrc=nwl [economist.com]

    Computer ownership is really very low worldwide. Even the US has only 76 computers per 100 people. Keep in mind that includes people like myself who, between work and home use, have 4 computers alone.

    Some other socking figures:
    Italy 36 computers per 100 people
    Mexico 13 computers per 100 people
    Spain 26 computers per 100 people
    Japan 67 computers per 100 people
    Russia 12 computers per 100 people

    And the billions of people in China and India don't even make the list.

    Seems to me that there are a lot more computers Intel could be selling in the future. The market is far from saturated.

  • by zippthorne ( 748122 ) on Tuesday February 10, 2009 @08:47PM (#26806845) Journal

    32nm means that the same processor can take half the area on the die. You could use that to get more cores, or you could just use that to get more out of the wafer.

    I think someone noted not too long ago that the price of silicon (in ICs) by area hasn't changed much over the years. But the price per element has sure gone down due to process reductions.

    If you change nothing else, your 32 nm chip will consume less power and cost less than an otherwise nearly identical 45 nm chip.

  • by mephistophyles ( 974697 ) on Tuesday February 10, 2009 @08:48PM (#26806857)
    I wasn't around when they landed someone on the moon so I can't quite comment on that bit, but I can tell you what I (and the rest of my kind) use the extra processing power for:

    Finite Element Analysis (simulating car crashes to make them safer before we crash the dummies in them).
    Multibody Dynamics (Simulation of robot behavior saves a ton of money, we can simulate the different options before we build 10 different robots or spend a year figuring out something by trial and error)
    Computational Fluid Dynamics (designing cars, jets and pretty much anything in between like windmills and how they affect their surroundings and how efficient they are)
    Simulating Complex Systems (designing control schemes for anything from chemical plants, to cruise control to autopilots) Computational Thermodynamics (Working on that tricky global warming thing, or just trying to figure out how to best model and work with various chemicals or proteins)

    This is just the uses (that I know of) that more raw power can help out in Mechanical Engineering. I still have to wait about an hour for certain simulations or computations to run and they're not even all that complex yet. The faster these things run (even a few percent increases) can save us tons of time in the long run. And time is money...
  • by hydertech ( 122031 ) on Tuesday February 10, 2009 @09:01PM (#26807009) Homepage

    Intel announced today that it was investing $7bln to build new manufacturing facilities in the US to manufacture these chips.

    The new facilities will be built at existing manufacturing plants in New Mexico, Oregon, and Arizona. Intel is estimating 7,000 new jobs will be created. BizJournals.com [bizjournals.com]

  • by Anonymous Coward on Tuesday February 10, 2009 @09:01PM (#26807011)

    We already got lower power CPUs. See the Athlon X2 series (a lower power Athlon64 X2), 11 of which have a 45W TDP. Don't forget about laptop CPUs that happen to fit in desktop boards. And then the Intel Atom and VIA Nano CPUs. And that's on the x86 side only. There is also ARM and others. And the die shrink will help to have lower power usage as well.

  • by wiredlogic ( 135348 ) on Tuesday February 10, 2009 @09:24PM (#26807277)

    That would be the Cyrix MediaGX circa 1997.

  • by Chabo ( 880571 ) on Tuesday February 10, 2009 @09:37PM (#26807393) Homepage Journal

    Additional disclaimer: I'm not a CPU engineer, and this is still based on things I read on public websites.

    I can't find the article, but Anandtech explained this well. Apparently the high-k+ process that's used in 45nm and smaller Intel chips make for incredibly low leakage currents.

    I did, however, find a graph that shows total system power consumption moving from 65nm (Conroe) to 45nm (Penryn), at the same clock speed: http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3137&p=6 [anandtech.com]

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Wednesday February 11, 2009 @12:58AM (#26808293) Journal

    Again: What quality of movie?

    I can watch 1920x1080 movies, smoothly, at least 30fps, if not 60. A quick calculation shows that the poor machine would likely be using over half its RAM just to store a single frame at that resolution. I'd be amazed if your 486 could do 640x480 at an acceptable framerate -- note that we had a different measure of "acceptable" back then.

    Also consider: Even if we disregard Flash, I am guessing talking to the network -- just straight TCP and IP -- is going to be its own kind of difficult. Keep in mind, Ogg Vorbis was named for how it "ogged" the audio, and machines of the time couldn't really do much else -- while decoding audio.

    Yes, there are hacks we could use to make it work. There are horribly ugly (but efficient) codecs we could use. We could drop JavaScript support, and give up the idea of rich web apps.

    And yes, there is a lot of waste involved. But it's been said before, and it is worth mentioning -- computers need to be faster now because we are making them do more. Some of it is bloat, and some of it is actual new functionality that would've been impossible ten years ago.

  • by Anonymous Coward on Wednesday February 11, 2009 @03:08AM (#26809219)

    The high-k+ process addresses substrate leakage (current flow from gate to substrate when the transistor is not switching) by using a material with a higher di-electric constant (the k) to provide a better insulating layer between the gate and substrate which traditionally used silicon dioxide (product of curing and exposing silicon to oxygen). It's this region between the gate and substrate which forms the channel through which current flows from source to drain when the transistor is on.

    I'm not into materials, chemistry or quantum mechanics, but the reason why leakage increases with shrinking transistor size is because as the transistors become smaller so too does the thickness of the channel/insulating layer. What's more is its become so thin that quantum tunneling of electrons becomes a major contributer to this leakage.

  • by symbolset ( 646467 ) on Wednesday February 11, 2009 @03:32AM (#26809327) Journal

    The smaller feature sizes bring power savings as well. So they're taking the server of yesteryear and putting it in your pocket. They're delivering the technology that'll bring the next billion users online because those folks don't have the watts to burn that we do.

    They're also working to solve the whole I/O problem with servers that happens when you get too much processing power in one box.

    In fact, they're pretty well focused on not just learning new things and creating new products, but in delivering new technologies that improve the way we work and live. And then letting go of it so we can figure new ways to use it that haven't occurred to them.

    That's so different from the next story down where another company is getting raked over the coals for dumping money into R&D, because that other company is so famous for clinging to every ounce of leverage they can get out of every vague interpretation or use of their innovations and so toxic to deal with that they could deliver an all electric hovercraft that cured cancer and nobody would want to partner with them.

    Sweet.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...