Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel Moves Up 32nm Production, Cuts 45nm 193

Vigile writes "Intel recently announced that it was moving up the production of 32nm processors in place of many 45nm CPUs that have been on the company's roadmap for some time. Though spun as good news (and sure to be tough on AMD), the fact is that the current economy is forcing Intel's hand as they are unwilling to invest much more in 45nm technologies that will surely be outdated by the time the market cycles back up and consumers and businesses start buying PCs again. By focusing on 32nm products, like Westmere, the first CPU with integrated graphics, Intel is basically putting a $7 billion bet on a turnaround in the economy for 2010."
This discussion has been archived. No new comments can be posted.

Intel Moves Up 32nm Production, Cuts 45nm

Comments Filter:
  • by CarpetShark ( 865376 ) on Tuesday February 10, 2009 @07:47PM (#26806115)

    I know my workload could use 16 cores, but the average consumer PC? Not so sure.

    The average consumer PC uses: * wordprocessing, which barely needs it, but can use it when performance is necessary, for background processing like print jobs, grammar checking and speech recog * spreadsheets, which lend themselves very well to multithreading * games, which could lend themselves well, if engines start doing stuff like per-creature-ai and pathfinding (ignoring stuff that's already on the GPU like physics and gfx) in proper threads. * web browsing. Admittedly, webpages are not the ideal scenario for multicore, but with multiple tabs, and multiple subprograms (flash, javascript, downloads, etc.) all running in threads, this could utilise multicores well too. Presumably future use of more XML etc. will help to push the boundaries there. If we ever get down the road of RDF on the desktop, then multicores will be very useful, in collecting and merging data streams, running subqueries, etc.

  • by von_rick ( 944421 ) on Tuesday February 10, 2009 @08:28PM (#26806627) Homepage

    Great point. People who bought their machines when the processors were at 65-nm won't need to replace them until about 2011. By then, according to Intel's own prediction, we would be in the sub 10-nm range.

    This is from an article from mid 2008: full article [crn.com]

    Intel debuted its 45nm process late last year and has been ramping its Penryn line of 45nm processors steadily throughout this year. The next die shrink milestone will be the 32nm process, set to kick off next year, followed by 14nm a few years after that and then sub-10nm, if all goes according to plan.

  • by Chabo ( 880571 ) on Tuesday February 10, 2009 @08:36PM (#26806731) Homepage Journal

    If someone made a CPU with many cores (>25, let's say), then one easy way to use all those cores would be to have each NPC have their own pathfinding thread.

    The problem right now in game design is the wide variety of hardware on the market. You still have gamers like me who are still running on single-core machines, and you have people who are running quad-core hyper-thread machines. As a game studio, you have to code for everyone. If you make a thread for each NPC now, then the task switching alone would choke the CPU for most games.

    You can read about Valve's difficulties making the Source engine multi-threaded in their paper "Dragged Kicking and Screaming: Source Multicore". http://valvesoftware.com/publications.html [valvesoftware.com]

  • by Sycraft-fu ( 314770 ) on Tuesday February 10, 2009 @08:58PM (#26806951)

    For one thing, Intel has always been ahead of, well, everyone pretty much on fab processes. This isn't saying Intel will skip 45nm, they can't do that as they a;ready are producing 45nm chips in large quantities. They have a 45nm fab online in Arizona cranking out tons of chips. Their Core 2s were the first to go 45nm, though you can still get 65nm variants. All their new Core i7s are 45nm. So they've been doing it for awhile, longer than AMD has (AMD is also 45nm now).

    The headline isn't great because basically what's happening is Intel isn't doing any kind of leapfrog. They are doing two things:

    1) Canceling some planned 45nm products. They'd planned on rolling out more products on their 45nm process. They are now canceling some of those. So they'll be doing less 45nm products than originally planned, not none (since they already have some).

    2) Redirecting resources to stepping up the timescale on 32nm. They already have all the technology in place for this. Now it is the implementation phase. That isn't easy or fast. They have to retool fabs, or build new ones, work out all the production problems, as well as design chips for this new process. This is already under way, a product like this is in the design phases for years before it actually hits the market. However they are going to direct more resources to it to try and make it happen faster.

    More or less, they are just trying to shorten the life of 45nm. They want to get 32nm out the door quicker. To do that, they are going to scale back new 45nm offerings.

    Makes sense. Their reasoning is basically that the economy sucks right now, so people are buying less tech. Thus rolling out new products isn't likely to make them a whole lot of money. Also it isn't like the products they have are crap or anything, they compete quite well. So, rather than just try to offer incremental upgrades that people probably aren't that interested in, unless they are buying new, they'll just wait. They'll try and have 32nm out the door sooner so that when the economy does recover, their offerings are that much stronger.

    Over all, probably a good idea. Not so many people are buying systems just to upgrade right now, so having something just a bit better isn't a big deal. If someone needs a new system, they'll still buy your stuff, it's still good. Get ready so that when people do want to buy upgrades, you've got killer stuff to offer.

  • by plague911 ( 1292006 ) on Tuesday February 10, 2009 @09:10PM (#26807107)
    Just a guess of mine. But the fact of the matter is that some semiconductor phd's out their think that the end of the line is coming for the reduction in device feature size. I believe my professor last term said he figured the end would come around 22nm mark not much further. I could be wrong about the exact number (i hated that class). But the point is once the end of the line is reached. Profits hit a brick wall and the whole industry may take a nose dive. Right now every year there is bigger and better being released. But what happens when technology stagnates? There will probably still be progress but the rate of progress will likely be slowed substantially. In short semiconductor companies may be in a race. But none of them want to finish that race.
  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday February 10, 2009 @09:20PM (#26807235) Journal

    Why would even watching a video on youtube need a 16-core processor?

    You clearly underestimate how much Flash sucks.

    People got along just fine on Pentium II's.

    And they did quite a lot less. Ignoring Flash, those Pentium IIs, I'm guessing, are physically incapable of watching a YouTube video, and are certainly incapable of watching an HD video from, say, Vimeo.

  • by adpowers ( 153922 ) on Tuesday February 10, 2009 @10:27PM (#26807535)

    Yeah, I noticed that this morning when I read about the investment. They closed a bunch of older facilities in Asia, laying off the workers, and are building the new fancy fabs in the US (and creating high paying jobs in the process).

    Of course, the next thing that came to my mind is whether Slashdot would cover that aspect of the story. Sure enough, Slashdot's summary completely disregards that Intel is creating jobs in America. I suspect there are two reasons for this: 1. It hurts Slashdot's agenda if they report about companies insourcing, readers should only know about outsourcing by "the evil corporations". 2. Because Intel is the big bad wolf and we can't report anything good they do.

  • Re:Safe Bet (Score:5, Interesting)

    by artor3 ( 1344997 ) on Tuesday February 10, 2009 @11:35PM (#26807735)

    FYI, poor people don't disappear when you stop looking at them.

    Having large amounts of poverty in the nation will breed crime, reduce sales, cause layoffs, and generally decrease the quality of life for those of us who planned ahead.

    Sometimes it sucks to be one of the responsible ones. If you didn't learn that throughout grade school and college, then I don't know what more to tell you.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Wednesday February 11, 2009 @01:58AM (#26808763) Journal

    But how much of that is the need for raw power VS the problem of really crappy code?

    There's a lot of each.

    Every now and then, I run a test of a YouTube (or other) video played in its native Flash player, and in a third-party player like VLC or mplayer.

    Not only is the mplayer version higher quality (better antialiasing), and more usable (when I move my mouse to another monitor, Flash fullscreen goes away), but it's the difference between using 30-50% CPU for the tiny browser version, and using 1% or less fullscreen.

    In Flash 10 -- yes, I'll say that again, flash TEN, the latest version -- they finally introduced hardware acceleration and 3D graphics. I think I know why it's called Flash 10 -- it finally lets Flash developers do what desktop developers were doing 10 years ago.

    now I have noticed the software has taken on the SUV model of not caring how crappy the resource suckage as long as you can add more crap to it.

    There are other reasons to hate SUVs...

    But here's a good reason to think about functionality and development time long before you think about performance:

    While there are obvious exceptions, you pretty much always find that performance is inversely proportional to readability, maintainability, and stability.

    Specifically, there was a study which showed that bugs per LOC remain constant across languages. So, if it takes me 500 lines to do something in assembly, and 100 lines to do it in C, and 10 lines to do it in Ruby, I'll take the Ruby version unless I have a very good reason not to. There's less chance I'll screw something up, and it's probably much clearer what I mean.

    The tools will catch up -- Ruby just got twice as fast. And the really performance-critical stuff, I can rewrite in C, or even assembly, if I must. But for the most part, even on a netbook, there's tons of resources to throw at the problem, versus the amount of programmer resources it might take.

    Because with the economy in the toilet and prices likely to go no where but up

    ...

    Do you have any idea how the economy works?

    when it comes to energy we could all use more efficient machines

    Ok, quick question: How much power does your system use? This laptop typically uses less than 25 watts -- that's for the whole system. The cord is capable of 90 watts, but it works.

    It has 128 gigs of disk space, 4 gigs of RAM, and dual 2.5 ghz CPUs, that run at 800 mhz most of the time.

    If a program is 10 megs, uses 200 megs of RAM, and uses some 20% of a single core, I really don't care -- I might bitch about how it could be more efficient, but it works, and I can run

    If a program is 100 kilobytes, uses 2 megs of RAM, and less than 1% of a single core, fine! Great! But if that program also crashes periodically, isn't 64-bit compatible, and is missing large chunks of functionality -- and despite being open source, I'm loathe to try to add it myself, as my C fu is not strong, and their code isn't very readable... I'd say it's not worth it, and I'd seriously consider that inefficient alternative.

    You are right -- you shouldn't need a dual core to watch Youtube. But wishing for the "good old days" is just as foolish.

  • Re:Safe Bet (Score:3, Interesting)

    by Kirijini ( 214824 ) <kirijini@nOSpam.yahoo.com> on Wednesday February 11, 2009 @02:28AM (#26808977)

    "We shouldn't call a Trillion dollars of pork a stimulus"

    A whole hell of a lot of the stimulus package is tax cuts. Over 250 billion dollars worth. Another $350 billion is going to education, healthcare (like medicaid), and food stamps. You can't call any of that pork. That money isn't going to special projects in congresspeople's districts. All of it goes to the states, or relieves the tax burden on individuals and employers.

    There's also things like highway maintenance, energy investment, and some telecom stuff. You might consider that pork. It's not - its an investment in infrastructure. Massive investments like this produce demand for labor and resources, and creates opportunities for entrepreneurs to form small businesses, or for small businesses to become big businesses. Some of it is short term, and in established markets, like road-work and building construction contracting. Some of it is long term, and investment in developing or new markets, like alternative energy and electronic medical records.

    It is spending, instead of cutting. But look at the plan. It's not increasing the size of the national government. It's mostly aid to the states. You want to prevent pork? Then pay attention to your state legislature. They're gunna be the ones spending most of it.

    "This is the real estate buying opportunity of a lifetime."

    You can't buy if you don't have resources to pay with. If you have a huge amount of hard savings (cash or gold in your mattress), then you're right. Go out there and buy some foreclosed homes. If you have a huge amount of mutual funds, stocks, real estate, etc. then you've been losing value and probably can't afford buying new property. If you're like most people and borrow the huge amount of money you need to buy real estate, what assurance do you have that you can pay it back? Your job? How do you know you're gunna keep it through the bad economy? Nobody knows how bad this is gunna get.

    This isn't an opportunity for buying real estate. The opportunity comes when the recovery is underway, as people feel more secure and credit loosens up.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...