Forgot your password?
typodupeerror
Intel Hardware

Ivy Bridge Running Hotter Than Intel's Last-gen CPU 182

Posted by Soulskill
from the water-under-the-bridge dept.
crookedvulture writes "The launch of Intel's Ivy Bridge CPUs made headlines earlier this week, but the next-gen processor's story is still being told. When overclocked, Ivy Bridge runs as much as 20C hotter than its Sandy Bridge predecessor at the same speed, despite the fact that the two chips have comparable power consumption. There are several reasons for these toasty tendencies. The new 22-nm process used to fabricate the CPU produces a smaller die with less surface area to dissipate heat. Intel has changed the thermal interface material between the CPU die and its heat spreader. Ivy also requires a much bigger step up in voltage to hit the same speeds as Sandy Bridge."
This discussion has been archived. No new comments can be posted.

Ivy Bridge Running Hotter Than Intel's Last-gen CPU

Comments Filter:
  • Good! (Score:5, Funny)

    by Kenja (541830) on Saturday April 28, 2012 @11:45AM (#39832033)
    After switching all my lights to LED bulbs, its a bit cold in my office. A new, hotter CPU could be just what I need.
    • Re: (Score:2, Offtopic)

      by AC-x (735297)

      Wait, you were in an office not lit with horrible fluorescent strip lighting?

    • This reminds me of something that happened in one of our old computer labs, one which we affectionately called "The Ice Box". As the story goes the A/C was originally designed on the assumption that the room would contain a few dozen computers, which it did, and an equal number of CRTs, which it did not. I suppose it's because the new building took so long going up that they missed the big switch to LCDs.

      The whole thing is probably apocryphal but I found it quite amusing all the same; there's something quit

      • by nschubach (922175)

        Obviously, if it's twenty degrees outside you need more heat, not an air conditioner. ;)

      • by Eskarel (565631)

        I've had people tell me that LCD's actually generate close to the same amount of heat, but when in University I worked in a computer lab which switched and it was noticeably cooler. This was however a state school with a shared heating system between multiple buildings which meant that they did some silly things like turning the heat on and off based on a calendar not the weather, so I can't attest scientifically that this was based entirely on the LCD switch over.

      • by ultranova (717540)

        As the story goes the A/C was originally designed on the assumption that the room would contain a few dozen computers, which it did, and an equal number of CRTs, which it did not. I suppose it's because the new building took so long going up that they missed the big switch to LCDs.

        Not to mention the invention of the thermostat.

    • Hotter != more heat (Score:5, Informative)

      by Ken_g6 (775014) on Saturday April 28, 2012 @01:46PM (#39832759) Homepage

      After switching all my lights to LED bulbs, its a bit cold in my office. A new, hotter CPU could be just what I need.

      You're confusing temperature and heat. A candle burns hotter than a person, but a person puts out more heat (100W) than a candle (80W). Likewise, Ivy Bridge puts out less heat than Sandy Bridge, even though it's hotter.

      • by Rozine (1345911)
        If you immediately know the candle light is fire, then the meal was cooked a long time ago...
      • by hey! (33014)

        A candle burns hotter than a person? How did they find that out? Burn the test subject in a calorimeter?

      • by thegarbz (1787294)

        The take away comment here is if you want to heat a room forget about lighting a fire, just go invite some hot bodies over.

      • Can I restate what you said but a little more clearly?

        The candle has higher power density than a person.

        Or more specifically...

        IB power output (or heat output, same thing) is LESS than SB, but it fluxes out of a smaller die area, so the junction temperature on the lid of the package is higher. The package psi-JC (junction to case heat transfer rate, a physical property) has not improved significantly, so the same heat flux rate from the die, through the lid (and through the heatsink, and through the chassi

    • I partially heat my home office with my Pentium 4.

      Really. Sometimes I boot my old box just because the room is chilly.

      • Do some BitCoin mining on your GPU, with aggression set to maximum.

        It'll be like a tropical island within the hour, except without the wonderful ocean breeze that normally accompanies it.

    • A bit cold?

      I once worked at a company that moved in to an office space before heating was installed. It was the middle of winter, in Canada, and all the equipment was brought up via crane through an opening in the side of the building.

      I used a computer running a busy loop program to heat my office. It worked well as a space heater.

  • by Trepidity (597) <delirium-slashdot@@@hackish...org> on Saturday April 28, 2012 @11:55AM (#39832087)

    It's clear in the article, but the headline here sort of implies that the chips run hotter in general, whereas this test is only saying the new chips run hotter when overclocked. From what I can find, when run at the rated voltages/speeds, Ivy Bridge CPUs run at about the same temperature as last gen's CPUs.

    • by Alastor187 (593341) on Saturday April 28, 2012 @12:24PM (#39832245)

      It's clear in the article, but the headline here sort of implies that the chips run hotter in general, whereas this test is only saying the new chips run hotter when overclocked. From what I can find, when run at the rated voltages/speeds, Ivy Bridge CPUs run at about the same temperature as last gen's CPUs.

      Seems like that would make sense if at normal 'voltage/speed' the Ivy Bridge is using less power. Based on the the numbers in the link the Ivy Bridge has a higher overall thermal resistance, junction-to-air, of roughly 30% [=((100C-20C)/(80C-20C))*(231W/236W)]. Based on other reviews the Ivy Bridge processors uses less power at stock frequency/voltage so that may be offsetting much of the temperature rise due to an increase in package resistance and heatsink interface resistance, under normal conditions.

      Power dissipation increases exponential with increases in frequency/voltage and it appears to rise faster with the Ivy Bridge processors. So as the power dissipation approaches or exceeds that of the Sandy Bridge processor much higher processor temperatures will be measured in the Ivy Bridge because of the higher thermal resistances.

      I think this is a non-issue for the average consumer. However, overclockers would probably be better off with the Sandy Bridge hardware.

      • Ivy Bridge is smaller in area than Sandy Bridge. Assuming I got the right numbers from Wikipedia, 160 mm^2 vs 216. That's 74% the area for heat transfer.
        • Ivy Bridge is smaller in area than Sandy Bridge. Assuming I got the right numbers from Wikipedia, 160 mm^2 vs 216. That's 74% the area for heat transfer.

          Agreed, but it doesn't necessarily scale linearly when including spreading affects. Using the numbers you provided one would estimate a ~35% increase in temperature at a given power dissipation, when comparing the Ivy Bridge to the Sandy Bridge. Based on the linked article the increase was only ~30%, which seems reasonable if expecting slightly improved heater transfer performance due to spreading across an oversized heatsink.

          With the limited amount of hardware information in this thread, one could probab

      • However, overclockers would probably be better off with the Sandy Bridge hardware.

        At the same clock, probably, but with the die shrink, Ivy Bridge should get better performance per watt than the Sandy Bridge OC'd to the same power. I may be making the silly assumption that OC is still about getting more performance and not just pushing hardware beyond its limits.

        How Intel prices those isn't directly relevant to their performance, of course, since Ivy Bridge is the "new hotness".

    • by tommasorepetti (2485820) on Saturday April 28, 2012 @12:36PM (#39832299)
      Completely agree. The overclocking community is somewhat overrepresented in reviews of computer hardware. Overclockers are, in general, very knowledgable, so I am not saying that their voices as part of the reception are a problem--it is, after all, often overclockers who push the limits of current generation architectures and empower consumers. It is important to note, however, that thermal issues when overclocked are secondary to efficiency and power consumption for well over 99% of all computing applications. I work in HPC and obviously care about eeking out performance from my platforms, but I have never overclocked a CPU. A modest performance increase is completely secondary to jeopardizing the reliability of a computer system. As far as I am concerned, this particular critique is irrelevant, and I think that many other lay people and professionals would feel the same way. I am much more interested in knowing if the logevity of the new chips is commensurate with that of the previous generation.
      • by spire3661 (1038968) on Saturday April 28, 2012 @01:18PM (#39832563) Journal
        Overclocking had its day back when the Celeron 300A was out. Now its all poseurs OCing to get a few more framerates and burning out their CPUs. Very VERY few of them OC it for anything more then penis. Sure you'll get some folding guys or dudes running triple 4k monitors. When I OC'd back in the day it was so i could MOVE faster in Quake 3, not so i could post benchmarks. Overclocking should be used to reach a performance level you couldn't otherwise get with money.
        • Re: (Score:2, Funny)

          by Anonymous Coward

          I also care deeply about how other people use their hardware, and would like to subscribe to your newsletter.

        • As a card-carrying dyke who's fully paid up on her membership dues, I couldn't care less about e-peen.

          My computer is overclocked because I do a lot of video encoding. For gaming it doesn't make a difference (and if you check the logs, most of the time in most games the CPU underclocks itself to 1.6GHz anyway), but when you're doing a video encode, particularly a large video encode or a lot of transcoding (the kind of operation that will keep your CPU pegged for 3 days in a row, and I'm talking about a Core

        • by hairyfeet (841228)

          Not to mention how many actually DO OC their chips now? 2%? 3%? if it reached even 5% frankly I would be amazed. i too OCed back in the MHz wars but that was because both the hardware AND the software was leaping ahead so quickly that unless you were uberrich that machine you bought last week would have struggled to run the latest software sometimes not even a year later. My CPUs went from 300 to 733 to 1100 to 1700MHz and that was like a 4 year stretch max and I wasn't always able to be the latest and grea

          • by citizenr (871508)

            Not to mention how many actually DO OC their chips now? 2%? 3%? if it reached even 5% frankly I would be amazed.

            Intel wouldn't be actively fighting OC in his products if it was only 2-3%. 775 socket was the last one that allowed you to OC the shit out of every cpu you put in it (Im typing this from C2D 2.4@4GHz, Celeron 420 1.6@3GHz before that). Basically almost 100% more cpu with few bios changes, a lot more fps in games (especially on weaker cpus).
            It used to be possible to buy $25-40 cpu and OC to $100-150 performance level.

            This time has passed with new sockets. Now Intel has OC tax, OC'able CPUs start at $200 and

            • by hairyfeet (841228)

              But what you are talking about is not because the market is large, quite the contrary, it is because Intel knows that the market isn't large and wants to be able to force an upsell on those that are into OCing. I mean why let someone OC a $200 CPU and get the same performance as a $400 CPU when you can force them to buy the $400 CPU just to be able to OC at all? This also lets them build OCing as a market for the "elite" and as we have seen many will try to buy the more expensive unit if they think it someh

              • by drsmithy (35869)

                But what you are talking about is not because the market is large, quite the contrary, it is because Intel knows that the market isn't large and wants to be able to force an upsell on those that are into OCing.

                If you think Intel don't let you overclock so they can upsell to some single-digit (being _very_ generous) percentage of customers (who are nearly entirely budget-driven in their purchasing), you're off in la-la land.

                • by hairyfeet (841228)

                  Uhhhh...you haven't actually BEEN to any of the OCing forums of late, have you friend? the guys on those sites are frankly spending more on their liquid coolers than I did on my whole PC. they are RAIDing SSD, dual and triple SLI or Xfire is the norm many are sporting 24 and 32Gb of RAM, hell check the top 30 leaderboards of any OCing forums these guys are NOT cheap.

                  So I'm sorry friend but while OCing USED to be poor folks squeezing a little more performance out of a Celeron 300A those days are long past.

                  • by drsmithy (35869)

                    Uhhhh...you haven't actually BEEN to any of the OCing forums of late, have you friend? the guys on those sites are frankly spending more on their liquid coolers than I did on my whole PC. they are RAIDing SSD, dual and triple SLI or Xfire is the norm many are sporting 24 and 32Gb of RAM, hell check the top 30 leaderboards of any OCing forums these guys are NOT cheap.

                    I'd be astounded if any more than a vanishingly tiny proportion of people overclocking did this.

                    It's like going to a $CAR forum and concluding

                    • Hairy is correct. The OCing scene is a lot like the PC version of street drag racing. Just as expensive too. OCing used to mean cheating the system. You cranked up a shitty little Celeron 300 with a better HSF and called it a day. Now, it's all about who's the better benchmark queen.

                    • by drsmithy (35869)

                      The OCing scene is a lot like the PC version of street drag racing.

                      Which pretty much proves my point.

                      Unless you think every Tom, Dick & Harry who chips his car's engine or puts on a high-flow exhaust is a street racer ?

                    • by hairyfeet (841228)

                      Thanks, but notice how he STILL doesn't get it? He thinks the poor are still OCing but the simple fact of the matter is that poor folks are happy with what they have, frankly they don't have enough work to push the CPU they have so OCing (which naturally shortens the life of the CPU, just as drag racing shortens the life of the engine) simply holds no appeal to them.

                      But you look at even the top 50 slots on any leaderboard or go to ANY forum that deals in OCing (which of course if one were poor and OCing th

            • by drsmithy (35869)

              Intel wouldn't be actively fighting OC in his products if it was only 2-3%.

              Intel don't block overclocking because of some insignificant number of computer enthusiasts, they do it to help prevent widespread fraud by unscrupulous operators who pass off overclocked chips as genuine.

        • by drsmithy (35869)

          Overclocking should be used to reach a performance level you couldn't otherwise get with money.

          When the 300A was released, you could already buy a full Pentium 2 at 450Mhz.

          • by toddestan (632714)

            Depending on what you were doing, the full speed on-die 128k L2 cache on the overclocked 300A could be faster than the half-speed off-die 512k cache on the P2.

    • E1 stepping Ivy 3770K CPU can hit 4.6Ghz (while keeping temps under 70'C benched), the current Sandy Bridge 2600K CPU easily reaches 4.9Ghz under the same conditions, that is 300Mhz more, the power consumption after overclocking is greater too, for enthusiasts this means no deal. Hopefully this is only an early model issue, people are now waiting for a new stepping or a different Ivy line up.
      • The general benchmarking consensus seems to be that IB is ~10% faster than the comparable SB chips, without OCing. Throw that "invisible" 10% onto the OCed clock speed, and IB should still be coming out ahead, at a SB-style 5.1 GHz.

        (I haven't seen any OC benchmark comparisons yet, to see if this is actually true)

    • by Anthony Mouse (1927662) on Saturday April 28, 2012 @12:42PM (#39832331)

      This is true as far as it goes, but the behavior when overclocked is telling for more than how well you can overclock: At the risk of stating the obvious, the chips the overclockers are having heat issues with are the ones Intel is manufacturing. That means Intel isn't going to be able to ramp the clock speed very easily for the same reasons that the overclockers are running into trouble, unless there is some significant and avoidable flaw in the chip or the process that they can remove in future revisions.

      On the plus side, this gives AMD a little breathing room to try to catch up a little.

    • by Kjella (173770)

      Even though TDP has gone down from 95 to 77W the die size has shrunk from 216 to 160 mm^2, so energy density is up from 0.44 to 0.48 W/mm^2. It's probably getting harder and harder to make heat sinks to spread it effectively enough, particularly with overclocking. For the non-overclocker I'd say the new chips are clearly better though as they're fan noise, battery life and electricity-bill friendly with a small boost in performance and $5-10 cheaper than the equivalent SB. And a better IGP if you'll ever us

    • by Mattsson (105422) on Saturday April 28, 2012 @01:41PM (#39832723) Homepage Journal

      Also, if a large part of the reason why the Ivy Bridge CPU runs hotter is the smaller area of the chip and the changed thermal interface materials, this means that while the new CPU chip might run hotter than the previous one, it doesn't put out more heat.
      The CPU is hotter but the heat sink is cooler since the energy can't be transferred from the chip to the heat sink fast enough.
      If this is the case, then Intel need to do something about the CPU package before going to higher frequencies.
      It also means that people needing the extra heat in their cold rooms would be disappointed since the heat output would be lower, not higher. ;-)

      • Intel need to do something about the CPU package before going to higher frequencies

        This is the story of die shrink - more performance per area, less heat per performance, but more heat per area.

        I remember when my 486's ran without any passive (much less active) cooling at all. Today even my Atoms struggle with passive cooling solutions.

    • Typical Slashdot sensationalism to leave that out of the headline. I clicked expecting another Prescott/Pentium D fiasco, but no. It's not even some kind of non-story with no merit, just being misrepresented by the submitter. It was even tagged "false" in Firehose and got posted as-is anyways.
  • Speed? (Score:2, Insightful)

    by cbreak (1575875)

    Ivy also requires a much bigger step up in voltage to hit the same speeds as Sandy Bridge.

    I get the feeling that they have very weird notions about what constitutes CPU Speed...

    • by Surt (22457)

      They omitted the word clock, but I'd still say their meaning was clear.

  • by edxwelch (600979) on Saturday April 28, 2012 @11:59AM (#39832115)

    Remember a year ago Intel was bragging about their new 3d tri-gate process would be 50% more power efficient: http://www.intel.com/content/www/us/en/silicon-innovations/standards-22nm-3d-tri-gate-transistors-presentation.html [intel.com].
    Comparing the i7 3770K against the 2600K, which is clocked at the same frequency it's only 17% more power efficient: http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/20 [anandtech.com]
    Also you have to bare in mind some of the power saving is due to the DDR controller power gating

    • by gstrickler (920733) on Saturday April 28, 2012 @12:25PM (#39832249)

      Read the Anandtech review, that's total system power consumption. If you compare just the CPU power consumption it's ~33% more power efficient (66W increase from idle to load for Ivy Bridge vs 98W increase for SB). And if you look at the GPU intensive comparisons, IB is ~20% more power efficient, but that's including a ~33% increase in GPU cores and an increase in GPU clock, for an ~40% increase in performance while using 20% less power. For the first generation chips on a brand new production process, those are very good results. I expect to see them improve as their 22nm tri-gate process matures.

      • People are just whiny these days. They expect every new product to completely blow away the old one in all respects, and cry when that doesn't happen.

        Same shit with the nVidia GTX 860. It is by all accounts a great card. The fastest single consumer GPU ever, much more efficient power wise per work done, emits less heat, and so on. However it doesn't completely crush the previous generation of hardware. It is only faster, not crushingly faster. So people get all mad about it as though if evil nVidia had just

    • by SurfsUp (11523)

      Remember a year ago Intel was bragging about their new 3d tri-gate process would be 50% more power efficient: http://www.intel.com/content/www/us/en/silicon-innovations/standards-22nm-3d-tri-gate-transistors-presentation.html [intel.com].

      Yes, the slides are unambiguous: "Greater than 50% reduction in active power going from 32nm to 22nm". Now, Intel tells us they have been predicting modest efficiency gains all along. For the last few months mabe. The truth is, Intel realized months ago the process would not meet expectations and already fired up the spin machine back then.

  • by ganjaganja (2031696) on Saturday April 28, 2012 @12:04PM (#39832151)
    Not all of us do overclocking. Subject is misleading.
    • by arbiter1 (1204146)
      I agree, i mean only enthusiast overclocks really, for MOST people stock speeds cpu is way more then fast enough. I am a pretty heavy gamer and i have a first gen i7 870, most Overclocking i do on it, is the turbo boost from 2.9 to 3.2 ghz built in the cpu and its a fast cpu, on everything i use it for. No need to overclock it.
    • by Mashiki (184564)

      If you have a sandybridge CPU you overclock by default. The CPU supports dynamic OCing to increase performance based on available cooling capacity, and CPU/GPU demand.

    • I think these days technically ALL of us do overclocking thanks to Turbo Boost and similar such technologies which up the frequency of the processor when only some cores are loaded. [intel.com]

  • by game kid (805301) on Saturday April 28, 2012 @12:21PM (#39832233) Homepage

    So lemme see here...Intel's new CPU dies are now smaller (good), which makes them less dissipative of heat (bad), so they decide to use worse thermal paste stuff?

    Seems legit.

    • so they decide to use worse thermal paste stuff?

      I don't think I've used Intel or AMD-supplied thermal paste in 10 years - I haven't done the precise math, but I assume I make it back on the electric bill over time by using less active cooling energy. And I rarely overclock anything.

      • by hairyfeet (841228) <bassbeast1968&gmail,com> on Saturday April 28, 2012 @06:42PM (#39833955) Journal
        They aren't talking about using thermal paste on the actual heatsink friend, they are talking about using thermal paste between the actual chip itself and the INSIDE of the heatsink which you then personally use whatever compound you wish. You see this is why i don't believe TFA because all of those rumors have so far been based on engineering samples which are just that, some samples of an unfinished chip given to reviewers. i just can't see a company as successful as Intel hobbling their latest chips by using some dirt cheap thermal paste at the critical juncture between the actual die and the heatsink just to save a few pennies.
    • by bloodhawk (813939)
      As others have pointed out, the /. summary is highly misleading, the article is about heat at overclocked speeds, The CPU's disipate heat and use less power quite happily at normal operating clock speeds,
  • Emphasis on sample
    If the retail releases also have this issue, then its newsworthy
  • The majority of people do not overclock their CPUs so this is not an issue for the majority.

  • Power consumption varies with the square of the voltage (p=v^2/r) while the power consumption varies linearly with the frequency, if it takes signicantly more voltage to over clock then it's no wonder the power usage is so high.

  • Looking at the pictures in the Overclockers.com link, you'd probably get better thermal dissipation if had Intel left the heat spreader off, with nothing expect the protective overcoat on the back of the chip.

    Actually, I bet modders are going to start cracking the IHS off for that very purpose, in order to directly contact their heatsinks.

  • This is a Feature (Score:4, Insightful)

    by Sarusa (104047) on Saturday April 28, 2012 @05:48PM (#39833781)

    It's hotter when overclocked. Overclockers love having to run pipes and submerge things. How are you going to justify hauling out the liquid nitrogen if it's running cool?

    Meanwhile everyone else is happier that it runs cooler, takes less power, is faster, and even costs less than Sandy Bridge.

    This is Win Win, people.

  • Seriously. What kind of quantum chromodynamics calculations are you simply not getting done today?

  • by sidthegeek (626567) on Saturday April 28, 2012 @08:27PM (#39834445)
    This brings new meaning to the term "burning bridges". ;-)

I bet the human brain is a kludge. -- Marvin Minsky

Working...