Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

Is Overclocking Over? 405

MrSeb writes "Earlier this week, an ExtremeTech writer received a press release from a Romanian overclocking team that smashed a few overclocking records, including pushing Kingston's HyperX DDR3 memory to an incredible 3600MHz (at CL10). The Lab501 team did this, and their other record breakers, with the aid of liquid nitrogen which cooled the RAM down to a frosty -196C. That certainly qualifies as extreme, but is it news? Ten years ago, overclocking memory involved a certain amount of investigation, research, and risk, but in these days of super-fast RAM and manufacturer's warranties it seems a less intoxicating prospect. As it becomes increasingly difficult to justify what a person should overclock for, has the enthusiast passion for overclocking cooled off?"
This discussion has been archived. No new comments can be posted.

Is Overclocking Over?

Comments Filter:
  • First post! (Score:5, Funny)

    by Anonymous Coward on Thursday December 22, 2011 @05:43AM (#38457776)

    Why? Because I've overc locked, so I'm faster than y'all!

  • No (Score:5, Insightful)

    by iB1 ( 837987 ) on Thursday December 22, 2011 @05:45AM (#38457786)
    Overclock your smartphone or tablet instead
    • Re:No (Score:5, Insightful)

      by robthebloke ( 1308483 ) on Thursday December 22, 2011 @05:49AM (#38457800)
      which kinda defeats the point of the industries drive towards more efficient devices with longer battery life. I overclocked my netbook once. Most pointless thing I've ever done. It's now underclocked to eek out a little more battery life.....
      • Re:No (Score:5, Interesting)

        by Anonymous Coward on Thursday December 22, 2011 @05:55AM (#38457836)

        I overclock my nook color from 800 mhz to 1200 mhz. I overcock my phone from 1 ghz to 1.4 ghz. My phone CPU's voltage doesn't change one bit and my Nook Color's CPU voltage is mildly higher. The CPU is far and away one of the least power consuming components of these devices -- the NC's screen uses around 1W and the cpu about 35mW. Unless you're overclocking the LCD, it the change in battery life is infinitesmal.

      • Re:No (Score:4, Interesting)

        by repvik ( 96666 ) on Thursday December 22, 2011 @06:02AM (#38457878)

        You'll save a lot more battery if you undervoltage your CPU. Eg. My Galaxy Nexus by default runs at 1350mV. I can run it perfectly fine on 1200mV, even overclocked to 1,4GHz. By my (possibly completely wrong) logic, the faster the CPU runs, the shorter time it spends in higher voltage states. Thus, overclocking and keeping the same voltage (or even undervoltage) actually saves energy.

        (Of course, underclocking also achieves the same since the voltage is lowered automatically. But then I've got a slower device rather than a faster device, while using more or less the same amount of juice.)

        • Re:No (Score:5, Informative)

          by DeathToBill ( 601486 ) on Thursday December 22, 2011 @06:10AM (#38457912) Journal

          Your logic is wrong.

          Every time a FET switches, it requires a certain number of electrons to move to or from the gate to create an electric field in the substrate to open or close a conducting pathway. This is a current flowing through a reistance and it dissipates power as heat. Assuming that the leakage current on the gate is very small compared to the switching current, the energy required to switch the FET (call it Es) is constant regardless of the clock speed. So the power dissipated by each FET (call it Pf) is:

          Pf = Es x fc

          where fc is the clock frequency in Hertz.

          Why do you suppose that frequency scaling is an effective way of saving power?

          • http://www.lesswatts.org/projects/applications-power-management/race-to-idle.php [lesswatts.org] suggests that it's better to run faster for a short time than to run slowly.

            • Of course it does, because there is another part of power consumption for electronic devices, that does not change with the frequency.
              But this assumes you are going 100% cpu load over the whole time in both cases.
              Not very likely.

              • by dgatwood ( 11270 )

                If it isn't going near 100% for a significant percentage of that time, the CPU should have been scaled back to a lower clock speed and, likely, a lower core voltage, so I would argue that not only is it likely, it should be nearly guaranteed. Am I missing something?

                Yes, I know there are sometimes performance reasons to leave the CPU going at a higher speed for short periods of time just in case it is needed for something else high-power shortly thereafter, which makes the relationship between clock speed a

          • Some processors benefit from undervolting and some don't, and I have no idea what the difference is, but there must be one, because it works better for some processors than for others. It is said that undervolting most big and powerful processors makes very little difference in power consumption or heat dissipation, and only switching the transistors less (typically through clock reduction, but intel will switch off whole cores now, and IIRC AMD can shut off groups of cores if you have 6 or more of them) ac

          • Re:No (Score:5, Informative)

            by gmarsh ( 839707 ) on Thursday December 22, 2011 @07:40AM (#38458302)

            Actually, voltage matters substantially.

            The gate of a FET is effectively a capacitor. Even with the FET in the on state, if you keep increasing the gate voltage it'll still keep taking electrons. And like a capacitor, energy stored in a FET gate = 1/2*C*V^2. You also have source/drain and gate/drain (miller) capacitance - source/drain has to be discharged (another 1/2CV^2 loss) and the miller capacitance has to be discharged and then charged at the opposite polarity (a CV^2 loss).

            Overall, neglecting leakage current, power loss is proportional to frequency, but it's also proportional to voltage squared.

            Power loss is also proportional to transistor count, which is why ARM is such a low power processor.

        • by Calydor ( 739835 )

          Diminishing returns, basically.

          If you for instance allowed it to slowly render that huge page you're looking at, working in the background while you were reading what was already rendered, you wouldn't have a lot of wasted power/time while just reading.

          It's like fuel economy in a car - the car has a 'best speed' for the amount of miles it'll go on a gallon. Same thing is true for CPUs.

      • Re:No (Score:5, Funny)

        by StripedCow ( 776465 ) on Thursday December 22, 2011 @06:17AM (#38457932)

        That's why you should have overclocked your battery too.

      • by Nursie ( 632944 )

        N900 - overclocking involves lowering the voltages and increasing the maximum burst speed, theory being that if you get the work done faster you can go to sleep sooner and save power.

        I'm not sure it really works, but it does make the UI more responsive.

    • Why?!

      What are you doing with your phone where say 10% will make much difference? Mid-range smartphones are already multi-core with hardware accelerated graphics and 512MB RAM or more. They're happily playing GTA3 now. Wait another couple of years and they'll probably be playing GTA IV. Graphics rendering is massively parallel and so easy to improve just by packing in more transistors. Better to just wait for the performance to double a few times rather than try to get tiny performance gains with exponential

      • My first computer cost £100-200
        My second computer cost £100-200
        My third computer cost £100-200
        My fourth computer cost £100-200
        My fifth computer cost £100-200
        My Smartphone cost £100-200

        All ran all the apps I wanted to when I bought it but were too slow for new ones ...

        I spent most of my time running apps that did much the same things on all of them, (web browsing, email, programming)

        Overclocking only extends the useful life of a computer, by reducing it's lifespan ...

    • I would if I could, but I'm not that good at tablet hacking. Currently it can almost, but not quite, handle playing episodes of Friendship is Magic. I want to be able to watch those while on train journeys, but in high-motion scenes it struggles no matter what player I use - I suspect because the embedded h264 acceleration isn't being used. If I could get just 10% more processing performance, it should be able to manage.
  • by X-Power ( 1009277 ) on Thursday December 22, 2011 @05:46AM (#38457788)

    For me, It's fun and I could care less what some dude did with liquid nitrogen.

    First computer, I just used Asus Overclock and felt I got more for my money.
    Second computer, I started fiddling with manual settings.
    Third computer I pushed it until I couldn't get rid of the heat with air cooling.
    Fourth and current computer, water cooled and running awesome (6 cores at 4.3 GHz).

    Each time I felt the progress, it's like leveling your character, but the character is you, and the game is real life!

    • by jones_supa ( 887896 ) on Thursday December 22, 2011 @06:03AM (#38457884)

      For me, It's fun and I could care less what some dude did with liquid nitrogen.

      About this post, it's hard to determine whether this should be "could care less" or the classic "couldn't care less". :)

      You could be interested about liquid nitrogen as you are an overclocker or, you're not as you don't want to go to such an advanced level it just being a fun hobby.

    • I always found that overclocking was very anti-climactic. It's noticeable if you have a really awful computer, but if your computer is already running okay then it makes no difference to add a little extra performance. It's like adding more RAM. There are less slowdowns, but technically nothign is actually speeding up.

      Also to me it still sounds like you're levelling something external, ie your computer. Levelling your knowledge very, very slightly too, but it's nothing compared to the levelling you'd feel i

  • Maybe, maybe not... (Score:5, Interesting)

    by RogueyWon ( 735973 ) * on Thursday December 22, 2011 @05:50AM (#38457806) Journal

    From a gaming perspective (typically one of the big drivers of overclocking), a few factors that might argue "yes, it's over":

    1) For quite a few years now, PC games haven't been forcing the kind of upgrade cycle that they did over the previous 20 years. When Crysis appeared in 2007, it was a game that gave many people an "upgrade or don't play it choice". And after that... the industry retreated. Consoles were the primary development platforms at the time and few PC games pushed significantly past the capabilities of the consoles. Not only did we not see any games more demanding than Crysis, but the vast majority of PC games released were substantially less demanding. As a gamer, if you had a PC that could run Crysis well, you did not need an upgrade. This situation lasted 4 years.

    2) Performance has become about more than clock-speeds. The main advances in PC gaming technology over the last few years have come from successive versions of directx. You can't overclock a machine with a directx 9 graphics card so that it can "do" directx10. Same goes for dx10/11.

    3) As the entry barriers to PC gaming get lower, the average knowledge level of users fall. PC gaming is, in general, easier and more convenient than it has been at any time in the past. Pick up an $800 PC, grab Steam and off you go. If you just want to play games and are using an off-the-shelf PC from a big manufacturer, you don't need to worry about switching around graphics drivers, sorting out hardware conflicts or any of the other little niggles that used to make PC gaming such a "joy". You can even find cases where PC gaming is easier than console gaming; the PS3, with its incessant firmware updates and mandatory installs has taken us a long way from the "insert game and play" roots of console gaming. People who are new to PC gaming just won't be coming from the kind of mindset that even considers overclocking as something you might even remotely want to do.

    4) Among "old school" PC gamers, I think there's been a growing recognition that overclocking has its downsides as well. In an economic downturn, when money is tight, you don't necessarily want to go risking a huge reduction in the lifespan of your expensive toys.

    That said, there are a couple of factors that might argue the other way (closely connected to the earlier arguments):

    1) System requirements are finally on the move again. After years in stasis, 2011 has seen the release of a number of games with equivalent or higher requirements than Crysis. Bulletstorm started the trend, but Battlefield 3 and - to an even greater extent - Total War: Shogun 2 have really started to push the envelope on PC hardware. A lot of developers openly admit to being bored with console hardware. Even though they still get most of their sales from the consoles, they are using the PC to push beyond what they can achieve there, both to get their studio noticed and to get themselves ready for developing for the next round of console hardware.

    2) The downturn also means that people feeling a squeeze on their budgets may be looking to get as much bang for their buck in terms of performance as possible. If you think that your new, overclocked PC will last long enough that you will be able to afford a replacement when it does start to give out, then why not take the risk?

    • by ripdajacker ( 1167101 ) on Thursday December 22, 2011 @06:14AM (#38457924) Homepage

      I think you're right. I've overclocked my i5 750 from 2.66 to 3.15, and the speed increase is.. well hard to spot. In benchmarks I certainly see it. It was much easier to do than in the good old days where it was jumper settings.

      I think the gist of it, at least for me, is that there's fun in it anymore. I have relatively high end gear, at least at time of purchase, and it all basically guides you to overclocking. It's not as bad ass as it used to be.

      This may be a bit biased since I now have much larger sum of disposable income compared to when I was overclocking.

    • by LoRdTAW ( 99712 )

      2) Performance has become about more than clock-speeds. The main advances in PC gaming technology over the last few years have come from successive versions of directx. You can't overclock a machine with a directx 9 graphics card so that it can "do" directx10. Same goes for dx10/11.

      Not necessarily, video cards are the dominant force in today's gaming rigs. The CPU has taken a back seat to the GPU as both graphics and Physics calculations are ran on the GPU. If anything, GPU overclocking should be the focus.

  • by 1s44c ( 552956 ) on Thursday December 22, 2011 @05:50AM (#38457812)

    Few people have any real need to sacrifice stability for a little more speed. Overclocking is pretty pointless for anyone with a modern CPU.

    • by petes_PoV ( 912422 ) on Thursday December 22, 2011 @06:58AM (#38458112)

      For a small proportion of the population (but, possibly, a large proportion of slashdot-ers) a PC is not a platform for doing useful work or serving entertainment, it's a source of "fun" in its own right. In past decades the people who like to play with their computers would be out in the yard, covered in oil, fiddling with a junky old car, or tuning a valve radio. Now they get their satisfaction from squeezing the last few MHz out of their PCs - whether there is any need or use for those few extra cycles, is immaterial.

      And for those with a more software bent, than a hardware leaning, there's always OSS - which serves a similar purpose.

  • It used to mean windows would run faster, games would run faster, everything was FASTER MAN!!!!!111one.

    But now overclocking for the at home folks is a case of hit a button in your bios, or in some cases a physical button on the motherboard, and it'll do some overclocking for you, automatically. As its become more automated, the news worthy stuff becomes more and more expensive to implement and show off, and so most things are less news worthy and so it appears "overclocking" happens less. In reality I'd expect it happens alot more, and maybe even when people aren't fully aware of what they are doing.

    Also systems being so much faster now, generally provide the speed that users require of them, unless they are the kind of users to be pushing systems to overclock simply for the hell of it, like the guys who get in the news. However you don't see these guys then gaming and getting 200fps on these systems, or anything exciting like that anymore. Its simply overclocked, and shown it to be "stable" at said speed. No one ever goes "lets see how many FPS can we get outta this baby now!", its all become very much a concept thing rather than actually running systems at these speeds for any sensible amount of time.

    • Speaking as someone who was overclocking Cyrix chips and AMD K6s, I'm super-glad that now I can just run a program and have the computer overclock and stress test while I sleep. I bought a 2.8 GHz processor and it goes 3.4 GHz for no additional cost. That's a small bump, but it cost me nothing, so it's very difficult to complain. Every car is different and some are just a little better built than others and could take more tuning, and lo and behold, the car's computer is self-tuning, and tunes itself for ef

  • by Toasterboy ( 228574 ) on Thursday December 22, 2011 @05:53AM (#38457828)

    Look, digital electronics are still subject to analog limitations. When you overclock, you squeeze the hysterisis curve, increasing the probability that your chip incorrectly interprets its the state of a particular bit as the opposite value. i.e. you get random data corruption. This is why you eventually start crashing randomly the more you overclock.

    While overclocking a chip that has been conservatively binned simply to reduce manufacturing costs but is actually stable at higher clock rates is reasonable, trying to overclock past the design limits is pretty insane if you care at all about the data integrity. Also, you tend to burn out the electronics earlier than their expected life due to increased heat stress.

    I never overclock.

    • by nzac ( 1822298 ) on Thursday December 22, 2011 @06:02AM (#38457874)

      You just don't get the overclockers mentality.
      Either is all part of the fun adding to the risk or you are getting the most out what you paid for and are still within stable limits.
      I don't think many overclockers care about random data corruption unless they blue screen or they turn it off when they need stability.

    • by Rockoon ( 1252108 ) on Thursday December 22, 2011 @07:19AM (#38458210)
      While "low end" chips may be conservatively binned to reduce manufacturing costs, thats really not the whole story at all.

      When the highest end chips can be clocked from 3.8 to 4.5ghz and higher using the stock cpu cooler, doesnt it make you wonder why Intel/AMD do not sell any 4.5ghz versions of these chips? Its because the OEM's fuck up case cooling every single time.

      If Intel sold a 4.5ghz i7, Dell would still put it into a case with horrible venting and only a single fan that has been poorly placed, and then Intel would be footing the bill for loads of warranty replacements. The reason the i7 980X's cost so much isnt just because Intel was taking advantage of performance enthusiasts irrationality.. its because the Dell's of the world fuck up cooling every single time. The sandy bridge i7's perform nearly as well but run a lot cooler so can survive the harsh conditions the OEM is going to hamstring them into, and THAT is the main reason why they are so much cheaper than the 980X's.
      • I'm sure it's more to do with the fact that Intel do not want to advertise a CPU with a TDP of 200W.

        • ..yeah.. thats it.. Intel wouldnt want to advertise and sell faster CPU's... because of a number that consumers dont understand or care about...

          Are you not from earth or some shit?
  • by DeathToBill ( 601486 ) on Thursday December 22, 2011 @05:54AM (#38457830) Journal

    Ten years ago, CPU and RAM speed were really big factors in how fast your PC felt. We've spent the last ten years optimising hell out of them, while still using 7200RPM spinning disks (if you're lucky). So, surprise surprise, today disk IO is what limits your PC's performance. Why overclock your RAM? It makes (almost) not difference to your IO speed.

    I got a new laptop just over three years ago. It had a 2.4GHz processor. I got my next new laptop a few weeks ago. It has a... 2.5GHz processor. Clock speeds have become almost irrelevant. What makes the new sucker fly is the SSD. Unfortunately, there is no BIOS setting, however risky, to change from disk to SSD.

    • Hmmmm 5 years ago I was running a single core 1.4ghz

      Now I'm running 6 cores at 2.4 ghz.

      But, yes, your point is valid ;)

  • by Anonymous Coward on Thursday December 22, 2011 @05:54AM (#38457832)

    That's like saying competitive soccer going broke would impact on EVERYONE EVER from playing soccer with their friends.

    Not everyone overclocks to beat a record.
    Hell, "overclock" a toaster if you have to. 2 second cold toast anyone? (the best toast)
    But really, there are still plenty of things you can overclock to beat records, such as what iB1 mentioned up there, overclock a smartphone or tablet.
    Overclock a Beagleboard, or a Raspberry Pi when it comes out, Arduinos. All these compact computers are pretty much sitting around waiting to be hit by the overclocking spirit.

  • No (Score:4, Informative)

    by lga ( 172042 ) on Thursday December 22, 2011 @05:54AM (#38457834) Journal

    No. Next question.

    Seriously though, both Intel and AMD sell multiplier-unlocked CPUs as a feature, and the winners of tests in PC Pro magazine are overclocked by the system builder. You can even buy upgrade bundles pre-overclocked. My latest motherboard came with one-click overclocking software and can adjust the clock speed through a web page while playing a game. Liquid coolers are mainstream. Overclocking is definitely not dead.

  • Huh, no (Score:4, Informative)

    by buserror ( 115301 ) on Thursday December 22, 2011 @05:56AM (#38457844)

    "has the enthusiast passion for overclocking cooled off"

    Not from my 5.0Ghz Core i7 2600k anyway -- The tools have become better, the mobo are generally better built and more tolerant to punishment (some have 2 Oz copper), the power rails are a LOT more controllable than before, and in general the IC companies that make the power ICs have progressed a lot too in that time, so you can overclock easier, quicker, get better results and in general, extract quite a bit more, without nitrogen.

    And, I compile distros all day, to me going from 3.8Ghz max to 5.0Ghz stable (and quiet!) is awesome; make -j10 FTW !

    • by Viol8 ( 599362 ) on Thursday December 22, 2011 @06:27AM (#38457966) Homepage

      [Sultry babe walks up]
      "Hello, and what do you do?"

      [nasal voice]
      "I compile distros all day. Yes, did you know that Slackware on average compiles 20% faster than Debian for 64 bit but if I overclock my Core i7 by raising power rail voltage and tweeking the quantum flux capacitor.... hello, where are you going..hello? Hey, come back, did I mention its a 2600K? Hello?"

      • by buserror ( 115301 ) on Thursday December 22, 2011 @07:13AM (#38458176)

        Is this really "slashdot.org" where "nerds" used to be around ? You know, nerds, who do technically oriented stuff "just because they can" ?

        The various comments on this topic -including the one up- makes me wonder really, or has "nerd" become more of a "I'm such a nerd, babe, look, I installed an app on my smartphone".

        Or /. has been mirrored to "hipster.com" and I'm accessing the wrong portal

        • by Viol8 ( 599362 )

          Oh dear. Someone's had a sense of humour bypass.

          You did rather set yourself up for it saying you compile distros all day. I mean, even for a nerd thats a bit of an odd thing to do. Once a week/month to rebuild a kernel sure, we've all done that at some point, but every day building entire distros? Why??

          • Ever heard of embedded development ? Or, maybe you think that distros themselves just appear magically as an ".iso" file brought by father xmas ? I'm sure you're very proud of having recompiled your kernel at some point, and that seems to have given you enough insight into general software development to make large, broad statements about it all.
            Actually, I /do/ find it funny, but not in the way you probably intended.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Why the fuck would anyone "compile distros all day" on their personal computer? If you're doing it for work, use the work machines. If you're doing it for a hobby, dude, get a better fucking hobby.

  • Gains aren't there (Score:5, Insightful)

    by DNS-and-BIND ( 461968 ) on Thursday December 22, 2011 @05:58AM (#38457858) Homepage
    In the days of the 300MHz Celeron, you could overclock it to 450MHz and gain 50% improvement. That extra 150MHz represented several hundred dollars straight to Intel, which you kept in your pocket by overclocking. These days, a few percent? It's just not worth the trouble any more.
  • Hahahaha (Score:3, Interesting)

    by aitikin ( 909209 ) on Thursday December 22, 2011 @06:01AM (#38457870)
    "...has the enthusiast passion for overclocking cooled off?"

    That's like saying, "Do nerds no longer need a proxy for phallic measurement?" As long as there's still testosterone (even if it is minimal in some here) there'll still be people (men mostly) looking to say "We did it first!"
    • by Viol8 ( 599362 )

      I think even most geeks think of overclockers as a little bit obsessive and kind of out there.

  • by dan_barrett ( 259964 ) on Thursday December 22, 2011 @06:03AM (#38457882)

    I think CPU speed is less of an issue these days; eg Core2 onwards processors are generally "fast enough" for most users.
    Compare the change in noticeable speed between a 386 and 486, or even Pentium vs Pentium 2 or 3, to today's Core2/Athlon vs Core i5/Phenom.
    Most people don't notice the jump in CPU performance on modern processors.

    The other traditional bottlenecks are rapidly disappearing too, eg a midrange Directx10 graphics card is good enough to play all but the most demanding games these days, and memory and disk speed and capacity are generally outpacing most people's demand.

    People will still overclock for the challenge of it, but I think there's no tangible day-to-day benefit anymore.

    As someone above mentioned, the real performance battle has moved to portable devices, eg how much performance can you get from a tablet or phone, given a fixed battery capacity?

  • ... between a tiny bunch of geeks who had more money than sense. Someone should have told them that if they really wanted to play their first person shooter faster they should overclock the graphics card, not waste time on the CPU.

    Yeah , I 'll get modded down for offending the high priest overclockers who read this, but really, if you spend 1000s on a special cooling system for your CPU just so it runs 25% faster so you can get even more unnoticable frames per second you really need to get out more.

    • Don't worry those guys then convince themselves they have a better visual perception than normal people so they don't feel stupid that they paid said 1000's. Much like the audiophile who buys a massively expensive sound system. Anyone asks them the hell why, they subtly (or not so much) imply that they have a hearing range that is far in excess of your standard human, and of course a better appreciation for music anyways :)
      • by Viol8 ( 599362 )

        Isn't that the truth! It reminds me of people who still claim that LPs sound better than CDs even though the LP stereo system is a hack that doesn't always reproduce phase properly and the audio before its recorded on an LP has go to through a compressor first to limit the amplitude because of physical restrictions in the offset of the groove and also the high frequency response is limited because the goove simply can't be machined to undulate enough accuratly enough to reproduce them especially at 33rpm cl

    • by llZENll ( 545605 )

      Not everything is about gaming, I overclocked for faster compiling cycles, and it makes a HUGE difference. And for gaming the market has long figured it out, hence the huge market for overclocking GPUs, you can get custom water blocks, heatsinks, memory coolers, all specifically for video cards. The market for overclocking GPUs dwarfs the market for overclocking CPUs in the yesteryear, people have been overclocking GPUs since they very first came out from 3dfx.

    • by jamesh ( 87723 )

      if they really wanted to play their first person shooter faster they should overclock the graphics card, not waste time on the CPU.

      I always thought they did that too?

      Yeah , I 'll get modded down for offending the high priest overclockers who read this, but really, if you spend 1000s on a special cooling system for your CPU just so it runs 25% faster so you can get even more unnoticable frames per second you really need to get out more.

      Maybe it's been a while since you were a kid but there's something enticing about "sticking it to the man"... robbing Intel of those few $$$ by taking a cheap CPU and running it as fast an expensive CPU. Intel (and AMD probably) know exactly what's going on and how best to make money out of it though :)

      • by Viol8 ( 599362 )

        Sure, you stick it to Intel. But this grand gesture then means you give twice the extra money you've have spent on an equivalent out-the-box CPU to some other faceless corp who provide overclocking kit and who may be just as venal and grasping as Intel/AMD/whoever. Plus you reduce the life of your CPU. *shrug*

      • Yes, in fact, any decent graphics card now has overclocking options right in the settings, so that you don't have to use a third party tool that may do it wrong and confuse the driver. For nVidia, for example, you just bring up the nVidia control panel and select "Performance" -> "Device settings" from the tree and you can fiddle with the clocks and on some cards even the voltages.

    • by ewhenn ( 647989 )

      Not sure what you are on about spending $1000 for 25% more performance. I have a cheap $22 CM212+ cooler, that's a pretty far cry from $1000. The gains are absolutely worth it, I have a 2500K that has a stock speed of 3.3 GHz, it's overclocked to 4.5 GHz, or about a 36% increase.

      $22 for 36% more performance is absolutely worth it, maybe not for gaming now, but it's definitely useful for other tasks.

      Personally, I do a decent amount of encoding video files, and the speed increase is absolutely time saving.

  • by NSN A392-99-964-5927 ( 1559367 ) on Thursday December 22, 2011 @06:24AM (#38457960) Homepage

    When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is probably wrong.

    Therefore, I rest my case.

    Cheers Arthur a true friend who is missed but still there :)

  • PC's are so fast these days for our simple minds there is no longer a need to overclock.

    And GET OFF MY LAWN!

  • by llZENll ( 545605 ) on Thursday December 22, 2011 @06:37AM (#38458018)

    Well let me dig up my test results spreadsheet from when I first got my 2500K CPU, times are in seconds to complete my task in Visual Studio 2010, first set of numbers is the system at stock clock, second set is overclocked at 5GHz, during my game development most of my day consists of building the game, loading the game and testing out changes or additions, therefore the reduction from doing that in 32s vs 21s is absolutely huge, even doing code changes that don't require a total rebuild I am waiting 3s less. It may not sound like a lot but when you are focused any time saved is very important, you can only be focused for so long.

    build debug from clean 12.9 6.9
    built already, go and load all effects and units 8.2 5.6
    at title screen all loaded, start medium map 19.6 14.3
    modify main.h build load to splash scrn 3.4 2.1
    modify main.h load into medium map 31.9 20.9
    modify main.h optimal load no sound, small map 16.9 10.3
    running in game, modify main.h apply changes 10 6.7
    average 14.7 9.5

    system is 2500K, C300 SSD, 16GB memory

    • by jamesh ( 87723 )

      There's always someone who comes along and spoils an argument with facts and evidence :)

    • Just to add another data point, I was able to push my 2500k to 5.0GHz (liquid cooling) and it ran Prime95/etc. stable ... but did get fairly warm ... and there was an intermittent bug on restore from sleep that may be fixed in the next version of ASUS Bios.

      So I backed off to 4.7GHz ... runs a lot cooler and has been rock-solid stable. So basically got an extra GHz in performance for free ... and yea, as the OP says, all those reductions in time add up and make for a more pleasing experience.

      • Restore from sleep, I find that very interesting. I just installed a bios update to my Gigabyte Ga-MA770T-UD3P 1.0 and fixed a sleep problem that I never had before overclocking with AMD overdrive. At least, I think that's what fixed it, I didn't change anything else, but there could have been a windows update in there someplace.

  • by damn_registrars ( 1103043 ) <damn.registrars@gmail.com> on Thursday December 22, 2011 @06:44AM (#38458048) Homepage Journal
    I tend to underclock more often now, to reduce power consumption on my systems. Of course, I don't play any games on my systems, so I am almost never pushing the capabilities of the hardware.
  • -ster.

    the hardware vendors devote tens of billions of dollars every year to keep up with moo re's law. this has lead to the fallacy that CPU power, ram and storage are infinite, so many of today's coders don't even bother to optimize their code.

    consider initializing a 2d array in a nested loop. if you increment columns in the inner loop, the memory cache will speed you up. but a dumb mistake could increment rows instead. in that case the cache actually slows your code down dramatically.

    it is wrong that

  • 1. People who do overclock and reap the benefits.

    2. People who don't and like to moan about it.
  • with the aid of liquid nitrogen which cooled the RAM down to a frosty -196C [...] has the enthusiast passion for overclocking cooled off?

    I see what you did there.

  • Overclocking is fun. Maybe you'll find a new new cooling method too! http://www.bbc.co.uk/news/science-environment-16285036 [bbc.co.uk]
  • I remember how in 1998 it was said that overclocking is dangerous and that it can kill your processor.
    My dad still use nowadays my old overclocked Intel Celeron 300A. I only had to reduce the RAM usage (one bank is dead and searching for a new one is not worth it).

    • I remember how in 1998 it was said that overclocking is dangerous and that it can kill your processor.

      back then, it could. If you had clocked up those 300As just a little more you could have burned them up real good. modern CPUs have thermal protection and it's relatively difficult to kill them by overclocking.

  • by C_Kode ( 102755 ) on Thursday December 22, 2011 @09:28AM (#38459074) Journal

    I haven't overclocked since I bought an 1.3Ghz AMD Duron around 2002. It didn't have much headroom though to OC. I think I could only get it to around 1.4Ghz and change.

    Lately I don't see much use in OCing. Chips are plenty fast enough for almost anything you throw at them today and you can buy faster chips relatively cheap in the next year or so when your current chip is becoming insufficient.

    Both the Intel i5-2500k and AMD Phenom II X6 1100T sit around $200. They aren't the fastest on the market, but at around $200 they are cheap any easily fast enough to handle anything you throw at them.

  • by eepok ( 545733 ) on Thursday December 22, 2011 @12:08PM (#38460816) Homepage

    I overclocked first computers (2000-2004). I bought a budget system in parts, put it together, got online, and learned that I could make my computer even faster with a little risk and careful effort.

    But then the prices of components began to fall and I stopped overclocking new rigs 2004. Why? Because a normal $30 heatsink was barely enough to keep some of the hotter processors cool without overclocking... and I was not willing to risk losing my processor for a few more FPS in Counter-Strike or whatever I was playing that month.

    Fast-forward to now, I still leave my main computer on 24/7, but as a career-person, I need to save more (house, retirement, vacations to placate the lady) and spend less on utilities. I also have less time to clean the dust out of computer cases that effectively had hoovers for cooling. So where I used to go for a balance of cost, heat, and overclockability, I now look at cost, heat, and power consumption. I now take pride in being able to comfortably play modern games (though not at max settings) on a rig supported by a 260 watt power supply. I have no guilt leaving that on overnight.

    Note: I never got into water-cooling. I never had the space or disposable income to mess around with the kit or the risk.

    • Here here. Low power is the new stupid (half kidding) thing to obsess over. My most-used home computer with a UI is an Atom. In 2010 when people were drooling over how great Sandy Bridge might be, and how much kickass-per-$ the X6 Phenoms offered, I was looking for an Athlon II 240e for my server to downgrade to (eventually finding, to my joy, a 610e for sale, so that I could finally pay $130(?) for the downgrade), just so I could say I had a 45W-TPD-but-still-reasonably-powerful-for-transcoding CPU. No

  • by Animats ( 122034 ) on Thursday December 22, 2011 @12:16PM (#38460906) Homepage

    Many industrial PCs are underclocked. They have more CPU power than they need, and they need more reliability and temperature range than the consumer manufacturers provide.

    The end of overclocking is coming anyway, because speed of light lag across the chip, rather than transistor switch time, is becoming the bottleneck. No amount of cooling will help with speed of light lag.

  • by BLKMGK ( 34057 ) <morejunk4me.hotmail@com> on Thursday December 22, 2011 @01:20PM (#38461688) Homepage Journal

    I still overclock and nearly every PC I've ever owned has been overclocked to include an 8088 clocked up with a radio crystal back in the day (not a great idea). I was playing with water cooling and Peltiers before you could buy ANY hardware for that off the shelf too. Cut down heatsinks, PVC caps, fountain pumps, and overseas sourced Peltiers made for some really quick computers for their time! Games were fast, looked great, and I ran RC5 cracking programs to use up idle cycles for years.

    Fast forward to the present. I still game but I am not quite into the really crazy high end stuff. I still use a PC for gaming almost exclusively. I no longer run programs in the background to eat up spare cycles and the cooling of my room thanks me for it. I AM running a water cooled CPU though using mostly off the shelf stuff that doesn't leak, my CPU is rock stable and not quite pushed to the edge. I upgraded my computer in the not so distant past for more speed and I'm pondering doing it again to the later SandyBridge architecture from my older i7 920 (4.1ghz). I'm also looking at the new 6core CPUs that have come out but they strip H.264 instructions apparently. :-(

    Why? Well it certainly isn't gaming since right now games seems woefully poor at using multiple cores! Now I have another "hobby" and that is compressing video. I buy BluRay, rip them, and put them on my personal server for viewing on efficient Atom powered STBs (overclocked though lol). When I was doing this with a C2D running in the mid 3-4ghz range some movies like Watchmen took 8 hours or more to encode with my high settings. Now I can do a movie in 2 hours or less while still having CPU available to do other things. If I move to the more efficient CPUs produced now, and especially if x.264 supports their ENcoding instructions one day, my times will drop again as I should be able to hit close to 5ghz. At that point I'll either encode with higher settings or just enjoy that it's as fast as it's going to get. I boot from an SSD so that's quick enough. My video card is a fairly pedestrian GTX275 which might get a bump too, I'm not sure.

    I have tinkered with using the GPU to encode as well. Right now my CPU alone can keep up with encoding on my GPU alone but mixing my GPU and CPU together (I found ONE package doing that and it wasn't x.264) was noticeably faster but severely limited my encoding options so I've stuck to CPU brute force. I'm hoping that with CUDA being open sourced more programs will begin using the GPU too.

    I've processed A LOT of video and I do video for friends too sometimes. Being able to run tons of apps, lots of browser windows, and generally not care too much about what is and isn't running is a side benefit. I may try BF3 out but doubt it'll be so much better than UT2K4 that I'll be sold as it will likely be exponentially more difficult to play. I'd love to find more things to do with the CPU power I have and I do try to use power wisely. My server(s) are actually underclocked and sleep their drives when not being accessed, my video front-ends draw less than 15watts apiece, the PSU in this box is Silver rated and under 650watts. Rendering or compiling code would be fun but I am neither developer nor artist. Those of us who are could certainly find value in overclocking! I know a certain Apple guy who was pretty butt hurt his 8 core powerhouse costing quite a bit more than my computer couldn't encode video as quickly when he challenged me :-)

    P.S. Yeah, I tinker with cars too, it's fun. I also laugh at those who talk about "shortened CPU lives" - get a clue. I have had exactly ONE CPU die and that was within the first 24 hours - warranty replaced. I've had overclocked CPU go for 5 years or more being passed down with no issue. This 920 has seen temps as high as 90C under full load for hours at a time when I had voltages too high and it's still ticking fine. My current peak is recorded as 75C. If you REALLY want to drop some heat water cool the video card, sadly these water blocks tend to be pretty custom and I don't do it since an upgrade on the video means a costly new block.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...