Is Overclocking Over? 405
MrSeb writes "Earlier this week, an ExtremeTech writer received a press release from a Romanian overclocking team that smashed a few overclocking records, including pushing Kingston's HyperX DDR3 memory to an incredible 3600MHz (at CL10). The Lab501 team did this, and their other record breakers, with the aid of liquid nitrogen which cooled the RAM down to a frosty -196C. That certainly qualifies as extreme, but is it news? Ten years ago, overclocking memory involved a certain amount of investigation, research, and risk, but in these days of super-fast RAM and manufacturer's warranties it seems a less intoxicating prospect. As it becomes increasingly difficult to justify what a person should overclock for, has the enthusiast passion for overclocking cooled off?"
First post! (Score:5, Funny)
Why? Because I've overc locked, so I'm faster than y'all!
No (Score:5, Insightful)
Re:No (Score:5, Insightful)
Re:No (Score:5, Interesting)
I overclock my nook color from 800 mhz to 1200 mhz. I overcock my phone from 1 ghz to 1.4 ghz. My phone CPU's voltage doesn't change one bit and my Nook Color's CPU voltage is mildly higher. The CPU is far and away one of the least power consuming components of these devices -- the NC's screen uses around 1W and the cpu about 35mW. Unless you're overclocking the LCD, it the change in battery life is infinitesmal.
Re:No (Score:5, Funny)
I overcock my
And that little slip right there says everything about the reasons for overclocking.
Re: (Score:3, Interesting)
Noticed the typo just a litttttttle too late, heh.
Actually the Nook is one of the best devices to overclock that I've ever come across -- it's really quite slow running CM7 or CM9 and the extra CPU speed helps immensely. CM9 is unusable without the extra boost, and the overclock takes CM7 from somewhat laggy to silky smooth in most operations.
On my Epic, the overclock isn't super useful now that I'm running CM7 on it too. The normal stock Samsung build of Android has slight amounts of lag without the mild o
Re:No (Score:5, Funny)
Re:No (Score:4, Funny)
Re:No (Score:5, Funny)
My ex-wife overcocked and ended up divorced because of it.
Re:No (Score:4, Funny)
Is your name Lorena?
Re:No (Score:4, Informative)
Re:No (Score:4, Interesting)
You'll save a lot more battery if you undervoltage your CPU. Eg. My Galaxy Nexus by default runs at 1350mV. I can run it perfectly fine on 1200mV, even overclocked to 1,4GHz. By my (possibly completely wrong) logic, the faster the CPU runs, the shorter time it spends in higher voltage states. Thus, overclocking and keeping the same voltage (or even undervoltage) actually saves energy.
(Of course, underclocking also achieves the same since the voltage is lowered automatically. But then I've got a slower device rather than a faster device, while using more or less the same amount of juice.)
Re:No (Score:5, Informative)
Your logic is wrong.
Every time a FET switches, it requires a certain number of electrons to move to or from the gate to create an electric field in the substrate to open or close a conducting pathway. This is a current flowing through a reistance and it dissipates power as heat. Assuming that the leakage current on the gate is very small compared to the switching current, the energy required to switch the FET (call it Es) is constant regardless of the clock speed. So the power dissipated by each FET (call it Pf) is:
Pf = Es x fc
where fc is the clock frequency in Hertz.
Why do you suppose that frequency scaling is an effective way of saving power?
Re: (Score:3)
http://www.lesswatts.org/projects/applications-power-management/race-to-idle.php [lesswatts.org] suggests that it's better to run faster for a short time than to run slowly.
Re: (Score:2)
Of course it does, because there is another part of power consumption for electronic devices, that does not change with the frequency.
But this assumes you are going 100% cpu load over the whole time in both cases.
Not very likely.
Re: (Score:3)
If it isn't going near 100% for a significant percentage of that time, the CPU should have been scaled back to a lower clock speed and, likely, a lower core voltage, so I would argue that not only is it likely, it should be nearly guaranteed. Am I missing something?
Yes, I know there are sometimes performance reasons to leave the CPU going at a higher speed for short periods of time just in case it is needed for something else high-power shortly thereafter, which makes the relationship between clock speed a
Re: (Score:2)
Some processors benefit from undervolting and some don't, and I have no idea what the difference is, but there must be one, because it works better for some processors than for others. It is said that undervolting most big and powerful processors makes very little difference in power consumption or heat dissipation, and only switching the transistors less (typically through clock reduction, but intel will switch off whole cores now, and IIRC AMD can shut off groups of cores if you have 6 or more of them) ac
Re:No (Score:5, Informative)
Actually, voltage matters substantially.
The gate of a FET is effectively a capacitor. Even with the FET in the on state, if you keep increasing the gate voltage it'll still keep taking electrons. And like a capacitor, energy stored in a FET gate = 1/2*C*V^2. You also have source/drain and gate/drain (miller) capacitance - source/drain has to be discharged (another 1/2CV^2 loss) and the miller capacitance has to be discharged and then charged at the opposite polarity (a CV^2 loss).
Overall, neglecting leakage current, power loss is proportional to frequency, but it's also proportional to voltage squared.
Power loss is also proportional to transistor count, which is why ARM is such a low power processor.
Re: (Score:2)
Diminishing returns, basically.
If you for instance allowed it to slowly render that huge page you're looking at, working in the background while you were reading what was already rendered, you wouldn't have a lot of wasted power/time while just reading.
It's like fuel economy in a car - the car has a 'best speed' for the amount of miles it'll go on a gallon. Same thing is true for CPUs.
Re:No (Score:5, Funny)
That's why you should have overclocked your battery too.
Re: (Score:3)
N900 - overclocking involves lowering the voltages and increasing the maximum burst speed, theory being that if you get the work done faster you can go to sleep sooner and save power.
I'm not sure it really works, but it does make the UI more responsive.
Re:No (Score:4, Informative)
In fact I've read a lot more bullshit here but to address one more thing in specific: Most modern devices are made using CMOS technology or at the very least using FETs. A common misconception I see here is people assuming that CMOS devices use power while in a stable state. The fact is, they don't if they're well designed; The power being used is to charge the gates of the FETs. Once they're charged the only power use is to compensate for leakage currents from the FET gates to other parts of the substrate. If you don't believe me, build a small circuit using FETs (think something like a bunch of flip-flops), switch it a bit at a fairly high frequency. Stop the clock. Put a capacitor over your power supply. Then disconnect the circuit so its powered from the capacitor. It'll keep its state for weeks most likely.
Re:No (Score:5, Informative)
I'm sorry that you have barely passed EE 101 a week ago. The information you have is very inaccurate and quite outdated.
In modern CMOS geometries, a large amount of power is wasted on leakage. That means that while the dynamic power scales linearly with frequency (at a constant voltage), the static power (leakage) does not.
However, if you *can* overclock significantly at a constant voltage, there probably is power headroom the manufacturer did not use properly, or expected the devices to be unreliable with reduced voltage at the original frequency. Dynamic voltage scaling is not new.
Re: (Score:3)
Re:No (Score:4, Informative)
Re: (Score:3)
Re: (Score:2)
Why?!
What are you doing with your phone where say 10% will make much difference? Mid-range smartphones are already multi-core with hardware accelerated graphics and 512MB RAM or more. They're happily playing GTA3 now. Wait another couple of years and they'll probably be playing GTA IV. Graphics rendering is massively parallel and so easy to improve just by packing in more transistors. Better to just wait for the performance to double a few times rather than try to get tiny performance gains with exponential
Re: (Score:3)
My first computer cost £100-200
My second computer cost £100-200
My third computer cost £100-200
My fourth computer cost £100-200
My fifth computer cost £100-200
My Smartphone cost £100-200
All ran all the apps I wanted to when I bought it but were too slow for new ones ...
I spent most of my time running apps that did much the same things on all of them, (web browsing, email, programming)
Overclocking only extends the useful life of a computer, by reducing it's lifespan ...
Re: (Score:2)
When we get quad-core, 2+ GHz phones, it should be easy enough to do a scaled-back version of GTA IV. A few less pedestrians spawned slightly less far away, less bits and pieces flying off of things, it can probably be done.
Re: (Score:2)
It's not dead, it's fun! (Score:5, Informative)
For me, It's fun and I could care less what some dude did with liquid nitrogen.
First computer, I just used Asus Overclock and felt I got more for my money.
Second computer, I started fiddling with manual settings.
Third computer I pushed it until I couldn't get rid of the heat with air cooling.
Fourth and current computer, water cooled and running awesome (6 cores at 4.3 GHz).
Each time I felt the progress, it's like leveling your character, but the character is you, and the game is real life!
Re:It's not dead, it's fun! (Score:4, Interesting)
For me, It's fun and I could care less what some dude did with liquid nitrogen.
About this post, it's hard to determine whether this should be "could care less" or the classic "couldn't care less". :)
You could be interested about liquid nitrogen as you are an overclocker or, you're not as you don't want to go to such an advanced level it just being a fun hobby.
Re:It's not dead, it's fun! (Score:5, Insightful)
Morons with no actual understanding of the language say "could care less." It's just that there's a lot of them.
Re:It's not dead, it's fun! (Score:5, Informative)
No, "I couldn't care less" means "it's not possible to care even less than I already do, even if I wanted to". It means I care the least possible amount. I have reached the bottom.
"I could care less" (but I don't, meaning that I do care a certain amount) means that there is still a margin between the amount I care and the least possible amount of care. I haven't reached the bottom. Nothing is said about the size of that margin, so this statement really doesn't say anything.
Re:It's not dead, it's fun! (Score:5, Funny)
Can we quantify "care" with units? Im still having trouble grasping this.
Re: (Score:3)
I always found that overclocking was very anti-climactic. It's noticeable if you have a really awful computer, but if your computer is already running okay then it makes no difference to add a little extra performance. It's like adding more RAM. There are less slowdowns, but technically nothign is actually speeding up.
Also to me it still sounds like you're levelling something external, ie your computer. Levelling your knowledge very, very slightly too, but it's nothing compared to the levelling you'd feel i
Re:It's not dead, it's fun! (Score:5, Funny)
All I do these days if I'm feeling daring is activate the 'high performance' power profile in Windows.
Re:It's not dead, it's fun! (Score:5, Informative)
Maybe, maybe not... (Score:5, Interesting)
From a gaming perspective (typically one of the big drivers of overclocking), a few factors that might argue "yes, it's over":
1) For quite a few years now, PC games haven't been forcing the kind of upgrade cycle that they did over the previous 20 years. When Crysis appeared in 2007, it was a game that gave many people an "upgrade or don't play it choice". And after that... the industry retreated. Consoles were the primary development platforms at the time and few PC games pushed significantly past the capabilities of the consoles. Not only did we not see any games more demanding than Crysis, but the vast majority of PC games released were substantially less demanding. As a gamer, if you had a PC that could run Crysis well, you did not need an upgrade. This situation lasted 4 years.
2) Performance has become about more than clock-speeds. The main advances in PC gaming technology over the last few years have come from successive versions of directx. You can't overclock a machine with a directx 9 graphics card so that it can "do" directx10. Same goes for dx10/11.
3) As the entry barriers to PC gaming get lower, the average knowledge level of users fall. PC gaming is, in general, easier and more convenient than it has been at any time in the past. Pick up an $800 PC, grab Steam and off you go. If you just want to play games and are using an off-the-shelf PC from a big manufacturer, you don't need to worry about switching around graphics drivers, sorting out hardware conflicts or any of the other little niggles that used to make PC gaming such a "joy". You can even find cases where PC gaming is easier than console gaming; the PS3, with its incessant firmware updates and mandatory installs has taken us a long way from the "insert game and play" roots of console gaming. People who are new to PC gaming just won't be coming from the kind of mindset that even considers overclocking as something you might even remotely want to do.
4) Among "old school" PC gamers, I think there's been a growing recognition that overclocking has its downsides as well. In an economic downturn, when money is tight, you don't necessarily want to go risking a huge reduction in the lifespan of your expensive toys.
That said, there are a couple of factors that might argue the other way (closely connected to the earlier arguments):
1) System requirements are finally on the move again. After years in stasis, 2011 has seen the release of a number of games with equivalent or higher requirements than Crysis. Bulletstorm started the trend, but Battlefield 3 and - to an even greater extent - Total War: Shogun 2 have really started to push the envelope on PC hardware. A lot of developers openly admit to being bored with console hardware. Even though they still get most of their sales from the consoles, they are using the PC to push beyond what they can achieve there, both to get their studio noticed and to get themselves ready for developing for the next round of console hardware.
2) The downturn also means that people feeling a squeeze on their budgets may be looking to get as much bang for their buck in terms of performance as possible. If you think that your new, overclocked PC will last long enough that you will be able to afford a replacement when it does start to give out, then why not take the risk?
Re:Maybe, maybe not... (Score:5, Insightful)
I think you're right. I've overclocked my i5 750 from 2.66 to 3.15, and the speed increase is.. well hard to spot. In benchmarks I certainly see it. It was much easier to do than in the good old days where it was jumper settings.
I think the gist of it, at least for me, is that there's fun in it anymore. I have relatively high end gear, at least at time of purchase, and it all basically guides you to overclocking. It's not as bad ass as it used to be.
This may be a bit biased since I now have much larger sum of disposable income compared to when I was overclocking.
Re: (Score:3)
2) Performance has become about more than clock-speeds. The main advances in PC gaming technology over the last few years have come from successive versions of directx. You can't overclock a machine with a directx 9 graphics card so that it can "do" directx10. Same goes for dx10/11.
Not necessarily, video cards are the dominant force in today's gaming rigs. The CPU has taken a back seat to the GPU as both graphics and Physics calculations are ran on the GPU. If anything, GPU overclocking should be the focus.
Pointless in most cases (Score:4, Insightful)
Few people have any real need to sacrifice stability for a little more speed. Overclocking is pretty pointless for anyone with a modern CPU.
It's not about using it. (Score:5, Insightful)
For a small proportion of the population (but, possibly, a large proportion of slashdot-ers) a PC is not a platform for doing useful work or serving entertainment, it's a source of "fun" in its own right. In past decades the people who like to play with their computers would be out in the yard, covered in oil, fiddling with a junky old car, or tuning a valve radio. Now they get their satisfaction from squeezing the last few MHz out of their PCs - whether there is any need or use for those few extra cycles, is immaterial.
And for those with a more software bent, than a hardware leaning, there's always OSS - which serves a similar purpose.
Re: (Score:3)
'If you know what you're doing' excludes most computer users.
It's not for most people who just want to run office tools, read web pages, and maybe play games. They can't handle the possible instability and won't notice the gains in most cases. You always sacrifice stability when overclocking, you are just hoping the difference is too small to be noticable.
Re: (Score:3)
For some family of chips, they are literally the same chip fused to disable cores or drop the clock speed. This means that the chips are designed to do more, but are intentionally crippled. It's cheaper for them to have a single manufacturing line than one for each chip in the family.
No, it simply doesn't provide the extras it used t (Score:5, Interesting)
It used to mean windows would run faster, games would run faster, everything was FASTER MAN!!!!!111one.
But now overclocking for the at home folks is a case of hit a button in your bios, or in some cases a physical button on the motherboard, and it'll do some overclocking for you, automatically. As its become more automated, the news worthy stuff becomes more and more expensive to implement and show off, and so most things are less news worthy and so it appears "overclocking" happens less. In reality I'd expect it happens alot more, and maybe even when people aren't fully aware of what they are doing.
Also systems being so much faster now, generally provide the speed that users require of them, unless they are the kind of users to be pushing systems to overclock simply for the hell of it, like the guys who get in the news. However you don't see these guys then gaming and getting 200fps on these systems, or anything exciting like that anymore. Its simply overclocked, and shown it to be "stable" at said speed. No one ever goes "lets see how many FPS can we get outta this baby now!", its all become very much a concept thing rather than actually running systems at these speeds for any sensible amount of time.
Re: (Score:2)
Speaking as someone who was overclocking Cyrix chips and AMD K6s, I'm super-glad that now I can just run a program and have the computer overclock and stress test while I sleep. I bought a 2.8 GHz processor and it goes 3.4 GHz for no additional cost. That's a small bump, but it cost me nothing, so it's very difficult to complain. Every car is different and some are just a little better built than others and could take more tuning, and lo and behold, the car's computer is self-tuning, and tunes itself for ef
Re: (Score:2)
Yep, thats one reason.
Others are its simply too expensive to do the kind of overclock you'd of done 5 years ago, for the same increase in %. I guess chip creators are getting better at running the chips at their limit already, rather than shipping them lower than they "could" go, tho I don't have any stats to back this up.
Most people don't understand that it's a bad idea. (Score:5, Informative)
Look, digital electronics are still subject to analog limitations. When you overclock, you squeeze the hysterisis curve, increasing the probability that your chip incorrectly interprets its the state of a particular bit as the opposite value. i.e. you get random data corruption. This is why you eventually start crashing randomly the more you overclock.
While overclocking a chip that has been conservatively binned simply to reduce manufacturing costs but is actually stable at higher clock rates is reasonable, trying to overclock past the design limits is pretty insane if you care at all about the data integrity. Also, you tend to burn out the electronics earlier than their expected life due to increased heat stress.
I never overclock.
Re:Most people don't understand that it's a bad id (Score:4, Insightful)
You just don't get the overclockers mentality.
Either is all part of the fun adding to the risk or you are getting the most out what you paid for and are still within stable limits.
I don't think many overclockers care about random data corruption unless they blue screen or they turn it off when they need stability.
Re:Most people don't understand that it's a bad id (Score:4)
When the highest end chips can be clocked from 3.8 to 4.5ghz and higher using the stock cpu cooler, doesnt it make you wonder why Intel/AMD do not sell any 4.5ghz versions of these chips? Its because the OEM's fuck up case cooling every single time.
If Intel sold a 4.5ghz i7, Dell would still put it into a case with horrible venting and only a single fan that has been poorly placed, and then Intel would be footing the bill for loads of warranty replacements. The reason the i7 980X's cost so much isnt just because Intel was taking advantage of performance enthusiasts irrationality.. its because the Dell's of the world fuck up cooling every single time. The sandy bridge i7's perform nearly as well but run a lot cooler so can survive the harsh conditions the OEM is going to hamstring them into, and THAT is the main reason why they are so much cheaper than the 980X's.
Re: (Score:2)
I'm sure it's more to do with the fact that Intel do not want to advertise a CPU with a TDP of 200W.
Re: (Score:2)
Are you not from earth or some shit?
The law of diminishing returns applies (Score:5, Insightful)
Ten years ago, CPU and RAM speed were really big factors in how fast your PC felt. We've spent the last ten years optimising hell out of them, while still using 7200RPM spinning disks (if you're lucky). So, surprise surprise, today disk IO is what limits your PC's performance. Why overclock your RAM? It makes (almost) not difference to your IO speed.
I got a new laptop just over three years ago. It had a 2.4GHz processor. I got my next new laptop a few weeks ago. It has a... 2.5GHz processor. Clock speeds have become almost irrelevant. What makes the new sucker fly is the SSD. Unfortunately, there is no BIOS setting, however risky, to change from disk to SSD.
Re: (Score:2)
Hmmmm 5 years ago I was running a single core 1.4ghz
Now I'm running 6 cores at 2.4 ghz.
But, yes, your point is valid ;)
Competitive Overclocking != all Overclocking (Score:5, Informative)
That's like saying competitive soccer going broke would impact on EVERYONE EVER from playing soccer with their friends.
Not everyone overclocks to beat a record.
Hell, "overclock" a toaster if you have to. 2 second cold toast anyone? (the best toast)
But really, there are still plenty of things you can overclock to beat records, such as what iB1 mentioned up there, overclock a smartphone or tablet.
Overclock a Beagleboard, or a Raspberry Pi when it comes out, Arduinos. All these compact computers are pretty much sitting around waiting to be hit by the overclocking spirit.
Re: (Score:3)
OMG You've just reminded me of the hell in a kettle overclocked kettle video on youtube.
http://www.youtube.com/watch?v=rGL67coOdOk [youtube.com] for those wondering.
No (Score:4, Informative)
No. Next question.
Seriously though, both Intel and AMD sell multiplier-unlocked CPUs as a feature, and the winners of tests in PC Pro magazine are overclocked by the system builder. You can even buy upgrade bundles pre-overclocked. My latest motherboard came with one-click overclocking software and can adjust the clock speed through a web page while playing a game. Liquid coolers are mainstream. Overclocking is definitely not dead.
Huh, no (Score:4, Informative)
"has the enthusiast passion for overclocking cooled off"
Not from my 5.0Ghz Core i7 2600k anyway -- The tools have become better, the mobo are generally better built and more tolerant to punishment (some have 2 Oz copper), the power rails are a LOT more controllable than before, and in general the IC companies that make the power ICs have progressed a lot too in that time, so you can overclock easier, quicker, get better results and in general, extract quite a bit more, without nitrogen.
And, I compile distros all day, to me going from 3.8Ghz max to 5.0Ghz stable (and quiet!) is awesome; make -j10 FTW !
I bet you're the life and soul of a party (Score:5, Funny)
[Sultry babe walks up]
"Hello, and what do you do?"
[nasal voice]
"I compile distros all day. Yes, did you know that Slackware on average compiles 20% faster than Debian for 64 bit but if I overclock my Core i7 by raising power rail voltage and tweeking the quantum flux capacitor.... hello, where are you going..hello? Hey, come back, did I mention its a 2600K? Hello?"
Re:I bet you're the life and soul of a party (Score:4, Insightful)
Is this really "slashdot.org" where "nerds" used to be around ? You know, nerds, who do technically oriented stuff "just because they can" ?
The various comments on this topic -including the one up- makes me wonder really, or has "nerd" become more of a "I'm such a nerd, babe, look, I installed an app on my smartphone".
Or /. has been mirrored to "hipster.com" and I'm accessing the wrong portal
Re: (Score:2)
Oh dear. Someone's had a sense of humour bypass.
You did rather set yourself up for it saying you compile distros all day. I mean, even for a nerd thats a bit of an odd thing to do. Once a week/month to rebuild a kernel sure, we've all done that at some point, but every day building entire distros? Why??
Re: (Score:3)
Ever heard of embedded development ? Or, maybe you think that distros themselves just appear magically as an ".iso" file brought by father xmas ? I'm sure you're very proud of having recompiled your kernel at some point, and that seems to have given you enough insight into general software development to make large, broad statements about it all. /do/ find it funny, but not in the way you probably intended.
Actually, I
Re: (Score:3, Interesting)
Why the fuck would anyone "compile distros all day" on their personal computer? If you're doing it for work, use the work machines. If you're doing it for a hobby, dude, get a better fucking hobby.
Re: (Score:3)
Which games do you know of that are CPU bound at 100FPS? There aren't very many... if you're getting crappy gaming performance, you'd be better off overclocking your GPU.
Gains aren't there (Score:5, Insightful)
Re: (Score:2)
Why was this rated -1?
Re: (Score:3)
Trolling it a bitch.
Hahahaha (Score:3, Interesting)
That's like saying, "Do nerds no longer need a proxy for phallic measurement?" As long as there's still testosterone (even if it is minimal in some here) there'll still be people (men mostly) looking to say "We did it first!"
Re: (Score:3)
I think even most geeks think of overclockers as a little bit obsessive and kind of out there.
Can't notice the difference anymore (Score:5, Insightful)
I think CPU speed is less of an issue these days; eg Core2 onwards processors are generally "fast enough" for most users.
Compare the change in noticeable speed between a 386 and 486, or even Pentium vs Pentium 2 or 3, to today's Core2/Athlon vs Core i5/Phenom.
Most people don't notice the jump in CPU performance on modern processors.
The other traditional bottlenecks are rapidly disappearing too, eg a midrange Directx10 graphics card is good enough to play all but the most demanding games these days, and memory and disk speed and capacity are generally outpacing most people's demand.
People will still overclock for the challenge of it, but I think there's no tangible day-to-day benefit anymore.
As someone above mentioned, the real performance battle has moved to portable devices, eg how much performance can you get from a tablet or phone, given a fixed battery capacity?
Overlocking was only ever a dick waving contest... (Score:2, Funny)
... between a tiny bunch of geeks who had more money than sense. Someone should have told them that if they really wanted to play their first person shooter faster they should overclock the graphics card, not waste time on the CPU.
Yeah , I 'll get modded down for offending the high priest overclockers who read this, but really, if you spend 1000s on a special cooling system for your CPU just so it runs 25% faster so you can get even more unnoticable frames per second you really need to get out more.
Re: (Score:2)
Re: (Score:3)
Isn't that the truth! It reminds me of people who still claim that LPs sound better than CDs even though the LP stereo system is a hack that doesn't always reproduce phase properly and the audio before its recorded on an LP has go to through a compressor first to limit the amplitude because of physical restrictions in the offset of the groove and also the high frequency response is limited because the goove simply can't be machined to undulate enough accuratly enough to reproduce them especially at 33rpm cl
Re: (Score:3)
Not everything is about gaming, I overclocked for faster compiling cycles, and it makes a HUGE difference. And for gaming the market has long figured it out, hence the huge market for overclocking GPUs, you can get custom water blocks, heatsinks, memory coolers, all specifically for video cards. The market for overclocking GPUs dwarfs the market for overclocking CPUs in the yesteryear, people have been overclocking GPUs since they very first came out from 3dfx.
Re: (Score:2)
if they really wanted to play their first person shooter faster they should overclock the graphics card, not waste time on the CPU.
I always thought they did that too?
Yeah , I 'll get modded down for offending the high priest overclockers who read this, but really, if you spend 1000s on a special cooling system for your CPU just so it runs 25% faster so you can get even more unnoticable frames per second you really need to get out more.
Maybe it's been a while since you were a kid but there's something enticing about "sticking it to the man"... robbing Intel of those few $$$ by taking a cheap CPU and running it as fast an expensive CPU. Intel (and AMD probably) know exactly what's going on and how best to make money out of it though :)
Re: (Score:2)
Sure, you stick it to Intel. But this grand gesture then means you give twice the extra money you've have spent on an equivalent out-the-box CPU to some other faceless corp who provide overclocking kit and who may be just as venal and grasping as Intel/AMD/whoever. Plus you reduce the life of your CPU. *shrug*
Re: (Score:2)
Yes, in fact, any decent graphics card now has overclocking options right in the settings, so that you don't have to use a third party tool that may do it wrong and confuse the driver. For nVidia, for example, you just bring up the nVidia control panel and select "Performance" -> "Device settings" from the tree and you can fiddle with the clocks and on some cards even the voltages.
Re: (Score:3)
Not sure what you are on about spending $1000 for 25% more performance. I have a cheap $22 CM212+ cooler, that's a pretty far cry from $1000. The gains are absolutely worth it, I have a 2500K that has a stock speed of 3.3 GHz, it's overclocked to 4.5 GHz, or about a 36% increase.
$22 for 36% more performance is absolutely worth it, maybe not for gaming now, but it's definitely useful for other tasks.
Personally, I do a decent amount of encoding video files, and the speed increase is absolutely time saving.
As in the words of Sir Arthur C Clarke! (Score:3, Insightful)
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is probably wrong.
Therefore, I rest my case.
Cheers Arthur a true friend who is missed but still there :)
Translation: (Score:2)
PC's are so fast these days for our simple minds there is no longer a need to overclock.
And GET OFF MY LAWN!
Huge difference for game development (Score:5, Informative)
Well let me dig up my test results spreadsheet from when I first got my 2500K CPU, times are in seconds to complete my task in Visual Studio 2010, first set of numbers is the system at stock clock, second set is overclocked at 5GHz, during my game development most of my day consists of building the game, loading the game and testing out changes or additions, therefore the reduction from doing that in 32s vs 21s is absolutely huge, even doing code changes that don't require a total rebuild I am waiting 3s less. It may not sound like a lot but when you are focused any time saved is very important, you can only be focused for so long.
build debug from clean 12.9 6.9
built already, go and load all effects and units 8.2 5.6
at title screen all loaded, start medium map 19.6 14.3
modify main.h build load to splash scrn 3.4 2.1
modify main.h load into medium map 31.9 20.9
modify main.h optimal load no sound, small map 16.9 10.3
running in game, modify main.h apply changes 10 6.7
average 14.7 9.5
system is 2500K, C300 SSD, 16GB memory
Re: (Score:2)
There's always someone who comes along and spoils an argument with facts and evidence :)
Re: (Score:2)
So I backed off to 4.7GHz ... runs a lot cooler and has been rock-solid stable. So basically got an extra GHz in performance for free ... and yea, as the OP says, all those reductions in time add up and make for a more pleasing experience.
Re: (Score:2)
Restore from sleep, I find that very interesting. I just installed a bios update to my Gigabyte Ga-MA770T-UD3P 1.0 and fixed a sleep problem that I never had before overclocking with AMD overdrive. At least, I think that's what fixed it, I didn't change anything else, but there could have been a windows update in there someplace.
Re: (Score:3)
I agree with this. As a gamer myself, I am tired of how long it takes to build the game from source in Visual Studio. When will they optimize?!
Give CLang and LLVM a try (Score:3)
I don't know how it compares to Visual Studio but they complete builds quicker and generates faster machine code than does GCC.
The CLang command-line is mostly GCC compatible. The parser is larger Visual Studio-compatible.
It is also Open Source under a BSD-style license.
A friend gave a talk at Microsoft one day. Upon his return he told me why Windows was so slow. It turns out that all the OEMs - Dell, Gateway, HP and the like - donate hardware to Microsoft's coders so they can be certain that the next ve
I'm usually doing the opposite... (Score:4, Insightful)
we wouldn't need over clocking if our code were fa (Score:2)
-ster.
the hardware vendors devote tens of billions of dollars every year to keep up with moo re's law. this has lead to the fallacy that CPU power, ram and storage are infinite, so many of today's coders don't even bother to optimize their code.
consider initializing a 2d array in a nested loop. if you increment columns in the inner loop, the memory cache will speed you up. but a dumb mistake could increment rows instead. in that case the cache actually slows your code down dramatically.
it is wrong that
you wouldn't need to continue your first sentence (Score:2)
in your comment if you didn't begin the comment in the subject line.
See how annoying that is? The subject line is for a subject line, not for the beginning of your comment.
In this discussion. (Score:2)
2. People who don't and like to moan about it.
Hehe (Score:2)
I see what you did there.
Its fun? (Score:2)
It reminds me old time... (Score:2)
I remember how in 1998 it was said that overclocking is dangerous and that it can kill your processor.
My dad still use nowadays my old overclocked Intel Celeron 300A. I only had to reduce the RAM usage (one bank is dead and searching for a new one is not worth it).
Re: (Score:2)
I remember how in 1998 it was said that overclocking is dangerous and that it can kill your processor.
back then, it could. If you had clocked up those 300As just a little more you could have burned them up real good. modern CPUs have thermal protection and it's relatively difficult to kill them by overclocking.
Duron 1.3Ghz (Score:3)
I haven't overclocked since I bought an 1.3Ghz AMD Duron around 2002. It didn't have much headroom though to OC. I think I could only get it to around 1.4Ghz and change.
Lately I don't see much use in OCing. Chips are plenty fast enough for almost anything you throw at them today and you can buy faster chips relatively cheap in the next year or so when your current chip is becoming insufficient.
Both the Intel i5-2500k and AMD Phenom II X6 1100T sit around $200. They aren't the fastest on the market, but at around $200 they are cheap any easily fast enough to handle anything you throw at them.
Overclocking is Out, Low-Power is In (Score:3)
I overclocked first computers (2000-2004). I bought a budget system in parts, put it together, got online, and learned that I could make my computer even faster with a little risk and careful effort.
But then the prices of components began to fall and I stopped overclocking new rigs 2004. Why? Because a normal $30 heatsink was barely enough to keep some of the hotter processors cool without overclocking... and I was not willing to risk losing my processor for a few more FPS in Counter-Strike or whatever I was playing that month.
Fast-forward to now, I still leave my main computer on 24/7, but as a career-person, I need to save more (house, retirement, vacations to placate the lady) and spend less on utilities. I also have less time to clean the dust out of computer cases that effectively had hoovers for cooling. So where I used to go for a balance of cost, heat, and overclockability, I now look at cost, heat, and power consumption. I now take pride in being able to comfortably play modern games (though not at max settings) on a rig supported by a 260 watt power supply. I have no guilt leaving that on overnight.
Note: I never got into water-cooling. I never had the space or disposable income to mess around with the kit or the risk.
Low-Power is In (Score:3)
Here here. Low power is the new stupid (half kidding) thing to obsess over. My most-used home computer with a UI is an Atom. In 2010 when people were drooling over how great Sandy Bridge might be, and how much kickass-per-$ the X6 Phenoms offered, I was looking for an Athlon II 240e for my server to downgrade to (eventually finding, to my joy, a 610e for sale, so that I could finally pay $130(?) for the downgrade), just so I could say I had a 45W-TPD-but-still-reasonably-powerful-for-transcoding CPU. No
Underclocking is big (Score:3)
Many industrial PCs are underclocked. They have more CPU power than they need, and they need more reliability and temperature range than the consumer manufacturers provide.
The end of overclocking is coming anyway, because speed of light lag across the chip, rather than transistor switch time, is becoming the bottleneck. No amount of cooling will help with speed of light lag.
I still overclock! (Score:3)
I still overclock and nearly every PC I've ever owned has been overclocked to include an 8088 clocked up with a radio crystal back in the day (not a great idea). I was playing with water cooling and Peltiers before you could buy ANY hardware for that off the shelf too. Cut down heatsinks, PVC caps, fountain pumps, and overseas sourced Peltiers made for some really quick computers for their time! Games were fast, looked great, and I ran RC5 cracking programs to use up idle cycles for years.
Fast forward to the present. I still game but I am not quite into the really crazy high end stuff. I still use a PC for gaming almost exclusively. I no longer run programs in the background to eat up spare cycles and the cooling of my room thanks me for it. I AM running a water cooled CPU though using mostly off the shelf stuff that doesn't leak, my CPU is rock stable and not quite pushed to the edge. I upgraded my computer in the not so distant past for more speed and I'm pondering doing it again to the later SandyBridge architecture from my older i7 920 (4.1ghz). I'm also looking at the new 6core CPUs that have come out but they strip H.264 instructions apparently. :-(
Why? Well it certainly isn't gaming since right now games seems woefully poor at using multiple cores! Now I have another "hobby" and that is compressing video. I buy BluRay, rip them, and put them on my personal server for viewing on efficient Atom powered STBs (overclocked though lol). When I was doing this with a C2D running in the mid 3-4ghz range some movies like Watchmen took 8 hours or more to encode with my high settings. Now I can do a movie in 2 hours or less while still having CPU available to do other things. If I move to the more efficient CPUs produced now, and especially if x.264 supports their ENcoding instructions one day, my times will drop again as I should be able to hit close to 5ghz. At that point I'll either encode with higher settings or just enjoy that it's as fast as it's going to get. I boot from an SSD so that's quick enough. My video card is a fairly pedestrian GTX275 which might get a bump too, I'm not sure.
I have tinkered with using the GPU to encode as well. Right now my CPU alone can keep up with encoding on my GPU alone but mixing my GPU and CPU together (I found ONE package doing that and it wasn't x.264) was noticeably faster but severely limited my encoding options so I've stuck to CPU brute force. I'm hoping that with CUDA being open sourced more programs will begin using the GPU too.
I've processed A LOT of video and I do video for friends too sometimes. Being able to run tons of apps, lots of browser windows, and generally not care too much about what is and isn't running is a side benefit. I may try BF3 out but doubt it'll be so much better than UT2K4 that I'll be sold as it will likely be exponentially more difficult to play. I'd love to find more things to do with the CPU power I have and I do try to use power wisely. My server(s) are actually underclocked and sleep their drives when not being accessed, my video front-ends draw less than 15watts apiece, the PSU in this box is Silver rated and under 650watts. Rendering or compiling code would be fun but I am neither developer nor artist. Those of us who are could certainly find value in overclocking! I know a certain Apple guy who was pretty butt hurt his 8 core powerhouse costing quite a bit more than my computer couldn't encode video as quickly when he challenged me :-)
P.S. Yeah, I tinker with cars too, it's fun. I also laugh at those who talk about "shortened CPU lives" - get a clue. I have had exactly ONE CPU die and that was within the first 24 hours - warranty replaced. I've had overclocked CPU go for 5 years or more being passed down with no issue. This 920 has seen temps as high as 90C under full load for hours at a time when I had voltages too high and it's still ticking fine. My current peak is recorded as 75C. If you REALLY want to drop some heat water cool the video card, sadly these water blocks tend to be pretty custom and I don't do it since an upgrade on the video means a costly new block.