AMD Breaks Overclocking Record With Bulldozer 193
MojoKid writes "AMD recently held a press event at their Austin headquarters, offering hands on time with the company's upcoming Bulldozer-based FX-line of processors. Many of the details disclosed are still under NDA embargo, but AMD is allowing a sneak peek today to go along with a claimed Guinness World Record announcement. A team of overclocking enthusiasts and AMD engineers had a sampling of early AMD FX processors running at around 5GHz with high-end air and water-cooling, in the 6GHz range with phase-change cooling, and well over 8GHz on liquid-nitrogen and liquid-helium setups. Voltages of over 1.9v were used as well for some of the more extreme tests. The team had access to dozens of early FX processors and methodically worked through a batch of chips until ultimately hitting a peak of 8.429GHz using liquid-helium, breaking the previous world record of 8.309GHz for modern processor frequency." Update: 09/13 13:54 GMT by T : Adds user Vigile: PC Perspective was there and took some photos and video of the event.
Hmmmm...... (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Thank you - I was sorta scratching my head on that one. I had a 386 SX, then a 486 DX, and I was pretty sure that the FPU was separate on the 386.
Yup , my mistake (Score:2)
I shouldn't rely on memory. Still 386 or 486 the point still stands.
Re: (Score:2)
I happened to grow up on a 386SX... I know that pain all too well.
Re: (Score:2)
Poor you, sounds like you didn't have a turbo button.
Re: (Score:2)
It would seem perverse to have a seperate chip for the FPU these days and I suspect in 5 or 10 years the same will be said for standalone GPUs.
Was there ever a competitive market for FPUs, or did only one FPU make and model work with each CPU? GPUs are meant to be replaced.
Re: (Score:2)
There was only one manufacturer at the time, really, Intel. Although Cyrix did start off as a match co-processor developer, come to think of it.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Ahh no.
The FPU was integrated in the 486DX you could get a cheap 486SX that didn't have an FPU because heck who needed one except for CAD users.
http://en.wikipedia.org/wiki/X87 [wikipedia.org]
For the history of the x87 family.
10 years? Naw I give it five max. Once APUs can play games at 1080p with all the eye candy almost no one will buy a separate GPU. They day that they can driver two 1080p displays at that level it will be all over.
Intel GMA (Graphics My ...) (Score:2)
Once APUs can play games at 1080p with all the eye candy
Given Intel's history of lagging behind NVIDIA and AMD in GPU capability, I don't see Intel GMA becoming able to "play games at 1080p with all the eye candy" any time soon. Does Sandy Bridge change this? Or were you talking about AMD CPUs with an AMD GPU on-die?
Re: (Score:2)
Sandy Bridge was (from what I've heard and read) a major improvement, but not really a game-changing one. Not enough to game at full settings at 1080p, but enough to watch videos at 1080p.
Re: (Score:2)
Re: (Score:2)
And when Intel began integrating the math chips on all future processors, I was sad because the IIT math chips I'd had in my previous systems was much more capable than those integrated by Intel. See all about x87s [cpu-info.com] here.
Re: (Score:2)
10 years? Naw I give it five max. Once APUs can play games at 1080p with all the eye candy almost no one will buy a separate GPU. They day that they can driver two 1080p displays at that level it will be all over.
1. Heat. If you have a hot CPU and a hot GPU, having both together is not such a good idea.
2. Upgrades. No more swapping graphics cards without swapping CPU (and if you're unlucky then mobo, memory etc. too)
3. If you look at the benchmarks for your favorite game, a discrete nVidia may be a better choice.
Don't get me wrong, for laptops, nettops and SoC chips this makes perfect sense but for a powerful desktop a discrete graphics card makes just as much sense. I don't think the discrete GPU is going away any
Re: (Score:2)
Once APUs can play games at 1080p with all the eye candy almost no one will buy a separate GPU.
Yeah, because no-one will find a use for all the extra power that a discrete GPU would give you.
The only reason you can play modern games on a PC with a low-end GPU with anything like the graphics you could get from a high-end GPU is that most games have been crippled to be playable on consoles with five-year-old GPUs.
Re: (Score:3)
Wow it is amazing some times what people read.
"Yeah, because no-one will find a use for all the extra power that a discrete GPU would give you."
I never said that at all. I said almost no one will buy a separate GPU. Notice that the word "almost" which means that some people still will.
This issue here are economies of scale. Even today the majority of systems probably use integrated graphics. Most systems sold today are notebooks and most notebooks use IG. Throw in all the corporate desktops, school machines
Re: (Score:2)
I see this argument over and over, but it isn't analogous. The very first x86 integrated FPU was *faster* than the i387. Right now, the integrated GPU is an order of magnitude less capable than the top end discrete GPU.
Your comparison does not hold true.
Re: (Score:2)
It would seem perverse to have a seperate chip for the FPU these days and I suspect in 5 or 10 years the same will be said for standalone GPUs.
The FPU didn't use 300W.
Putting a high-end GPU onto a CPU would double the cost, require adding a huge memory bus to support the required bandwidth, make it insanely difficult to cool and force you to upgrade the CPU whenever the GPU got too slow for new games.
No New Egg Warranty (Score:3)
I bet these liquid-helium cooling kits do not come with a warranty from New Egg!
Overclocking Demonstration (Score:5, Informative)
E-peen (Score:2)
Re: (Score:2)
It's not even that, since the record is not for processing speed - just clock speed! One hopes they at least ran some benchmarks, but the article doesn't say anything about it. So, this is more like "highest RPM for an internal combustion engine" or something like that. (Which, not coincidentally, is most easily done on a small-displacement engine with little torque).
Re: (Score:2)
Umm... Just about any CPU will run for 5 plus years. Depending on what you are using your PC for just about any CPU you get from Intel or AMD will do that for.
And if this CPU will handle this level of abuse odds are that it will last for a very long time on your desktop system.
Hothardware ? (Score:2)
Shouldn't this be really, really cool hardware instead if they are using liquid-helium and all that ?
Obviously (Score:2)
You can break almost anything with a bulldozer. Records are especially easy.
It's been a long time.. (Score:2)
But.. imagine a Beowulf cluster of these!
Other benchmarks (Score:2)
It's all about memory speed! (Score:2)
Why haven't we learned our lessons yet? Processors are bottlenecked by memory speed. We were only able to get processors to go faster in the past by large memory caches and superscaller design. It's a dead end.
We need new memory technologies. We need new memory and memory buses that are able to run faster than processors. Preferably an optical technology not subject to radio and inductive interference. Solve this and systems will seem like 10 thz superscaller designs.
Then we need to reverse past trends an
Re: (Score:3)
Don't you think those of us working on these chips haven't thought of that? This problem was recognized back in the 1940s , for goodness sake. Quote:
Liquid He in Texas? (Score:2)
Re: (Score:2)
Re:Signal propagation limits (Score:5, Funny)
I though the Police solved the Synchronicity problem in 1983....
https://secure.wikimedia.org/wikipedia/en/wiki/Synchronicity_(album) [wikimedia.org] - a wiki article about their research.
Re:Signal propagation limits (Score:5, Funny)
Re: (Score:2)
>>Due to the speed of light, isn't there a limitation on how fast you can clock a CPU die? As I know, eventually you will run into problems maintaining synchronicity.
A nanosecond = one foot. (Take that, metric system!)
You can calculate the theoretical maximum thereby, by just plugging in clock rate and die size.
Re: (Score:2)
It isn't necessarily a problem if you have multiple signals "in flight" on the same line; but it is a problem is signals that are supposed to arrive simultaneously on different lines stop doing so as increasing clock speeds ratchet up the requirements for what "simultaneous" means.
You can already see it happening with external busses: check out the traces between th
Re: (Score:2, Funny)
>>Second is a metric system unit...
Yeah, but a foot isn't. That's the funny thing. A foot is a better natural unit for distance than a meter.
We really ought to be using kilofeet and millifeet rather than meters. =)
Re: (Score:2)
A foot is a better natural unit for distance than a meter.
No it isn't. It just seems natural to you since you're used to it. To everybody not still wedded to archaic units, meters are perfectly natural and usable.
Re: (Score:3)
>>No it isn't. It just seems natural to you since you're used to it. To everybody not still wedded to archaic units, meters are perfectly natural and usable.
Natural in the scientific sense, not the psychological sense. In other words, Kelvin are a better unit for temperature than Celsius, since you have to convert Celsius to Kelvin in order to do any thermodynamic calculations. A nano-lightsecond (aka a foot) makes a lot more sense to work with than the rather arbitrary meter unit. Call it a nls.
In fa
Re: (Score:2)
The speed of light is a limitation. Like a sibling commenter says, the speed of light is one foot per nanosecond. Propagation time through transistors is slower than this, but it's a good estimation tool. However, you don't have to have an operation go all the way through the system in a single clock cycle. Hence, pipelining.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Protection (Score:2)
Does anyone else think that for handling liquid nitrogen and liquid fucking helium that these "experts" seem to be throwing caution to the wind? Seems more like a rave than a science experiment.
Re: (Score:2)
Re: (Score:2)
Even if it was dangerous... what a cool story to share. "Yeah, I was overclocking a new line of processors and I spilled this all over my arm."
Re:Protection (Score:4, Funny)
Next on Fox News:
"My CPU overheated while I watched porn. Wanted to add some nitrogen and that's how I lost my John Thomas"
Re: (Score:3)
Is that part of the Republican debate?
Re: (Score:2)
Re: (Score:2)
That's possible via the Leidenfrost Effect [wikipedia.org], but not for long.
Re: (Score:2)
This makes me wonder why you'd use such cooling on hot processors then. You'd have to maintain a close relative temperature across the entire processor in order to prevent that vapor barrier.
Re: (Score:2)
Re: (Score:3)
I picked up this trick in an undergrad lab from a professor, though he didn't quote it by name. Pour LN2 out of the container onto the palm of your hand - it steams and falls out onto the floor. The key is to position your palm so that nothing collects there - everything rolls off - and not to do it for too long, because even though there is a vapor barrier that, it is chilling your hand.
It looks pretty neat though, and has a lot of wow factor to someone who doesn't understand that it can easily be done s
Re: (Score:3)
Liquid nitrogen vaporizes, forming a gaseous layer which protects exposed skin. You *can* put it in your mouth, and look like you are breathing steam, but you can't swallow it (as the gaseous layer gets forced aside causing you frostbite when the liquid presses against your internal linings, and it rapidly expands in your stomach, which makes you expand too, possibly fatally). See http://darwinawards.com/personal/personal2000-25.html [darwinawards.com]
You can wash your face with it, but your hair / body hair *can* perforate t
Re: (Score:3)
No. Liquid nitrogen is easy to use safely. It's common in undergrad-level labs, can be reasonably easily purchased (like dry ice) by just about anyone, and only requires minor safety procedures. Liquid helium is more dangerous, but is common in graduate-level labs and is not particularly hard to work with if you know what you're doing. It is really quite expensive, though, and a complete waste to use for cooling something that's producing ~100W of heat, since it doesn't have a very high heat capacity.
It's p
Re: (Score:2)
Yeah, if you get too excited during the 'hands-on time', you'll end up losing your willy.
Re: (Score:3)
It's not that dangerous - the system operates just like a regular refrigerator. The cooling system is filled with helium gas at room-temperature. A compressor is used to compress the gas to high pressure - this causes heat to be emitted. As the pressure decreases, the helium liquefies before reaching the processor where it then heats up again and becomes gas again and the cycle is repeated.
It's not like some dude is standing on a wheeled office chair above the PC, with a flask of liquid helium in one hand,
Re: (Score:2)
Does anyone else think that for handling liquid nitrogen and liquid fucking helium that these "experts" seem to be throwing caution to the wind? Seems more like a rave than a science experiment.
I was handling liquid nitrogen straight out of high school (16-17), and used liquid helium in a limited fashion at the same employer
Re: (Score:2)
Re: (Score:2)
Re:Distraction. (Score:5, Insightful)
Re: (Score:2)
I was going to say something similar but you beat me to it.
Re: (Score:3)
Then why did you post instead of just modding him up?
You need mod points for that stunt.
Re: (Score:2)
Do you often launch into rants that have no relation at all to the post you are replying to.
Said post didn't mention "too much processing power" or "overkill" or the requirement of any other person than themselves.
In fact all they said is that they'd take the one with 50% of the performance for 10% of the price since it seems good enough. Obviously faster would be better, but that poster happens get more utility spending the money on something else.
Why would you get so worked up about someone elses simple e
Re: (Score:2)
So you also don't understand the meaning of need. OK.
Clearly his computer lets him do what he does on it, that's so obivous it is a tautology. Thus it meets his need.
If you want to be technical of course he doesn't "need" it at all. I promise take that computer away completely and he continues to live.
You miss the point. (Score:2)
I bought a new laptop this past January. I *could* have bought a blazing fast i7 with discrete graphics but I settled for an i3 with integrated. It was good enough for my needs, cost less, and the battery lasts longer.
Sure, I could use blazing fast speeds, but I don't *need* them. And I don't want them enough to pay the extra costs in dollars and energy consumption.
Re: (Score:3, Informative)
Okay, your post is a little bit of flamebait, but I'll bite.
I can purchase the A-3850 for about $40 less than the i5-2400. Certainly, the i5 will trounce the A-3850 up to the point where you want to process graphics. Once you start to process graphics, you can't really compare the Intel HD 3000 to the on-die Radeon 6550. If I want comparable graphics, I have to purchase $150 graphics card. I can also purchase the ASUS FM1-Pro motherboard for about $50 less than a similar ASUS board for the i5 (with comp
Re: (Score:2)
The OEMs who ship the bulk probably don't even notice them, and I'm guessing that the market for ludicrously overclocked servers is pretty much nonexistent.
Re: (Score:2)
Re: (Score:2)
HFT servers have rack units with hot-swappable CPUs. They're considered consumable. They overclock them to high heaven and pop in new ones when they burn out. I wouldn't be surprised if they've come up with a magazine / chain loading system by now...
Re: (Score:2)
Okay excluding the L33T gamers, super heavy CAD users, HD video producers, Movie F/X houses, and research labs no one needs the power of Intel's high end desktop chips. Frankly some of those users like the video and science users do most of their stuff on the server CPUs anyway.
Impractical... Well yes today but I thought the same thing when I heard about Intel running a cpu at 100 mhz with ln2 cooling back in the 80s. Of course that was when we thought an 8 Mhz system was fast. They where getting 5 mhz on a
Re: (Score:2)
Okay excluding the L33T gamers, super heavy CAD users, HD video producers, Movie F/X houses, and research labs no one needs the power of Intel's high end desktop chips.
I'm none of the above, but I don't like waiting more than a minute for a program to compile. And I compile often.
Re: (Score:2)
Get an SSD and more ram.
Even then unless you are writing giant modules which are you are not supposed to do you will not need to wait more than a minute to compile. After all you only compile the module you are working on.
I use a Core2 and my compile times are well under a minute. Now if I do a clean and build it is longer but no one does that often.
Re: (Score:2)
I believed not so long ago that CPUs were way over the speeds the average consumer needs, that is until I watched processor usage on a modern laptop doing live high resolution video chat.
For people who want to do face to face teleconferencing, high speed real-time video encoding at good resolutions and framerates will really eat up your CPU power. Its not an edge-case anymore either as more and more people expect to be able to do high quality video calls (and not the crap MSN pawned off on people for years
Re: (Score:2)
Well on a modern system that should be off-loaded to the GPU. Even then I would bet that a Sandy-Bridge i3 or i5 would be more than good enough for that task. You sure don't need a quad core i7 for that.
Re: (Score:2)
Re: (Score:2)
"Okay excluding the L33T gamers, super heavy CAD users, HD video producers, Movie F/X houses, and research labs no one needs the power of Intel's high end desktop chips. Frankly some of those users like the video and science users do most of their stuff on the server CPUs anyway."
Bull fucking shit.
Many non-geek amateurs/home users definitely want as much CPU performance as possible. In fact, as someone else wrote in a comment on this site a few months back, many software dev geeks need less CPU/RAM/disk tha
Re: (Score:2)
They don't play with Intel at the high end in the desktop but then I don't know many people that pay $1000+ for a desktop CPU.
They don't play with Intel at the "high end" of $280 http://www.electronics-emporium.com/products/Intel-Core-i7%252d2600K-Processor-3.4GHz-8-MB-Cache-Socket-LGA1155-%252d%252d-1FO004ROHKFN06.html [electronics-emporium.com], which quite a few people do pay for.
Re: (Score:2)
Stop being so willfully obtuse. Intel sells parts all over the range, competing at nearly every pricepoint, and in general offering a better value proposition to boot (performance/price). For example:
http://techreport.com/articles.x/21208/18 [techreport.com]
So yeah, cut the fanboy crap, and the FUD about Intel being so expensive. You can get newer Intel parts just a hair above $100, and they're still good in that range. At least as good as, if not better than, similarly prices AMD chips.
Re: (Score:2)
Every single time you find a benchmark which significantly disagrees with the large sample set taken by PassMark on cpubenchmark.net, you have found a benchmark that is complete bullshit. Now that you know that techreport puts out bullshit benchmarks.. will you ever attempt to cite them again
Re: (Score:3)
You can make excuses all you want for why you would rather believe the obviously less realistic benchmarks of the tech sites, but they are still just excuses.
While you think I need to get over myself.. you need to get over the obvious flaws in the "information" delivery framework that you have been listening to.
You claim that I am listening to one source, when in fact...
Re: (Score:2)
Impractical PR stunts have their uses. They help to focus attention on what is possible today, and what will become commonplace in years to come. 8 Ghz may remain impractical for another two years, or ten years. But, eventually, people will come to expect that, and more. Someone has to do the early experimenting!
Re: (Score:2)
And someone did. Back in the NetBurst era, 8Ghz was attained. Must mean NetBurst was a great architecture.... :snicker:
Re: (Score:2)
Re: (Score:2)
Would you care to do the math on FLOPS per dollar? Because if you care about money at all, AMD has been winning the performance game for a while now.
Re: (Score:2)
Not quite true. It was true for some real-world workloads, while Intel won at some other real-world workloads. Then came Sandy Bridge and kicked AMD's ass all over. I went from being a 9 year long user of AMD over to Sandy Bridge when it was time to replace my old, aging Athlon 64 3500+, mobo and RAM, because for what I do, the i5-2500 beat the shit out of the X6 1055T(which was more expensive than the i5-2500 at the time too...)
And Sandy Bridge is low and mid end for Intel. High-end desktop/consumer perfor
Re: (Score:2)
Re: (Score:2, Funny)
Give away the processor, and sell the liquid helium. The gillette model all over again.
Re: (Score:2)
AMD has been far ahead of Intel in chip design. They have been behind in core design. AMD had gotten rid of the FSB with a NUMA earlier, real quad cores with L3 cache earlier, and properly integrated graphics earlier (Intel's CPU performace goes down quick when the integrated GPU is used in SB).
And in the real world Intel has still been ahead of them for most uses since at least the Core-2.
As for this demo, good luck running your CPU at 1.9V for more than a few weeks.
Re: (Score:2)
1.9V lifespan would be measured in minutes or hours, not days or weeks. At room temperature, 1.9V would damage the processor almost immediately, if not kill it.
In fancy demonstrations and testing like this, the main magic comes not from voltage, but from super conductivity within the processor at the cold temperatures. The cold also increases the lifespan of the chip when using exorbitant amounts of voltage. This doesn't matter to most people or maybe even the poster I'm replying to, but I'm mentioning it h
Re: (Score:2)
Re: (Score:2)
check here: http://en.wikipedia.org/wiki/Point_(geometry) [wikipedia.org]
Re: (Score:2)
I at least lol'd at your creativity.
Also Goatse is on a Russian domain now? In Soviet Russia, Goatse disgusted by YOU!
Re: (Score:2)
Overall clocking a CPU the fastest (that we know of) (for right now): cool. Congratulations, dudes. You did something awesome and we all recognize how nifty that is.
Over-clocking a CPU the most: AMD employees ought to be disqualified, at least when they're using AMD chips. For an overclocking record, you shouldn't be connected to the organization that rated the chip in the first place. If I were to make my own 3-GHz-worthy Pentium 4 then I could rate it as a 100 MHz processor and I would win an overclocking record too. How do we really know these aren't 6 GHz Bulldozer chips to begin with, but under-rated, other than AMD's word that they're not? Not that I seriously believe shenanigans are happening, but the potential for it makes it not cool. AMD employees going for an OC record should have to use an Intel chip (or ARM or whatever, but not AMD) and vice-versa.
This isn't necessarily a record for overclocking. This is a record for CLOCKING. It doesn't matter if the chip was rated at 9GHz or 9MHz. The fact is that no chip, regardless of its initial rating has ever reached a clock speed that high.
Re: (Score:2)
Pretty sure its an absolute Ghz record, not one relative to the chips nominal speed.
Re: (Score:2)
If I understand the summary correctly, it's not the record for the most overclocking-- rather, it's the record for the fastest clock speed of a computer.
8+ GHZ is no simple feat.