Overclocked Radeon Card Breaks 1 GHz 199
dacaldar writes "According to Yahoo Finance, noted Finnish over-clockers Sampsa Kurri and Ville Suvanto have made world history by over-clocking a graphics processor to engine clock levels above 1 GHz. The record was set on the recently-announced Radeon® X1800 XT graphics processor from ATI Technologies Inc."
Huzzah! (Score:5, Funny)
Re:Huzzah! (Score:4, Funny)
Re:Huzzah! (Score:4, Informative)
http://www.xtremesystems.org/forums/showthread.php ?p=1104977#post1104977 [xtremesystems.org]
sampsa is asked
Were you able to run any benchmarks at that speed or was that a Windows stable shot???? Anyway that is still hella fast with no artifacts.
sampsa's response
Just a Windows shot in 2D.
so still impressive, but not what they describe
A Fraud! A Sham! A Scandal! (Score:2, Insightful)
Awesome! (Score:1, Funny)
First Imagine a Beowulf Cluster of those post
First Can Netcraft confirm that? post
Re:Awesome! (Score:2, Funny)
Hmm. Your ideas are intriguing to me and I wish to subscribe to your
newsletter.
One wonders... (Score:5, Interesting)
Re:One wonders... (Score:3, Funny)
Re:One wonders... (Score:1)
Look at the bottom of the page. It's just a press release announcing that ATI was the first to get to 1 GHz. Basically a "fuck you" to Nvidia, nothing more.
Re:One wonders... (Score:5, Insightful)
Since their product is still mostly vapor (you can't buy it yet), and nVidia is currently owning them in the high end market because ATI's product is so late, one has to grasp straws in order to try look l33t in the eyes of the potential purchasers.
Wish they'd spend less time yapping and more time actually putting product on the shelves.
Nice overclock in any case, but ATI putting out a press release about it is kinda silly
Re:One wonders... (Score:4, Informative)
Re:One wonders... (Score:2)
X1800XL is out (tho even it is not buyable yet in most part of europe), but XT is nowhere to be seen - and it's been three weeks since the 'launch'.
No big surprise (Score:5, Funny)
Overclocked Radeon Card Breaks
1 GHz
Was wondering why an overclocked card breaking is such a big deal
Benchmarks? (Score:5, Funny)
Re:Benchmarks? (Score:2)
Video drivers, as notoriously buggy and fragile as they are, don't handle clock speed changes very well at all. You can get a few percent without much problem, but the difficulty scale starts climbing much faster than CPU overclocking.
Congrats, though - it's only a matter of time until it happens in production chips.
Re:Benchmarks? (Score:5, Informative)
http://www.muropaketti.com/3dmark/r520/12419.png [muropaketti.com]
Pictures of their setup/methods:
http://www.muropaketti.com/3dmark/r520/ghz/ [muropaketti.com]
Re:Benchmarks? (Score:2)
Re:Benchmarks? (Score:5, Informative)
in other words... still impressive (no other chip has been able to overclock to 1ghz, even in 2d mode) but not quite what you were hoping for
GPU to excel CPU (Score:1)
Re:GPU to excel CPU (Score:3, Interesting)
Re:GPU to excel CPU (Score:2)
Umm, you're out of your mind. Or more precisely, you're trying to hard to guard your statements. "not that much faster" is rubbish.
Obviously, for "general purpose computing" a GPU would not only not be "that much faster" than a CPU, but indeed, it would be significantly slower
If this wheren't so, we'd offcourse be using our GPUs as CPUs (or more likely, construct CPUs the way GPUs are constructed)
Re:GPU to excel CPU (Score:3, Interesting)
What I've been waiting for is some sort of mathematics program (I used to use Mathematica in college) that could utilize this concentrated pow
Re:GPU to excel CPU (Score:3, Informative)
Re:GPU to excel CPU (Score:5, Informative)
Re:GPU to excel CPU (Score:3, Informative)
Generic GPU programming [gpgpu.org]
Re:GPU to excel CPU (Score:4, Informative)
Go here [gpgpu.org] for several examples of this -- far from simply having been proposed, it's been done a fair number of times.
The thing to keep in mind with this is that while the GPU has a lot of bandwidth and throughput, most of that is due to a high degree of parallelism. Obviously 1 GHz hasn't been a major milestone for CPUs for quite a while, but CPUs are only recently starting to do multi-core processing, while GPUs have been doing fairly seriously parallel processing for quite a while.
Along with that, the GPU has a major advantage for some tasks in having hardware support for some relatively complex operations that require a fair amount of programming on the CPU (e.g. multiplying, inverting, etc., small vectors, typically has a single instruction to find Euclidean distance between two 3D points, etc.)
That means the GPU can be quite a bit faster for some things, but it's a long ways from a panacea -- you can get spectacular results applying a single mathematical transformation to a large matrix, but if you have a process that's mostly serial in nature, it'll probably be substantially slower than on the CPU.
Along with that, development for the GPU is generally somewhat difficult compared to development on the CPU. Writing the code itself isn't too bad, as there are decent IDEs (e.g ATI's RenderMonkey) but you're working in a strange (though somewhat C-like) language. Much worse is essentially a complete lack of debugging support. Along with that, you have to take the target GPU into account in the code (to some extent). I just got a call in the middle of a meeting this morning from one of my co-workers, pointing out that some of my code works perfectly on my own machine, but not at all on any his. I haven't had a chance to figure out what's wrong yet, but I'm betting it stems from the difference in graphics controllers (my machine has an nVidia board but his has Intel "Extreme" (ly slow) graphics).
--
The universe is a figment of its own imagination.
Also the precision of GPUs is limited (Score:2)
16 bit floating point number may be a problem due to the small precision.
Re:Also the precision of GPUs is limited (Score:2)
Re:Also the precision of GPUs is limited (Score:2)
Still to put it into perspective Intel floating point goes up to 80bit (less if your using the vector units).
Re:GPU to excel CPU (Score:2)
GPU chips are designed to do a certain type of calculation (matrix multiplication) as quickly as possible, and, unsurprisingly, they can do it a lot faster than chips designed for a much wider array of calculations. 3D graphics is a sufficiently popular application that it has caused chips to be designed for rapid matrix multiplication, but various other applications requi
Speed play offs. (Score:5, Funny)
A bit presumptious? (Score:5, Insightful)
I think that's going a bit far. Good for them and everything, but world history? V-E day, Einstein's 1905, Rosa Parks refusing to give up her seat on the bus- these events impact world history (sorry for the all-Western examples); making a chip oscillate faster than an arbitrary threshold does not.
Re:A bit presumptious? (Score:5, Funny)
Re:A bit presumptious? (Score:2)
Re:A bit presumptious? (Score:2)
Cuz I already had one WWII event.
My post was not intended to be a comprehensive list of great events in World History. I also omitted, e.g., the expulsion of the Persians from Greece circa 480 BC, and so forth.
What matters with Rosa Parks was what happened afterward. A civil rights movement that fundamentally impacted US culture, and the rights of ~20M African Americans. Maybe not up there with WWII, but not inconsequential.
And it was also a nod to acknowledge her due to her recent death.
3D Mark? (Score:1)
Re:3D Mark? (Score:2)
http://www.muropaketti.com/3dmark/r520/12419.png [muropaketti.com]
Re:3D Mark? (Score:2)
Global Warming (Score:1, Redundant)
The culprit (Score:5, Funny)
World history? (Score:3, Insightful)
Uh, it's cool and all but not likely to be in the history books. (easy on that hyperbole, wiil ya)
Re:World history? (Score:3, Funny)
(*Think about it*^)
Re:World history? (Score:2)
Until someone writes the Wikipedia entry for it. I can see it five years from now...
On this date in 2005 [wikipedia.org], ATI Technologies [wikipedia.org] sucessfully crossed to the 1 Ghz [wikipedia.org] barrier with Computer Graphics Cards [wikipedia.org], ushering in a new era in computer gaming [wikipedia.org].
Re:World history? (Score:2)
Someday people will ask... (Score:5, Funny)
Seriously, "world history"? There's no historical significance here. It was inevitable, and no big deal.
Historical? (Score:1, Insightful)
Re:Historical? (Score:2)
How is this any more historical than overclocking it to 993 mhz?
What if Apollo 11 had travelled 99.3% of the way to the moon? What if the Manhattan Project built a bomb with only 99.3% of a critical mass of Uranium in it? What if the Continental Congress had gotten 99.3% of the votes to declare independence, but then decided just to stay a British colony?
History is made by those who achieve something, not by those who just come really close and then fail. Centuries from now, when this event is looked
Re:Historical? (Score:2)
We'll just see (Score:5, Funny)
Re:We'll just see (Score:2)
Re:We'll just see (Score:2)
Re:We'll just see (Score:2)
What's the point of these tests? (Score:4, Insightful)
And as for being the first people in the world to do this... the chances of that are small. I'm sure there are people at Radeon (and other companies) who have done things far more bizarre, but didn't announce it to the world.
Re:What's the point of these tests? (Score:2)
Re:What's the point of these tests? (Score:3, Insightful)
Really, I don't think it's interesting whatsoever. It's like testing the strength of various bulletproof glass samples at a temperature of -100 C. The fact is, bulletproof glass is not used in such environments so the test gives no useful information.
Beside, since when are geeky pursuits practical?
I can't believe you're being serious. My geeky pur
Re:What's the point of these tests? (Score:2)
Re:What's the point of these tests? (Score:2)
Yeah, but it's also well known that if you force more air/gasoline/nitrous into a car engine, it will go faster. But people continue to try to break land speed records. It's human nature. People do it just for the sake of doing it.
The company is ATI, and no, they don't do anythi
Re:What's the point of these tests? (Score:2)
Yeah I goofed, sorry.
They don't care what happens to their chips at -80C or +180C. All they do is test them a little bit beyond the limits of their recommended operating range.
I highly doubt it. I have some pretty intimate knowledge of what sorts of things go on at a certain giant chipmaker, and believe me, all kinds of crazy shit has been done that you don't know about simply because they don't tell you. I imagine ATI is similar. I don't know if they've done anything exactly like
Re:What's the point of these tests? (Score:2)
Re:What's the point of these tests? (Score:2)
I am in no way a chip engineer (in fact I recently dropped out of a computer engineering program to switch to business), so it's just speculation on my part, but the logic seems to w
Not for the weak (Score:5, Insightful)
It seems we may have a ways to go before it can be done with standard air cooling. I actually didn't think that operating temperatures for these processors went down to -80C.
Why so cold? (Score:2)
Re:Why so cold? (Score:2)
Re:Why so cold? (Score:2)
Actually I doubt that you understand my point, but to answer your question: anything accepting heat at a temperature which is well beyond the temperature of the uncooled system is ok. This need not be a cryogenic temperature.
If you would stick the processor in ice water it would remain at more or less zero degrees C, the only problem being that evapo
Re:Why so cold? (Score:2)
Re:Why so cold? (Score:2)
Firstly, it is rather easy to make a permanent 'ice water reservoir' (or similar) while it is nontrivial to produce liquid nitrogen. Admittedly for a short test this is no real issue (bear in mind I never said it was BAD, just asked if it had any advantages to use liquid N).
Secondly this large temperature difference between the coolant and a
Re:Not for the weak (Score:2)
comon now (Score:5, Funny)
GPU vs. CPU Speed (Score:1, Interesting)
Are they made on a different process? Are they made with different materials? Are there signifigantly more transistors on a GPU?
Why don't we have a 3Ghz GPU?
Re:GPU vs. CPU Speed (Score:4, Informative)
GPUs have also tended to focus on parallel execution - at least over the last few years - increasing the number of pixels done at the same time, to compensate for not being able to hit multi-ghz speeds, so yes they have many more transistors than typical CPUs (the 7800GTX might break 300 million, well over 250 million) - and of course heat is an issue if you push the voltage and / or clock speeds to far. The last few generations of GPUs have been up around 65-80W real world draw, more than most CPUs out there. And of course GPUs have very little room for cooling in those expansion slots.
Re:GPU vs. CPU Speed (Score:2)
You say this as if it is both true and obvious. In fact, it is false. For several years now, floating point arithmetic has been just fast (and in many cases, faster) than integer arithmetic. I have actually sped up code which was previously cleverly written to use integers, just by switching to doubles.
What is still very slow is converting from integers to floats, and vice versa
Re:GPU vs. CPU Speed (Score:2)
Re:GPU vs. CPU Speed (Score:5, Insightful)
To answer both questions. Graphics are trivial to parallelize. You know to start with that you'll be doing essentially the same code for all pixels, and each pixel is essentially independent from its neighbours. So doing one or twenty at the same time is mostly the same, and since all you need is to make sure the whole screen is rendered, each pipeline just needs to grab the next unhandled pixel. No syncronization difficulties, no nothing. Since pixel pipelines don't stop each other doing syncing, you effectively have a 24 GHz processor in this beast.
On the other hand, you have an Athlon 64 X2 4800+ (damn, that's a needlessly big, numbery name). It has two cores, each running at 2.4 GHz (2.4 * 2 = 4.8, hence the name, I believe). However, for safe use of two processors for general computing purposes, lots of timing trouble has to be handled. Even if you do have those two processors, a lot of time has to be spent making sure they're coherent, and the effective performance is well below twice that of a single processor at twice the clock speed.
So, if raising the speed is easier than adding another core, and gives enough performance benefits to justify it, without the added programming complexity and errors (there was at least one privilege elevation exploit in linux that involved race conditions in kernel calls, IIRC), why go multiple processor earlier than needed? Of course, for some easily parallelized problems, people have been using multiprocessing for quite a while, and actually doing two things at the same time is also a possibility, but not quite as directly useful as in the graphics card scenario.
And you thought slashdot was bad (Score:1)
It was 2D mode only (Score:5, Interesting)
Re:It was 2D mode only (Score:5, Funny)
[1] Going downhill
Re:It was 2D mode only (Score:2, Interesting)
The phrase "maximum system stability" though might be misleading. If you define it as just POSTing, then man I've done some awesome overclocking myself!
Interesting that these overclockers are "noted", and "Finnish." That does sort of give
Cheating (Score:2)
I'm gonna wait until you can get 1ghz with a practical cooling solution before getting too excited (tho the way CPUs are heading these days cryogenic cooling may come as stock in a few years!)
Re:Cheating (Score:2)
GPUs on the otherhand aren't anywhere near overclockable so this is quite the hack.
Re:Cheating (Score:2)
Quake IV (Score:1, Funny)
This record already broken (Score:3, Informative)
Hot Hot Hot! (Score:2, Funny)
Re:Hot Hot Hot! (Score:2)
FPS (Score:5, Funny)
That's sad... (Score:5, Funny)
Re:That's sad... (Score:2)
Curious thought - the human brain has about 50% of its mass devoted to processing sight and patterns in images... Sounds about right, doesn't it?
Re:That's sad... (Score:2)
the brain also relies on quantum effects for certain computational effects, has a 1.5 GB hardware ROM (DNA) that includes the full specification of every tissue and every protein in the body... and can store and recall an i
Re:That's sad... (Score:2)
It's true for me...
Because it's worth zilch without... (Score:5, Informative)
O RLY? (Score:3, Informative)
BFG GeForce(TM) 7800 GTX OC(TM) with Water Block. Factory overclocked to 490MHz / 1300MHz (vs. 400MHz / 1000MHz standard), this built-to-order card will feature a water block instead of a GPU fan for those wanting to purchase or who may already have an existing liquid-cooled PC system. BFG will hand-build your card using Arctic Silver 5 Premium Thermal Compound. Easily hooked up to any existing 1/4" tubing system or to 3/8" tubes with the included adapters, this card runs cool and silent. BFG Tech is proud to offer their true lifetime warranty on this graphics card. (Card with water block requires internal or external water cooled system, sold separately.)
Re:O RLY? (Score:2)
Re:O RLY? (Score:3, Insightful)
It'll just about... (Score:5, Funny)
Re:It'll just about... (Score:2)
Thank you very much, you bastard.
warming up? eke! (Score:2)
To -80 degrees? Ouch! How cold were you before that?
BFD! (Score:2)
WTF? (Score:2)
On the other hand, 'almost 2 GHz memory speed' is a little more meaningful. At least they mentioned that.
n00b (Score:2, Funny)
Re:Wow, watch the paint dry too. (Score:2)