Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Overclocked Radeon Card Breaks 1 GHz 199

dacaldar writes "According to Yahoo Finance, noted Finnish over-clockers Sampsa Kurri and Ville Suvanto have made world history by over-clocking a graphics processor to engine clock levels above 1 GHz. The record was set on the recently-announced Radeon® X1800 XT graphics processor from ATI Technologies Inc."
This discussion has been archived. No new comments can be posted.

Overclocked Radeon Card Breaks 1 GHz

Comments Filter:
  • by syphax ( 189065 ) on Wednesday October 26, 2005 @04:49PM (#13884029) Journal
    have made world history

    I think that's going a bit far. Good for them and everything, but world history? V-E day, Einstein's 1905, Rosa Parks refusing to give up her seat on the bus- these events impact world history (sorry for the all-Western examples); making a chip oscillate faster than an arbitrary threshold does not.
  • World history? (Score:3, Insightful)

    by Seanasy ( 21730 ) on Wednesday October 26, 2005 @04:51PM (#13884053)
    ... have made world history...

    Uh, it's cool and all but not likely to be in the history books. (easy on that hyperbole, wiil ya)

  • Re:One wonders... (Score:5, Insightful)

    by Jarnis ( 266190 ) on Wednesday October 26, 2005 @04:51PM (#13884056)
    ... because ATI made a big press release about it.

    Since their product is still mostly vapor (you can't buy it yet), and nVidia is currently owning them in the high end market because ATI's product is so late, one has to grasp straws in order to try look l33t in the eyes of the potential purchasers.

    Wish they'd spend less time yapping and more time actually putting product on the shelves.

    Nice overclock in any case, but ATI putting out a press release about it is kinda silly
  • Historical? (Score:1, Insightful)

    by Anonymous Coward on Wednesday October 26, 2005 @04:51PM (#13884060)
    How is this any more historical than overclocking it to 993 mhz? Its not! 1ghz is just a nice round number. It I overclock one to 1.82 ghz tomorrow, no one will care!
  • by pclminion ( 145572 ) on Wednesday October 26, 2005 @04:52PM (#13884067)
    If you cool a chip, you can make it run faster. This is a matter of physics that doesn't need to be tested any more than it already has been. In some small way I appreciate the geek factor but I'm far more interested in geek projects that have some practical use.

    And as for being the first people in the world to do this... the chances of that are small. I'm sure there are people at Radeon (and other companies) who have done things far more bizarre, but didn't announce it to the world.

  • Not for the weak (Score:5, Insightful)

    by Mr. Sketch ( 111112 ) <<moc.liamg> <ta> <hcteks.retsim>> on Wednesday October 26, 2005 @04:52PM (#13884071)
    The team, optimistic that higher speeds could ultimately be achieved with the Radeon X1800 XT, attained the record speeds using a custom-built liquid nitrogen cooling system that cooled the graphics processor to minus-80 degrees Celsius.

    It seems we may have a ways to go before it can be done with standard air cooling. I actually didn't think that operating temperatures for these processors went down to -80C.
  • by pclminion ( 145572 ) on Wednesday October 26, 2005 @05:07PM (#13884173)
    Yes, but a lot of different chips have different overclocking potential. It's interesting to see which can be pushed the furthest, even if its impractical.

    Really, I don't think it's interesting whatsoever. It's like testing the strength of various bulletproof glass samples at a temperature of -100 C. The fact is, bulletproof glass is not used in such environments so the test gives no useful information.

    Beside, since when are geeky pursuits practical?

    I can't believe you're being serious. My geeky pursuits pay for my house.

  • by xouumalperxe ( 815707 ) on Wednesday October 26, 2005 @05:40PM (#13884426)
    Well, while the CPU people are finally doing dual core processors (essentially, two instruction pipelines in one die, plus cache et al), the GPU people have something like 24 pipelines in a single graphics chip. Why is it that the CPU people have such lame parallelism?

    To answer both questions. Graphics are trivial to parallelize. You know to start with that you'll be doing essentially the same code for all pixels, and each pixel is essentially independent from its neighbours. So doing one or twenty at the same time is mostly the same, and since all you need is to make sure the whole screen is rendered, each pipeline just needs to grab the next unhandled pixel. No syncronization difficulties, no nothing. Since pixel pipelines don't stop each other doing syncing, you effectively have a 24 GHz processor in this beast.
    On the other hand, you have an Athlon 64 X2 4800+ (damn, that's a needlessly big, numbery name). It has two cores, each running at 2.4 GHz (2.4 * 2 = 4.8, hence the name, I believe). However, for safe use of two processors for general computing purposes, lots of timing trouble has to be handled. Even if you do have those two processors, a lot of time has to be spent making sure they're coherent, and the effective performance is well below twice that of a single processor at twice the clock speed.

    So, if raising the speed is easier than adding another core, and gives enough performance benefits to justify it, without the added programming complexity and errors (there was at least one privilege elevation exploit in linux that involved race conditions in kernel calls, IIRC), why go multiple processor earlier than needed? Of course, for some easily parallelized problems, people have been using multiprocessing for quite a while, and actually doing two things at the same time is also a possibility, but not quite as directly useful as in the graphics card scenario.
  • Re:O RLY? (Score:3, Insightful)

    by GarfBond ( 565331 ) on Wednesday October 26, 2005 @06:15PM (#13884702)
    Good job not even reading the summary. The card you referenced is under half the speed of this ridiculously overclocked ATI card, which happens to be 1GHZ core/1.8GHZ memory.
  • by TheVorp ( 888771 ) on Thursday October 27, 2005 @10:57AM (#13888969) Homepage
    Not sure if anyone else noticed, but the videocard in question was actually overclocked to 1003.50 MhZ, not exACTly a true GhZ, eh? This is the same kind of math that rips me off of gigs on my mp3 player and HDD. Nevertheless, I welcome the first vc to be clocked to 0.97998046875 GhZ!

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...