Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Overclocking the AMD Spider 105

An anonymous reader writes "AMD has released two videos that show an overview of the new AMD Spider platform and how easy it is to overclock it with a single tool. The AMD Spider is based on AMD Phenom processors, the newly released ATI Radeon HD 3800 series discrete graphics and AMD 7-Series chipsets."
This discussion has been archived. No new comments can be posted.

Overclocking the AMD Spider

Comments Filter:
  • My old PC can't handle Crysis and I've been waiting to see the Phenom before upgrading!
  • I was kinda enthused by the fact that he's an active member of the overclocking community, but to stop the video right before it actually starts with the overclocking, and then call it a video about overclocking the platform, seems...specious.

    From the first video, the platform looks interesting, but will it be able to do any of those things with just one video card (rather than FOUR)?
  • by twfry ( 266215 ) on Saturday November 17, 2007 @09:09AM (#21389457)
    I really don't see where the need to overclock comes from anymore. Today's speeds are pretty darn fast and I'd assume that if you actually have a real need for more processing power, that you should be able to come up with the couple hundred bucks for another socket/proc.

    Lately I've been undervolting to build silent systems. The latest AMD Brisbane processors at 2.1GHz can be undervolted to 1.05V and still pass my stress tests at speed, and stay below 40C with the 'silent' fan modes.
    • Re: (Score:3, Interesting)

      Everyone wants different stuff. I have less than no interest in a quiet (let alone silent) system, but I am interested in a fast system. I never have been an overclocker, but I can easily understand those who are... it's about squeezing every last drop of performance out of that chip. No different from wanting a silent system, really, as in both cases you're in a relative minority who's taking a concept to its extreme, they're just in different areas.
      • by Bert64 ( 520050 )
        But how about squeezing every last bit of performance out of the software that runs on those chips?

        Incidentally, i want quiet and power efficient on the systems i keep running 24/7 (my laptop, tho its usually suspended, and a media server) at home... Tho for a system that boots up to play games and then gets turned off i'm not so concerned.
        In a datacenter i want power efficiency and performance, noise is irrelevant.
        • But how about squeezing every last bit of performance out of the software that runs on those chips?
          This is good too, but by and large, that's the programmers' concern, not yours.
          • by Bert64 ( 520050 )
            Well, it seems pointless to go to such extremes to get as much performance as possible from your hardware (and potentially shortening its lifespan or voiding its warrantee), only to then waste it running inefficient software. Surely a better idea would be more efficient software coupled with more reliable hardware.

            As to your suggestion of the software performance being the programmer's concern, it could similarly be said that hardware performance is the concern of the engineers who designed it.
            • Not true. In general, it's easier to get more performance out of your hardware, especially with something like overclocking, which is akin to changing a software setting, and getting more performance.
              • by sznupi ( 719324 )
                I can think of a few programs (won't mention them by name, they're pet-peeves of many users here...) that feel slow now matter how fast the machine is.

                OTOH I use few equivalents which feel much faster even on 5 year old machine...and it would seem that recognising good coding and NOT having to upgrade your machine constantly is quite easy...
          • This is good too, but by and large, that's the programmers' concern, not yours.

            It doesn't seem that way some days. I know that my current PC is orders of magnitude faster with respect to hardware. But it seems that programmers, or programming houses really, are just using the faster hardware to put out less efficient code. I say programming houses because they are the ones that pay the bills and want things done faster at the expense of quality so that they can sell more crap.

            I wonder how fast some

            • I wonder how fast something like Open Office would run if coded with the efficiency needed to run a program on older computers (not necessarily PCs) when they had to pay attention to resources and cycles.

              But today's programs are, in general, orders of magnitude more complex than those delightfully handcrafted versions of AppleWorks and WordPerfect from the 1980s you're feeling nostalgic for.

              The effort required to build a piece of software does not scale linearly with the complexity - a piece of software tha

      • I'd humbly argue that making quiet/silent machines are far from the minority.

        More and more people are hooking up a computer to their TV, and with that, dvr and playback software on their computers. Now, when they pay 2000$ for the screen and sound, you dont want some box going zzzzzzzzzzzzzzzzzzzzzz on the side. Does the DishNetwork box make nasty whirrrs? Does the DVD player grind when you use it?

        I dont think so. In that same light, nobody wants computers that growl like sleeping dinos.
        • In that same light, nobody wants computers that growl like sleeping dinos.
          I want a loud computer for the bedroom, to cut down on how much my wife complains about my snoring. I wouldn't like something like that as an HTPC, though.
      • Because I actually need the speeed? Core 2 E6420 @ 3.4GHz here.
    • Re: (Score:3, Interesting)

      by ceeam ( 39911 )
      Why not? Moderately overclocked CPUs generally don't consume much more power (they're still on idle usually, and when they are not they finish their tasks _faster_), they are not louder, they are not less reliable even. Usually overclocking these days is just reversing "market positioning" and restoring proper, designed CPU speed. And if you ever do video encoding or play CPU-bound sims you can never have enough.

      But I like silent systems too. But overclocked ones could be silent as well. The days of PIV and
      • by EdZep ( 114198 )
        Has AMD always been OC friendly? I remember when Intel was actively discouraging the practice so as not to have sales of more expensive CPUs undercut.
        • by ceeam ( 39911 )
          Who cares whether they have "always been" friendly or not? Current lines from both manufacturers are very OC friendly.
        • Re: (Score:3, Informative)

          by tlhIngan ( 30335 )

          Has AMD always been OC friendly? I remember when Intel was actively discouraging the practice so as not to have sales of more expensive CPUs undercut.

          Well, traditionally, AMD always had supply issues, so their chips tended to not be very overclockable (they had problems with yields of higher-end chips, so there were no high-end parts to remark as lower end chips). However, they were easy to overclock, usually with aid of conductive ink to restore bridges that set the clock frequencies and multipliers of the

      • I can't imagine the cinematic quality of watching a movie with 4!! GPUs (right next to each other for maximum cooling inefficiency) in my home. Maybe if it was the perfect storm, or a movie about propeller aircraft would it not be noticeable. Don't get me wrong, I MIGHT like to have that system (OK probably would love it), but it seems like an non elegant way to do something.
    • Re: (Score:3, Insightful)

      by Jartan ( 219704 )

      I really don't see where the need to overclock comes from anymore.


      You seem to be looking at it from a non gaming perspective. Considering the article is about a gaming system that seems to be a bit off topic as far as viewpoints go.
      • I really don't see where the need to overclock comes from anymore.

        You seem to be looking at it from a non gaming perspective. Considering the article is about a gaming system that seems to be a bit off topic as far as viewpoints go.

        Also, I think it is popular among poor people because you can spend $80 on a processor and overclock to a processor worth triple that. See the Intel E2180 for example. For me as a college student this is great.

    • Why is it overclock and undervolt and not overvolt and underclock?
      • Why is it overclock and undervolt and not overvolt and underclock?

        Semantics, and all that. The word choices of the phrases are due to the intended goal of the activity that the phrase was created to describe.

        Overclocking is seeking the highest clock speed at which that particular processor can run, in order to maximize CPU capability, with only secondary considerations about the energy / thermal factors.It is not seeking to increase the voltage, although that is the unfortunate side effect. Thus it

    • Several reasons. I'll reference the AMD 5000+ Black Edition because I just picked one up for my new computer I'm building ($130). It's built on the 65NM process, and overclock's easy to 3.3-3.5 GHZ. I Enjoy overclocking (Hobby) and I enjoy having a fast machine, even if I'm not using. Now I also do a lot of video/audio encoding, ripping so every mhz counts to get those tasks done quicker. So I'm getting an extra (effective) 1.8GHZ or so out of this CPU, and overall a damn fast processor, for 130 bucks.
    • I personally want both. I want a system that will self-overclock automatically and which will thermally throttle itself down, and which when idle will undervolt and underclock to save power, much like using the ondemand governor for linux power management, except at a deeper level. This principle is inherent to many larger, mechanical systems, like your car's engine (provided you have an oxygen sensor.) In cold weather, the air is denser, so you can burn more fuel, and you get more power. In hot weather, th
    • You overclock because most games can still only use one processor core. Dual cores let you offload everything but the game onto the other core, but you're still only running the game on just one core, and you want that core running as fast as possible. Yes, this is changing and more games can take advantage of multiple CPUs, but that's the exception right now, not the rule. Additionally, even if the game can run on multiple cores and use them all, you'd still overclock memory as far as it'll go.
    • Yea, except you can generally save around $50 by overclocking. That's why I do it. I can pretty much get processors for half the price of what they would be at the speed I'm running 'em. (Yes, I'm using $50 processors. What can I say, I'm on a budget.)
    • I really don't see where the need to overclock comes from anymore. Today's speeds are pretty darn fast and I'd assume that if you actually have a real need for more processing power, that you should be able to come up with the couple hundred bucks for another socket/proc.

      Quick price point comparison here, I spent 170 on my processor, (2.33 Ghz) and about 60 (fan, thermal grease, and a bit extra on the case) on it's cooling system, while I don't have it overclocked (the cooling system was mostly for fun), the whole thing should easily overclock to match Intel's fastest Core 2 Duo, (3.0 Ghz).

      That processor I would be matching is about 280, so my net gain in terms of cash is pretty minimal (though more fun in my opinion).

      However, the processor I have, or that more expensive p

    • Why overclock? Why not? That's my question? Pretty damn fast? for who? For a slashdotter who's programming in FORTRAN or for a gamer who needs more clocks?

      With today's heatsinks, at most what you'd need is a $40-50 heatsink and your cpu can reach speeds of a processor that costs double or triple the one you have. Most video cards out there can overclock without any modifications to the cooling.

      Overclocking is safe too, if you know what you're doing. If your PC starts displaying artifacts on the screen you k
  • by florescent_beige ( 608235 ) on Saturday November 17, 2007 @09:12AM (#21389467) Journal
    Here. [youtube.com]
  • by skoaldipper ( 752281 ) on Saturday November 17, 2007 @09:12AM (#21389471)
    The marketing train...

    I felt like I just got ran over. Nice job AMD. Actually, the first flashvert was pretty slick with the transformer, and was fairly informative. Honestly, I didn't quite extract much information from the overclocking one, except for it's awailable date.

    Forgive me, but it's early Saturday morning here. And in the spirit of todays morning cartoon ritual, while munching on some Lucky charms cereal I fully expected the overclocking advert to finish with...

    "Shh! Be vewy vewy qwiet, I'm on my AMD hunting for more raw bits! Eh. Heh! Heh! Heh! Heh!"
  • DISCRETE (Score:4, Informative)

    by Sockatume ( 732728 ) on Saturday November 17, 2007 @09:39AM (#21389633)
    Discrete = distinct, seperate. Discreet = subtle, low-key. That is all.
  • What's new? (Score:5, Insightful)

    by mollymoo ( 202721 ) on Saturday November 17, 2007 @10:09AM (#21389851) Journal
    Perhaps I'm missing something, but this is noting new at all, is it? I mean, the only "innovation" here is that one company is making the CPU, chipset and graphics card. You know, like Intel have been for years. But AMD make one where the graphics card is targeted at gamers. Whoop-de-fucking-do.
    • Re:What's new? (Score:4, Interesting)

      by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Saturday November 17, 2007 @10:32AM (#21390013) Homepage Journal
      That IS new, and it IS a big deal. It is a sign! It's a sign that there's enough consumers who want their games to just fucking work on PC without having to worry about what hardware they're going to buy, like a console. Sure, you don't get all the benefits of console gaming, but you don't get all the drawbacks, either. So now AMD is interested in catering to this market - it means that the market [probably] exists, which indicates that the overall gaming market is growing. That's not news to most of us, but it's still a positive sign of the direction in which the market is heading. Personally, I am more interested in integrated systems today because I am no longer chasing the latest and greatest, I just want something cheap that works. My primary system is now a laptop (albeit the most powerful one that was available at the time I purchased it) and I like it that way. I am down to one desktop system and I have drive sleds for it so it can be a variety of testing systems. Everything else is a laptop or some other SFF unit (like my iopener, or my xbox.)
    • by Bert64 ( 520050 )
      Intel's videocards have always been very much budget cards... Fine for general office computers but useless for gaming or heavy video related work.
    • Re:What's new? (Score:5, Informative)

      by moosesocks ( 264553 ) on Saturday November 17, 2007 @01:00PM (#21390981) Homepage

      I mean, the only "innovation" here is that one company is making the CPU, chipset and graphics card. You know, like Intel have been for years. But AMD make one where the graphics card is targeted at gamers. Whoop-de-fucking-do.


      Not quite. The role of the GPU is stepping up to be much more important than "just games".

      Newer operating systems rely extensively on the GPU to render the desktop, apply various effects to it, etc.... These tasks can be as simple as alpha blending, or as complex as providing a hardware-accelerated version of Photoshop.

      It's not quite there yet on Windows (Vista implements it rather poorly), but Linux and OS X have been using OpenGL acceleration on the desktop for quite some time now. In what might be a first for a 'desktop' feature, support for it on Linux is actually quite good, and provides a rather nice UI experience (once you turn all of Compiz's superfluous effects off, that is).

      I'm going to jump in here as a part-time Apple fanboy, and also point out that Apple's very heavily pushing its set of accelerated 2D Graphics libraries [arstechnica.com] toward developers to integrate into their applications to provide a more natural and fluid experience. In 10.5, OpenGL rendering is pervasive in almost every part of the user interface. Once you've got that framework in place, it becomes very easy to do all sorts of fun stuff without worrying about bogging down the CPU.

      Even fast modern CPUs perform miserably when it comes to graphics operations, as they're not designed to cope with vector and matrix operations. With high-resolution displays becoming prevalent these days, it makes a good deal of sense to offload as much of the processing as possible to the GPU. If you implement this properly in the operating system, it's even transparent to the users AND developers. It's very much a no-brainer.

      Many GPUs these days also provide accelerated support for video encoding/decoding, which is also a rather strenuous task for a normal desktop CPU to handle efficiently. Video editing applications can also take advantage by providing realtime previews of HD video rendered with effects applied to it.

      Anyone who's done a substantial amount of video editing knows just how welcome this would be. Ironically, it's a shift back to an older paradigm, as the Amiga Video Toasters included an array of specialized graphics hardware to do all of the dirty work, and did it in real-time.

      This might also translate into some sort of energy savings, given that modern CPUs consume very little power when idle, although this is pure speculation on my part.

      There are all sorts of fun applications for this sort of technology once the frameworks are in place. Read up on Apple's 'Core' set of libraries for a fascinating peek into the future of UI and software design. Pixelmator [pixelmator.com] is one of the first applications to take extensive advantage of these features, and is an absolute joy to work with. Although its featureset isn't as extensive as Photoshop, it's damn impressive for a 1.0 product, and I'd daresay that it's a hell of a lot more useful to mainstream audiences than the GIMP is, and has a sexy UI to boot. Dragging the sliders when tweaking a filter, and watching the ENTIRE image smoothly change as you drag the slider seems like nirvana to photographers and graphic artists (even on somewhat old hardware)

      So yes. This is a big deal. Everyday desktop software is transitioning toward relying upon the GPU for basic tasks, and AMD has stepped up to the plate to provide a decent set of entry-level graphics hardware to fill in the gap. Remember the state of video hardware before nVidia came along, and introduced the TNT2 and later the Geforce2-MX? Before them, decent 3d graphics hardware was an extravagant luxury. Afterward, it was easily affordable, and nearly ubiquitous.

      I should also point out that Intel's graphics hardware is absolute shit. That comparison's just not fair.
      • I'm going to give you a great big "THANK YOU" for that link to Pixelmator. I did not know about this program. I'm really impressed with the low price, too. With so many programs well over $100, at $59 I'm going to have to give it a try, and buy it if I do like it.
        • No problem! It's shareware, so do give it a try before plunking down $60 for it. It IS of course missing some of the features you'd expect in Photoshop, although it's got more of the 'essentials' than the GIMP presently does. It's also

          There seem to be a few inexpensive graphics apps coming onto OS X, rushing to fill in the gap, given that there weren't really many options apart from the GIMP and Photoshop (one's rather undesirable, and the other's rather expensive and outdated).

          Pixelmator leads the pack,
      • Comment removed based on user account deletion
      • Sorry, but Spider is not an entry-level system designed to pep up your accelerated desktop experience, it's intended to give you more fps in Crysis. The graphics cards which are part of Spider are ATI's fastest cards; they are gaming cards through and through and cost over $200. So while accelerated desktops are the future, AMD Spider isn't targeted at that future.

    • I mean, the only "innovation" here is that one company is making the CPU, chipset and graphics card. You know, like Intel have been for years. But AMD make one where the graphics card is targeted at gamers. Whoop-de-fucking-do.

      Soon ATI/AMD will be releasing a new high-end GPU series, called Stream [sci-tech-today.com], as a competitor to nVidia's Quadro FX series.

      Traditionally, ATI supported only 24-bit floating point numbers on their consumer-grade GPU's [whereas nVidia & Matrox supported 32-bits on their consumer-gra
    • by ameoba ( 173803 )
      The bit about overclocking is significant because there was a big stink recently about how the Phenom CPUs were going to be launching at fairly low clock speeds. If the overclocking is well supported, it really changes what shipping at low clock speeds means.
  • by bl8n8r ( 649187 )
    Sucking up mass jigawatts of power off the grid to juice 4 video cards for gaming is insane. The target groups for this rig are people with compensation problems or ones with no concept or care for energy conservation. We're moving in the wrong direction folks.
    • Re: (Score:3, Insightful)

      by slyn ( 1111419 )
      I don't think its a problem you'll need to worry about anytime soon. According to this [steampowered.com], only 0.41% of about 165000 Steam users (when I just checked) have 2 GPU's. The number is probably way smaller for 3 card users, and probably barely anyone has a 4 card setup. The performance just doesn't scale well enough in SLI/Crossfire for it to be worth it to buy two GPU's. IIRC the performance increase in framerate is only around 30% if you are using two of the same model of GPU. It's just not cost effective enough
    • Re: (Score:2, Insightful)

      by Wrath0fb0b ( 302444 )
      Sitting at home with any amount of computing power has to be more energy efficient than taking a car anywhere.
      • by Kupfernigk ( 1190345 ) on Saturday November 17, 2007 @12:07PM (#21390629)
        In order to get to and from the office in a small European city car, with about the same real world consumption as a Prius, I use enough fuel to produce about 6KWH of electricity, enough to run a 4-GPU 2-screen rig for a morning (including the monitors). That is on the very low side for commutes; the guy who commutes from the next large city in his SUV uses as much fuel in a day as I do in two weeks. If one of the ultimate goals of these systems is virtual working in a photo realistic environment, they could be big enough to need a substantial water cooling system and still reduce global warming.
        • by Shark ( 78448 )
          It isn't called global warming anymore. The new official name for it is 'climat change'. The UN would be in quite a pickle justifying carbon taxes if it turns out that we aren't warming in a few decades. But you can never argue that the climate isn't changing. I say they're just taking the safe bet regardless of which side of that 'consensus' (god forbid there be a debate) you are.

          That said, I am *totally* for energy efficiency if merely from the fact that waste is bad. But if they really want to save
    • Re: (Score:1, Informative)

      by Anonymous Coward
      Since when is forward the wrong direction?
      What's wrong with having 4 graphics cards? Especially in this case ones that _aren't_ heavy on the noise or wattage side. 4 cards could be used for graphics, or some combination of graphics and physics, or just heavy "general purpose" compute power (where I use the term "general purpose" as loosely as can be applied to a graphics card...make no mistake that the kinds of apps that a GPU can accelerate are rather specialized).
    • by Kjella ( 173770 )
      Oh give me a break, while the cards suck up a little power you're not polluting of significance, not more than our most efficient power plants anyway. You're not contributing to the throw-away society, you're not littering or use a product that's made of animals, used for animal testing, endangering any species or much of anything else. I doubt it's any worse than buying a bigger car, bigger house, imported foods or a helluva lot of normal social activities. If you ever drove down to McDonalds in your SUV,
    • Dude. Have some fun in your short short life.
  • I'm desperately waiting for the ATI Imageon imbedded next-gen smartphones and pocket pcs. I'm overclocking my Blackjack, and when I get dedicated graphics, things will be no different.
    • by fitten ( 521191 )
      Imageon, apply directly to your smartphone.
      Imageon, apply directly to your smartphone.
      Imageon, apply directly to your smartphone.
  • If AMD really wants to show that they're serious about letting the overclocking community have there way, why don't they just unlock the clock multiplier on the CPU? I remember that way back with the original Athlon, you could accomplish this with just a mechanical pencil and be well on your way to melting your CPU (with the plus side of not having to change your frontside bus, thus keeping other system components like chipset and memory fairly happy as well). The problem was, of course, that there higher-e
  • I really don't get it. Having been a regular AMD user since 1989, I see AMD's market share gained during the last years slipping, its lead position overtaken, its finances floundering.
    Of course, Spider has the potential to win the hard-core gamers and overclockers (and maybe the energy-conscious underclockers). But - I didn't do the research here, wild guessing - per one hard-core gamer 10 or 100 CPUs are sold to the general public (desktop). And 10 or 100 CPUs are sold to be used in servers.
    In order to sur
    • This is actually for noobs. Noobs who want cheap gaming from their Dell instead of a GMA.
      I imagine the next step that will actually make a difference to AMD's marketshare will be the laptop version, since intel pitsgraphics are still very common in portables. This is all good for AMD; OEMs are where big money is, not us geeks buying parts on newegg.

Whoever dies with the most toys wins.

Working...