Overclocking the AMD Spider 105
An anonymous reader writes "AMD has released two videos that show an overview of the new AMD Spider platform and how easy it is to overclock it with a single tool. The AMD Spider is based on AMD Phenom processors, the newly released ATI Radeon HD 3800 series discrete graphics and AMD 7-Series chipsets."
Re: (Score:1)
Photos can been seen on the links below:
http://www.forumpix.co.uk/uploads/1195321353.jpg [forumpix.co.uk]
http://www.forumpix.co.uk/uploads/1195321370.jpg [forumpix.co.uk]
http://www.forumpix.co.uk/uploads/1195321383.jpg [forumpix.co.uk]
http://www.forumpix.co.uk/uploads/1195321393.jpg [forumpix.co.uk]
Just release it already (Score:2, Offtopic)
Re:Just release it already (Score:5, Funny)
Overclocking? (Score:1)
From the first video, the platform looks interesting, but will it be able to do any of those things with just one video card (rather than FOUR)?
Why overclock when you can undervolt? (Score:3, Interesting)
Lately I've been undervolting to build silent systems. The latest AMD Brisbane processors at 2.1GHz can be undervolted to 1.05V and still pass my stress tests at speed, and stay below 40C with the 'silent' fan modes.
Re: (Score:3, Interesting)
Re: (Score:2)
Incidentally, i want quiet and power efficient on the systems i keep running 24/7 (my laptop, tho its usually suspended, and a media server) at home... Tho for a system that boots up to play games and then gets turned off i'm not so concerned.
In a datacenter i want power efficiency and performance, noise is irrelevant.
Re: (Score:2)
Re: (Score:2)
As to your suggestion of the software performance being the programmer's concern, it could similarly be said that hardware performance is the concern of the engineers who designed it.
Re: (Score:2)
Re: (Score:2)
OTOH I use few equivalents which feel much faster even on 5 year old machine...and it would seem that recognising good coding and NOT having to upgrade your machine constantly is quite easy...
Re: (Score:2)
It doesn't seem that way some days. I know that my current PC is orders of magnitude faster with respect to hardware. But it seems that programmers, or programming houses really, are just using the faster hardware to put out less efficient code. I say programming houses because they are the ones that pay the bills and want things done faster at the expense of quality so that they can sell more crap.
I wonder how fast some
Re: (Score:2)
But today's programs are, in general, orders of magnitude more complex than those delightfully handcrafted versions of AppleWorks and WordPerfect from the 1980s you're feeling nostalgic for.
The effort required to build a piece of software does not scale linearly with the complexity - a piece of software tha
Re: (Score:2)
More and more people are hooking up a computer to their TV, and with that, dvr and playback software on their computers. Now, when they pay 2000$ for the screen and sound, you dont want some box going zzzzzzzzzzzzzzzzzzzzzz on the side. Does the DishNetwork box make nasty whirrrs? Does the DVD player grind when you use it?
I dont think so. In that same light, nobody wants computers that growl like sleeping dinos.
Re: (Score:2)
I want a loud computer for the bedroom, to cut down on how much my wife complains about my snoring. I wouldn't like something like that as an HTPC, though.
Re: (Score:1)
Re: (Score:3, Interesting)
But I like silent systems too. But overclocked ones could be silent as well. The days of PIV and
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Bus-overclocking has gotten so much easier though anyhow, since most chipsets let you do it without altering the PCI bus, for example. That used to be the issue back in the day - peripherals would start to flake out.
Re: (Score:3, Informative)
Well, traditionally, AMD always had supply issues, so their chips tended to not be very overclockable (they had problems with yields of higher-end chips, so there were no high-end parts to remark as lower end chips). However, they were easy to overclock, usually with aid of conductive ink to restore bridges that set the clock frequencies and multipliers of the
4 GPUs!!! = Loud, Hot, Expensive (Score:2)
Re:4 GPUs!!! = Loud, Hot, Expensive (Score:4, Funny)
That's a lot of GPUs. 4 factorial factorial is about a mole.
Re: (Score:1)
Re: (Score:3, Insightful)
You seem to be looking at it from a non gaming perspective. Considering the article is about a gaming system that seems to be a bit off topic as far as viewpoints go.
Re: (Score:1)
You seem to be looking at it from a non gaming perspective. Considering the article is about a gaming system that seems to be a bit off topic as far as viewpoints go.
Also, I think it is popular among poor people because you can spend $80 on a processor and overclock to a processor worth triple that. See the Intel E2180 for example. For me as a college student this is great.
Re: (Score:2)
Re: (Score:2)
Why is it overclock and undervolt and not overvolt and underclock?
Semantics, and all that. The word choices of the phrases are due to the intended goal of the activity that the phrase was created to describe.
Overclocking is seeking the highest clock speed at which that particular processor can run, in order to maximize CPU capability, with only secondary considerations about the energy / thermal factors.It is not seeking to increase the voltage, although that is the unfortunate side effect. Thus it
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
I really don't see where the need to overclock comes from anymore. Today's speeds are pretty darn fast and I'd assume that if you actually have a real need for more processing power, that you should be able to come up with the couple hundred bucks for another socket/proc.
Quick price point comparison here, I spent 170 on my processor, (2.33 Ghz) and about 60 (fan, thermal grease, and a bit extra on the case) on it's cooling system, while I don't have it overclocked (the cooling system was mostly for fun), the whole thing should easily overclock to match Intel's fastest Core 2 Duo, (3.0 Ghz).
That processor I would be matching is about 280, so my net gain in terms of cash is pretty minimal (though more fun in my opinion).
However, the processor I have, or that more expensive p
Why overclock? It's cost efficient! (Score:2)
With today's heatsinks, at most what you'd need is a $40-50 heatsink and your cpu can reach speeds of a processor that costs double or triple the one you have. Most video cards out there can overclock without any modifications to the cooling.
Overclocking is safe too, if you know what you're doing. If your PC starts displaying artifacts on the screen you k
Full Video on Youtube (Score:5, Informative)
Re:Full Video on Youtube (Score:5, Informative)
Re: (Score:1)
Choo! Choo! All aboard! (Score:3, Insightful)
I felt like I just got ran over. Nice job AMD. Actually, the first flashvert was pretty slick with the transformer, and was fairly informative. Honestly, I didn't quite extract much information from the overclocking one, except for it's awailable date.
Forgive me, but it's early Saturday morning here. And in the spirit of todays morning cartoon ritual, while munching on some Lucky charms cereal I fully expected the overclocking advert to finish with...
"Shh! Be vewy vewy qwiet, I'm on my AMD hunting for more raw bits! Eh. Heh! Heh! Heh! Heh!"
Wascally raw bits? (Score:2)
Re: (Score:1)
DISCRETE (Score:4, Informative)
Re: (Score:1)
Re:DISCRETE (Score:5, Funny)
Re: (Score:1)
seperate =
Re: (Score:1)
sepErate = sepArate (Score:2)
Well, maybe not on
What's new? (Score:5, Insightful)
Re:What's new? (Score:4, Interesting)
Re: (Score:2)
Re:What's new? (Score:5, Informative)
Not quite. The role of the GPU is stepping up to be much more important than "just games".
Newer operating systems rely extensively on the GPU to render the desktop, apply various effects to it, etc.... These tasks can be as simple as alpha blending, or as complex as providing a hardware-accelerated version of Photoshop.
It's not quite there yet on Windows (Vista implements it rather poorly), but Linux and OS X have been using OpenGL acceleration on the desktop for quite some time now. In what might be a first for a 'desktop' feature, support for it on Linux is actually quite good, and provides a rather nice UI experience (once you turn all of Compiz's superfluous effects off, that is).
I'm going to jump in here as a part-time Apple fanboy, and also point out that Apple's very heavily pushing its set of accelerated 2D Graphics libraries [arstechnica.com] toward developers to integrate into their applications to provide a more natural and fluid experience. In 10.5, OpenGL rendering is pervasive in almost every part of the user interface. Once you've got that framework in place, it becomes very easy to do all sorts of fun stuff without worrying about bogging down the CPU.
Even fast modern CPUs perform miserably when it comes to graphics operations, as they're not designed to cope with vector and matrix operations. With high-resolution displays becoming prevalent these days, it makes a good deal of sense to offload as much of the processing as possible to the GPU. If you implement this properly in the operating system, it's even transparent to the users AND developers. It's very much a no-brainer.
Many GPUs these days also provide accelerated support for video encoding/decoding, which is also a rather strenuous task for a normal desktop CPU to handle efficiently. Video editing applications can also take advantage by providing realtime previews of HD video rendered with effects applied to it.
Anyone who's done a substantial amount of video editing knows just how welcome this would be. Ironically, it's a shift back to an older paradigm, as the Amiga Video Toasters included an array of specialized graphics hardware to do all of the dirty work, and did it in real-time.
This might also translate into some sort of energy savings, given that modern CPUs consume very little power when idle, although this is pure speculation on my part.
There are all sorts of fun applications for this sort of technology once the frameworks are in place. Read up on Apple's 'Core' set of libraries for a fascinating peek into the future of UI and software design. Pixelmator [pixelmator.com] is one of the first applications to take extensive advantage of these features, and is an absolute joy to work with. Although its featureset isn't as extensive as Photoshop, it's damn impressive for a 1.0 product, and I'd daresay that it's a hell of a lot more useful to mainstream audiences than the GIMP is, and has a sexy UI to boot. Dragging the sliders when tweaking a filter, and watching the ENTIRE image smoothly change as you drag the slider seems like nirvana to photographers and graphic artists (even on somewhat old hardware)
So yes. This is a big deal. Everyday desktop software is transitioning toward relying upon the GPU for basic tasks, and AMD has stepped up to the plate to provide a decent set of entry-level graphics hardware to fill in the gap. Remember the state of video hardware before nVidia came along, and introduced the TNT2 and later the Geforce2-MX? Before them, decent 3d graphics hardware was an extravagant luxury. Afterward, it was easily affordable, and nearly ubiquitous.
I should also point out that Intel's graphics hardware is absolute shit. That comparison's just not fair.
Re: (Score:2)
Re: (Score:2)
There seem to be a few inexpensive graphics apps coming onto OS X, rushing to fill in the gap, given that there weren't really many options apart from the GIMP and Photoshop (one's rather undesirable, and the other's rather expensive and outdated).
Pixelmator leads the pack,
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
AMD & 64-bit CAD/CAM/OpenGL (Score:2)
I mean, the only "innovation" here is that one company is making the CPU, chipset and graphics card. You know, like Intel have been for years. But AMD make one where the graphics card is targeted at gamers. Whoop-de-fucking-do.
Soon ATI/AMD will be releasing a new high-end GPU series, called Stream [sci-tech-today.com], as a competitor to nVidia's Quadro FX series.
Traditionally, ATI supported only 24-bit floating point numbers on their consumer-grade GPU's [whereas nVidia & Matrox supported 32-bits on their consumer-gra
Re: (Score:2)
I don't see the value (Score:2, Insightful)
Re: (Score:3, Insightful)
Re: (Score:2, Insightful)
I think this counts as insightful (Score:4, Insightful)
Re: (Score:1)
That said, I am *totally* for energy efficiency if merely from the fact that waste is bad. But if they really want to save
Re: (Score:1, Informative)
What's wrong with having 4 graphics cards? Especially in this case ones that _aren't_ heavy on the noise or wattage side. 4 cards could be used for graphics, or some combination of graphics and physics, or just heavy "general purpose" compute power (where I use the term "general purpose" as loosely as can be applied to a graphics card...make no mistake that the kinds of apps that a GPU can accelerate are rather specialized).
Re: (Score:2)
Re: (Score:3)
Where's Imageon... (Score:1)
Re: (Score:1)
Imageon, apply directly to your smartphone.
Imageon, apply directly to your smartphone.
This is just marketing plush (Score:1)
Priorities ? (Score:2)
Of course, Spider has the potential to win the hard-core gamers and overclockers (and maybe the energy-conscious underclockers). But - I didn't do the research here, wild guessing - per one hard-core gamer 10 or 100 CPUs are sold to the general public (desktop). And 10 or 100 CPUs are sold to be used in servers.
In order to sur
Re: (Score:2)
I imagine the next step that will actually make a difference to AMD's marketshare will be the laptop version, since intel pitsgraphics are still very common in portables. This is all good for AMD; OEMs are where big money is, not us geeks buying parts on newegg.
Re: (Score:2, Informative)
Maybe it was a problem my end, but he just talked about this mythical tool for a while, then just as you really start to get into it - the video ends.
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)