AMD Demonstrates "Teraflop In a Box" 182
UncleFluffy writes "AMD gave a sneak preview of their upcoming R600 GPU. The demo system was a single PC with two R600 cards running streaming computing tasks at just over 1 Teraflop. Though a prototype, this beats Intel to ubiquitous Teraflop machines by approximately 5 years." Ars has an article exploring why it's hard to program such GPUs for anything other than graphics applications.
ubiquitous (Score:5, Insightful)
Look up 'ubiquitous' before you whine about how far behind Intel might seem to be.
Though having one demonstration will help spur the demand, and the demand will spur production, I still think it'll be five years before everybody's grandmother will have a Tf lying around on their checkbook-balancing credenza, and every PHB will have one under their desk warming their feet during long conference calls.
Re:OOOoooo (Score:5, Insightful)
Re:OOOoooo (Score:5, Insightful)
Simple: they aren't available. PC's don't typically come with DSPs. But they do come with graphics, and if you can use the GPU for things like this, it's a nice dovetail. For someone like that radio manufacturer, no need to force the consumer to buy more hardware. It's already there.
The first rule of teraflop club... (Score:5, Insightful)
And the second rule of teraflop club...
Don't mention the wattage...
Back here in the real world where we PAY FOR ELECTRICITY, we're waiting for some nice FLOPS/Watt, keep trying guys.
And they announced this some time ago didn't they?
Worthless Preview (Score:3, Insightful)
It also included some pictures of the cooling solution that will completely dominate the card. Not that a picture of a microchip with "R600" written on it would be a lot better I guess. Although the pictures are fuzzy and hard to see, it looks like it might require two separate molex connections just like the 8800s.
Re:OOOoooo (Score:5, Insightful)
Re:Not sonar? (Score:4, Insightful)
You use ambient sound instead of radiating a signal yourself, and you try to resolve the entire environment, rather than just the sound emitting elements in the environment. This makes you a lot harder to detect; it also makes resolving what is going on a lot more difficult. Hence the need for lots of CPU power. In the water or in the air. Passive sonar - at least typically - is intended to resolve (for instance) a ship or a weapon that is emitting noise. But the sea is emitting noise all the time - waves, fish burping, whale calls, shrimp clicking - all kinds of noise, really. Using that noise as the detecting signal is the trick, and it isn't very similar to normal sonar, in terms of what kind of computations or results are required. Classic sonar gives you a range and bearing; this kind of thing is aimed at giving you an actual picture of the environment. It's a lot harder to do, but man, is it cool.
Re:Not misleading at all (Score:4, Insightful)
Re:Not misleading at all (Score:3, Insightful)
Well...duh (Score:5, Insightful)
See, in the early days FPU was a seperate chip (anyone remember buying an 80387 to plug into their mobo?). Writing code to use FPU was also a complete pain in the ass, because you had to use assembly, with all the memory management and interrupt handling headaches inherent. FPUs from different vendors weren't guaranteed to have completely compatible instruction sets. Because it was such a pain in the ass, only highly special purpose applications made use of FPU code. (And, it's not that computer scientists hadn't thought up appropriate abstractions to make writing floating point easy. Compilers just weren't spitting out FPU code).
Then, things began to improve. The FPU was brought on die, but as an optional component (think 486SX vs 486DX). Languages evolved to support FPUs, hiding all the difficulty under suitible abstractions so programmer could write code that just worked. More applications began to make use of floating point capabilities, but very few required a FPU to work.
Finally, FPU was brought on die as a bog standard part of the CPU. At that point, FPU capabilities could be taken for granted and an explosion of applications requiring an FPU to achieve decent performance ensued (see, for istance, most games). And writing FPU code is now no longer any more difficult than declaring type float. The compiler handles all the tricky parts.
I think GPGPU will follow a similar trajectory. Right now, we're in phase one. Use a GPU for general purpose computation is such an incredible pain that only the most specialized applications are going to use GPGPU capabilities. High level languages haven't really evolved to take advantage of these capabilities yet. And yes, it's not as though computer scientists don't have appropriate abstractions that would make coding for GPGPU vastly easier. Eventually, GPGPU will become an optional part of the CPU. Eventually high level languages (in addition to the C family, perhaps FORTRAN or Matlab or other languages used in scientific computing) will be extended to use GPGPU capabilities. Standards will emerge, or where hardware manufacturers fail to standardize, high level abstraction will sweep the details under the rug. When this happens, many more applications will begin to take advantage of GPGPU capabilities. Even further down the road, GPGPU capabilities will become bog standard, at which point will see an explosion of applications that need these capabilities for decent performance.
Granted, the curve for GPGPU is steeper because this isn't just a matter of different instructions, but a change in memory management as well. But I think this kind of transition can and will eventually happen.
I get a kick out of the tags people assign ... (Score:2, Insightful)
Re:OOOoooo (Score:2, Insightful)
NOTE:
Not the cost of the units, but the cost of doing anything useful with them. For a person NOT integrating the parts into mass-produced items, it's only suitable for people doing something simple as a hobby, or for learning. I would *guess* that building anything to solve a problem in practice would take an incredibly large amount of time and skill, both of which are valuable resources even if they are your own. Cost of parts is only the total cost if you consider your time to be worthless. Making a DSP output a nice spectrograph of the airwaves wandering past your house is fine, making one that can perform underwater imaging is a different kettle of fish. Building something that can do that and then writing the code for it would not be a one man job, and it would not be cheap.
Lunch money for public high school over 10 years: $10,000
College education: $100,000
Ability to read: Priceless.