Audio Processing on Your Graphics Card? 335
edsarkiss writes "BionicFX has announced Audio Video EXchange (AVEX), a technology that transforms real-time audio into video and performs audio effect processing on the GPU of your NVIDIA 3D video card, the latest of which are apparently capable of more than 40 gigaflops of processing power compared to less than 6 gigaflops on Intel and AMD CPUs." Another reader points out a story on Tom's Hardware.
Makes perfect sense... (Score:5, Interesting)
Having all that processing power available to do more than just shift pixels makes perfect sense. I'm just surprised that nobody thought of doing it sooner.
Re:Makes perfect sense... (Score:5, Informative)
Emmm, what [psu.edu] about [psu.edu] this [psu.edu], for example?
3dfx Commercials (Score:3, Funny)
About 4-5 years ago there were some 3dfx commercials that had the engineer walking around the plant talking about how powerful their new processor was and how it could be used to "save the world" then over the loud speaker comes the message "Scrap that, we are going to use it for games instead.", next we see the engineers all croweded around a computer and one screams "Blow his freaking head off!"
Ad Critic used to have them before they went for profit.
Re:Makes perfect sense... (Score:5, Interesting)
Back in 386 days, one of our professors was working on liquifaction (the ground sometimes behaves like a liquid during earthquakes). The models were very trig-intensive and took forever on a desktop. So, he wrote the simulations in Postscript and sent them to the printer where its processor could do the work much faster.
Re:Makes perfect sense... (Score:4, Informative)
The printers had the best processors available for his work. Of course, he was the first of our profs to have a linux box on his desktop, and the first to do parallel processing on the several sun sparc 2 workstations we had.
He was always pretty clever at using the computing resources at hand!
Re:Makes perfect sense... (Score:3, Informative)
I really can't understand this hostility towards PDF. Some renderers may be quite slow, but Apple has shown that it's perfectly possible to write a efficient Preview application. Perhaps someday Adobe will step up to the challenge.
Re:Makes perfect sense... (Score:3, Funny)
Re:Makes perfect sense... (Score:3, Informative)
It's easy to be surprised when you're wrong: BrookGPU: General Purpose Programming on GPUs [slashdot.org] December 2003.
Re:Makes perfect sense... (Score:4, Informative)
Re:Makes perfect sense... (Score:5, Informative)
And another post:
How can the price range be so slow when the processing power is claimed to be so many times faster than Intel chips?
First, silicon area doesn't necessarily mean performance. The whole reason that IBM, AMD and Intel are building multi-core chips [slashdot.org] is that so much of the area in a moden microprocessor is spent in workarounds for different structural hazards rather than in real work. The GPUs are huge because they are parallel mathematical computation engines. On a FLOP per sq. mm basis, they are a LOT more efficient than a single core CPU could hope to be.
As WIAKywbfatw points out, GPUs became more powerful than CPUs (on a FLOP basis) a decade or more ago. This was the whole reason Intel created the AGP port - to prevent the GPU from becoming the center of the the computer (it was a huge threat to their business).
Today, silicon is more and more about customization... on a FLOP basis, the chips in HD digital TVs have nearly the performance of the latest P4 - but at MUCH less cost... because they are less flexible (a LOT less flexible). Their design is to optimize single precision floating point performance... You can't use that CPU power for a long-running simulation ("scientific computing") - only for graphics; where single precision is still orders of magnitude more precise than the monitor can display.
Are you totally fucked up? (Score:3, Insightful)
GPU were NEVER a threat to cpus. They became only usable for ANYTHING but graphics with the introduction of vertex and pixelshaders, e.g. with the R100 or NV10 chips. Really usable are only chips with ps2, end even those can rarely archive "better then cpu" performance, even with tuned algorithms (main problem is memory access fragmentation breaking the caching strategies and causing pipeline stalls (wasting 100s of cyles) and multipass overhead because many implementations need 1000s of passes).
10 years
Re:Makes perfect sense... (Score:3, Informative)
nVidea used the term GPU [nvidia.com] to refer to its fixed function T&L capable NV10 chip, which was released on August 31, 1999 as the GeForce 256.
The AGP 1.0 standard dates to 1996, and it was intended to provide fast bandwidth for textures and video.
Not a fair comparison (Score:3, Informative)
GPUs are special purpose.. CPU's are not..
Great for audio workstations... (Score:4, Interesting)
And the ability to get a few frags in while the band is taking a break isn't too bad either!
Re:Great for audio workstations... (Score:2, Insightful)
Re:Great for audio workstations... (Score:2, Insightful)
Re:Great for audio workstations... (Score:3, Funny)
Re:Great for audio workstations... (Score:2, Funny)
Re:Great for audio workstations... (Score:4, Insightful)
Code coprocessor (Score:3, Insightful)
Oh, wait. They already are, but they're just trying to do most of this stuff with an x86 chip. Silly. It's not inconceivable that the future of PCs is a block of powerful media processors where the x86 chip will end up being th
Re:Code coprocessor (Score:5, Insightful)
Re:Great for audio workstations... (Score:5, Interesting)
So far Cann cannot take as much performance away from the GPU as he would like. "Right now, getting the data back from the video card is very slow, so the overall performance isn't even close to the theoretical max of the card. I am hoping that the PCI Express architecture will resolve this. This will mean more instances of effects running at higher sample rates," he said.
so it appears that there may really be a problem here... a GPU will normally do a bunch of calculations, then the raster goes *out* to the monitor, not *back* to the bus... I can see how getting data back out to the bus might be an issue. A "real" DSP/audio card would certainly be better, and they aren't *all* as expensive as the original article would have you believe... a quick google found at least one [mtlc.net] decent-looking DSP card for ~$500 out there, and I'm sure there are others, probably for cheaper ( the quoted price is for a card *and* a stack of software ), if you looked around a bit... if you're considering plunking down the cash for a PCI-X machine and a good GPU, you probably have a ~$500 for a good DSP card, too, and a special-purpose solution *designed* for the purpose at hand is almost always going to be better than repurposing a *different* special-purpose product.
Did that make sense? What I'm trying to say is that you'd be much better off buying an actual DSP audio card than buying two GPUs. That'd just be silly. This repurposed GPU stuff is just for folks unwilling to buy an extra card, but who have a nice GPU already.
40gflops?! how well does it crack dnet keys? (Score:4, Interesting)
Price range of $200 to $800... (Score:2)
Re:Price range of $200 to $800... (Score:3, Insightful)
Re:Price range of $200 to $800... (Score:2, Funny)
Re:Price range of $200 to $800... (Score:3, Funny)
Re:Price range of $200 to $800... (Score:2, Funny)
Re:Price range of $200 to $800... (Score:2)
Re:Price range of $200 to $800... (Score:4, Interesting)
GPUs are not really all that powerful compared to a CPU, but they're working with a totally different set of constraints.
Re:Price range of $200 to $800... (Score:3, Funny)
I bet someone out there on
Re:Price range of $200 to $800... (Score:5, Insightful)
That's what GPUs are designed for -- performing massively iterative algorithms on sets of data and returning the processed dataset. There are lots of algorithms that might benefit from this: encoding better digital video, searching for patterns, crunching numbers for encryption, etc. There are also lots of algorithms that would be NO GOOD -- SQL select statements, for example, or rendering web pages. Basically, any time processing is low and I/O is high, the GPU is a bad idea.
Think of the GPU as a tiny little distributed computing network on your own computer. And thank the video game industry for finally making signal co-processors commercially viable.
Re:Price range of $200 to $800... (Score:3, Informative)
Re:Price range of $200 to $800... (Score:2)
=Smidge=
Re:Price range of $200 to $800... (Score:2)
How right you are....
Excellent generall purpose coprocessor (Score:4, Insightful)
Personally, I'd like to see search algorithms (perhaps data-search, perhaps even video search) move to suchc a co-processor.
Re:Excellent generall purpose coprocessor (Score:2)
You could get some sweet real-time full-bitrate audio effects out of that puppy. It was the only time I ever found assembly worth writing.
It actually seems a *little* odd to me that you'd use a chip designed strictly with video in mind to do something like audio processing, but why not? Actually... well, is there a reason you might not? What happens when your machine askes the GPU to do some 'normal' graphics work when you have this stuff installed? What's y
Re:Excellent generall purpose coprocessor (Score:3, Funny)
Oh, I'm sure you'll be able to buy a GoogleCard(TM) for your machine in the next few years...
Hmmm (Score:5, Funny)
Re:Hmmm (Score:2)
That is until they move the CPU onto the graphics card... make the graphics card the motherboard.. and start making peripherals to take the workload of your all in one motherboard...
... digistyle!
Lather rinse repeat
This is called... (Score:2, Informative)
A more detailed description of the WoR is available here [cap-lore.com].
Re:Hmmm (Score:2, Funny)
KFG
Re:Hmmm (Score:2)
Pretty soon my graphics card is going to do more, cost more, heat up more, be louder and use more electricity than the rest of my computer combined.
It's called a NVidia GeForce 6800 Ultra [compusa.com]./p>
Re:Hmmm (Score:2)
Re:Hmmm (Score:3, Interesting)
In other words: the interfaces of a computer are (often) intended to provide immersive experiences for their users. Computer users are humans, so you would expect the processing power dedicated to each component of I/O to reflect the discernment of humans in their corresponding sense.
In yet more words: if
mad possible by Doom (Score:3, Interesting)
Re:mad possible by Doom (Score:2)
switch GPU and CPU (Score:4, Funny)
Re:switch GPU and CPU (Score:2)
I've thought the same thing though.
Re:switch GPU and CPU (Score:2)
Re:switch GPU and CPU (Score:3, Informative)
There are also many optimizing tricks involving GPUs that may lend themselves to certain tasks more
Re:switch GPU and CPU (Score:4, Insightful)
<analogy accuracy="flawed at best">
The CPU's a generalist and can treat most patients in a fair amount of time. The GPU is a specialist, however. If you know any of these in real life, you know that they can do one thing, and one thing only. In this case, it's graphics. You ask them to do something else, like gardening, and they look at you like you're from outer space.
</analogy>
Re:switch GPU and CPU (Score:5, Funny)
> A processor to sort out and verify that Network activity is correct.
> A processor to adjust Audio properly
> A processor for Graphics
I think you meant:
One processor for the audio kings playing their song
One for the graphics-lords under their rainbows
One for network men, pushing bits along
One for the dark lord through his dark windows
One processor to rule them all, One processor to discover them
One processor to bring them all and on the bus bind them...
Re:switch GPU and CPU (Score:2)
The console's CPU is not being used at all, only the graphics co-processor is being used.
http://www.simulationinformation.com/entertainm
DSPonGPU (Score:2)
Sounds like an acid trip (Score:5, Funny)
Re:Sounds like an acid trip (Score:2)
Reminds me of (Score:2)
Question (Score:2)
many professionals use floating point audio (Score:2)
In your CD player, maybe. CDs represent audio using 16-bit integer samples. Currently, professional audio is often recorded at 24 bit integer, and then immediately converted to 32 bit floating point.
32-bit FP audio has a much larger dynamic range. If you use a 16-bit audio stream, raising the volume can cause clipping (if the values exceed 2^15), and lowering the volume will lose information (the same information is represented usi
Re:Sound is all about floating point (Score:2)
parent makes some excellent points, especially in wondering what sort of latency might be involved in doing this processing in the GPU. I'm thinking this type of approach is only going to see big rewards in certain conditions - like, say, you're running some sort of CPU with relatively poor floating-point per
Wanted: 2 AGP slots (Score:5, Funny)
Now I'm going to have to find a motherboard that I could use to play Doom3 on that supported 2 video cards
(one for video, one for sound)
These innovations are getting pricey!!!
Re:Wanted: 2 AGP slots (Score:2)
Re:Wanted: 2 AGP slots (Score:3, Informative)
My POS ATI has audio and firewire. (Score:2)
GPGPU.org (Score:3, Informative)
Re:GPGPU.org (Score:2)
blah blah blah (Score:2)
"can reach more than"... "Capable of more than"...
So what's the real-world performance?
This is like those radio commercials where a store sells candy bars for $0.30, and then trumpets "up to 70% off everything in the store!"
GPU vs CPU (Score:2)
Re:GPU vs CPU (Score:3, Funny)
Re:GPU vs CPU (Score:2)
I'm not really skilled in this area, but I belive the CPU is more like a jack of all trades, whereas a GPU is specialized to just do the math involved for Graphics
Re:GPU vs CPU (Score:2)
Re:GPU vs CPU (Score:3)
Re:GPU vs CPU (Score:2)
Graphics processors have very very specialized ops -- operations which are hardware pathways. If you take a RISC processor and tell it to rotate a matrix of numbers, then you have to reduce the problem
Re:GPU vs CPU (Score:2)
Re:GPU vs CPU (Score:2)
There is the Sh language [sourceforge.net] that tries to balance workload between the CPU and the GPU.
However, the CPU is a general purpose processor. The GPU is evolving into a general purpose parallel processor. That means the CPU can do this, then do that, then do something else very well. The GPU can do the exact same thing many times very well. So each processor has its pros.
As
Great. (Score:5, Funny)
You know... (Score:2)
Just a Thought (Score:2)
M
Re:Just a Thought (Score:2)
what about another AGP slot dedicated to a sound / graphics effect processor? I mean, the benefits would be huge.
M
Jesus (Score:4, Informative)
GFX cards are streaming supercomputers (Score:5, Informative)
A good site for information on it is www.gpgpu.org, where there are perhaps 200 different projects related to general purpose GFX card use.
As the capabilities of graphics cards expand and become more esoteric, perhaps game developers will begin to eschew the use of certain graphics featuers in favor of using those parts of the pipeline to perform generic calculations, such as physics.
Perhaps there are also ways of performing such calculations and using the results as decorative graphics, ie when we're showing decorative ripples on water, perhaps those ripples are artifacts of some calculation that is being used elsewhere in the game.
Coprocessor? (Score:3, Interesting)
The GPU is of course heavily optimized (over a regular CPU) for video, and perhaps some of those optimizations would be passed on to audio as well. In the future, if such things pick up, one might well see more "multimedia" card which would incorporate a mixed GPU/SPU or perhaps dual processors?
Comment removed (Score:3, Funny)
like apples core image (Score:3, Interesting)
Supports ATI and NVideo (lib figures out if you have a useable graphics card, else it just uses the cpu)
http://www.apple.com/macosx/tiger/core.html [apple.com]
Goes to show... (Score:4, Interesting)
If the CPU was nothing but a router and directed data to dedicated hardware (network cards, GPU with integrated physics engine, harddisk controller, etc) we can get away from inefficient execution tied up in an architecture that 99% of the market depends on.
Computers were built with modularity in mind. We need to get back to those roots as it's not only a good idea, but the only way we're going to get past some performance barriers.
This is old (Score:5, Informative)
I have never seen audio before... (Score:3, Funny)
This would probably look best when viewed with a Viewsonic monitor.
Short Memory... (Score:4, Insightful)
The concept of using a CPU to do I/O and other "OS stuff" for a vector processor is a wee bit older then that.
Maybe you remember the Cray 1? Or all those i860's we used to use on cards back in the 286 days?
Those who forget history are doomed to post on
Apple's Core Video Technology (Score:3, Interesting)
Nintendo 64 did this - new HW expands old tricks (Score:5, Informative)
If you think about it, things like bilinear/trilinear filtering are perfect for resampling, graphic blendops like add/subtract/modulate are great for audio mixing and can be done with even older fixed function hardware and bit of programming effort. The programmability of new hardware with pixel and vertex shaders improves the generic applications of the GPU by orders of magnitude and allows significantly more non-graphic algorithms to be implemented.
Overclocking: (Score:4, Funny)
Wait - what happens to the Chipmunk mp3s?
Re:Overclocking: (Score:3, Funny)
Wait - what happens to the Chipmunk mp3s?
You won't hear anything, but your dog will be really pissed off.
what about compression algorithms? (Score:2)
Video processing -- aka MPEG2 encoding? (Score:3, Interesting)
Anyone know any more about this? Audio is nice, but its not nearly as CPU intensive as video transcoding.
SETI/Folding (Score:3, Interesting)
Latency? (Score:3, Insightful)
What kind of latency does this pose?
There are currently lesser expensive audio DSP cards on the market (UAD 1 by Universal Audio/Kind of Loud [uaudio.com], and the TC Powercore [tcelectronic.com], and nowadays they don't cost much more than a GPU. However on both of those cards the latency is pretty harsh. Many audio system will compensate for the latency in some instances, although some can't/don't compensate for bussed effects, which is unfortunate as reverb is the greatest reason to use a card like this, and it is a bus effect typically, and the extra delay incurred acts to set a huge, usually inappropriate predelay.
Of course there will always be those willing to work around the potential latency issues, however that defeats the purpose that they state on their site (no more freezing/bouncing/yelling at the machine).
This is exactly why Protools TDM systems are still in vogue for higher end studios and producers. The TDM hardware does just about everything as offloaded DSP, therefore the latency is extremely low, fixed, and documented. You can look up (command-click on the track volume display actually) to find out the amount of latency on a track in samples, and if there is a need to compensate than you can figure it out. Although typically one doesn't need to compensate for only 20 samples of latency as that is less than you might find in a analog studio using digital effects.
Re:Latency? (Score:3, Interesting)
Also, "which is unfortunate as reverb is the greatest reason to use a card like this, and it is a bus effect typically, and the extra delay incurred acts to set a huge, usually inappropriate predelay."
Which is why their first stated proof of concept algorithm will be a convolution based verb
OpenAL (Score:3, Informative)
Given that OpenAL is backed by sound card manufacturers, I wonder if they would ever concede to using GPUs to accelerate 3-D sound. I hope that the apparent conflict of interest doesn't hinder progress, if GPUs can really make a difference.
OpenAL is the one cross-platform audio API I've tried that actually _works_, while the other cross-platform options seem to either be stagnant, incomplete, just plain garbage, or so lacking in documentation that no mere mortal could figure them out. Here's to hoping that OpenAL and cross-platform audio on UNIX keeps getting better and better, because we really do need it.
Re:I've always wondered (Score:2)