Nvidia Discloses Details On Next-Gen Fermi GPU 175
EconolineCrush writes "The Tech Report has published the first details describing the architecture behind Nvidia's upcoming Fermi GPU. More than just a graphics processor, Fermi incorporates many enhancements targeted specifically at general-purpose computing, such as better support for double-precision math, improved internal scheduling and switching, and more robust tools for developers. Plus, you know, more cores. Some questions about the chip remain unanswered, but it's not expected to arrive until later this year or early next."
Re:But does it... (Score:1, Insightful)
More importantly, does it run physx in a machine that also has a non-nvidia gpu?
Oh, wait. No, it doesn't [slashdot.org].
Re:AWESOME (Score:5, Insightful)
Re:Honestly, at this point... (Score:2, Insightful)
Comment removed (Score:5, Insightful)
Re:But does it... (Score:2, Insightful)
You understand that these gpu's are made by nvidia, right? So how could they run something on a machine with a non-nvidia gpu if the gpu's the article refers to are made by nvidia/I.?
What exactly were you trying to say? I'm not quite sure.
Re:AWESOME (Score:3, Insightful)
Re:But does it... (Score:4, Insightful)
Can also be useful in graphics (Score:3, Insightful)
It depends on what you are doing, but when you get something that involves a lot of successive operations, even 32-bit FP can end up not being enough precision. You get truncation errors and those add up to visible artifacts. This could also become more true as displays start to take higher precision input and even more true if we start getting high dynamic range displays (like something that can do ultra-bright when asked) that themselves take floating point data.
Re:But does it... (Score:3, Insightful)
Actually, I can think of at least one other major computer manufacturer who makes products that nerf other manufacturers' products. I think they're located in Cupertino.
Re:Games before hardware (Score:4, Insightful)
That's because most games are now being written for consoles and then being ported to PC, so the graphics requirements are based on what's in an X-Box 360. Unfortunately consoles are on something like a 5 year cycle. People are now buying a game console + a cheap PC for their other stuff for cheaper than the ol gaming rig. Makes sense in a way.
Re:AWESOME (Score:3, Insightful)
There were a hand-full of issues behind that decision. One of them was that some GPGPU platforms fail silently [nvidia.com], which, in practice, means that you start crunching numbers with less than the expected mantissa and therefore you get considerably larger rounding errors,. This is something that may bring disastrous results. Another issue is that even in some cases the announced double-precision support of some products was a bit flawed, as it failed to comply with IEEE 754, the standard for floating-point arithmetic. [wikipedia.org] Although it didn't complied due to only a hand-full of issues, to rely on GPGPUs to crunch numbers when they don't conform to that standard would mean that someone would be forced to spend a considerable time formally checking what effects that non-compliance would have on the project being developed. That means that that would take precious man-hours from projects which may already be poorly manned, not to mention that that task would be rendered to waste as the next GPGPU generation would either fully support with IEEE 754 or, in the worst case scenario, fail to support it in some other aspect, which would mean that the poor chap assigned to verify the effects of the product's non-compliance would be forced to do everything from scratch, once again.
So, to sum things up, GPGPU's support for double-precision math is, in fact, great news. It means that everyone may have it's own personal vector-processing super-computer on his desktop. Heck, even on laptops. That may not mean much for the proverbial joe-sixpack (at least not beyond the "oohh... shiny graphics" side of things) but being able to crunch a lot more numbers on the same time frame means the world to anyone writing/using number-crunching software, which is a lot of people.