AMD's Latest Server Compute GPU Packs In 32GB of Memory 46
Deathspawner writes: Following-up on the release of 12GB and 16GB FirePro compute cards last fall, AMD has just announced a brand-new top-end: the 32GB FirePro S9170. Targeted at DGEMM computation, the S9170 sets a new record for GPU memory on a single card, and does so without a dual-GPU design. Architecturally, the S9170 is similar to the S9150, but is clocked a bit faster, and is set to cost about the same as well, at between $3,000~$4,000. While AMD's recent desktop Radeon launch might have left a bit to be desired, the company has proven with its S9170 that it's still able to push boundaries.
Re: (Score:3)
640kili-Terabytes should be good. That way we can buffer over a full year of 8k video.
Finally (Score:3, Funny)
So what happened to the comments lately? (Score:1)
Starting at about the time that the "read the X comments" link was removed, I noticed a big drop in the number and quality of comments and moderation. Am I the only one?
Also, what happened to polls? I've only seen one since they removed the sidebar. Have those been killed off?
Re: (Score:1)
Re: (Score:1, Informative)
Re: (Score:3)
Rather than improving the site and taking on refugees from Reddit they decided to double down on terrible design and appeal to the worst of them.
It's like Italy deciding to turn away all the moderates and only deciding to accept
Meanwhile those of us that have been around Slashdot since 2000 have pretty much thrown our hands up.
I wish I had enough knowledge and freetime to rewrite INN with moderation. Usenet 2.0. It seems that the Eternal September has hit the web.
Re: (Score:2)
"Meanwhile those of us that have been around Slashdot since 2000 have pretty much thrown our hands up."
Imagine how I feel.
I have to agree with you. I really want a new old slashdot but I doubt that the changes I would make would make everyone happy.
First 3x Posts are Retarded (Score:2, Funny)
1) You're old... We get it!
2) 2013 called, they want their fad back
3) "Render" Hentai? Unless you count encoding DVD rips in some lossy codec to be "rendering": it really won't. Do you have some fucked up 3D animation hentai running on OpenGL or DirectX?
Re: (Score:2)
Do you have some fucked up 3D animation hentai running on OpenGL or DirectX?
In a world where we have VR headsets there's a possibility their answer is yes.
What are these used for? (Score:1)
Re: (Score:1)
Re: (Score:3)
These are just huge waste of money. You can use the same money to buy several of gaming cards - each of them is way faster than workstation card for many types of operations. Also since GPU computation is highly parallel it makes no point to have one super card instead of many cheap cards combined together.
Re:What are these used for? (Score:4, Interesting)
Re: (Score:1)
each of them is way faster than workstation card for many types of operations
Many, but not all. There are a wide variety of computational uses for GPUs these days. Some are processing speed bound, some of bandwidth bound, others are memory bound. Adding a bunch of cards isn't going to help with some computations that require large intermediate buffers with a lot of cross dependence, as you'll just be stuck waiting for the same memory to be moved back and forth between cards, assuming it can even fit in them.
Re: What are these used for? (Score:2)
Re: (Score:2)
Re: (Score:1)
I don't believe the Hawaii chip used for this card has ECC protection for any of its shader array SRAMs. That may significantly reduce the usefulness of this product for high-volume installations (MTBF will likely be unacceptably low). Well, maybe one could make-do, but computations would have to be double-checked or otherwise qualified. A 32GB framebuffer is nifty, for sure, but definitely a niche product for the foreseeable future.
Re: (Score:1)
Scientific applications that need to process what we in the business call "An unbelievable metric fuckton of linear algebra."
A (now depressingly standard) billion-cell 3D CFD simulation = solving a minimum of 5 billion simultaneous nonlinear equations per timestep = solving probably some dozens of 5-billion-variable linear algebra problems per timestep since implicit methods are all the rage.
SETI@Home? (Score:1)
DGEMM-Double precision GEneral Matrix Multiplicatn (Score:3, Informative)
D = Double precision (as opposed to S = Single precision, C - Complex single precision, or Z - Complex Double precision)
GE = GEneral, as opposed to HErmetian for example.
M = Matrix
M = Multiplication
An APU with even 16GB of integrated memory (Score:1)
An APU with even 16GB of integrated memory would be news, let alone 32. A video card with 32GB of memory is not. A GPU with 32GB of integrated memory would be news, but that is not what this is. This is a video card with 32GB of memory.
Of course, AMD's own press release gets this wrong, but that's not excuse. It just means this is C&P bullshit.
Re: (Score:1)
Putting 32GB of ram on a card with a 512bit memory bus is impressive (that's a lot of traces!) regardless of the fact that it isn't an APU. It is interesting to see this dual direction from AMD however. While they're releasing HBM and the associated "Fury" products the hard 4GB limit is obviously impacting competition and makes the newly released product non-competitive. It does seem like once they can improve the memory limit and get the cost down, that type of product would be a great addition to an AP
Re: (Score:2)
I'm confused. How would a GPU with 32GB of "integrated memory" be news, but a GPU with 32GB of [non-integrated*] memory is not news? I'm not sure what you mean when you say "integrated memory". This is not the graphics half of an APU, it is a discrete card, and nowhere do the summary or the press release state otherwise. The term "compute GPU" just means it's targeted at computing workloads, not graphics workloads.
What exactly is your complaint?
* Not even sure what this means, but you seem to be contrasting
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Informative)
Like every other Hawaii based professional card, it does [amd.com].
What puts "G" into "GPU"? (Score:1)
Is it still a "Graphics Processing Unit" [wikipedia.org], if it does not even offer any way to connect a display to it?
Well, maybe, it just means "Ginormous" now...
Re: (Score:2)
Re: (Score:1)
Type Mismatch Error in 10 (Score:1)
The real question here is: when did "compute" go from being a verb to an adjective?