NVIDIA Drops Surprise Unveiling of Pascal-Based GeForce GTX Titan X (hothardware.com) 134
MojoKid writes from a report via HotHardware: Details just emerged from NVIDIA regarding its upcoming powerful, Pascal-based Titan X graphics card, featuring a 12 billion transistor GPU, codenamed GP102. NVIDIA is obviously having a little fun with this one and at an artificial intelligence (AI) meet-up at Stanford University this evening, NVIDIA CEO Jen-Hsun Huang first announced, and then actually gave away a few brand-new, Pascal-based NVIDIA TITAN X GPUs. Apparently, Brian Kelleher, one of NVIDIA's top hardware engineers, made a bet with NVIDIA CEO Jen-Hsun Huang, that the company could squeeze 10 teraflops of computing performance out of a single chip. Jen-Hsun thought that was not doable in this generation of product, but apparently, Brian and his team pulled it off. The new Titan X is powered by NVIDIA's largest GPU -- the company says it's actually the biggest GPU ever built. The Pascal-based GP102 features 3,584 CUDA cores, clocked at 1.53GHz (the previous-gen Titan X has 3,072 CUDA cores clocked at 1.08GHz). The specifications NVIDIA has released thus far include: 12-billion transistors, 11 TFLOPs FP32 (32-bit floating point), 44 TOPS INT8 (new deep learning inferencing instructions), 3,584 CUDA cores at 1.53GHz, and 12GB of GDDR5X memory (480GB/s). The new Titan X will be available August 2nd for $1,200 direct from NVIDIA.com.
Glad to see Pascal making a comeback. (Score:5, Funny)
I thought it had been surpassed by C++, but this is great for everyone.
Re: Glad to see Pascal making a comeback. (Score:5, Funny)
begin
println("Woosh.")
end.
Re: (Score:2)
Well the summary makes no clarification on what "Pascal based" actually means. Is it a new processor, fab process, developer methodology, or was it just manufactured under high atmospheric pressure? Turns out it's an architecture only used at Nvidia - in other words it's a marketing name which is utterly meaningless.
Re: (Score:2)
I thought it had been surpassed by C++, but this is great for everyone.
Nah it just morphed into Delphi and got hacked to death
Re: (Score:2)
I thought it had been surpassed by C++, but this is great for everyone.
Nah it just morphed into Delphi and got hacked to death
Wait, I thought that Modula-2 was the successor...
Re: (Score:2)
I thought it had been surpassed by C++, but this is great for everyone.
Nah it just morphed into Delphi and got hacked to death
Wait, I thought that Modula-2 was the successor...
Wait for it....
Oberon
Re: (Score:2)
Oberon
That's the hostname of my FreeNAS file server. Not sure why my file server is relevant to this discussion.
Thanks Nvidia (Score:1, Flamebait)
Re:Thanks Nvidia (Score:5, Insightful)
Nvidia uses scientists names for their products: Tesla, Fermi, Kepler, Maxwell, Pascal and Volta (next version after Pascal).
If the leading edge of "consumer" graphic cards is of any interest to someone, they'd know what Pascal was since it's been announced now for over 2 years.
Re: (Score:1)
It still sounds silly in this case, even if you're familiar with the industry. They should just name them after numbers to avoid confusion. When version 7 of 9 comes out, it'll be sweet.
Re: (Score:2)
I just have to say... Scientists have awesome names.
Re: (Score:2)
Well I'm sure Nvidia could have named it after scientists named Smith, Jones, Doe, Smith (a different one), and Johnson, but those names aren't very exciting.
+1 Re:Thanks Nvidia (Score:1)
I too thought it had something to do with the programming language. I remember taking C and Pascal the same semester in college. Big mistake! /. of old.
Why not just refer to it as "Pascal architecture" in the story summary? I get that people who follow this might know that is what was meant, but not everyone spends thousands of dollars on video cards or follows things like this. I would think that for a summary story, it would be a little more front-page-friendly. But then again, I prefer the
Re: (Score:3)
FIY, intel's "skylake" has nothing to do with the sky or lakes.
Re: (Score:2)
Aww damn really? I guess I'll have to cancel my order.
Re: (Score:2)
So its loudest supporters include a notorious spammer known for making poor choices (like spamming Slashdot). I don't know if you thought your input would help Pascal's image, but it certainly hasn't :)
Re: (Score:2)
You do know that Windows 10 ignores its host file when it comes to telemetry, right?
Re: (Score:2)
dammit really? There goes that idea....
On a bet with Jen the Hsuang (Score:4, Funny)
So bloody fast it actually made the Kessel run in 12 parsecs.
Re:On a bet with Jen the Hsuang (Score:5, Funny)
Yes, but it had to shoot first.
Re: (Score:2)
Its so powerful, it shot first then created a whole separate timeline in which it shot second.
Re: (Score:2)
Yes, but it had to shoot first.
Well, draw first anyway.
But... (Score:3, Interesting)
Re:But... (Score:4, Funny)
It can probably blow the fucking doors off Crysis.......
Re: (Score:2)
Re: (Score:2)
It already did the sequel, while banging your sister.
With the moccachino in its left hand...
And on full resolution, high detail.
Backwards, uphill, both ways.
Any questions?
Re: (Score:2)
Re: But... (Score:2)
Re: (Score:3)
You first adopters that have fat wallets, please start buying.
The Nvidia 1060 6GB 192-bit video cards are supposed to start at $250. That's $100 more than the Nvidia 950 2GB 128-bit video cards. I'm set up for auto-notify at Newegg.
Re: (Score:3)
The card you should be comparing to in the hiearchy is the GTX 960 4GiB
Re: (Score:3)
The card you should be comparing to in the hiearchy is the GTX 960 4GiB
I stand corrected. Thank you.
Re: (Score:3)
Re: (Score:2)
Huh? No it's not. It's pretty decent in power/performance ratio, comparing favourably to slower cards from AMD from the same era
Heat? (Score:2, Interesting)
Re: (Score:3)
The card looks like it will fit in a standard case, but the cooling tower will be the size of a small house.
Sorry, this isn't an AMD card...
Re: (Score:2)
Some people here still quote power use/heat output figures from the Fermi architecture, especially the 4x0 series which debuted in 2010...
Re: (Score:2)
but the cooling tower will be the size of a small house.
You do realize the actual cooler is on the picture in the article right?
http://hothardware.com/Content... [hothardware.com]
And that it's part of the card which fit in some case (standards for graphics card lengths I'm unaware off, fit on a mini-ITX board a possible exception.)
A family of mice or small snakes or some small fishes I guess.
Judgment Day (Score:3, Funny)
Re: (Score:2)
There's a flaw in your logic: Skynet [blogspot.com] is already a time-traveler. [dailymail.co.uk]
Re: (Score:1)
Now we just need to perfect time travel.
Perfect?
We don't even know if it's in the future to past direction.
Re: (Score:1)
Or better:
Now we just need to perfect time travel.
Perfect?
We don't even have any proof that it would even be possible to travel backwards in time!
My other post made it a "maybe" whereas this one is a "not as far as we know." I remember seeing something likely here on Slashdot which if true / suggested it wouldn't be possible. But maybe that's not proven either. Maybe it have to be perfect once in use but currently it's less about perfecting it and more about (not) being capable to do it at all.
Pascal (Score:1)
For the ones wondering: http://www.nvidia.com/object/gpu-architecture.html
Re: (Score:3)
And for the little kids still confused about the Pascal/C++ jokes: https://en.wikipedia.org/wiki/... [wikipedia.org]
basically a supercomputer on a card (Score:1)
basically a supercomputer on a card. I'd be *really* interested in finding out if those cores are individually addressable, etc and the memory setup. I remember the computer my Dad did his PhD calculations on -- an IBM 704 with memory expansion to a whopping 48K
Re: (Score:1)
It really is a supercomputer on a card, complete with a device driver that's effectively a batch job scheduling system. There's a lot of limitations though, things like sets of threads all need to run the same code, and as much as possible, follow the same branches in code. It's not like having a couple thousand individual CPUs that you can program by any stretch.
This tends to work well for things like image processing or machine learning, and not nearly as well for tasks like sorting or searching.
Re: (Score:2)
Re: (Score:2)
nVidia set up us the surprise.
For great justice.
(I think they meant that nVidia dropped the surprise at the unveiling (on the audience), not dropped it from the unveiling.)
1.21 Jiga flops! (Score:2)
Great Scott! 1.21 jiga flops! 1.21 jiga flops?! What was I thinking!!
Oh, it was on a dare...
Not HBM2???? (Score:2)
Why GDDR5X? HBM2 triples the memory throughput. If they want a monster card that is overkill for today, it should at least incorporate the king of memory buses.
Re: (Score:3)
Nvidia has yet to use HBM in a major product let alone HBM2. HBM2 isn't volume ready quite yet, and AMD allegedly has some form of "dibs" on getting priority on production from at least Hynix.
Re: (Score:2)
Because HBM1 is a dead end? It's the same as HBM2, except it's outdated.
As for GDDR5X even that one has rather low availability, else it would be used on more cards not only top end ones.
Re: (Score:2)
Re: (Score:3)
So where are those sub-$100 Pascal-based videocards?
Your affordable is another man's can't afford one. Same thing goes for the Titan X.
Re: (Score:1)
An affordable video card is totally capable of outputting 4k
So where are those sub-$100 Pascal-based videocards?
https://www.youtube.com/watch?v=gfvM5JX1Mk4
This is Intel Atom:
http://liliputing.com/2015/04/... [liliputing.com]
"Intel says its new chips can play 4K videos at 60 frames per second when theyâ(TM)re encoded at up to 250 Mbps bitrates. 1080p videos can play at up to 240 frames per second."
No need to buy a graphics card at all for 4K playback.
Re: (Score:2)
I didn't see the "video" at the end of his comment. Since we're talking about GPUs I assumed we were talking about 4K gaming.
Re: (Score:1)
So read it again:
I didn't see the "video" at the end of his comment. Since we're talking about GPUs I assumed we were talking about 4K gaming.
"An affordable video card is totally capable of outputting 4k or even 8k video"
Re: (Score:2)
Yeah, now let's have that Intel GPU do some actual 3D work and see how it performs at 4K. Hint: it will be terrible, which is why Nvidia is still in business.
Re: (Score:1)
Yeah, but this person thought 4 and 8K video was all that mattered.
Why the fuck he think 8K video is important but not games is beyond me. How many have an 8K display? How many have 8K content to watch? How many would suffer with settling for 4K content right now?
Re: (Score:1)
There's little reason to release sub $100 cards.
For those purposes you've got integrated graphics.
Re: (Score:2)
Well, sub-$200 cards then.
Re: (Score:1)
Nvidia have cheap cards in that category.
I assume you'll see a GTX 1050 at some time too.
Re: (Score:1)
But that's not $100 or even $150 territory for those who want it.
But there's the RX 480, 470 and 460 for them so far. But I can only assume Nvidia will release cards for that market too.
Re:Who gives a fuck? (Score:4, Informative)
Re: (Score:2, Informative)
The card is designed for data mining and neural network research; it's not for games or even remotely intended to be used for them.
Bullshit.
http://www.geforce.com/hardwar... [geforce.com]
"With the DNA of the worldâ(TM)s fastest supercomputer and the soul of NVIDIA® Keplerâ architecture, GeForce® GTX TITAN GPU is a revolution in PC gaming performance."
I admit I don't completely know who they focus on with the Titan cards.
10-11 Tflops single precision performance with this one.
317-343 Gflops double precision.
159-171 Gflops half precision (shouldn't that one be higher?)
The idea with the more professional card is to hit 5
Deep learning and gaming (Score:1)
At an artificial intelligence (AI) meet-up at Stanford University this evening, NVIDIA CEO Jen-Hsun Huang first announced, and then actually gave away a few brand-new, Pascal-based NVIDIA TITAN X GPUs.
In fact, the Titan X is currently the preferred GPU for deep learning [timdettmers.com] thanks to its 12GB memory. But I'm not going to argue that this card can be a great GPU for both gaming and deep learning (unlike the Quadro which is largely for CAD-like applications).
Re: Who gives a fuck? (Score:2)
Re: (Score:1)
Bzzzzt totally wrong.
That would be the Tesla product range. Titan is a gaming GPU.
Re: (Score:2)
Actually the Tesla line of cards are for data mining and neural network research.
Nvidia has 3 PC lines of cards
Geforce Gaming
Quadro CAD/CAM and professional graphics.
Tesla for GPU compute. AI, data mining and other GPU compute functions.
The Titan is part of the GForce line and is a bit of an odd duck. It is a gaming card but like every other gaming card it can be used for AI, data mining, CAD, and even professional graphics but it is a gaming card. A very high end expensive gaming card.
Re: (Score:2)
The first Titan had high speed double precision (FP64), so did the regular GTX 480 / 580 before it. The Maxwell Titan didn't, but it offered a very large 12GB RAM instead, a size which was available previously but only on highest end Quadro. And today we get a new one with fast support for 8bit integers as a differentiating feature, which comes as a surprise. 1080 Ti ought to have 12GB RAM (because of a 384bit bus, and because 1080 has 8GB already)
That's to say the idea of what a Titan is for is evolving a
Re: (Score:2)
PROTIP: it's means it is.
It's been nice proving you wrong.
Re: (Score:1)
Grand Theft Call of Crysis Duty.
I love that series. I'm currently playing Grand Theft Call of Crysis Duty II: Rise of the Polygons
Re:Who gives a fuck? (Score:5, Informative)
A lot of researchers are using GPUs for things very different than graphics. A professor was telling me just last week that the boundary between a machine learning training algorithm being interesting was to train to deal with a problem in a week or less [note one trained it does its job much faster, that's just the get-it-ready-to-go time], and that GPUs were often used for that training. The bit width requirements are modest, but the amount of data to process is huge.
Of course, he went on to show how the approaches his students had come up with were faster and more power efficient by orders of magnitude for many common algorithms, but still they were trying to improve a normal way of doing things, which is to get up and running fast using GPUs are a source of number crunch.
In summary, people who don't actually need so more horsepower buying it helps keep it being developed for the smaller number of people who get it who are actually doing something useful with it.
Re: (Score:3)
There are *other* uses for GPU's. They make great compute processors for specific kinds of problems some of which are NOT related directly to pushing out pixels on a screen.
Re: (Score:1)
Not need more computing power? I think you're on the wrong site, maybe you should be over here [huffingtonpost.com], or maybe here, because no one will ever need more than 640 kb [computerworld.com].
Re: (Score:1)
An affordable video card is totally capable of outputting 4k or even 8k video with no problem.
Encoding that or playing video games or using it for totally unrelated things is a different story though.
As you're posting as AC maybe you don't even WANT an answer. And now when you will likely get multiple will that help educate you and reconsider? Not likely. Because it's not a card FOR YOU and that's all that matter for you. That doesn't mean it's useless for everyone else.
Of course people will buy this and can afford it.
A friend just bought a GTX 1080 and a 34" 21:9 100 Hz G-sync screen (and the rest
Re: Who gives a fuck? (Score:2)
Re: (Score:1)
Then again DX12 dual Nvidia cards doesn't seem to work in Hitman and Rise of the Tomb Raider (but does in Ashes of Singularity.)
I assume eventually more games will have support but dual cards isn't the best and I'd suggest he just wait and upgrade to a single Volta card instead.
Re: (Score:2)
> Nobody has the money to afford one of these things
Speak for yourself. I'll be getting one for sure.
> You don't need one, either. It serves no purpose.
Completely not true. You need something as powerful as this or even more to play AAA games like Elite: Dangerous with maximum image quality (i;e. including say 2x supersampling) in high definition VR at 90 frames sec X2 (eyes) without dropping frames (i.e. making you feel nauseous)
Re: (Score:2)
Since when is Elite: Dangerous an AAA game?
Re: (Score:2)
Please now back up your assertion that there is some magical affordable GPU out there that can render modern 3D software at 4K or 8K at a constant 60 fps with a link to some kind of... what do we call it? proof.
This isn't for simple video playback, numb nuts. This is for 3D render, and massively parallel floating point math (read: CUDA apps).
Re: (Score:2)
Re: (Score:2)