$3000 GeForce GTX TITAN Z Tested, Less Performance Than $1500 R9 295X2 151
Vigile writes: NVIDIA announced its latest dual-GPU flagship card, the GeForce GTX Titan Z, at the GPU Technology Conference in late March with a staggering price point of $2999. Since that time, AMD announced and released the Radeon R9 295X2, its own dual-GPU card with a price tag of $1499. PC Perspective finally put the GTX Titan Z to the test and found that from a PC gamer's view, the card is way overpriced for the performance it offers. At both 2560x1440 and 3840x2160 (4K), the R9 295X2 offered higher and more consistent frame rates, sometimes by as much as 30%. The AMD card also only takes up two slots (though it does have a water cooling radiator to worry about) while the NVIDIA GTX Titan Z is a three-slot design. The Titan Z is quieter and uses much less power, but gamers considering a $1500 or $3000 graphics card selection are likely not overly concerned with power efficiency.
So glad it's over (Score:5, Interesting)
This is ridiculous.
Re:So glad it's over (Score:5, Informative)
I'm so glad that I got the gaming bug out of my system when a ridiculously-priced video card was $300, and mainstream cards were in the $90-160 range... This is ridiculous.
That's still pretty much the case; the difference today is that some people make, or try to make, their living off playing & broadcasting their gameplay. This means they need to be able to run the latest games at the highest specs, record and livestream all at the same time without missing a beat.
Re: (Score:2)
Isn't the recording & encoding part mostly CPU-dependent? And even if the graphics card is used to encode the video, isn't there dedicated H264 encoder hardware on these cards (meaning a budget card from the same generation shouldn't be any slower in this aspect)?
Re: (Score:2)
You're correct. Newer Intel CPUs can use a technology called Intel Quick Sync [wikipedia.org] to speed up streaming and video encoding. Basically it uses the hardware encoder on Intel CPUs to perform the encoding.
Streaming software like OBS [obsproject.com] supports Quick Sync. Impact on CPU and GPU usage is much lower since it's using the iGPU (which would normally be disabled when playing games with a discrete video card). It's basically using silicon which would otherwise go to waste, since most people disable the integrated video on In
Re: (Score:2)
Most live streams barely do VGA quality, never mind 1080p. And most video cards can do 1080p quite easily, so even if you live stream, 1080p is the max other people are going to see. Gaming on a 10 4K monitor s
Re: (Score:2)
I have 180 dollar gaming card that plays everything very well.
This is, frankly, stupid.There is no gain, and professional gamers want all the particulates and distractions turned off.
Re: (Score:2)
I put 6 GB ram in even though Crucial and Dell both tell you it won't work. Should have went for 8 so I could have a bigger ram drive. - I actually ran out of memory the other night running 64 bit Waterfox! (that was a first.) I put a ragged old OCX SSD in it that I bought for $20 when OCZ put themselves out of business. Then I put windows 8 on it for $30. It refuses to update to 8.1. (how bad are
Re: (Score:2)
I'm typing this post on an old Dell Latitude D420, which still works fine for surfing the web, though I have to limit youtube-type video to lower reso
Re: (Score:3)
Re: (Score:1)
Re: (Score:2)
http://en.wikipedia.org/wiki/A... [wikipedia.org]
Heheheh.
ASCI Red
Speed: 1.3 tflops
Ranking: #1 TOP500 June 2000
Power: 850 kW
Re: So glad it's over (Score:3)
Re: (Score:2)
People are just stupid that think they need 60+ hz: your eye can't refresh that quickly so who cares if your screen can? Often they are pushing high res on screens that max out at less than the framerate their GPU is pushing. So now not only their eye but their hardware can't use the frames. I get that when the system gets busy (or the game complex) frame rates can drop but I'm not sure upping the peak framerate is the best answer. Gaming rigs likely should be configured to have most system proccesses bound
Re: (Score:2)
Actually, your eye can detect changes at greater that 60Hz, it simply can't register individual still frames at anywhere near that speed. Much like how it can still detect the existence of detail far smaller than the smallest discrete pixel it can resolve.
The other place where higher frame rates factor in is latency at 60Hz there is a ~17ms delay between one frame and the next - and any actiontaken at the beginning of the frame will not be reflected until the next frame is rendered. Admittedly that's not
Re: (Score:2)
I'm so glad that I got the gaming bug out of my system when a ridiculously-priced video card was $300, and mainstream cards were in the $90-160 range...
These cards exists because they make them for the compute/workstation/enterprise market, why not rebrand and sell for some insane amount of money? Just like Intel's $999 processors wouldn't exist without the Xeon line. You get plenty bang for the buck for $150-250 with the "normal" enthusiast cards topping out at $500-700, which I assume is not that much more after inflation. Of course if you insist on playing Crysis in UltraHD with everything dialed up to max nothing will be enough, but many games the last
Re: (Score:2)
Actually a few 780s in a SLI will run that just fine.
Re: (Score:2)
That's what the quadro line is for.
Re: (Score:2)
yeah but if you got a line of cards where you flipped a bit for the drivers to read and treat it differently then why not make another swipe at that and take the top of the line from that line and flip a bit to say it's something else...
now there's so many youtube wannabe professionals that they can make good money from it and so many review sites that they'll get to selling 10 000 units for that shit only, easily justifying a production run. of course for 3k you can get a fucking laptop to play every game
Re: (Score:2)
I agree, it is dumb. There are suckers who'll pay it though.
Re:So glad it's over (Score:4, Informative)
Extreme compatibility -- work on all nvidia cards and use none of the new hardware features.
Extreme performance -- work on only the latest cards and use all of the latest hardware features.
Nobody is buying $3K cards to play video games, they are using them to solve engineering problems, video games are just a convenient way to benchmark performance that is easily understood by laymen.
Re: (Score:2)
>Nobody is buying $3K cards to play video games, they are using them to solve engineering problems,
Are you sure? I haven't paid much attention lately, but there was a time when CAD applications demanded far more accurate internal computations that a gaming card simply couldn't deliver. The Quadro, IIRC, was far more expensive than any gaming card, and also considerably slower. What it offered to justify the price was far more accurate rendering, especially where the depth buffer is concerned.
Of course
Re: (Score:3)
Standard consumer goods practice; always make sure you have atleast one ridiculously expensive version.
Doesn't need to be any better, just far more expensive.
There's always people who associate "expensive" with "good" and some can even afford it.
Same goes for TV's, Hifi equipment, musical instruments, tools, sports equipment, cars, etc...
People don't buy very high end video cards ... (Score:3, Interesting)
Keep that in mind when you see that great price for a used high end card. The card probably ran for an extended period of time over clocked to just under its "melting point" and just got replaced by an ASIC miner.
That was last year (Score:3)
People that mine either mine scrypt style currencies that still run better on GPUs or they are using ASIC miners for at least a year already. Used high end cards are either NVidia which are sold because the gamer wants something new or is short on cash, or AMD when the owner wants a faster GPU for either gaming or scrypt coin mining. For scrypt coin mining on AMD, overclocking the GPU doesn't work, in general you have to clock down a bit unless you are lucky and you can overclock the memory enough to maximi
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Except that Titan isn't really a gaming card. The big draw is the double precision floating point performance. The GTX 780 - which is the same part for gaming purposes, is about 700 dollars (they have almost identical single precision performance, which is what gaming is), so 2 780's would be about 1500 dollars (to compare to the titan black dual GPU monstrosity).
And you don't need top end parts unless you're gaming on 4k (which is either a 3500 dollar monitor for a good one, or a ~500 dollar Seiki TV tha
Re: (Score:2)
OK, tell that to NVidia [geforce.com]:
Hard to get more definitive than that.
OK, you can argue that NVidia is simply lying; that they engineer the
Re: (Score:2)
Re: (Score:1)
The Titan-Z was and is a PR product. It was conceived simply to create buzz around nVidia. They had the misfortune that AMD put out a better card before they could get the darn thing to market though. First they delayed it, then as pressure mounted they finally sneaked it out without much of the ado they were hoping for. I doubt there exists or will ever exist more than a couple of hundred Titan-Zs IN THE WORLD.
Anyone who tells you that this card "is for X" where X is something else than PR is wrong and/or
Re:So glad it's over (Score:4, Funny)
I doubt there exists or will ever exist more than a couple of hundred Titan-Zs IN THE WORLD.
Am I the only one who read that in Jeremy Clarkson's voice?
m
Re: (Score:2)
I've owned two "top end" (as opposed to merely "high end") graphics cards in the days before I had a mortgage and when the top end of the market was still only in the $1,000 range. The first was an Nvidia 7950 GX2 and the second was an Nvidia 590. Both of them, frankly, were cranky, unreliable and difficult. It was also rare I took them anywhere near their performance limits. This latest trends towards super-priced cards is a combination of R&D and willy waving.
This wouldn't be slashdot without a car an
So glad it's over (Score:2)
High end video cards have always been $600-$1000 ever since the 386 days. Unless you quit your gaming bug before then, you never had a current gen high end video card.
Re: (Score:2)
Actually when the NVIDIA RIVA TNT came out and people still had their 3Dfx Voodoo 2 cards they were a lot cheaper than they are now.
Re: So glad it's over (Score:2)
My first new video card was a nvidia RIVA TNT. Was made by diamond, and I bought it with my 486/25. It was $600.
Wrong premise (Score:5, Insightful)
These cards should have been tested from the perspective of high performance computing or scientific application.
Re:Wrong premise (Score:5, Insightful)
These cards should have been tested from the perspective of high performance computing or scientific application.
I don't think nVidia would want that.
Re:Wrong premise (Score:5, Informative)
Re: (Score:3)
Wrong premise (Score:1)
Gaming graphics cards are optimised for high-end graphics rendering - scientific graphics cards are optimised for crunching numbers/running simulations.
That's like testing a car by trying to drive it underwater
Re: (Score:2)
Overpaying by 20X makes you much cooler than overpaying by 10X. The metric is bragging rights, not actual performance, and definitely not some cost/benefit analysis.
Re: (Score:2)
These cards should have been tested from the perspective of high performance computing or scientific application.
Nah, virtual currency mining. :-)
Re: (Score:2)
These cards should have been tested from the perspective of high performance computing or scientific application.
Exactly.
Using the same base assumption, I have conducted research that finds a two billion dollar super computer cluster from IBM is way over priced from grandmas email and facebook browsing point of view.
I have also concluded my research showing the NASA space shuttles are way over priced from a running to the corner store for milk point of view.
Now where are my millions of research dollars?!
Re: (Score:1)
These cards should have been tested from the perspective of high performance computing or scientific application.
5 insightful for complete BS? The nvidia quadro exists for that reason (or some applications tesla). GTX is FOR GAMES and totally unoptimised for cruching other data from 3d modelling/rendering, cad, vid encoding, compositing, scientific models/simulation etc etc. Price for price comparisons find the like of k2000 instead of gtx 780 would be better for none game. GTX work in some none game applications better but generally workstation cards beat them in most areas and software.
IIRC adobe premiere pro pla
Re: (Score:3)
Nvidia does let people use the full computing featureset and performance barring ECC memory on a GTX Titan. Memory capacity is high too (6GB) though now there's also GTX 780 with that amount..
They have this hierarchy (based on virtually the same cards, but drivers and segmentation differ, in increasing price order)
GTX 780 and 780 Ti (3GB or 6GB) < GTX Titan (6GB) < Tesla (w/ 5GB, 6GB or 12GB) < Quadro K6000 (12GB)
That gives :
- gaming and GPGPU, double precision FP artificially much slower
- gaming a
Quiet is important (Score:5, Insightful)
don't underestimate the beauty of a quiet powerful computer.
I won't buy a $3000 gpu anymore than I'll buy a $1500 one, but I did buy the GTX 780 over the cheaper but somewhat more powerful R9 250 solely on the basis of it being cooler.
Re: (Score:3)
GTX 780 over the cheaper but somewhat more powerful R9 250
That's one heckuva typo. (I *hope* that's a typo)
Re: (Score:3)
yes, it was the 290, not 250, sorry.
They were competing against each other, the amd card had slighter better bang for the buck but was reportedly quite hot and some boards were quite noisy.
Re: (Score:2)
I really just wish desktops were capable of only turning on the discrete GPU when playing games, and relying on the CPU built-in one the rest of the time. (Or is it possible nowadays and I never found out?)
Re: (Score:2)
It's a common laptop feature, but it works because both the awesome GPU and the cheap GPU are integrated.
You can do it on the desktop, you just have to buy a 3dfx Voodoo card :) (it had a passthrough cable so you would plug it into your regular video card then your monitor into the 3dfx card... without that you'd need to plug your monitor into your fancy gaming video card whenever you wanted to use it).
Re: (Score:2)
It might get possible in the future, or in select integrated desktops ; for now at least the modern big GPUs have much better power management than before. Showing the desktop or even idling with the screen turned off was a huge power waste when you ran a e.g. Radeon 4870 or GTX 275, but with a GTX 780 or Radeon 7970 it's almost a gentle power bump next to not having the card in the first place. Of note is Radeon "zerocore power" which does shut the card down, but only when the PC's display goes stand by.
Nv
Re: (Score:1)
Re: (Score:2)
What's the idle power consumption on one of these bad boys? Many systems with many-hundred-watt TDPs idle under 100W...
Re: (Score:2)
What is Lucid Virtu? I did google for it but what appeared to be the manufacturer's page was obscured by some request to "like" something or other on Facebook.
I don't have a facebook account so I closed the browser tab. Stupid fucking company.
Re: (Score:2)
I did buy the GTX 780 over the cheaper but somewhat more powerful R9 250 solely on the basis of it being cooler.
Damn hipsters!
Good excuse... (Score:2)
crossfire/sli compatability (Score:3)
Do games these days typically take full advantage of such setups? I haven't really paid too much attention to gaming/hardware in the past few years, but it seemed as if support for dual GPU's was less than stellar.
IE, the only true advantage was an increase in the memory available to apps -- computationally, very few games took advantage of the additional gpu.
Has this changed, or (equally likely) I am completely off base on the state of afairs ?
Re: (Score:1)
Re: (Score:1)
You are missing something. Every aspect of life that isn't World of Warcraft.
Re:$3,000?? (Score:5, Insightful)
They don't. What they need this for is ghetto floating point development hardware. This is cheap by those standards and offers far more precision than consumer grade GPUs.
Re: (Score:2)
Re: (Score:3)
Double precision floating point hardware, designed to do things like physics simulation. This thing has no ECC and some other similar tradeoffs, so it's fairly cheap at only 3k.
Here's an example of a non-ghetto version: http://h30094.www3.hp.com/prod... [hp.com]
Re: (Score:2)
Re: (Score:2)
Unlike AMD hardware that performs fp64 math on fp32 hardware at significantly reduced performance
I'm almost certain this is false. Got a source?
AMD has been known [tomshardware.co.uk] to outperform nVidia at double-precision work.
Re: (Score:1)
You buy it because you accidentally set Crysis to maximum quality, and now you can't change it back because on your cheap $400 card your mouse is moving about 1 pixel an hour.
Re: (Score:2)
Re: (Score:2)
You are missing 120 fps along with strobing, especially in indie games like Path of Exile, or Minecraft where the games haven't been properly optimized.
A nVidia GTX 780 Ti is the best performance for $700 without breaking the bank.
Re: (Score:2)
i'm lost. why do people need a $3,000 video card to play games like World of Warcraft?
For the same reason you need a space shuttle rocket to go to the corner store for milk.
The same reason you need IBMs Watson to balance your checkbook in Excel.
I can play it fine on a $50 video card that takes one slot and a 15 inch monitor. Framerate is so fast that I had to turn on V-sync.
Indeed... Are we learning anything yet?
I must be missing something.
That goes without saying.
The detail you are missing is that you don't need a literal atomic scale physics simulator just to play games.
[Homer Simpson] I just need something that can send email
[Sales Guy] Oh my, you'll need a top of the line model for that! This baby here NASA uses to calculate their taxes!
[Home
3000? (Score:2, Informative)
Gamers spending 3000 on a video card aren't overly burdened with intelligence.
Re: (Score:3)
Man I wish I could be that dumb.
Re: (Score:1)
Wrong tests (Score:5, Insightful)
The Titan shouldn't be considered a top-end gaming card. It should be treated as a budget Tesla card - even at $3k, it's the cheapest card in Nvidia's lineup with full double-precision floating point performance (which no game uses, but is common for scientific computing, Tesla's market). And on tests using that, the single-gpu Titan and Titan Black outperform the 295X2 by a large amount [anandtech.com]. AT hasn't gotten to test a Titan Z yet, but you can tell it's going to wipe the floor with the 295X2.
Yes, Nvidia advertised the original Titan as a super-gaming card, and to be fair it was their top-performing gaming card for a while. But once the 780 Ti came out, that was over, and since everyone expects a 790 dual-GPU gaming card to be announced soon, buying any Titan for gaming is a fool's choice.
Nvidia seems to still be advertising it as a top-end gaming card, presumably trying to prove the old adage about fools and their money. It just comes off as a scam to me, but anyone willing to spend over a grand without doing some proper research probably deserves to be ripped off.
Re:Wrong tests (Score:4, Informative)
Case in point: linear algebra libraries (like 80% of scientific computing). Basically people are modifying algorithms so that bulk of computation is done in single precision and then cleaned up in double. Those mixed mode algorithms often outperform pure DP ones even on non crippled cards (for example MAGMA library).
People don't like to be screwed with...
Re: (Score:3)
Re: (Score:3)
The Titan shouldn't be considered a top-end gaming card. It should be treated as a budget Tesla card - even at $3k, it's the cheapest card in Nvidia's lineup with full double-precision floating point performance (which no game uses, but is common for scientific computing, Tesla's market).
This. For gaming there is virtually no difference between a 780 Ti (~$700) and a Titan Black (~$1000). They look identical on gaming benchmarks. I imagine that a pair of 780Ti's in SLI would outperform the Titan Z when it comes to gaming (the Titan Z is underclocked compared to the Titan Black) and for less than half the price.
The difference is the unlocked floating point capability and added vram. The Titans are for number crunching. The TitanZ Crushes the AMD R9 295x2. Well, that and gamers looki
cherry-picked benchmarks (Score:1)
If you do a broader range of benchmarks you'll see the 290x beats the Titan on most compute benchmarks:
http://www.anandtech.com/show/... [anandtech.com]
Re: (Score:2)
It's not cherry-picking if benchmarks like that are the primary reason to use a Titan. Particularly when I explicitly said so.
Re: (Score:1)
So, only the benchmarks that you say are important count. That's pretty much the dictionary definition of cherry picking
You keep using that word... (Score:3)
You're arguing with the antecedent. I'm saying "if you care about X, the Titan is good", and you're accusing me of cherry-picking because the Titan is bad at Y and Z, even though I specifically called it out as not being good for anything except X in a performance-per-penny measure.
I am saying that one of the principal reasons to buy a Titan is if you have a heavy double-precision compute load. I then provided a benchmark showing that a Titan beats the 295X2 in such a load. It would be cherry-picking if I p
Re: (Score:2)
To be fair, Upcoming "Star Citizen" will be using double precision floats to model its huge universe in proper detail.
That being said I still of course agree.
Re: (Score:2)
It only makes sense if you need CUDA, a lot of DP performance and no ECC or professional drivers and have a lot of money. Im not sure who those people are.
Workstations, perhaps? There's a lot of scientific computing done using desktop-sized workstations, not supercomputers. And they're spending several grand on Xeon CPUs anyways, so a $3K GPU isn't that much more.
64Bit floating point and compute mode (Score:3, Interesting)
Re: (Score:2)
Re: (Score:1)
Frame rate is the wrong metric (Score:2)
It's SCrypt Hashes per Second per Watt of energy consumed. And SCrypt Hashes per Second per Dollar of GPU.
Re: (Score:2)
Should we start worrying about nVidia? (Score:2)
Re: (Score:2)
Link? The only benchmark lists I could find there only tested FPS. And we all know that such tests are quite silly now as quality and to some extent latency are important these days.
AMD Consistent framerate, since when? (Score:1)
the R9 295X2 offered higher and more consistent frame rates
http://cdn.pcper.com/files/ima... [pcper.com]
But not "stable", "consistent" or "smooth". This is still a major issue with the core of all AMD cards which hasnt been fixed.
You get what you pay for. Nvidia might be the "expensive" of the bunch, just wish i forked out a little more instead of getting my HD7770.
Re: (Score:2)
the R9 295X2 offered higher and more consistent frame rates
http://cdn.pcper.com/files/ima... [pcper.com]
But not "stable", "consistent" or "smooth". This is still a major issue with the core of all AMD cards which hasnt been fixed.
You get what you pay for. Nvidia might be the "expensive" of the bunch, just wish i forked out a little more instead of getting my HD7770.
Do you realize that in the graph you linked no card dips below 50fps at any time? In fact, if you count the occasional peaks crossing the (ridiculously low) 15ms/66fps threshold, the Titan Z shows 6 frames slower than 15ms and the 295X2 shows 4 frames at more than 15ms (if I count correctly). You really can't argue that the Titan Z is smoother. All cards are extremely smooth.
ATI Catalyst (Score:1)
AMD fp64 rate (Score:4, Informative)
I would just like to point out that the 295X2 has superior absolute gaming performance and superior fp32 performance but, just like most gaming NVidia products, the fp64 is crippled at 1/8 fp32 rate at configuration in order to create a profit margin for the costlier "pro" products. The hardware itself is capable of 1/2 fp64 rate and should be superior to the Titan Z if AMD decides to offer "pro-level support".
As proof, consider the fp64 rate of the single-chip AMD W9100, sold at ~$4000, which is 2.6 TFlops (http://www.amd.com/Documents/FirePro_W9100_Data_Sheet.pdf), versus the 2.7 TFlops of the Titan Z (1/3 fp32 rate, see http://en.wikipedia.org/wiki/G... [wikipedia.org]). AMD could unlock the 295X2 at its full potential 5.2 double precision TFlops and release it any day if they want, crushing the Titan Z.
Honestly, instead of the Titan Z, I'd rather buy the AMD W9100 for $4000 and get equivalent double precision compute rate, better perf/W and, most importanty, certification for pro applications and ECC memory. That is certainly worth the extra $1000 in this product segment.
I don't think NVidia expects to sell many of them (Score:1)
Re: (Score:3)
ASIC makers have done if for the government. GPU mining is a thing of the past.
Prices still haven't come down though.
Re: (Score:2)
Re: (Score:2)
They solved their problem (speed) without solving the problem that it caused for all the bystanders (price).
So yes, it's a pretty typical private sector solution that doesn't fix any of the problems caused to large audience by the original product, forcing customers to pick up the tab.
Re: (Score:2)
TIL: Supply and demand is a "problem" that needs to be "solved".