NVIDIA Launches GTX 750 Ti With New Maxwell Architecture 110
Vigile writes "NVIDIA is launching the GeForce GTX 750 Ti today, which would normally just be a passing mention for a new $150 mainstream graphics card. But company is using this as the starting point for its Maxwell architecture, which is actually pretty interesting. With a new GPU design that reorganizes the compute structure into smaller blocks, Maxwell is able to provide 66% more CUDA cores with a die size that is just 25% bigger than the previous generation all while continuing to use the same 28nm process technology we have today. Power and area efficiency were the target design points for Maxwell as it will eventually be integrated into NVIDIA's Tegra line, too. As a result the GeForce GTX 750 Ti is able to outperform AMD's Radeon R7 260X by 5-10% while using 35 watts less power at the same time."
Re: (Score:3)
It's always easy for nvidia to say their graphics cards outperform AMD cards in computation, but difficult to make it happen. Nvidia is great at selling hype, nothing more.
Thanks to the Litecoin and Bitcoin minners Nvidia are the only cards at the shelves at stores and who do not have a 150% to 200% damn markup from the MSRP price?!
If Maxwell can also mine coins do not expect any reasonably priced GPU for years to come.
Re: (Score:2)
Re:believe it when I see it (Score:4, Informative)
What else would you use an amd card for?
I you want to game and/or use linux with any performance and stability you need a nvidia card.
Intel integrated gpu works fine for linux though if you don't need the performance,
Re: (Score:2)
What else would you use an amd card for?
For anything you want? They even come with HW documentation these days. Also, Linus. ;-)
Re: (Score:2)
What else would you use an amd card for?
For anything you want? They even come with HW documentation these days. Also, Linus. ;-)
It comes with Linus? Is there a super efficient NVIDIA invented process that bundles a small piece of him on die in place of a core in every chip?
Re: (Score:2)
Re: (Score:2)
Nope, you got it right the first time - it would have to be NVIDIA for it to work in the first place, AMD can't find their arses with both hands, a map, a team of dedicated researchers and a reality TV challenge show.
Re: (Score:2)
ATI's driver is free and opensourced and speced unlike the nvidia one which is a binary blob.
Re:believe it when I see it (Score:4, Insightful)
Well, for one, you can use it.
Re: (Score:1)
Re: (Score:2)
I got an AMD A10 5600K chip and was having some issues with FPS in Xonotic. Wasn't sure WTF and blamed it on AMD. Picked up a used nvidia gtx 280. Well the graphics seemed much smoother for about 5 min then the slow downs started/ WTF? Anyways make a long story short I got a big fat CPU cooler and out of the blue the A!0 chip with opensource radeon drivers can play the game maxed out with no issues.
So here I was blaming AMD for crappy drivers when the CPU was overheating with the stock cooler. Oh and what d
Re: (Score:1)
It was a "used" nvidia gtx 280. You don't know if something happened to it.
Re: (Score:1)
Maybe AMD should think about making more. Just an idea.
Re: (Score:1)
Yes, but it is "months in advance". The crypocoin thing started in November. Besides that TSMC is only at 80% capacity
FUCK BETA (Score:4, Interesting)
Hype or not, games on my gtx 760 look amazing. Looks like they are testing the waters for the next flagship.
Re:believe it when I see it (Score:2)
Honestly, my three-year old GTX 580 makes games look amazing, and it is still surprisingly capable with modern games (really, only Crysis 3 on Ultra made it wheeze). This is thanks to how anemic were the GPUs in the last-gen game-consoles, but I'd wager it still holds up well with the launch titles released for the XBOne or PS4. I suppose I'll get a new GPU in a year or two, but after that I think I'll be find until the PS5/XBox-NextWhatever is released.
My days of upgrading video-cards on a yearly basis see
Re: (Score:3)
Don't worry. They 260X they're comparing it to is still 2x faster at bitcoin mining. Still faster per-watt as well.
Re: (Score:2)
Re: (Score:2)
What does scrypt performance have to do with bitcoin?
Re (Score:2)
Won't you spend more $$$ on electricity for bitcoin mining than what you get out of it?, unless you steal the electricity.
Re: (Score:2)
Apparently you're supposed to make money off transaction fees.
That's besides the point. blackraven14250 attempted to rebut a statement with something completely irrelevant.
Re: (Score:2)
Believe it! (Score:1, Insightful)
It's all about what you're trying to do. Nvidia usually has an edge in the reliability/gaming sector, while AMD has an edge in the mining/hashing sector. To say that Nvidia is pure hype is, ironically, hyperbole.
Re: (Score:2)
It's all about what you're trying to do. Nvidia usually has an edge in the reliability/gaming sector, while AMD has an edge in the mining/hashing sector. To say that Nvidia is pure hype is, ironically, hyperbole.
This chip has more hash performance and shadders with less wattage than an ATI card.
This one might be the preferred one for miners while it is 3x slower it uses 1/5th the power of full dual 290x. Expect $600 and limited to no availability once the miners discover it :-)
Re:Believe it! (Score:4, Insightful)
I had understood that anyone with half a brain was on ASICs now.
Then again anyone with half a brain wouldnt be joining the pyramid scheme so late in the game.
Re: (Score:3, Interesting)
I had understood that anyone with half a brain was on ASICs now.
Then again anyone with half a brain wouldnt be joining the pyramid scheme so late in the game.
Butterfly which makes the ASICS has been busted taking as long as 8 months for the orders.
Basically you plow down $2,000 for the units and they keep them for 7 months and mine the coins with your device. Then sell it to you when the cost of the coins go up 3000% so you lose your investment and Butterfly keeps the interest made ... similiar to banks with holding cash for 48 hours etc.
But at least banks give the money back after 72 hours after they short stocks and keep the interest on your cash first. These
Re: (Score:3, Informative)
Re: (Score:2)
I had kind of wondered right off the bat why a company advertising devices which print money wouldnt use them to print money.
Possibilities:
1) Theyre phenomenally dumb, but still capable of making ASICs
2) They used the ASICs ahead of time, then sold the ASICs once the value propisition of said ASICs had plummetted
3) They realized that there was no more value to BitCoin than to roulette
None of those sound terribly appetizing. Sure you can make money there, just like you can with horse racing and house flippi
Re: (Score:2)
If youre the first into the ASIC world, you would be absolutely crazy not to run your ASICs for a few weeks before sending shipments off. Its like playing the stock market or blackjack with a 30% discount: statistically, the advantage is so great that you cant lose.
Except in this case theyre not even risking money-- theyre just pushing the ship date back a few weeks.
Re: (Score:3, Informative)
I had understood that anyone with half a brain was on ASICs now.
That's true for Bitcoin, which uses SHA-256 as its hashing protocol. But for Litecoin, Dogecoin, and a bunch of other knock-off "altcoins", the proof-of-work is Scrypt, and that is difficult to support on ASICs because of the memory requirements.
There are some Scrypt ASICs currently being tested, but hash rates are quite modest and they focus more on saving power than on outgunning the top AMD video cards.
Re: (Score:2)
Anyone want to guess how the company making those devices would know the performance stats of the ASICs if they arent ready to ship?
Heres a hint: Theyre probably not waiting on manufacturing-- theyre waiting for someone to get around to shutting the devices off.
Re: (Score:2)
Oh, and "Heres" is not a word. Neither is "theyre".
Sure they are. Heres was the ancient Greek god of punctuation. He played a stringed instrument called a theyre. It was shaped like an apostrophe which is also a Greek word, a portmanteau of apoplexy and catastrophe, referring to Heres' frequent reaction to poor grammar.
Re: (Score:2)
However, double-precision math is further pared back to 1/32 the rate of FP32; that was 1/24 in the mainstream Kepler-based GPUs.
Apparently, they achieved it at least partially by further carving up FP64 capabilities - even the cheapest AMD stuff has 1:16, as lousy as it is for some applications. Oh, what the hell. They're just gaming it. ;-)
Re: (Score:2)
It makes sense to cut down on die space and power usage by removing capabilities that almost no one uses. Why should 99% of gamers have to carry the burden for 1% of HPC users? Presumably Nvidia will create a successor card to Tesla that will include full FP64 capability on the Maxwell platform. It won't come cheap, though.
Re: (Score:2)
... capabilities that almost no one uses. Why should 99% of gamers have to carry the burden for 1% of HPC users?
"Almost no one?" There's also less demanding engineering workstations and even content creation scenarios where better DP support would come in handy. But, having said that, AMD's APUs make probably much more sense for these than a gaming card with limited memory.
Re: (Score:2)
"Almost no one?" There's also less demanding engineering workstations and even content creation scenarios where better DP support would come in handy.
Yeah, exactly. That's "almost no one" when compared to the size of the gaming market.
Only requires... (Score:2)
... five expansion slots to fit the fans, this time!
Re: (Score:3)
Re: (Score:2)
Yeah, I saw that. I wasn't surprised. That seems to be the common configuration these days. (AMD is guilty of it, too.)
Fortunately, the days of packing my computers with expansion cards are long gone, anyway.
Won't stop me from make a cheap joke about it, though.
Maxwell? (Score:4, Funny)
That sounds Smart...
I'll get me coat.
Re: (Score:2)
I got a lot of problems, but 99 ain't one of them.
Re:Maxwell? (Score:5, Funny)
Good to the last drop.
Heat and noise.... (Score:4, Interesting)
Re:Heat and noise.... (Score:5, Informative)
The benchmarks on Phoronix [phoronix.com] did temperature, and commented on (though didn't measure) noise. Was actually a fairly comprehensive, well done benchmark, the only thing missing was frame latency measurements.
Re:Heat and noise.... (Score:5, Funny)
I opened the link scrolled through it, only to get all excited that my card (the R9 290) was trouncing everything else in one of the charts!
Then I realized I was looking at the temperature charts....
Re:Heat and noise.... (Score:5, Informative)
They are at Anandtech. [anandtech.com] They do noise/temps/power at idle, in a game, or under full synthetic load. They even do an overclock and then re-compare game/synth numbers.
Re: (Score:2)
They are not because both of those numbers are highly subjective to not only the devices use, but also it's enclosure. Do you have the card in SLI mode, inside a rack with 100 other cards all running bitcoin miners? Well then, noise and heat will be through the roof. Do you spend most of your time on in a terminal emulator and your case is water cooled? Well then it's going to be pretty low. "Average" is totally subjective and I think it best to leave those measurements up to an external review.
Re:Heat and noise.... (Score:4, Insightful)
By your standard, almost anything would be subjective. Let's go through your line of thinking:
The tester chose an enclosure you probably don't have at home. As such, the card will not demonstrate the same values in your enclosure at home. As a result the tests are "subjective".
Power consumption? Well, you've probably got a different PSU. Subjective.
FPS? You've probably got a different CPU, different OS configuration, motherboard, harddisc... Subjective!
In summary: If the tester uses the same enclosure for every card they test, I don't see how it's subjective. Sone or dB as a unit of loudness are measurable, as is temperature. Or do you want to tell us that, say, the distance to Betelgeuze is subjective just because you don't happen to have the proper equipment to measure it?
Re: (Score:3)
Forget enclosures. Power up the device with no enclosure, give me the numbers at 1 meter distance.
Now we have a COMMON bar to use to judge.
Re: (Score:2)
I doubt miners care all that much about sound, ops/watt/s being far more relevant to their usage. If you've got a bank of dozens or hundreds of cards being hammered 24/7 it's going to be loud, period.
For everyone else I would think that the relevant questions would be how loud it is while being hammered by a graphically demanding game, watching a movie, and using a word processor. And these days it sounds like the last two usage cases tend to be comparable. As for the influence of case, etc. I'd say that
Re: (Score:2)
No, I've been saying they should do that for any femputer part with a fan.
What's the noise and 1 meter at mid power and max power?
TYVM
Re: (Score:3)
Operating temperature and noise output would only be valid measurements for the reference card. Once it starts getting manufactured by PNY or Diamond or eVGA or whomever they'll be using their own coolers and their own variations of the NVIDIA board.
Re: (Score:2)
Power consumption is heat generation. If you decrease power consumption, this should also reduce noise since a slower fan can be used.
66%? big deal. (Score:2)
going from 1 unit to 1.25 unit size is 56% bigger in area, so they gained 10%?
Re:66%? big deal. (Score:4, Insightful)
If you had read the article, you would have known that they went from 118 mm2 to 148mm2, i.e. a 25% increase in area.
If Slashdot entered the 21st century, it would be able to render superscript.
Re:66%? big deal. (Score:5, Funny)
If Slashdot entered the 21st century, it would be able to render superscript.
Maybe the beta supports it :)
Re:66%? big deal. (Score:5, Informative)
I just checked; it does not. (I tried both ways: using unicode character entities and using the <sup> tag.)
Wow (Score:2)
5-10% better than a cheaper rival card that came out 5 months ago.
Go nvidia, go!
Re: (Score:3)
5-10% better than a cheaper rival card that came out 5 months ago.
Go nvidia, go!
I'm no nVidiot, but 5-10% improvement at a substantial power savings in the same price bracket is indeed an impressive feat.
This being a brand new architecture means that later cards can also reap these benefits.
AMD and their OEMs are still slowly trotting out 290X cards with decent cooling at inflated prices. The sooner nVidia gets their next architecture out there the sooner we'll see new products / price drops on the AMD side. The sooner that happens, the sooner we see new products / price drops on the
Re: (Score:3, Insightful)
Meanwhile, in CPU land we've been stuck for years of Intel charging $BUTT for marginally better
If you think Haswell, Ivy Bridge or Sandy Bridge were 'marginally better' you aren't paying attention.
Re: (Score:2)
Re:Wow (Score:4, Insightful)
You say the big advance is in power, then mention the 290X, which has a single precision GLOPS/W figure of 19.4, between the new GTX750's 19.0 and the GTX750TI's 21.8
The 290X has a double precision GFLOPS/W of 2.6, the GTX750TI gets 0.68. Compared to the 65W TDP Radeon 250's double precision performance of 0.74, its a loser.
This is just hype and selective benchmarks for a new architecture that was supposed to be 20nm. They couldn't get it built on 20nm so they've had to stick with 28.
If it was 20nm, it probably would be better all round.
Re: (Score:2)
You say the big advance is in power, then mention the 290X, which has a single precision GLOPS/W figure of 19.4, between the new GTX750's 19.0 and the GTX750TI's 21.8
The 290X has a double precision GFLOPS/W of 2.6, the GTX750TI gets 0.68. Compared to the 65W TDP Radeon 250's double precision performance of 0.74, its a loser.
This is just hype and selective benchmarks for a new architecture that was supposed to be 20nm. They couldn't get it built on 20nm so they've had to stick with 28.
If it was 20nm, it probably would be better all round.
You can't compare the flagship 290x to the low-end 750, lol.
Beyond that, nVidia doesn't give a shit about double precision compute - never have, never will. Most other people don't, either, even people doing computing on GPUs.
nVidia is still very much gamer-focused, despite CUDA still having a huge market advantage over ATI Stream / OpenCL / DirectCompute / AMD APP Acceleration / etc. nVidia's new architecture is promising, and this card's launch is giving us a taste.
I currently have only AMD cards in my
Re: (Score:2)
I brought up the 290X only in reference to AMD's inability to properly fill the channel months after launch, and the market's inflated prices due to the lack of competition. And you're missing the point entirely as well.
You too are trotting out a pointless comparison. The 7790 is a high-midrange part, the 750 is low end.
The only comparison nVidia is using for pricing is gaming performance (where they consistently win) and feature set (3D, shadow play, gsync, etc.). nVidia doesn't price these cards out ba
Double precision isn't useful in this situation (Score:2)
Games don't make use of double precision math on a GPU. Really the only thing that does is some GPGPU apps (plenty of others are SP). So it makes no sense to optimize for it, and nVidia does not in their consumer cards, particularly low end ones like the 750.
Don't go and try to sniff around to find benchmarks that make your favourite product win, as it is rather silly. Ya, there's a lot the 290X is better at, but that doesn't mean it is relevant. The idea here is for reasonable graphics (as in gaming, multi
Re: (Score:2)
I'm no nVidiot, but 5-10% improvement at a substantial power savings in the same price bracket is indeed an impressive feat.
Substantial power savings, but certainly not the same price bracket. It is a $149 card so most reviews don't pit it against the much cheaper $119 R7 260X. In fact newegg right now sells an XFX OC edition of 260X for $119 and a Sapphire for $114 after rebate. Let's not mention that the Maxwell card gets trounced even by the cheaper 260X at many OpenCL tests - and reduced FP64 to 1/32 (vs the previous gen 1/24), so compute is also out. You have to pay dearly for the efficiency, nVidia as usual demands a price
Re: (Score:2)
I'm no nVidiot, but 5-10% improvement at a substantial power savings in the same price bracket is indeed an impressive feat.
Substantial power savings, but certainly not the same price bracket. It is a $149 card so most reviews don't pit it against the much cheaper $119 R7 260X. In fact newegg right now sells an XFX OC edition of 260X for $119 and a Sapphire for $114 after rebate. Let's not mention that the Maxwell card gets trounced even by the cheaper 260X at many OpenCL tests - and reduced FP64 to 1/32 (vs the previous gen 1/24), so compute is also out.
You have to pay dearly for the efficiency, nVidia as usual demands a price premium. Basically thank god for AMD being able to keep up, otherwise nVidia would be selling their cards 2X and 3X the price!
Nobody buys AMD for compute except Bitcoin miners. The market is priced for gaming. nVidia's 750 is priced exactly where it needs to be for gaming.
Re: (Score:2)
5-10% better than a cheaper rival card that came out 5 months ago.
This is a mobile-first design. Look at the power consumption figures to see why it is a major advance.
Nvidia's L2 Cache Jump (Score:2)
The Radeon R7 260 it is being compared against has only 768 KB and Kepler units had 256-320 KBs.
The performance improvement could simply be the L2 being larger, which means it is paging out to it's memory less.
Re: (Score:3)
It's also only measuring single precision performance. The AMD GPU's are more power efficient at double precision.
Re: (Score:2)
video encoding (Score:1)
7xx or 8xx? (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Unfortunately, Wikipedia doesn't even bother to list the architecture in its giant table [wikipedia.org] - a very important aspect of a graphics card.
Please tell me I'm dreaming! (Score:1)
AMD's saving grace has been efficiency and price (Score:1)
"Mainstream" graphics card? (Score:2)
I wonder who they want to sell to, when it comes to "mainstream" cards.
When it comes to graphics I consider myself mainstream. Watching video, running the OS, an occasional photo edit - that's about the heaviest it goes. I rarely play games (and those are not graphics intensive, just online games), I don't do CAD or anything else that's graphically intensive.
Motherboards come with graphics built in, and that works just fine for those not into hardcore gaming or hardcore graphic design work. Both relative sm
What diffence would cores make if they cripple? (Score:2)
As an owner of several nVidia produts, I appalled what nVidia does to the non-Quadro cards!
So you can choose a crippled gaming cards that can't do math well, or choose a workstation card that can't cool itself, and doesn't really know what to shaders.
Tell your marketing department, a loyal customer will seriously give AMD/ATI a close look the next time around.