NVIDIA Launches New $219 Turing-Powered GeForce GTX 1660 (hothardware.com) 101
MojoKid writes: NVIDIA took the wraps off yet another lower cost Turing-based graphics card today, dubbed the GeForce GTX 1660. For a $219 MSRP, the card offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores with a 1785MHz boost clock and 6GB of GDDR6 RAM with 192.1GB/s of bandwidth. Generally speaking, the new GeForce GTX 1660 is 15% to 30% faster than NVIDIA's previous generation GeForce GTX 1060 but doesn't support new ray tracing and DLSS features that the majority of NVIDIA's new Turing cards support. Performance-wise, GeForce GTX 1660 is generally faster than an AMD Radeon RX 590 overall. Boards from various OEM partners should be in the channel for purchase this week.
Re: Nah. (Score:1)
Exactly. Plus they were 3D only cards that required a gay passthrough cable and it only worked with their proprietary GLIDE API. Fuck 3DFX.
Re: (Score:2)
The original Voodoo/Voodoo2 cards were an add-on with a VGA passthrough cable, but that was at the birth of 3d acceleration. Voodoo Rush cards existed just fine along side these as an all-in-one video card. Once we got to the Voodoo Banshee, the add-on card platform was retired. And Nvidia and ATI were just getting started, albeit with garbage hardware initially. The Voodoo 3 was an amazing card and on
Re: Nah. (Score:2)
LIAR! (Score:1)
https://www.theverge.com/2013/1/16/3881826/amd-accuses-bob-feldstein-of-stealing-documents-nvidia
https://hothardware.com/news/nvidia-geforce-partner-program
https://www.techradar.com/news/nvidia-faces-allegations-of-anti-consumer-practices-in-the-graphics-card-world
https://www.cnet.com/news/former-nvidia-engineer-charged-with-insider-trading/
https://www.extremetech.com/extreme/145830-industrial-espionage-amd-files-suit-against-former-employees-for-alleged-document-theft
Turing Powered (Score:2)
Adding a new step to the GPU process, not only do you have to feed the card textures and shaders, now you ALSO have to feed it arbitrary state machines to compute in order to provide the power it needs to perform.
Re: (Score:2)
Huh? WTF are you blathering about?
Re: (Score:2)
Maybe jokes should be funny.
Re: (Score:2)
Maybe jokes should be funny.
Yeah, but anyone could do it that way.
Give RX 570s are going for $130 (Score:4, Interesting)
Also to my shock and awe AMD works now. Longtime PC gamers will remember a period of almost 5 years when their GPU drivers were a disaster. I've been gaming on it for 2 weeks now with zero crashes (knock on wood). The only downside is power consumption, it pulls about 80 watts more than a GTX 1060. But at $100 it's hard to complain.
Um... there on NewEgg for that price right now (Score:2)
I spend a few months sniping ebay to get that 580 for $100 though. You'll usually pay $120-$140. But a GTX 1060 6GB will set you back $160-$180 (was trying to snipe one of those too, still am albeit at $100
The RX 570 is _nuts_ though. It out does the 1060 3gb for $50 bucks less. Heck, I've seen the 8gb version go for $140 on sale. Even with the power consumption (figure 40-60 watts more) it's a no brainer unless you've got a cheap OEM computer (in which case yeah, you're stuck with nv
I'm not really shilling (Score:2)
Something I have noticed, my RX580 is underclocked (1200, XFX apparently ships their GPUs underclocked and then has instructions on overclocking them). I haven't bothered pushing it to it's full allowed clock of 1300 (it's one of the early, low end models) because it's
Re: (Score:2)
It is just the way of things. US corporation when they get dominant market share, just turn into arse holes and take crappy short cuts, look no further than the latest iteration if Nvidia shield, just full of cheap crap shortcuts to maximise profits based on pretending it is like the previous model. So AMD damages it rep and moves heaven and earth to rebuild it with hardwork whilst Nvidia is selling out the trust it built to screw customers and maximise profits. Given time AMD will dominate and Nvidia will
Re: (Score:2)
Well, is it possible that the same money-grabber psychopathic corporate executives move between AMD and NVidia? Just a thought...
Re: (Score:2)
RX580 is the top selling card on Amazon for a long time, it's hard to argue with the value. But otherwise AMD is barely hanging onto its roughly 17% add-in GPU share. Radeon VII is kickass but who knows when the supply is coming back and not everybody wants or needs a high end card. There is a lot of Nvidia hate going around and from where I sit it's richly deserved. I'm in that camp myself, I'd rather eat a turd than give money to NVidia. Seems like I've got lots of company there.
Re: (Score:2)
...contains a lot of people who have been abused by and consequently hate Nvidia.
Re: (Score:2)
That "loses money" refrain is apocryphal and is based on wholesale price of HBM2 not declining at all in 20 months, thus defying the law of semiconductor gravity. More likely, underestimated demand.
I've read bad things about AMD's top end (Score:2)
As for the RX580 being top selling, I'm pretty sure that's due to miners. Miners made AMD cards rare as hens teeth right when they fixed their stability issues, killing their market share in P
Re: (Score:2)
mostly that they're pushing them too hard to hit competitive numbers. You end up with a card that's unstable out of the box.
Complete rubbish, nobody is complaining about unstable out of the box. Rather, there are complaints about buggy tweaking tools. Enthusiasts quickly found that the Radeon VII can be aggressively undervolted at default clock, and in fact can be overclocked while undervolted. [reddit.com]
Re: (Score:2)
As for the RX580 being top selling, I'm pretty sure that's due to miners.
You can be pretty sure you're wrong, confirmed by Steam's hardware survey that shows RX580 and other 500 series steadily increasing total installed share.
Think crypto mining (Score:2)
Re: (Score:2)
As for the RX580 being top selling, I'm pretty sure that's due to miners.
You can be pretty sure you're wrong, confirmed by Steam's hardware survey that shows RX580 and other 500 series steadily increasing total installed share.
Yes, as the miners now sell their used 580s to gamers who were essentially locked out due to inflated prices for over a year.
Re: (Score:2)
Re your mining theory, it probably explains why prices are remain fairly high for Vega 56 and 64, and it might explain why Radeon VII sold out in about two hours, when miners found out about the double precision floating point performance (3.4 TFLOPS.) I seriously doubt that miners are buying 500 series now, especially as there are many used ones on the market if they really do feel the need.
You know what miners really aren't buying? Nvidia cards. It really never made much sense, and now that profitability
Re: Give RX 570s are going for $130 (Score:2)
Steam says otherwise. Gamers gave the finger to AMD and bought the 1050ti over the much supperior Rx 570 15 to 1!
Steam hardware survey has Nvidia owning ,85% of the market with Intel and AMD fighting for the 15%. Gamers want game works optimized titles and Jay2cents and Linus Tech Tips whore for Nvidia with a vengeance making commenters think you're an idiot to consider anything but Nvidia.
AMD is screwed as people are so brainwashed
570s were still being bought for mining (Score:2)
Meanwhile you could still get 1050s for $200 bucks (crazy, since it was suppose to be a $120 card, but so be it). 1050 TIs were pushing $250, but again, you take what you can get when a bloody RX 570 is going for $350 bucks.
Re: (Score:2)
Steam hardware survey says 75% Nvidia, not 85%.
Re: (Score:2)
I have no problem with AMD, and own an integrated AMD GPU on my i7-8705g.
But I don't get why you so ravenously try to alter reality to make AMD look like rainbows come out of its ass.
Re: (Score:2)
AMD is screwed as people are so brainwashed
Producing buggy cards with barely working drivers is not consumers being brainwashed. Sometimes there's more to life than raw performance per dollar. That said AMD has gotten *MUCH* better since they first released the 5xx series.
Re: (Score:2)
Sounds like you're a bit out of touch. AMD drivers have been great for years, particularly on Linux. Open sourcing the mainstream driver was brilliant, and they got a lot of loyal customers for that.
Re: (Score:2)
NV drivers on Linux are vastly superior in performance and stability, and lack of quirkiness.
You want real fun? Try to get good performance on a dedicated AMD GPU on a laptop that shares an integrated Intel GPU.
DRI_PRIME=1. Come on. Bring back my Catalyst control panel, please.
Though the fact that AMDs drivers are open source is something I appreciate very much.
Re: (Score:2)
with two free games and RX 580s are readily available for $120 on ebay (just got one for $100) they're probably feeling a bit of pressure on the low end.
If you're not interested in gaming there's always a cheaper slower option on the market.
Pedantry (Score:1)
Since the Slashdot janitors don't edit, all I can do is rant:
offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores
"Comprise" means "include" not "compose". "Comprise" implies these are all the things included, where "include" leaves open the possibly that you didn't list everything.
It's never correct to say "comprised of" any more than "included of"; That's just someone trying to look smart but instead babbling nonsense. If you want to say "composed of", say "composed of". If you want to use "comprise", use it just like you would "include".
So "offers a cu
Re: (Score:2)
Or cut all that bullshit and say 'With'.
So "offers a cut-down NVIDIA TU116 GPU with 1408 CUDA cores"
Cmon man get with the times if it don't fit in a tweet people don't have the time or brain to pay attention to it.
Re: (Score:3)
Re:Pedantry (Score:4, Informative)
WRONG!
There's some jackass going around trying to convince everyone of that, and he's dedicated his life to eradicating all instances of "comprised of" from Wikipedia, and the shitty "news" articles that covered his efforts are almost assuredly why you "know" this "fact".
But that jackass is WRONG! The usage of "comprised of" is perfectly valid, and has been in standard usage for ages. It comes from the Latin comprehendere, and basically means to bring shit together (com) before (pre) taking it (hendere). Comprise means to collectively make up, form, or constitute.
3 books that comprise a volume are the 3 books comprising that volume, and that volume is comprised of (or by) those 3 books.
The only thing you are even close to correct on is the idea that "com" may imply completeness, as in "complete". But you're still wrong because "complete" itself refers to the fucking groups of soldiers that absolutely did have things not included. When 10 guys die or are incapacitated you would complete your unit by adding more from your slaves / subjects that weren't initially included. Hell, a unit of soldiers is also known as a "complement". Complete doesn't mean everything is included, but that nothing necessary is missing. Thus a GPU "comprised of" 1408 CUDA cores is perfectly valid as long as they didn't sell it as a GPU that should have more CUDA cores. They have different SKUs for that.
Re: (Score:1)
https://duckduckgo.com/?q=comp... [duckduckgo.com]
Every result I see says I'm right. I see no one arguing the other way.
Re: (Score:2)
Not wrong, but no one cares either.
The oxford dictionary specifically calls out that "comprised of" in this sense is common in the english language but also classically and grammatically incorrect.
This usage is part of standard English, but the construction comprise of, as in the property comprises of bedroom, bathroom, and kitchen, is regarded as incorrect.
Re: (Score:3)
WRONG!
There's some jackass going around trying to convince everyone of that, and he's dedicated his life to eradicating all instances of "comprised of" from Wikipedia, and the shitty "news" articles that covered his efforts are almost assuredly why you "know" this "fact".
But that jackass is WRONG! The usage of "comprised of" is perfectly valid, and has been in standard usage for ages. It comes from the Latin comprehendere, and basically means to bring shit together (com) before (pre) taking it (hendere). Comprise means to collectively make up, form, or constitute.
3 books that comprise a volume are the 3 books comprising that volume, and that volume is comprised of (or by) those 3 books.
The thing I've discovered about pedants is that those who are most pedantic about something tend to be the ones who know the least about that subject. English language pedants doubly so. Over here a lot of people get hot under the collar if you say "can I get" despite it being perfectly cromulent. Same with using literally as hyperbole. Even the Oxford English Dictionary now literally lists the hyperbolic definition of literally.
People who actually know a lot about language (or other subjects) tend to be
Re: (Score:1)
It hurts me to say it, but you are absolutely correct, and I endorse your rant.
Re: (Score:3)
The Oxford Dictionaries online dictionary regards the passive form "comprised of" as standard English usage [wikipedia.org]
Re: (Score:2)
I want ... it NOT to run at 95c and use 300 watts please AMD!
I'm afraid those days are done. We're pretty much at the end of the road for die shrinks. There are some who are hopeful that we'll make it to 5nm, and some Cthulhu dreamers thinking we'll get to 3.5 in a few more years.
Welcome to the end of the road.
Next up is multi-layer dies, which have same surface area, but 2-4 layers of transistors -- meaning 2-4x the heat to transfer out of the same contact patch.
You thought heat was a problem before? Hold my beer.
Re: (Score:2)
They're going to have to figure out a way to reduce voltage, then...
Re: (Score:2)
Wait. For. Navi.
Anyone buying a new GPU now is a fool! Navi will be out, and it'll likely disappoint, but it'll at least be a 7nm GPU.
Nvidia will likely trot out their 7nm offerings in the fall. These will blow anything AMD has out of the water, but Nvidia will probably charge way too much for them, so some of the Navi parts may bet better for a given price point.
Note that AMD already release 7nm GPUs in the compute-focused Vega cards and their Radeon VII rebrand. This is Vega. Vega is shit. It is hot,
Re: I wish AMD would release new cards (Score:2)
Nope it doesn't match the 2080ti so gamers will assume it sucks for all editions like always. Nvidia has an incredible mindshare and cult following if you go on YouTube reviewer sites like Jayz2cents
Re: (Score:3)
but it'll at least be a 7nm GPU.
I'll take irrelevant shit for $100 Jim!
I agree with waiting for Navi to see what the competition brings to a table and at what price. But there's few things I could give less shits about than the transistor gate size on the die itself. Tell me the performance, the price, and if it is possible to not sound like a vacuum cleaner and those are the only things that come into any buying decision.
Oh and if it had RGB lighting, because you know can't build a computer in 2018 that doesn't look like a 70s disco on t
Re: (Score:2)
15% boost? sounds pretty underwhelming otherwise.
Welcome to the silicon wall. It took Nvidia 36 months and roughly 50% more transistors to get that 15% of performance.
The days of doubling your performance every couple of years is done.
Re: (Score:2)
Nah, AMD let Nvidia sit on their ass that long, and all Nvidia had to do was rebrand their machine learning cards. They didn't even design and fab something new based on that architecture. They just gave people cut down chips and said the tensor cores would run raytracing (via Nvidia's proprietary, game-crippling middleware RTX) and shitty AI upscaling that's worse than regular upscaling (DLSS).
Re: (Score:2)
Welcome to the silicon wall. It took Nvidia 36 months and roughly 50% more transistors to get that 15% of performance. The days of doubling your performance every couple of years is done.
Then crippled it with 2/3rds the memory bandwidth of the 1660 Ti, so it's actually capable of considerably more. As for long term, the human brain does everything it does in 15-20W. Maybe we can't do it with die shrinks but something tells me there's still a ton of potential in low-power computing.
Re: (Score:1)
Re: (Score:2)
RX 580 retails for under $200 for months now, which is why it's Amazon's top selling card. 1660 isn't going to displace 580, rather it's going to eviscerate the RTX line. Nvidia knows it and that's why they needed the Mellanox distraction.
Re: (Score:1)
If you paid attention to the YouTube tech scene, dozens of videos were made about that point yesterday.
If you haven't noticed, creimer is a retard.
Fuck off.
Except for a lucky strike of course (Score:2)
What is this in terms of Seti@Home or some other thing? Every couple of years I can do everything I've done before over a decade and a half in a few weeks, making all that effort pointless. I suppose nobody should run any number crunching until the year 2525 and get everything from now until then done by February 2525.
15% Better Than 1060 (Score:2)
What the 1660 boils down to is ~15% more performance than a 1060 for the same price. Same amount of VRAM, also.
Get your act together, AMD, we're heading toward Intel-style 7% gains per GPU generation.
Re: (Score:2)
AMD is eating nVidias lunch on the low end with the integrated Vega CPUs. A 2400g is about equivalent to a gt 1030 in performance.
You get a quad core, 8-thread CPU and it comes with the equivalent of a free 1030 built in for $135? Makes a hard case for buying anything from nVidia's low end.
I would be more interested (Score:2)
In seeing Nvidia come back to reality when it comes to their GPU pricing as of late.
Walked through a Fry's recently and saw an entire SHELF full of 2080 TI cards ( maybe 30+ units ) at $1500 each.
I'm pretty sure the price is WHY the shelf was still full of them.
Similar to the lesson Apple had to learn with their overpriced iPhone X, there is a limit people are willing to pay for any given product.
AWESOME (Score:2)
This is awesome- another stupidly-expensive video card that will be obsolete in 6 months. Woo hoo!
Okay, maybe it'll actually be obsolete in 3 months, but hey- for that 90-day window I'll have a video card that my friends won't geek-shame me over. I won't have to hang my head in shame because my video card doesn't have the latest GPU made from genuine imported yak kidneys or whatever.