Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Hardware Technology

NVIDIA Launches New $219 Turing-Powered GeForce GTX 1660 (hothardware.com) 101

MojoKid writes: NVIDIA took the wraps off yet another lower cost Turing-based graphics card today, dubbed the GeForce GTX 1660. For a $219 MSRP, the card offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores with a 1785MHz boost clock and 6GB of GDDR6 RAM with 192.1GB/s of bandwidth. Generally speaking, the new GeForce GTX 1660 is 15% to 30% faster than NVIDIA's previous generation GeForce GTX 1060 but doesn't support new ray tracing and DLSS features that the majority of NVIDIA's new Turing cards support. Performance-wise, GeForce GTX 1660 is generally faster than an AMD Radeon RX 590 overall. Boards from various OEM partners should be in the channel for purchase this week.
This discussion has been archived. No new comments can be posted.

NVIDIA Launches New $219 Turing-Powered GeForce GTX 1660

Comments Filter:
  • Adding a new step to the GPU process, not only do you have to feed the card textures and shaders, now you ALSO have to feed it arbitrary state machines to compute in order to provide the power it needs to perform.

  • by rsilvergun ( 571051 ) on Thursday March 14, 2019 @04:54PM (#58274866)
    with two free games and RX 580s are readily available for $120 on ebay (just got one for $100) they're probably feeling a bit of pressure on the low end.

    Also to my shock and awe AMD works now. Longtime PC gamers will remember a period of almost 5 years when their GPU drivers were a disaster. I've been gaming on it for 2 weeks now with zero crashes (knock on wood). The only downside is power consumption, it pulls about 80 watts more than a GTX 1060. But at $100 it's hard to complain.
    • RX580 is the top selling card on Amazon for a long time, it's hard to argue with the value. But otherwise AMD is barely hanging onto its roughly 17% add-in GPU share. Radeon VII is kickass but who knows when the supply is coming back and not everybody wants or needs a high end card. There is a lot of Nvidia hate going around and from where I sit it's richly deserved. I'm in that camp myself, I'd rather eat a turd than give money to NVidia. Seems like I've got lots of company there.

      • mostly that they're pushing them too hard to hit competitive numbers. You end up with a card that's unstable out of the box. You can tweak voltages and such until the card is stable, and you'll get the performance that was promised on the tin, but it just feels like if I'm blowing $700 on a GPU I shouldn't have to.

        As for the RX580 being top selling, I'm pretty sure that's due to miners. Miners made AMD cards rare as hens teeth right when they fixed their stability issues, killing their market share in P
        • mostly that they're pushing them too hard to hit competitive numbers. You end up with a card that's unstable out of the box.

          Complete rubbish, nobody is complaining about unstable out of the box. Rather, there are complaints about buggy tweaking tools. Enthusiasts quickly found that the Radeon VII can be aggressively undervolted at default clock, and in fact can be overclocked while undervolted. [reddit.com]

        • As for the RX580 being top selling, I'm pretty sure that's due to miners.

          You can be pretty sure you're wrong, confirmed by Steam's hardware survey that shows RX580 and other 500 series steadily increasing total installed share.

          • 580s have sold like hot cakes, but it's only the last 3-4 months they've been available to gamers.
          • by drnb ( 2434720 )

            As for the RX580 being top selling, I'm pretty sure that's due to miners.

            You can be pretty sure you're wrong, confirmed by Steam's hardware survey that shows RX580 and other 500 series steadily increasing total installed share.

            Yes, as the miners now sell their used 580s to gamers who were essentially locked out due to inflated prices for over a year.

        • Re your mining theory, it probably explains why prices are remain fairly high for Vega 56 and 64, and it might explain why Radeon VII sold out in about two hours, when miners found out about the double precision floating point performance (3.4 TFLOPS.) I seriously doubt that miners are buying 500 series now, especially as there are many used ones on the market if they really do feel the need.

          You know what miners really aren't buying? Nvidia cards. It really never made much sense, and now that profitability

      • Steam says otherwise. Gamers gave the finger to AMD and bought the 1050ti over the much supperior Rx 570 15 to 1!

        Steam hardware survey has Nvidia owning ,85% of the market with Intel and AMD fighting for the 15%. Gamers want game works optimized titles and Jay2cents and Linus Tech Tips whore for Nvidia with a vengeance making commenters think you're an idiot to consider anything but Nvidia.

        AMD is screwed as people are so brainwashed

        • The miners didn't get scared off until the 550, which is pretty useless (it's a bit faster than integrated AMD graphics). Yeah, even 560s had shot up in price.

          Meanwhile you could still get 1050s for $200 bucks (crazy, since it was suppose to be a $120 card, but so be it). 1050 TIs were pushing $250, but again, you take what you can get when a bloody RX 570 is going for $350 bucks.
        • Steam hardware survey says 75% Nvidia, not 85%.

          • Sure, but it also says more people are using their integrated Intel chipset than are using the first AMD chip to show up on the list, 17 entries down.
            I have no problem with AMD, and own an integrated AMD GPU on my i7-8705g.
            But I don't get why you so ravenously try to alter reality to make AMD look like rainbows come out of its ass.
        • AMD is screwed as people are so brainwashed

          Producing buggy cards with barely working drivers is not consumers being brainwashed. Sometimes there's more to life than raw performance per dollar. That said AMD has gotten *MUCH* better since they first released the 5xx series.

    • with two free games and RX 580s are readily available for $120 on ebay (just got one for $100) they're probably feeling a bit of pressure on the low end.

      If you're not interested in gaming there's always a cheaper slower option on the market.

  • Since the Slashdot janitors don't edit, all I can do is rant:

    offers a cut-down NVIDIA TU116 GPU comprised of 1408 CUDA cores

    "Comprise" means "include" not "compose". "Comprise" implies these are all the things included, where "include" leaves open the possibly that you didn't list everything.

    It's never correct to say "comprised of" any more than "included of"; That's just someone trying to look smart but instead babbling nonsense. If you want to say "composed of", say "composed of". If you want to use "comprise", use it just like you would "include".

    So "offers a cu

    • Or cut all that bullshit and say 'With'.
      So "offers a cut-down NVIDIA TU116 GPU with 1408 CUDA cores"

      Cmon man get with the times if it don't fit in a tweet people don't have the time or brain to pay attention to it.

    • by Xenx ( 2211586 )
      While there is contention among language professionals, "comprised of" is considered standard English usage by multiple well regarded sources. Further, it is commonplace in both writing and speech. As the general public is the intended audience, and not just language professionals, it would be more accurate for them to use it as they have.
    • Re:Pedantry (Score:4, Informative)

      by sexconker ( 1179573 ) on Thursday March 14, 2019 @05:42PM (#58275052)

      WRONG!

      There's some jackass going around trying to convince everyone of that, and he's dedicated his life to eradicating all instances of "comprised of" from Wikipedia, and the shitty "news" articles that covered his efforts are almost assuredly why you "know" this "fact".

      But that jackass is WRONG! The usage of "comprised of" is perfectly valid, and has been in standard usage for ages. It comes from the Latin comprehendere, and basically means to bring shit together (com) before (pre) taking it (hendere). Comprise means to collectively make up, form, or constitute.

      3 books that comprise a volume are the 3 books comprising that volume, and that volume is comprised of (or by) those 3 books.

      The only thing you are even close to correct on is the idea that "com" may imply completeness, as in "complete". But you're still wrong because "complete" itself refers to the fucking groups of soldiers that absolutely did have things not included. When 10 guys die or are incapacitated you would complete your unit by adding more from your slaves / subjects that weren't initially included. Hell, a unit of soldiers is also known as a "complement". Complete doesn't mean everything is included, but that nothing necessary is missing. Thus a GPU "comprised of" 1408 CUDA cores is perfectly valid as long as they didn't sell it as a GPU that should have more CUDA cores. They have different SKUs for that.

      • by lgw ( 121541 )

        https://duckduckgo.com/?q=comp... [duckduckgo.com]

        Every result I see says I'm right. I see no one arguing the other way.

      • Not wrong, but no one cares either.

        The oxford dictionary specifically calls out that "comprised of" in this sense is common in the english language but also classically and grammatically incorrect.
        This usage is part of standard English, but the construction comprise of, as in the property comprises of bedroom, bathroom, and kitchen, is regarded as incorrect.

      • by mjwx ( 966435 )

        WRONG!

        There's some jackass going around trying to convince everyone of that, and he's dedicated his life to eradicating all instances of "comprised of" from Wikipedia, and the shitty "news" articles that covered his efforts are almost assuredly why you "know" this "fact".

        But that jackass is WRONG! The usage of "comprised of" is perfectly valid, and has been in standard usage for ages. It comes from the Latin comprehendere, and basically means to bring shit together (com) before (pre) taking it (hendere). Comprise means to collectively make up, form, or constitute.

        3 books that comprise a volume are the 3 books comprising that volume, and that volume is comprised of (or by) those 3 books.

        The thing I've discovered about pedants is that those who are most pedantic about something tend to be the ones who know the least about that subject. English language pedants doubly so. Over here a lot of people get hot under the collar if you say "can I get" despite it being perfectly cromulent. Same with using literally as hyperbole. Even the Oxford English Dictionary now literally lists the hyperbolic definition of literally.

        People who actually know a lot about language (or other subjects) tend to be

    • "Comprise" means "include" not "compose". "Comprise" implies these are all the things included, where "include" leaves open the possibly that you didn't list everything.

      It's never correct to say "comprised of" any more than "included of"; That's just someone trying to look smart but instead babbling nonsense. If you want to say "composed of", say "composed of". If you want to use "comprise", use it just like you would "include".

      It hurts me to say it, but you are absolutely correct, and I endorse your rant.

  • What is this in terms of Seti@Home or some other thing? Every couple of years I can do everything I've done before over a decade and a half in a few weeks, making all that effort pointless. I suppose nobody should run any number crunching until the year 2525 and get everything from now until then done by February 2525.

  • What the 1660 boils down to is ~15% more performance than a 1060 for the same price. Same amount of VRAM, also.
    Get your act together, AMD, we're heading toward Intel-style 7% gains per GPU generation.

  • In seeing Nvidia come back to reality when it comes to their GPU pricing as of late.

    Walked through a Fry's recently and saw an entire SHELF full of 2080 TI cards ( maybe 30+ units ) at $1500 each.
    I'm pretty sure the price is WHY the shelf was still full of them.

    Similar to the lesson Apple had to learn with their overpriced iPhone X, there is a limit people are willing to pay for any given product.

  • This is awesome- another stupidly-expensive video card that will be obsolete in 6 months. Woo hoo!

    Okay, maybe it'll actually be obsolete in 3 months, but hey- for that 90-day window I'll have a video card that my friends won't geek-shame me over. I won't have to hang my head in shame because my video card doesn't have the latest GPU made from genuine imported yak kidneys or whatever.

No spitting on the Bus! Thank you, The Mgt.

Working...