Forgot your password?
typodupeerror
Android Hardware

Nvidia Announces 192-Core Tegra K1 Chips, Bets On Android 128

Posted by samzenpus
from the get-it-while-it's-hot dept.
sfcrazy writes "Nvidia just announced Tegra K1, its first 192-core processor. NVIDIA CEO Jen-Hsun Huang made the announcement at CES 2014. He also said that Android will be the most important platform for gaming consoles. 'Why wouldn't you want to be open, connected to your TV and have access to your photos, music, gaming. It's just a matter of time before Android disrupts game consoles,' said Huang." Nvidia's marketing department created a crop circle to promote the chip after CEO Jen Hsun Huang declared that it was so advanced that "it's practically built by aliens."
This discussion has been archived. No new comments can be posted.

Nvidia Announces 192-Core Tegra K1 Chips, Bets On Android

Comments Filter:
  • by mrchaotica (681592) * on Monday January 06, 2014 @10:36AM (#45877621)

    Nvidia's just saying that because they lost the bid for all the consoles.

    (It doesn't mean it's not true, though.)

    • by UnknowingFool (672806) on Monday January 06, 2014 @10:44AM (#45877661)
      Well maybe, but also there is the trend that most people are playing games on smart phones and not consoles. For everyone that bought a new game for PS4 or Xbox One, there are probably 10x as many consumers who bought Candy Crush Saga. Not everyone wants to spend hours in front of TV or monitor playing games. Some people just want a bit of downtime between doing other things.
      • by Kjella (173770)

        Recently Ive switched to driving but before that it used to be bus or tram or subway and it's perfect downtime to try catching a star in Angry Birds or whatever. Mobile gaming is usable in a lot of places consoles could never reach.

      • You can't take your console into another room while your dad watches a football game. You can't take your console for a drive. And the interface for surfing the web or posting to Twitter on your console usually sucks.

        I thought about getting a Playstation 4, but I think I'm going to save up and get a relatively high end Android hybrid tablet with attachable keyboard, like the next generation equivalent to the ASUS Transformer TF701T (or whatever the designation is). My kids and I can use it as a tabl
        • by aliquis (678370)

          Kjella above you:

          Recently Ive switched to driving but before that it used to be bus or tram or subway and it's perfect downtime to try catching a star in Angry Birds or whatever. Mobile gaming is usable in a lot of places consoles could never reach.

          You:

          You can't take your console into another room while your dad watches a football game. You can't take your console for a drive. And the interface for surfing the web or posting to Twitter on your console usually sucks.

          I still don't think it's even comparable products. I have a DS. I rarely set down on the couch to play a game on it. I did at the toilet even if the visit could become closer to an hour instead. For some games of simpler content maybe half an hour.

          The thing is you wouldn't go very far in Fallout at the bus even if the trip took 30 minutes. You wouldn't shoot very accurately in some shooting game either. You wouldn't get to play Zelda until you had accomplished whatever you wanted to fin

          • I'm mostly thinking in terms of having a family. If I get a gaming console, that just increases the fighting for use of the television. It's also a tremendous help for when we go on long drives to visit family members.

            I don't type much on the tablet touch-screen except for searches through Netflix movie lists and lists of books I own. If I want to get any serious work done with one, I would use an external monitor and bluetooth keyboard and mouse.
        • by aliquis (678370)

          Or put it otherwise:

          Yeah, sure, you can play games on your phone on the bus, toilet, sitting in your bed before going to sleep or when some better screen&device is occupied. You can't use a PC for the same purpose.

          But on the other hand you can easily type text on the PC, hold many more tabs in your browser, code things, more easily do adjustments to your photos, store more files and play stuff like Battlefield 4, Civilization V, Starcraft II and such which may not work that great on your phone (guess Ci

          • I wrote basically the same response to this above - I'm thinking in terms of my kids. The television already does streaming video and DVDs, adding a gaming console will just increase the intensity of the fights over who gets to pick the current show. A tablet will cause a temporary increase in fights just because it's the newest toy when we first get it, but after the novelty wears off the kids will just take turns between the television and tablet.

            And there's also the long drives we take once a month
      • by aliquis (678370)

        But there's more to it.

        One could question what the profits are per chip of these vs the profits on each GTX 780. But then one would also have to consider the R&D and production cost of each.

        For me currently I'm buying my share of the Humble Bundles but the Android titles seem so über-shitty, shallow and ugly vs the PC titles so currently the Android games have little pull on me. Some portable titles may have more game content but maybe those are for the PSP, PS Vita, DS and 3DS at the moment.

        • One could question what the profits are per chip of these vs the profits on each GTX 780. But then one would also have to consider the R&D and production cost of each.

          First of all a gaming console would not have a GTX780. It would have a customized, somewhat cheaper and less powerful chip. But from the R&D and production view, as a manufacturer, it might be worth it to sell cheaper mobile GPUs but many more of them than bigger console GPUs. Every year people buy hundreds of millions of smartphones. This is easily more than both the lifetime sales of PS3 and Xbox 360 combined.

          Maybe some people play and like Candy Crush Saga. I would never want to play it. Guess it fit in well with the Humble Bundle Android titles I mention. Pure complete shit which is limited by the crap you hold in your hand (which include the whole interface issue, now that can be fixed by external analogue joysticks and such but to put them to their best use I guess they will have to be required by the game so the game can be designed for them and that will be more interesting for the developers if more people have them, then again if you're going to snap your phone into something which make it similar to a PSP you lose some of that portability. May still work at home but there you've got more options.

          This is pure gaming snobbery. Yes, your console or PC game is much more intricate and co

          • by aliquis (678370)

            Yeah, sorry about the GTX 780 vs console part. That was just me comparing portable gaming vs non-portable gaming and I'm not interested in a console so I picked a graphic card instead for my example.

            In reality the R&D is connected since Nvidia likely use lots of knowledge from their desktop GPU business in designing the new mobile chips, the same goes for knowledge and equipment for actual production of the chips and if a similar process is done to make the chips the same is of course true there. The di

      • Well maybe, but also there is the trend that most people are playing games on smart phones and not consoles. For everyone that bought a new game for PS4 or Xbox One, there are probably 10x as many consumers who bought Candy Crush Saga. Not everyone wants to spend hours in front of TV or monitor playing games. Some people just want a bit of downtime between doing other things.

        Ehhhh. That's not the reason behind this.

        You're right, mobile gaming is huge. But Candy Crush Saga doesn't require a Kepler GPU. There aren't many popular mobile games, tablet or otherwise, crying out for more horsepower at this point.

        NVidia has a more basic problem. As the grandparent post noted, their customers are drying up. The industry has pretty much agreed the IGP is the future. IGP delivers extremely fast compute performance and lower power usage. Both the Xbox One and PS4 use a single chip for grap

      • by exomondo (1725132)

        Well maybe, but also there is the trend that most people are playing games on smart phones and not consoles. For everyone that bought a new game for PS4 or Xbox One, there are probably 10x as many consumers who bought Candy Crush Saga.

        It really is a completely different type of game, smartphone games are more time wasters than immersive experiences and they typically don't need much in the way of computing power. People aren't going to play Candy Crush or Doodle Jump on their XBox just as they aren't going to play Skyrim or Metal Gear Solid or Diablo on their smartphone. There are a myriad of Android consoles that you could use to play all those Android games on your TV but who wants to do that?

      • by twocows (1216842)
        And some people don't. The game industry was plenty big before casual gaming was a thing and it will continue to be a big thing once that fad dies. Console/desktop PC gaming will be around for a long time to come, no matter what Nvidia's marketing department says.
    • Right, the Ouya proves his point.

      NOT!!!!!
      • I'm pretty sure the millions of phones and tablets people are using to play facebook games instead of PCs and consoles is proving his point.

        There were also a fairly large number of Ouya's sold and shipped from their kickstarter. I think their downfall is that now that it's out in the wild, they have a very small selection of games to play on it. I've bought a couple of games on mine and, entertainment wise, it's just as good as my PS3, or Wii, it's just there isn't much there I'm interested in.
        • Every time I get excited about a really nice game on my tablet or phone, I start wishing it was on a bigger screen with a real controller instead.
          There are very few exceptions.
          Sure, that doesn't eliminate something like an Android-powered set-top device but the games aren't usually made to work well in that configuration.

    • by tlhIngan (30335)

      Nvidia's just saying that because they lost the bid for all the consoles.

      (It doesn't mean it's not true, though.)

      Maybe, but unlikely - I think they deliberately spurned away consoles after what happened with the original Xbox and the PS3 - basically they end up being screwed badly by both Sony and Microsoft and they didn't want that happening again.

      AMD though, needs the business (both Microsoft and Nintendo provided some support for the Wii and Xbox360).

  • Am I the only person that read the headline and thought CPU? Misled?

    • The CPU in this has four 32-bit 2.3GHz Cortex A15 cores. A model will come out later with two 64-bit 2.5GHz "Denver" cores -- a CPU of NVidia's own making which they haven't released many details about but their benchmarks show as significantly faster.

      When I saw them marketing it as 192 cores I let out a sigh... because these kind of dumb tactics are so expected now.

      • by Z00L00K (682162)

        It would have been a lot more interesting if it actually was 192 CPU cores in it. Of course it would be a bit of a challenge to code for it - and to get an efficient OS build for it. But on the other hand it's probably the way that we need to go in order to get more performance in the future.

      • 32bit so games will be cap about 2.5 gb ram and 1gb video ram?

        • 32bit so games will be cap about 2.5 gb ram and 1gb video ram?

          What phone has more than that?

        • by Bert64 (520050)

          Sure, why not?
          That's considerably more than an xbox 360 or ps3, and people are more than happy to play games on those.

        • by Anonymous Coward

          32-bit means only that a single process sees a 32-bit virtual address space. The underlying A15 hardware supports 40-bit physical addresses, so the OS can deal with up to 1TB of RAM, if it wanted to. The only restriction is each process can only see 4GB at a time. The GPU frame-buffer is not mapped into the game processes address space (parts of it may be mapped in to the graphics driver's address space), so your app/game can use 4GB of ram (more or less).

          http://www.arm.com/files/downloads/ARMv8_white_pap

      • by Anonymous Coward

        Those 'cores' are far more powerful CPU processors than anything we used in the pre-80486 age. The are NOT usually programmed like a traditional CPU, since such a paradigm would give LOUSY performance (see the world's most disastrous CPU/GPU project- the Larrabee- for a textbook example of why you NEVER EVER simply build loads of what a very very dumb person considers 'proper' cores).

        You, PhrostyMcByte, are the kind of cretin that causes the Chinese companies to stuff EIGHT useless CPU cores in their newes

    • I was misled too. The useless headline is using the same pointless info nVidia presented, because they can't use "CUDA cores".
  • Have to remember that one for the next time I present a design or an engineering proposition to some pointy-haired bosses. Ha !
  • I definitely thought the crop circle was manmade, given the design and the reports that said a group of people were in the area. I thought it was more an independent attention-whore art-prank, though; nor did bells go off in my mind that the Braille "192" meant a 192 core processor either (though it obviously was a processor or circuit board by appearance).

    • by Threni (635302)

      "I definitely thought the crop circle was manmade, given...." ...that they all are?

  • ... Why would Sony and Microsoft be open? What interest would they have in using a competitor's OS? And creating a completely NEW platform would still be very hard...
  • by Anarchduke (1551707) on Monday January 06, 2014 @10:59AM (#45877771)
    Why didn't they give this the code name Roswell?
    • by Anonymous Coward

      Why didn't they give this the code name Roswell?

      Wasn't that the place where the thing made by aliens allegedly crashed and burned?

    • by Xest (935314)

      Wrong sort of aliens, they mean the illegal kind. You know, to cut costs.

  • Difficult to see. Always in motion is the future
  • by Anonymous Coward on Monday January 06, 2014 @11:06AM (#45877827)

    Better be careful with statements like "Practically built by aliens". Nvidia might be getting a visit from immigration control to make sure their aliens are not illegal.

  • When can I have it in my phone?
  • First? (Score:5, Insightful)

    by gman003 (1693318) on Monday January 06, 2014 @11:47AM (#45878237)

    To begin with, the summary and headline are being misleading - that's 192 GPU "cores" (really ALUs - there's only one scheduler on this entire GPU), so it's already inaccurate. But it's also hardly the first Nvidia chip with 192 "cores".

    First Tegra with a 192-core GPU, but it's not their first 192-core GPU. Their first was the GeForce 260, followed by the GeForce GTS 450, GTX550 Ti, GT630, and GT635.

    In fact, this is basically a GT630 with a smaller memory interface (64-bit LPDDR3 instead of 128-bit DDR3) and a few power optimizations.

    The sad thing is, they don't have to make up bullshit for marketing - they're bringing a full-fledged, full-featured GPU to mobile products, with all the modern features that entails. And even with just one SMX at low clocks, that's still a lot of horsepower - I run Crysis at 1080p on high with just two SMX units (660M). Putting that amount of power into a tablet would be impressive on its own, no lying about "cores" necessary.

    • Re:First? (Score:4, Informative)

      by Nemyst (1383049) on Monday January 06, 2014 @12:55PM (#45879011) Homepage
      The problem is, you needed an entire paragraph to get your point across (and that's assuming people know what Crysis is, which is fine on PC but not so much on mobile). They need to make it sound impressive in five words or less, otherwise the fickle market has already turned their collective head elsewhere.
    • by aliquis (678370)

      While that may sound cool to someone who don't know their stuff where do that it relative the rest of the market?

      GT 630: 311 GFLOPS (is that the one you mean? http://en.wikipedia.org/wiki/GeForce_600_Series [wikipedia.org])

      Xbox 360: 240 GFLOPS (http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/2)
      WiiU: 352 GFLOPS (? http://geekermagazine.com/xbox-one-vs-ps4-vs-wii-u/ [geekermagazine.com])
      Xbox One: 1,23 TFLOPS (http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/2)
      Playstation 4: 1,84 TFLOPS

  • I was about to ask the potential scrypt mining power of this thing relative to its cost and power requirements and then I realized it's nVidia.

    Forget about it.

  • A 5W chip producing 360 gflops. To put that into persective a GeForce 730M, which has the same architecture, but twice as many cores and rated at 556.8 GFLOPS, is a 33W part.
    So basically, Nvidia have made Kepler 4 times more efficient with no architecture changes. What magic did they use?

    • by fredan (54788)

      butterfly labs and josh "inaba" zerlan.

      he's is they guy how can fix anything in just two weeks (tm).

      BFL is known for their accurate power calculation when they are designing their products. that's why nvidia hired them.

"Most of us, when all is said and done, like what we like and make up reasons for it afterwards." -- Soren F. Petersen

Working...