Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Handhelds Hardware Technology

Nvidia Demos 'Kal-El' Quad-Core Tegra Mobile CPU 109

MojoKid writes "Nvidia just took the wraps off their first quad-core Tegra mobile processor design at the Mobile World Conference today and it's a sight to behold. Dubbed Kal-El, the new chip will be capable of outputting 1440P video content and offer 300 DPI on devices with a 10.1" display. Nvidia is claiming that Kal-El will deliver 5x the performance of Tegra 2 and ship with a 12-core GeForce GPU as well. The company has also posted two different videos of Kal-El in action."
This discussion has been archived. No new comments can be posted.

Nvidia Demos 'Kal-El' Quad-Core Tegra Mobile CPU

Comments Filter:
  • Slated for the next four years are "Wayne", "Logan", "Powdered Toast Man", and "The Tick".
    • In true Tick fashion, even on Slashdot he doesn't get any respect. Here he gets listed after a hero that farts toast and burns the US Constitution for warmth.
    • by gl4ss ( 559668 )

      LOG LOG EVERYBODY NEEDS A LOG LOG LOG IT'S BETTER THAN BAD ITS GOOD. Yes, log. All nations love Log. So, hurry now
      to your local store and be the first in your country to have the International Log.

      but really what they're promising here is taking the old dual core chip and doubling everything and then selling those specs as the real deal.

      what happens to power use? is this faster than building a system with two tegras?

      • Can't build a system with two Tegras, as they're not CPUs, they're SoCs - it'd be two systems on the same motherboard. (Which would be interesting for some server applications, but still...)

        • You could put a fast interconnect between them and run some SSI cluster OS. As long as you were careful with the scheduling, they'd look like one NUMA machine with 8 cores and two GPUs.
          • I'm assuming we were talking about Tegra 2s, so it'd look like one NUMA machine with 4 cores and two 4-"core" GPUs, versus the Kal-El chip, which can run a normal OS, has 4 cores, and runs a 12-"core" GPU.

    • Slated for the next four years are ....

      Don't count your weasels before they pop, dink.

      -- The Tick

    • Let's hope this doesn't become a Blur
  • finally, we get our high-resolution screens?

    Too bad it'll probably be at the cost of having to upgrade everything to blu-ray 2.0 or something...

    • Re: (Score:2, Insightful)

      by chenjeru ( 916013 )

      1440 is a version of 1080p. It still has 1080 lines of horizontal resolution, but only 1440 vertical lines instead of the standard1920. This format uses non-square pixels to fill a 16x9 aspect.

      • Re: (Score:3, Informative)

        by GerbilSoft ( 761537 )

        1440 is a version of 1080p. It still has 1080 lines of horizontal resolution, but only 1440 vertical lines instead of the standard1920. This format uses non-square pixels to fill a 16x9 aspect.

        This right here is why "HD" is a joke. You've got 1366x768 "720p" displays that are only capable of showing 1280x720 signals, and now there's "1440p" displays that are non-square 1440x1080 instead of the expected 2560x1440. Either that or you're mistaken, since the slides in TFA mention 2560x1600.

        • Almost anything with a dual-dvi output can drive a 2560x1600 external display. I was hoping they meant it had accelerated rendering for above-1080p-video, which actually would have been cool.

          Display resolutions seem to be going down. 1600x1200 laptops were common for a while. Even 1 1/2 years ago I could get a 1080p monitor for under $100, but a couple weeks ago I needed another one and all the monitors around $100 are only 1600x900 if not some weird resolution slightly less than that.

        • by Kjella ( 173770 )

          No. HDV is 1440x1080i at maximum, but 1440p refers to the vertical resolution. 2560x1440 is not a very common resolution but found in monitors like Dell UltraSharp U2711, Apple LED Cinema Display 27 and Nec NEC SpectraView Reference 271W. You'll have an easier time finding ice cream in Sahara than native 1440p content though.

      • 1440 is a version of 1080p. It still has 1080 lines of horizontal resolution, but only 1440 vertical lines instead of the standard1920. This format uses non-square pixels to fill a 16x9 aspect.

        No, they mean 2560x1440 with progressive scan.

        • No, they mean 2560x1440 with progressive scan.

          Well that's a suck-tastic downgrade from their current and past video card lines.

          I'm sitting at a workstation with a pair of 2560x1600 resolution monitors right now, on an old Quatro FX 4600 NVidia card that runs them just fine. (Seriously - it's nowhere near their former top of the line, which is in the workstation at the other end of the table from me...). Since their old cards could do better, and they're now bragging about being able to do less, why should we be impressed?

          Or is this a sign that the H

          • by Rennt ( 582550 )

            Well that's a suck-tastic downgrade from their current and past video card lines.

            But a massive upgrade from current mobile SOCs. Honestly, these are designed for tablets... why the hell are you blathering about high res monitors and discrete chipsets?

            • Well that's a suck-tastic downgrade from their current and past video card lines.

              But a massive upgrade from current mobile SOCs. Honestly, these are designed for tablets... why the hell are you blathering about high res monitors and discrete chipsets?

              Presumably they missed the words 'Tegra' and 'mobile' in the title, and 'Mobile World Conference' in the summary, plus most of TFA.

          • by wisty ( 1335733 )

            Yeah, but it means that my crappy netbook with Intel graphics can drive some of the best monitors on the market. No more monitor jealousy, as everyone is brought down to the same level ..

            except for 27 inch iMac users, who get 2560 x 1440.

          • The last monitor I bought for myself, right before HDTV came out and screwed us all, was 1920x1200 - you can't even find a monitor with that high of a resolution anymore.

            Dell Ultrasharp 2410 - 24" IPS screen @ 1920x1200 native resolution.

            $600 MSRP

            $500 or so in practice though

      • Re:1440p? (Score:4, Informative)

        by beelsebob ( 529313 ) on Wednesday February 16, 2011 @08:37AM (#35220150)

        not true. 1440p, as with 720p and 1080p refers to the number of rows. 1440p would be 1920 pixels wide at 4:3 or 2560 pixels wide at 16:9.

        • You may be correct in this case since it does seem that 1440p is a different format than what I was describing. However, my referenced format does indeed exist, it's the HDV1080 standard, which is 1440*1080 with pixels at 1.33:1 (anamorphic).

      • by OrangeTide ( 124937 ) on Wednesday February 16, 2011 @09:30AM (#35220654) Homepage Journal

        Their demonstration showed 2560x1440 content.

        • I had assumed the lesser value of 1440 (HDV) versus the greater (1440p). Mark me as corrected, and pleasantly surprised.

      • http://en.wikipedia.org/wiki/Extreme_High_Definition [wikipedia.org]

        A related term is Extreme Definition (or XD). This is a term used on the Internet[citation needed], referring to the 1440p - 2560x1440 - resolution. The term was formulated with Extreme High Definition in mind, since both standards share the 2560 pixel horizontal resolution. To avoid confusion between the two resolutions, however, the word high was left out.

        For several months, the only device which output this as its native resolution was Apple's 27-inch

      • by RMingin ( 985478 )
        Incorrect. 1440p = double 720p. 2560*1440. I believe you're thinking of 4:3 1080p, which was 1440*1080. When referring to HD resolutions by their single number nicknames, it's always the vertical resolution that is named.
      • by dfghjk ( 711126 )

        This gets modded Insightful...

  • You'll be just as disappointed as I was. It doesn't fly at all.
  • by Tumbleweed ( 3706 ) * on Wednesday February 16, 2011 @08:17AM (#35220026)

    So performance similar to a Core 2 Duo (T72000) in a phone? Sa-weet! Gimme a dock so I can plug my 'phone' into and use my monitor/mouse/keyboard/internet connection, and that's all the computer I'll need for most purposes. I'll figure up the big boy when I need to use Photoshop or other intensive things.

  • by Anonymous Coward
    Geez, I'm made of kryptonite. This is unfair!
  • power consumption? (Score:5, Insightful)

    by crunzh ( 1082841 ) on Wednesday February 16, 2011 @08:44AM (#35220210) Homepage
    They write nothing about power consumption... I am disappointed. The most important benchmark of a mobile CPU is power consumption, I can stick a atom in a cellphone to get a lot of cpu power, but the batteries will be toast in no time.
    • And since they're not crowing out how great it is (which they certainly would if it were) then we can probably fairly safely assume is a relative battery-buster.

    • It looks like these aren't due to be released till the end of the year(with devices using them early 2012). It's possible they simply have no good power consumption numbers yet.
      • by Khyber ( 864651 )

        "It's possible they simply have no good power consumption numbers yet."

        Sorry, even my company has the brains to hook the equipment up to a kill-a-watt during the various testing phases of product development, so we have power figures available immediately for given loads.

        If nVidia can't cough up $200 in measly American Currency for ONE KAW tester, then nVidia is bound to be going the way of the dinosaur.

        Oh, wait, they've already begun emulating 3dfx, by selling their own cards. We all saw how well that work

        • by Guspaz ( 556486 )

          They've had actual silicon for 12 days. They may have been too busy showing it off to the press to sit down and plug it into a killawatt.

          Referencing the Anandtech article, nVidia claims that for the same workload, it is as efficient or more efficient than the Tegra 2, but if you increase the workload, it'll obviously use a bunch more power.

    • by CAIMLAS ( 41445 ) on Wednesday February 16, 2011 @11:23AM (#35221808)

      Incorrect. The latest atoms (granted, not readily available for consumption) are fast and lower power than some of the leading ARM smartphone CPU/SoCs (or at least comparable on a perf/watt basis).

      Your biggest power drains in a smartphone will be:

      * Cellular and WiFi radios
      * Display
      * Crap software - poorly implemented drivers for the above, in addition to poorly implemented 3D/etc. drawing mechanisms which ineffectively utilize the processor, draw a lot on the screen, and so on.

      • Ummm, no? 9 watts is low for an Atom chip. 5 watts is unheard of, though AMD is planning on something to that effect with Bobcat.

        The dual-core 1ghz Tegra 2 with its embedded graphics core and 720p h.264 video decode (actually 1080p, but 720p support is a LOT more comprehensive) is 2 watts. TWO. And that's from six months ago when Nvidia's design was over power budget. It might be closer to 1 or 1.5 now. That's for the ENTIRE chipset, whereas Atom's motherboard adds another 10-20 watts.

        Even the AMD chip, whi

        • by steveha ( 103154 )

          Mod parent up. A Tegra 2 is a "system on a chip" and you don't need much else. An Atom needs support chips, and you have to look at the total power budget of the Atom plus support chips.

          A Tegra is much more power-efficient than an Atom. It is not an accident that Android 3.0 tablets will be running on Tegra 2 chips, and not on Atom chips.

          steveha

          • by CAIMLAS ( 41445 )

            you, and your parent, need to do a little research.

            An atom CPU is a bit more than just those 330 whatevers you can still find. There are literally dozens of variants of the "Atom" chips now.

            There is indeed an Atom SoC that clocks in under 4 watts for TDP. This says nothing about idle states, which are drastically improved.

            Now consider the fact that the LCD on a phone takes probably in the range of 15 watts, maybe a bit more. Then you've got the radio, which is going to reduce your battery life all the furth

            • by steveha ( 103154 )

              An atom CPU is a bit more than just those 330 whatevers you can still find. There are literally dozens of variants of the "Atom" chips now.

              The Atom I am familiar with is in a netbook, and it blows lots of hot air out the side vent. Way too much power dissipation.

              After reading your comment, I Googled for "Atom system on a chip" and found:

              http://www.tomshardware.com/news/atom-soc-system-on-chip-e600-processor,11304.html [tomshardware.com]

              Looks like it is actually shipping, too, not just vapor. I haven't heard of any phones sh

      • Mod up. All the drain on most android phones I've used/seen/fixed/rooted/cyanogenmodded was from the display.

    • It will be harnessing the power of our yellow Sun, which will give it super-speed, super-strength, flight, x-ray vision, invulnerability and various other super-abilities and powers.

      So really, you don't have to worry about power consumption. But you DO have to worry about kryptonite exposure and Lex Luthor.

    • by Misagon ( 1135 )

      Yes, but the power consumption of the CPU is still dwarfed by the power consumption of sending and receiving radio signals, so nobody will care.

      • by crunzh ( 1082841 )
        No incorrect, the only thing that rivals the cpu for poweruse in amodern smartphone/tablet is the screen.
  • They essentially have doubled the core count 2 cores, to 4 cores. They are essentially they same cores.

    Are they running at 2.5X the clock speeds as well?? I seriously doubt it.

    Really this goes beyond exaggeration, it is more like pure false advertising.

    • The math IS screwy - 2 times better CPU performance (the benchmarks THEY show even show this - it's the same clock speed, same CPU cores), 3 times better GPU performance.

      You don't get to add those numbers.

      Still, being able to dance with a Core 2 Duo is pretty damn good.

  • Name (Score:2, Funny)

    by T.E.D. ( 34228 )
    ...named of course after terrorist mastermind Kalel Sheikh Mohammed.
    • by CAIMLAS ( 41445 )

      Are you kidding, or do you have something to back that up? It's as plausible as anything, I suppose, but I'm interested in hearing reasoning if indeed this is true.

      • by theMAGE ( 51991 )

        He's just being silly - the guy's name is actually Khalid not Khalel.

    • Named after Superman's Kryptonian name. [wikipedia.org]

      Kid's these days ain't got no culture.
  • I can't wait for AMD/ATI to come out with their new GPU code named Kryptonite!

  • Quad-core ARM Cortex of some kind? Guessing more than a A9? Where are the real details about this chip?
    • If it's anything like the Tegra 2 it's going to be regular Cortex-A9 cores, an Nvidia GPU, and the usual dedicated hardware found on most ARM SoCs. Here's a picture of the Tegra 2 [anandtech.com] so I imagine that the Tegra 3 will look similar, just with more cores and a beefier GPU.

      However, the Tegra 2 doesn't perform any better than the Exynos from Samsung or TI's newest OMAP based on AnandTech benchmarks [anandtech.com], so I don't expect Tegra 3 to be much different from other parts available at the time. Considering Sony has said t
  • if DC doesn't have IP there... well I guess they'd have gone after Cage already if they did.

  • Amazing how fast the industry is ramping up the hardware capabilities of mobile devices. Among the critical problems Windows has on tablets, responsiveness might end up being solved for them. Of course, battery life and usability are still huge problems.

  • From the slide:

    2011 Kal-el
    2012 Wayne
    2013 Logan
    2014 Stark

    That's Superman, Batman, Wolverine, and Iron Man.

    There is a thread here claiming The Tick is in the list, but if so, he's not in the slide from TFA, he's not in the Wikipedia article [wikipedia.org], and Google search doesn't know about it. It's a joke or a troll.

    According to the graph, the performance to come is just crazy! Performance compared to the Tegra 2:

    Kal-El: 5x
    Wayne: 10x
    Logan: 50x
    Stark: 75x? 80x?

    I'm not sure how those numbers can be real, though. A Tegra

Luck, that's when preparation and opportunity meet. -- P.E. Trudeau

Working...