Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Portables Hardware

nVidia Preview 'Tegra' MID Platform 117

wild_berry writes "nVidia have previewed their Mobile Internet Device platform which will be officially unveiled at Computex in the next few days. The platform features CPU's named Tegra paired with nVidia chipset and graphics technology. Tegra is a system-on-a-chip featuring an ARM 11 core and nVidia's graphics technologies permitting 1080p HiDef television decode and OpenGL ES 2.0 3D graphics. Engadget's page has more details, such as the low expected price ($199-249), huge battery life (up to 130 hours audio/30 hours HD video) and enough graphics power to render Quake3 anti-aliased at 40FPS."
This discussion has been archived. No new comments can be posted.

nVidia Preview 'Tegra' MID Platform

Comments Filter:
  • by Kokuyo ( 549451 ) on Monday June 02, 2008 @10:20AM (#23627307) Journal
    But seriously, this sounds interesting. If they actually manage to pull it off, this might actually make TV on the go a real possibility (compared to strain your neck trying to watch Sex and the City on your phone...).

    Now the only question is, how heavy is the battery to allow for such a long lasting device. You can't tell me it actually is this efficient, if it boasts that kind of computational power.
    • by Svartalf ( 2997 )
      Heh... Odds are, it's got the same performance profile in that regard, to the OMAP3 devices. And they're delivering that sort of muscle to the prototypes we're seeing from those devices- with estimated 10 hour operating times on a charge.
    • by qoncept ( 599709 )
      "(compared to strain your neck trying to watch Sex and the City on your phone...)."

      Yeah, I hate it when that happens. Not that those are the two things I hate most in the world or anything.
    • by TheRaven64 ( 641858 ) on Monday June 02, 2008 @10:57AM (#23627779) Journal
      The CPU is one of the Cortex MPCores. Other devices with these IP cores at their heart use under 250mW (compared to 2-5W for Intel's 'low power' offerings). The GPU is likely to use more power when in heavy use, but I'd expect it to scale back well. For reference, the iPhone's GPU is almost identical to the 3D chip found in the Dreamcast, which got similar Quake 3 performance (note they don't specify a resolution for this).
      • Dreamcast was Reneases with a PowerVR chip, iPhone is some bastard child Samsung ARM that i was told was doing CPU graphics... nope looks to be PowerVR. Ok, both PowerVR graphics, totally different cpu's though.
    • LCD backlight or OLED probably consumes the biggest slice of power anyways. Playing a movie using hardware acceleration and flash media is not that much of a power hog (compared to worse case).
      • No.

        Close the lid. Does your computer last twice as long? No. I suggest before you shout your mouth off you attempt to gain some grounding in the topic. For starters, try:

        watch -n1 cat /proc/acpi/battery/CMB1/state

        With an absolute mizer of a cpu, a Crusoe 800, I go from 850ma to 550ma from the display. The reason the backlight is a 50% power hit is because the cpu is sucking almost nothing in the first place.

        With LED backlighting or OLED the effect would be less pronounced.

        The reason you dont notice mov
        • The reason you dont notice movie cpu drain is because you are used to x86 cpus
          Actually I am an MX31 person (ARM1136).

          Your numbers don't match up with mine, sounds like you're shooting from the hip.

          I was at least polite enough to speak in very general terms and use qualifiers.
  • Yer! ARM laptop (Score:5, Insightful)

    by jabjoe ( 1042100 ) on Monday June 02, 2008 @10:23AM (#23627349)
    I've been waiting for ARM laptop thing. Real battery life! Why do I need x86 compatibility? Give me battery life every time.
    • Re:Yer! ARM laptop (Score:4, Insightful)

      by bsDaemon ( 87307 ) on Monday June 02, 2008 @10:42AM (#23627585)
      x86 compat is less important for us slashdotter types because we can compile the vast majority of software that we use from sources for whatever platform (bsd, linux) and architecture (x86, arm, sparc) we're usuing.

      The people who expect to be able to buy software to run on hardware that they also bought -- they might care -- just a little bit -- I would imagine.
      • Re:Yer! ARM laptop (Score:4, Insightful)

        by LWATCDR ( 28044 ) on Monday June 02, 2008 @10:58AM (#23627797) Homepage Journal
        Maybe but maybe not.
        Most smart phones don't use WindowsXP "I don't know of any that use an X86" and people do buy software for those.
        If used a good Linux distro and then provided repositories than you would have your software.
        A software package system that worked like iTunes would be an Ideal system.
        Provide lost of free and pay software from an easy to use online store and you would have a great business model. Steam shows that it already works for games.
        It should work just fine for this as well.
        Of course this chipset could also be the heart of a new iPhone/iPod Touch as well.
      • Re:Yer! ARM laptop (Score:5, Informative)

        by TheRaven64 ( 641858 ) on Monday June 02, 2008 @10:59AM (#23627817) Journal
        And yet close to 100% of these people are quite happy with their mobile phone, which probably has at least a 200MHz ARM CPU, get upgraded four times as often as their PC, spends more time being interacted with by them than their PC, and doesn't contain an x86 chip or (for 93%) run Windows.
        • by bsDaemon ( 87307 )
          But OP specifically was talking about laoptops, not phones. For phones, yes -- people understand you have to get whatever it is for that specific device.
        • Well, the difference is that they don't use the mobile phone as a PC. It's an appliance. Same as a fridge, or a DVD player, or a TV, or their fixed landline phone. If it does its job, why would you care if your fridge has an x86 in it? Most "normal" people I know don't really do much more than phone on their mobiles and sell the occasional SMS. Very few even realize that they could run any other program on those, much less actually download one, so compatibility doesn't play a role. But I dare say that wi
          • Gimp,Anal rape?
            That one hell of a Freudian slip
            • Not really. I wouldn't mind if whoever is responsible for making sure that script-fu windows always pop up in front of all other windows (hint: they don't) got a little porcupine action to sharpen up his skills.
    • Re:Yer! ARM laptop (Score:5, Interesting)

      by zeromorph ( 1009305 ) on Monday June 02, 2008 @10:42AM (#23627591)

      Yes, looks like a new round in the CISC (now represented by Intel Atom) vs. RISC (now represented by Tegra) flame war. Ars Teechnica had an interesting article [arstechnica.com] about the new relevance of the differences of the two architectures two weeks ago.

      • by jabjoe ( 1042100 )
        I'm not sure it will be history repeating because size and power matter more in these devices then in the desktop. It will be hard for x86 machines to win out on price, size and power over ARM. They are paying a heavy price for x86, all for compatibility I really don't think we need.
    • Re: (Score:2, Interesting)

      by Scootin159 ( 557129 )
      Interesting that they say it has 'enough graphics power to render Quake 3 @ 40fps'... does Quake 3 actually run on any non win/x86 platform?
      • Re: (Score:3, Informative)

        by hr.wien ( 986516 )
        The Quake 3 source is released under the GPL, so yes. :)
      • by Yvan256 ( 722131 )
        The real question is, at which resolution? Because 40 FPS @ 320x240 isn't that useful...

      • by dook43 ( 660162 )
        Q3 has been out for Linux since close to day 1. Now that the source has been released, I'm sure people have ported it to other arches. However, you will not be able to connect to any servers because PunkBuster is a binary/X86 only piece of software.
      • Re: (Score:3, Funny)

        by Missing_dc ( 1074809 )
        Nevermind Quake3, This sounds like it could run World of Warcraft(a special Arm version, of course). Combine this with an EVDO card and I'm set!!
      • Well, there was the Linux/x86 version. The Mac (OS and OS X)PPC version, the Dreamcast and PS2 ports. And as others have mentioned, the source for the game is GPL'd so as long as you have the horsepower to run the game, a port could be made to just about any hardware.
    • by thue ( 121682 )
      You can already buy a cheap ARM laptop here [bestlinkeshop.com].
  • Worth waiting for... (Score:5, Interesting)

    by LinuxGeek ( 6139 ) * <djand.nc@NoSPam.gmail.com> on Monday June 02, 2008 @10:28AM (#23627411)
    I almost bought an Asus EEE pc this weekend, this is worth waiting to see how it is implemented in consumer devices. Give me a small laptop type that can run linux and I'll buy one or two. Heck, 30 or 40 hours would be enough battery time, don't need 100.
    • by ooze ( 307871 )
      Been waiting for something like this for years too. But considering nVidia's history, I don't think we will get proper linux drivers for this.
      • If this thing isn't going to run Linux, then I don't know what it is going to run. Certainly not the 'nearly free', stripped down Windows XP for cheap portables.

        nVidia would be pretty silly to build this thing and not to provide a proper driver for the only OS it'll probably work under. Of course, if this thing takes off, Microsoft probably will come out with a 'mini XP for ARM-based cheap portables'. But nVidia's got to feed the Linux chicken in order to lay that particular egg...
        • by ooze ( 307871 )
          They already said, the bigger version is supposed to run under Windows CE, and nothing else planned so far.
        • Microsoft probably will come out with a 'mini XP for ARM-based cheap portables'.
          You mean something like Windows CE?
    • by freedumb2000 ( 966222 ) on Monday June 02, 2008 @12:12PM (#23628675)
      Then have a look at this little machine: http://openpandora.org/ [openpandora.org]
    • Even 30 is far more than I need. As long as it lasts longer than I stay away, and charges in less time than I sleep, it's fine unless I'm away from the mains for more than a day, and it's rare for this to happen at times when I want a computer with me.
    • Heck, 30 or 40 hours would be enough battery time, don't need 100.
      10 would be fine for most people. Most I have seen was a 9 cell li-po pack that got 5 on some brand name machine (don't remember which -- probably HP or Sony)
  • When I heard that a company was making an inexpensive computer with great battery life, adequete performance and it was going to be 'ultra portable', I was so happy! ... then they released it and it was more expensive than originally planned, and not quite as robust.

    ... and now nVidia is going to do the same thing to me.
  • If it can run ffdshow or VLC at 1080p then we're talking something special.
    • If it can run ffdshow or VLC at 1080p then we're talking something special.

      Read again. The chip is made by nVidia. You can pretty much be sure that the decoding capability will be handled in BLOB.

      At best, maybe they'll put some hooks in ffmpeg's library (or directly in VLC as an alternate engine) to call their BLOB to handle the accelerated decoding.

      At worst you'll have to use a binary only nVidia-specific player. And given that the ARM+nVidia platform isn't going to be very popular fact, probably not a lot people are going to reverse engineer it (ala "Nouveau" project) - expect

  • I read an article about the Atom platform, which competes in this space. Apparently only the most powerful version of Atom would have enough oomph to run Vista, so can this nVidia MID handle it acceptably? (I know, the review mentioned it runs Windows Mobile, but I'm curious.)
    • Re:Vista (Score:5, Informative)

      by Khyber ( 864651 ) <techkitsune@gmail.com> on Monday June 02, 2008 @10:48AM (#23627661) Homepage Journal
      Short answer: no.

      Atom is x86 based (I think) whereas this is ARM-based. Vista isn't even ARM compatible.
    • Re:Vista (Score:4, Interesting)

      by neokushan ( 932374 ) on Monday June 02, 2008 @10:55AM (#23627743)
      Vista doesn't have an ARM version, you'll have to stick with Windows mobile for now.
      However, TFA states (that's right, I actually read it) that nVidia is open to running other platforms, not just windows CE, so if enough interest is generated, they MIGHT actually have Linux running on it.
      It's a chipset, though, not a device or anything so ultimately it would be up to the mobile manufacturers to decide what happens, providing nVidia has support for it.
      • by Bert64 ( 520050 )
        The biggest problem with windows mobile, is that it isn't actually windows... It is a completely different kernel, with a userland and interface that only has some similarities...
        Thus, even if more windows software had source code, you still couldn't recompile most of it to run. It's only windows by name.
        As phones become more powerful, they are more than capable of running software that would have run on a desktop system just a few years ago... And just look at how much has been ported to the iphone, so qui
        • A large amount of .NET programs will "just run" on Windows CE.
          • by znerk ( 1162519 )

            A large amount of .NET programs will "just run" on Windows CE.

            And an absurdly large number won't. To be frank, I'm wondering just which programs you're referring to, because damn near nothing runs on my ARM-based WindowsCE-running device without *major* tweaking (read that as "practically rewritten from the ground up"). Yeah, yeah, anecdotal whatsis, but my current office project is porting some old code to WinCE, via Visual Studio 2008. It's absolutely amazing how many things *aren't* supported "out of the box" under the .NET Compact Framework. Menus, for instance.

    • Re:Vista (Score:4, Informative)

      by TheRaven64 ( 641858 ) on Monday June 02, 2008 @11:02AM (#23627853) Journal
      They aren't even in close to the same space. This is not x86, so won't run Windows. It is in the (well) under 500mW power bracket, while the Atom uses 2W idle and needs a very power-hungry northbridge. Intel are trying to tell everyone that Atom is competitive with ARM, but it's still an entire order of magnitude more power hungry for similar performance at the moment. The 'ten times less power than our competition' dig on the nVidia site is aimed directly at Intel.
    • by dave420 ( 699308 )
      Nope. You can run Vista on any Atom, obviously some of the features will have to be disabled when running on the lower-level CPUs, but it's more than possible.
  • I'd certainly be willing to offer my meager talents to the effort for THAT kind of battery life. Will an ITIL metrics slide help? :/
    • Re: (Score:1, Informative)

      by Anonymous Coward
      we had them in for a demo, and they were pretty much emphatic that they will not support Linux on this part. Compared to the similar parts from TI and Freescale, that pretty much made it an non-starter for us.
  • by saha ( 615847 )
    iPhone 3.0. Actually the current iPhone uses Power VR MBX and the new one is rumored to be using the Power VR SGX graphics. The Power VR VXD video IP core can supposedly "supports 1080p H.264 Main/High Profile decoding, as well as VC-1 and a variety of other standards" http://www.beyond3d.com/content/news/638 [beyond3d.com] http://www.appleinsider.com/articles/08/04/30/apples_bionic_arm_to_muscle_advanced_gaming_graphics_into_iphones.html [appleinsider.com]
    • Re: (Score:2, Funny)

      by Anonymous Coward
      Congratulations, you are the first apple fanboy who tries to steal the thread with your MacRumors about the iPhone.
    • PowerVR vs. nVidia (Score:3, Interesting)

      by DrYak ( 748999 )
      The PowerVR vs. nVidia comparison is approximately the same as the ARM vs. Intel Atom.

      nVidia are producing classical graphic cores.

      PowerVR are employing specific techniques (Tile-Based Deferred rendering) which enable them to cram the same performance using a lot less transistors and running at lower clocks.

      The nVidia SoC is probably more targeted toward sub-notebooks, big multimedia PDAs (As a example, the TapWave Zodiac was based on an ARM and an ATI Imageon running PalmOS 5) and small internet-enabled ap
      • Re: (Score:1, Interesting)

        by Anonymous Coward
        Tile-based architectures start running into instruction-fetch issues for long shaders on complex scenes (shaders can be kilobytes each, and each tile has to fetch all of the shaders that end up visible in that tile). A "classical" architecture with agressive Z culling will beat the pants off a tiler (and CSAA drops the multisampling bandwidth down into the noise). I'm not sure where you think they're getting away with less transistors for a tiler either, there's a whole binning engine (and the associated ba
        • Smart Phone will probably use whatever is less power hungry and go for PowerVR's designs.

          Tile-based architectures start running into instruction-fetch issues for long shaders on complex scenes (shaders can be kilobytes each, and each tile has to fetch all of the shaders that end up visible in that tile).

          Excuse me ? On a *PHONE* ?
          I'm pretty much sure that you can't play Crysis on a Tiler, unless you make the tiler so much over-complicated that it looses its advantage over classical architecture...
          BUT, common, I'm speaking about *SmartPhone*. Nobody's going to play Crysis or anything that has more than a couple of kilobytes worth of shader code on a 320x240 resolution that fits in you pocket.

          Besides, this kind of situation isn't very likely to happen any way because :

          - Even on the desktop you won't encounte

          • by LordMyren ( 15499 ) on Monday June 02, 2008 @07:14PM (#23633273) Homepage
            you manage to miss every relevant point in the book in a very long elaboration of the status quo.

            nvidia and amd and every consumer electronics company in the world are doing their damnedest to break that status quo and make your phone and everything else a capable all purpose platform. this nvidia chip can go in mobile phones, but its got a video engine capable of 1680x1050. why is that? because ~~***YOUR PHONE***~~ needs that display? good god no. the point is, we're seeing new embedded devices we expect to function in dual roles of a) phone and b) computer replacement.

            long shaders let you do tasks like indirection in ways unfathomable for simpler setups. this in turn lets you run more application code in gpgpu land. this lets you save power. even if you disavow the use of it, i fail to understand how anyone could claim the lack of the feature is a good thing. it requires more advanced caching / buffering, but that should not be a dealbreaker. especially when we start loading our chips with massive onboard caches -- a secret well loved by the gamecube for example.
            • . the point is, we're seeing new embedded devices we expect to function in dual roles of a) phone and b) computer replacement.

              Then we simply have a definition problem, because those all-encompassing device that are currently emerging is what I tend to file under the categories of sub-notebooks and beefied-up PDAs.

              I which case, we actually both agree, given that a couple of posts ago I mentioned that this new chip will be perfect for sub-notebooks and PDA. This even makes more sense if the later is coupled with one of those laser-based embed projector. The the hidef resolution will definitely make sense.

              even if you disavow the use of it, i fail to understand how anyone could claim the lack of the feature is a good thing.

              I'm not saying that it's a

              • I did not address tiler architecture in my comment. But yeah, I think we're near the same page here.
          • Re: (Score:2, Insightful)

            by Anonymous Coward
            Wow, I'm picking up a serious "nobody needs more than 640K of memory" vibe from you... you're not a PowerVR designer by any chance, are you? Of course people are going to want to continue to play their desktop/console games on their portable devices, why would you design for anything less?

            A typical shader architecture can be viewed as a VLIW processor with an interpolator, texture unit, ALU and data store. Each "instruction" for all those units takes something like 512b, or about 64 bytes. 1KB is only ~16 i
  • Quake 3 (Score:5, Funny)

    by ELTaNiN ( 1297561 ) on Monday June 02, 2008 @10:59AM (#23627815)
    May it run Doom instead?
    • by bbk ( 33798 )
      Considering that you could get 40FPS at phone-class resolutions (640x480) in Quake 3 from a ~300Mhz PC with a Voodoo 2 around 10 years ago, it isn't all that impressive.
      • i think you'd have to use pretty mediocre quality settings to get that fps on a voodoo2. i went straight to geforce 1 sdr, and my dual 566mhz celeron bp6 still got slowdown on reasonable settings at 800x600.
    • Nope, just Quake 3. Quake 3 is embedded in the hardware -- it's a dedicated Q3-ASIC. That's how they got that performance and power; it's simply not a general-purpose computer ;)
  • enough graphics power to render Quake3 anti-aliased at 40FPS

    Sounds like an interesting toy, but aren't we twisting the measurements a bit here? Quake 3 came out in 1999. Any modern graphic chip has the graphics power to render Q4 at much faster than 40 FPS. Of course, there's the important question of "do you have the computing power behind the graphics power to make the game playable without lag or stutter on anything but a non-trivial map?", as is "do you have the system resources to get a new map sta

    • by eebra82 ( 907996 ) on Monday June 02, 2008 @11:36AM (#23628273) Homepage

      Sounds like an interesting toy, but aren't we twisting the measurements a bit here? Quake 3 came out in 1999. Any modern graphic chip has the graphics power to render Q4 at much faster than 40 FPS.
      You are missing the point. It's not as much about how fast it can run Quake 3, but rather that it is capable of doing so reasonably well. You cannot compare it to modern graphics engines simply because this is a processor that promises to deliver reasonable performance at incredibly low voltages.

      As for the resolution, I agree that it's rather strange that they left out the details on this, but we can assume that it's going to be something like 640x400, which is still very impressive.
      • by Anonymous Coward on Monday June 02, 2008 @01:19PM (#23629433)
        It's at 800x480, and the Quake3 port was a quick hack to test the chip, not a serious performance-tuned effort (i.e. it isn't using the vertex shaders at all, and the pixel shaders are using a very crude translation scheme from Q3's shader language). I'm fairly sure I could get a tuned port to run 100's of frames/sec on the same hardware. More modern games (Doom3/Quake4) would actually run better, but we didn't have the source to play with (and the game datasets are probably a bit large for the platform).
        • Re: (Score:3, Interesting)

          by Bert64 ( 520050 )
          A networkable quake3 that you can play over wifi with random people on the train would be fun.
          Infact, a phone with enough power to play good multiplayer games, wifi, the ability to auto detect other devices within range, and most importantly the ability to remote boot games from other users (so you dont need to rely on finding people with the same games) would be awesome...
          Just imagine the commute to work, and finding random other people on the train to play games with.
          • ...ability to remote boot games from other users (so you dont need to rely on finding people with the same games)...

            In a perfect world this might be interesting. In the real world, if you build such a platform, I can assure you that some script kiddy is going to play games with your system that you will not like.

  • Ob (Score:1, Funny)

    by Anonymous Coward
    Does it run linux, does it blend, in Soviet Russia a beowulf cluster of their new overlords welcomes ME!!!!
    • by znerk ( 1162519 )
      OMG ROFL

      I wish I had mod points, I don't care if it's an AC.

      I need a new keyboard, and a towel for my monitor.
  • More details (Score:5, Informative)

    by Meorah ( 308102 ) on Monday June 02, 2008 @12:03PM (#23628565)
    for those who actually enjoy RTFA'ing and want a bit more comprehensive info than a BBC fluff piece, nvidia's marketing page, and some pretty vids on engadget:

    http://www.tgdaily.com/content/view/37729/135 [tgdaily.com]

    The APX 2500 is far more interesting to me than the 600/650. Qualcomm and Broadcom better watch their backs.
  • I will definitely be getting one of these tegra powered devices.

    Hopefully, a phone. And, hopefully it won't cost $400.

    really, a smart phone with that chipset should only cost about $200.

    with 1080p tv/video and gaming.
  • by Doc Ruby ( 173196 ) on Monday June 02, 2008 @02:40PM (#23630355) Homepage Journal
    I don't need mobile TV. What I need is a few cheap, reliable, fanless, low power media terminals to stream HD video date from my Gbps LAN server, convert it into 1080p HDMI/DVI for my big TVs.

    So what I need is some Tegra PCs with minimal HW (maybe a DVD/Blu-Ray player, but no floppy, modem, or really even a HD - just 8GB Flash and PXE boot) that's mainly LAN and HDMI/DVI connections, running Linux, and full-featured Linux drivers. Preferably open-source drivers that we can tweak to work right, but which get full performance from the HW.
  • Or... (Score:4, Insightful)

    by Bert64 ( 520050 ) <bert@NOSpaM.slashdot.firenzee.com> on Monday June 02, 2008 @03:30PM (#23631031) Homepage
    Perhaps this technology could be used to produce a very small quiet and low power consuming mythtv box...The noise of my current system can be annoying when trying to watch a movie, but i didn't want to skimp on the cpu because i wanted to play 1080p video on it.

"To take a significant step forward, you must make a series of finite improvements." -- Donald J. Atwood, General Motors

Working...