Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Portables Hardware

nVidia Preview 'Tegra' MID Platform 117

wild_berry writes "nVidia have previewed their Mobile Internet Device platform which will be officially unveiled at Computex in the next few days. The platform features CPU's named Tegra paired with nVidia chipset and graphics technology. Tegra is a system-on-a-chip featuring an ARM 11 core and nVidia's graphics technologies permitting 1080p HiDef television decode and OpenGL ES 2.0 3D graphics. Engadget's page has more details, such as the low expected price ($199-249), huge battery life (up to 130 hours audio/30 hours HD video) and enough graphics power to render Quake3 anti-aliased at 40FPS."
This discussion has been archived. No new comments can be posted.

nVidia Preview 'Tegra' MID Platform

Comments Filter:
  • Yer! ARM laptop (Score:5, Insightful)

    by jabjoe ( 1042100 ) on Monday June 02, 2008 @11:23AM (#23627349)
    I've been waiting for ARM laptop thing. Real battery life! Why do I need x86 compatibility? Give me battery life every time.
  • Re:Yer! ARM laptop (Score:4, Insightful)

    by bsDaemon ( 87307 ) on Monday June 02, 2008 @11:42AM (#23627585)
    x86 compat is less important for us slashdotter types because we can compile the vast majority of software that we use from sources for whatever platform (bsd, linux) and architecture (x86, arm, sparc) we're usuing.

    The people who expect to be able to buy software to run on hardware that they also bought -- they might care -- just a little bit -- I would imagine.
  • Re:Media player. (Score:5, Insightful)

    by Lord Ender ( 156273 ) on Monday June 02, 2008 @11:43AM (#23627601) Homepage
    Is that an auto-generated comment? Are you a bot?

    The article is about a new processor for mobile devices. Asking if it supports ogg is like asking if your ethernet cable supports MP3.
  • Re:Closed :( (Score:4, Insightful)

    by LuxMaker ( 996734 ) on Monday June 02, 2008 @11:53AM (#23627725) Journal

    Who gives a shit? (2)


    Over half the slashdotters here maybe?
    Open source of course allows for more flexibility as well as a review for vulnerabilities.
  • Re:Yer! ARM laptop (Score:4, Insightful)

    by LWATCDR ( 28044 ) on Monday June 02, 2008 @11:58AM (#23627797) Homepage Journal
    Maybe but maybe not.
    Most smart phones don't use WindowsXP "I don't know of any that use an X86" and people do buy software for those.
    If used a good Linux distro and then provided repositories than you would have your software.
    A software package system that worked like iTunes would be an Ideal system.
    Provide lost of free and pay software from an easy to use online store and you would have a great business model. Steam shows that it already works for games.
    It should work just fine for this as well.
    Of course this chipset could also be the heart of a new iPhone/iPod Touch as well.
  • by Doc Ruby ( 173196 ) on Monday June 02, 2008 @03:40PM (#23630355) Homepage Journal
    I don't need mobile TV. What I need is a few cheap, reliable, fanless, low power media terminals to stream HD video date from my Gbps LAN server, convert it into 1080p HDMI/DVI for my big TVs.

    So what I need is some Tegra PCs with minimal HW (maybe a DVD/Blu-Ray player, but no floppy, modem, or really even a HD - just 8GB Flash and PXE boot) that's mainly LAN and HDMI/DVI connections, running Linux, and full-featured Linux drivers. Preferably open-source drivers that we can tweak to work right, but which get full performance from the HW.
  • Or... (Score:4, Insightful)

    by Bert64 ( 520050 ) <bert AT slashdot DOT firenzee DOT com> on Monday June 02, 2008 @04:30PM (#23631031) Homepage
    Perhaps this technology could be used to produce a very small quiet and low power consuming mythtv box...The noise of my current system can be annoying when trying to watch a movie, but i didn't want to skimp on the cpu because i wanted to play 1080p video on it.
  • by LordMyren ( 15499 ) on Monday June 02, 2008 @08:14PM (#23633273) Homepage
    you manage to miss every relevant point in the book in a very long elaboration of the status quo.

    nvidia and amd and every consumer electronics company in the world are doing their damnedest to break that status quo and make your phone and everything else a capable all purpose platform. this nvidia chip can go in mobile phones, but its got a video engine capable of 1680x1050. why is that? because ~~***YOUR PHONE***~~ needs that display? good god no. the point is, we're seeing new embedded devices we expect to function in dual roles of a) phone and b) computer replacement.

    long shaders let you do tasks like indirection in ways unfathomable for simpler setups. this in turn lets you run more application code in gpgpu land. this lets you save power. even if you disavow the use of it, i fail to understand how anyone could claim the lack of the feature is a good thing. it requires more advanced caching / buffering, but that should not be a dealbreaker. especially when we start loading our chips with massive onboard caches -- a secret well loved by the gamecube for example.
  • by Anonymous Coward on Tuesday June 03, 2008 @02:56AM (#23635363)
    Wow, I'm picking up a serious "nobody needs more than 640K of memory" vibe from you... you're not a PowerVR designer by any chance, are you? Of course people are going to want to continue to play their desktop/console games on their portable devices, why would you design for anything less?

    A typical shader architecture can be viewed as a VLIW processor with an interpolator, texture unit, ALU and data store. Each "instruction" for all those units takes something like 512b, or about 64 bytes. 1KB is only ~16 instructions, hardly a "long" shader (the Doom3 shader is ~12 instructions, and it's very short compared to most modern games).

    Caching does nothing for you if you can't fit all of the shaders used in a frame on-cache, because you have to reload different shaders for different tiles (whereas a classical architecture, with an app that sorts by mode, has to load each shader only once). Instant 1500x shader bandwidth hit for a 800x480 screen with 16x16 tiles...

    Binning on the driver side? On an ARM? Yeah, that'll work... right after the transform, setup and clip I assume (or were you thinking of some crazy-ass feedback mechanism to main memory, costing even MORE bandwidth and power?!?)

    And by the way, GPGPU is already running on smartphones (pretty useful to accelerate physics for games, for example...).

For God's sake, stop researching for a while and begin to think!

Working...