nVidia Preview 'Tegra' MID Platform 117
wild_berry writes "nVidia have previewed their Mobile Internet Device platform which will be officially unveiled at Computex in the next few days. The platform features CPU's named Tegra paired with nVidia chipset and graphics technology. Tegra is a system-on-a-chip featuring an ARM 11 core and nVidia's graphics technologies permitting 1080p HiDef television decode and OpenGL ES 2.0 3D graphics. Engadget's page has more details, such as the low expected price ($199-249), huge battery life (up to 130 hours audio/30 hours HD video) and enough graphics power to render Quake3 anti-aliased at 40FPS."
Yer! ARM laptop (Score:5, Insightful)
Re:Yer! ARM laptop (Score:4, Insightful)
The people who expect to be able to buy software to run on hardware that they also bought -- they might care -- just a little bit -- I would imagine.
Re:Media player. (Score:5, Insightful)
The article is about a new processor for mobile devices. Asking if it supports ogg is like asking if your ethernet cable supports MP3.
Re:Closed :( (Score:4, Insightful)
Over half the slashdotters here maybe?
Open source of course allows for more flexibility as well as a review for vulnerabilities.
Re:Yer! ARM laptop (Score:4, Insightful)
Most smart phones don't use WindowsXP "I don't know of any that use an X86" and people do buy software for those.
If used a good Linux distro and then provided repositories than you would have your software.
A software package system that worked like iTunes would be an Ideal system.
Provide lost of free and pay software from an easy to use online store and you would have a great business model. Steam shows that it already works for games.
It should work just fine for this as well.
Of course this chipset could also be the heart of a new iPhone/iPod Touch as well.
Stationary But With Linux Drivers (Score:4, Insightful)
So what I need is some Tegra PCs with minimal HW (maybe a DVD/Blu-Ray player, but no floppy, modem, or really even a HD - just 8GB Flash and PXE boot) that's mainly LAN and HDMI/DVI connections, running Linux, and full-featured Linux drivers. Preferably open-source drivers that we can tweak to work right, but which get full performance from the HW.
Or... (Score:4, Insightful)
Re:...on a phone.... (Score:4, Insightful)
nvidia and amd and every consumer electronics company in the world are doing their damnedest to break that status quo and make your phone and everything else a capable all purpose platform. this nvidia chip can go in mobile phones, but its got a video engine capable of 1680x1050. why is that? because ~~***YOUR PHONE***~~ needs that display? good god no. the point is, we're seeing new embedded devices we expect to function in dual roles of a) phone and b) computer replacement.
long shaders let you do tasks like indirection in ways unfathomable for simpler setups. this in turn lets you run more application code in gpgpu land. this lets you save power. even if you disavow the use of it, i fail to understand how anyone could claim the lack of the feature is a good thing. it requires more advanced caching / buffering, but that should not be a dealbreaker. especially when we start loading our chips with massive onboard caches -- a secret well loved by the gamecube for example.
Re:...on a phone.... (Score:2, Insightful)
A typical shader architecture can be viewed as a VLIW processor with an interpolator, texture unit, ALU and data store. Each "instruction" for all those units takes something like 512b, or about 64 bytes. 1KB is only ~16 instructions, hardly a "long" shader (the Doom3 shader is ~12 instructions, and it's very short compared to most modern games).
Caching does nothing for you if you can't fit all of the shaders used in a frame on-cache, because you have to reload different shaders for different tiles (whereas a classical architecture, with an app that sorts by mode, has to load each shader only once). Instant 1500x shader bandwidth hit for a 800x480 screen with 16x16 tiles...
Binning on the driver side? On an ARM? Yeah, that'll work... right after the transform, setup and clip I assume (or were you thinking of some crazy-ass feedback mechanism to main memory, costing even MORE bandwidth and power?!?)
And by the way, GPGPU is already running on smartphones (pretty useful to accelerate physics for games, for example...).