NVIDIA Launches Five New Mobile GPUs 67
Engadget is reporting that NVIDIA has released five new mobile GPUs to fill some imagined gap in the 200M series lineup. These new chips supposedly double the performance and halve the power consumption of the older chips, but still no word on why they think we need eight different GPU options. "The cards are SLI, HybridPower, CUDA, Windows 7 and DirectX 10.1 compatible, and all support PhysX other than the low-end G210M. Of course, with integrated graphics like the 9400M starting to obviate discrete graphics in the mid range -- even including Apple's latest low-end 15-inch MacBook Pro -- we're not sure what we'll do with eight different GPU options, but we suppose NVIDIA's yet-to-be-announced price sheet for these cards will make it all clear in time."
Re: (Score:2, Offtopic)
I realize I'm just posting in a troll thread, which is exactly what the troll wants, but I just couldn't resist ... it's like nerd honey ...
There aren't any racial levels in "Dwarf." You could have a level 5 fighter who happens to be a dwarf. There are racial levels for other races though, usually monstrous ones. You could have, for example, a level 5 doppelganger.
Although personally I would rather make my doppelganger a 10th level assassin, 10th level rogue. Oh yeah ... *drools*
P.S. I don't use Linux. I pl
Re: (Score:1)
Re: (Score:2)
There aren't any racial levels in "Dwarf."
Not to defend the GP, but I take it you never played Basic [wikipedia.org], eh? Those are the rules that an awful lot of people started off with, in the days when AD&D -- a.k.a. 1st edition -- was a bit beyond the financial means of many teenagers.
Re: (Score:1, Offtopic)
Finally (Score:5, Interesting)
Finally, news about low-power GPUs with decent capabilities.
I'm sure hardcore gamers prefer bleeding edge hardware news, but for the rest of us, heat dissipation and power requirements are beginning to be a nuisance more than anything else. I'm sure 99% of computer users would be fine with a dual-core Atom CPU and one of those new GPUs.
Re: (Score:3, Informative)
Finally, news about low-power GPUs with decent capabilities.
I'm sure hardcore gamers prefer bleeding edge hardware news, but for the rest of us, heat dissipation and power requirements are beginning to be a nuisance more than anything else. I'm sure 99% of computer users would be fine with a dual-core Atom CPU and one of those new GPUs.
I have a duel core atom, and it sucks for flash. Its really sad that you can have the best video solution in the world paired with these and video ends up being the thing that suffers the most.
Once we get HTML 5, and video on the web migrates to a non-CPU based video system that will be true though.
Re: (Score:1)
Why doesn't flash do it already?
It's already been planned. From here [nvidia.com]:
NVIDIA, the inventor of the GPU, and Adobe Systems Incorporated announced that they are collaborating as part of the Open Screen Project to optimize and enable Adobe® Flash® Player, a key component of the Adobe Flash Platform, to leverage GPU video and graphics acceleration on a wide range of mobile Internet devices, including netbooks, tablets, mobile phones and other on-the-go media devices./quote
Re: (Score:2)
<onomatopoeia>SNERK</onomatopoeia>!
Say what now?
Yes NVIDIA, yes you do.
Never ascribe to malice ... (Score:2)
Re: (Score:2)
Maybe the NVIDIA writer should have written that NVIDIA invented the General Purpose GPU [wikipedia.org] ? From the wiki it seems like they might have been pioneers there.
I still don't think it's a valid claim. Try reading Myer & Sutherland'sOn the Design of Display Processors [stanford.edu], or Levinthal and Porter's Chap - a SIMD graphics processor [acm.org]. There was also the Ti graphics chips eg (34010 [wikipedia.org]). It happens all the time. IIRC many of the SGI machines were done with programmable hardware but I guess that wasn't exposed to the end user.
Re: (Score:2)
Re: (Score:2)
> You are assuming that the video decoders for your HTML-5 compliant browser will be capable of taking advantage of GPU-assist. Why doesn't flash do it already?
Yes that's DECODERS as opposed to a singular decoder owned by a company that may or may not care about your requirements.
There are varying levels of GPU-assist even on Linux. So it's clearly NOT going to
be a problem if anyone who is interested in dealing with the problem decides to take
a crack at it. It will inevitably lead to a better situation t
Re: (Score:2, Interesting)
I have a duel core atom, and it sucks for flash
Probably cuz it's tired from fighting in one-on-one combat with the GPU all the time. I recommend getting an Atom that works with its GPU [tomshardware.com].
Re: (Score:2)
I have a duel core atom, and it sucks for flash
Probably cuz it's tired from fighting in one-on-one combat with the GPU all the time. I recommend getting an Atom that works with its GPU [tomshardware.com].
Your link says nothing about the GPU in the Ion chipset (GeForce 9300) helping Flash video in any way (it doesn't). Yes, we all know Ion's GPU accelerates the codecs used in Blu-ray (H.264, VC-1, MPEG-2), but the Atom has to do all the work when it comes to Flash (and it sucks).
Here's a much better link that explains how the Atom (single and dual core) does with Flash on the Ion platform at different resolutions: Zotac's Ion: The Follow Up - Watching Flash Video on the Ion [anandtech.com]
Summary: single-core Atom on Io
Re: (Score:2)
I can stream HD video (movie trailers from apple's site) on my netbook. There is a dual core atom? Where? It looks like dual core cause of hyper threading, but it is still a single core CPU. Well, the 1.6Ghz one in the netbooks that I have seen are anyway. The trailers streamed and played fine under linux (ubuntu 8.10 and ubuntu remix), OSX 10.5, XP Pro, and win 7. I needed to install the proper player to view them, but once installed no issues.
The flash websites (which drive me crazy in a bad way) have no
Re: (Score:2)
Re: (Score:3, Interesting)
I think there's some misunderstanding between "hardcore gamers" and people who the Atom CPU is viable for. The Atom is a wonderfully efficient chip, and I'll concede that it's probably good enough for most "mundane" computing tasks. However, it's not good for ANY level of traditional (and by traditional I mean something that uses some level of 3d acceleration) PC gaming. I'd also question it's usefulness for things like video encoding. That's not a high end or odd application anymore. My mother (who is
Re:Finally (Score:5, Interesting)
The Atom is a wonderfully efficient chip
No, it's not. It's a wonderfully feature-less chip, with everything possible off-loaded into the northbridge. Which is why the NB looks like the real CPU, when you look at the board.
If you want wonderful efficiency, look at those new smartbooks that were show in a recent /. article. They take 1-2 watts, and play full-hd and hardware accelerated flash.
I rather stack 10 of those, than buying one Atom chip (with the same power usage).
I just wish someone would offer bare-bones ARM modules that you could take as much as you wanted of, and stick them together to form a desktop computer. maybe even have a special module that you could take out as a smartbook. Throw in some GPUs, and maybe an SPU (sound), or whatever you like.
Of course Windows would -- as usual -- just choke and die, but Windows and Smartbooks do not fit anyway (yet). It's all Linux in its many forms (including Android).
I for one, would love to have a desktop system, that is essentially a more tightly integrated blade rack with a fast backbone bus.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Are you sure it isn't only about the NB not being as power-efficient yet? I wonder if there is anything more "off-loaded" than with any other CPU.
Re: (Score:1)
You can get a MIPS-based desktop system with 72 processors that consumes 300 Watts, from SyCortex. They call it their Deskside Development System [sicortex.com] for their bigger parallel computers, and they say it does have a fast backbone bus.
It does run Linux, but at $23,695.00 (48 GB RAM) it's not, I suspect, what you were asking for. I would also like some cheap barebones I could just go on populating with CPUs as I wanted.
The GP might like SGI's Molecule [gizmodo.com] better though, it being Atom-based: 5000 chips, that's 10000 co
Re: (Score:2)
>>The Atom is a wonderfully efficient chip
> No, it's not. It's a wonderfully feature-less chip, with everything possible off-loaded into the northbridge.
Well this depends what you compare an Atom to, compared to many other x86 it doesn't off-load anything more in the northbridge..
Compared to an SOC or an ARM, sure.
Re: (Score:1)
Re: (Score:2)
Atom + nVidia ION does full 1080p decoding and is capable to running a 3D desktop with any wiggly effects you might want. That covers a lot of ground in my book. Gaming and video editing is at the opposite end of the scale for me, having a HD cam it's one of the things that really give the machine a workout. But it's rather specific in either you got it or you don't. If you don't, and most people I know only have digicams, then Atom will do you just fine. For gaming, go dualcore + fast GPU, for video encodi
Re: (Score:2)
As an owner of an Asus EEE BOX 206 with an ATI HD video card, I could only agree that it would suite the needs of most users if Adobe would get up off their but and create decent GPU offloading capabilities into Flash.
The EEE has an Atom and draws 19W max. It plays DVD's just fine. Not being able to stream YouTube or HULU really sucks, though.
I question if producing 8 different chip sets is as cost effective is perhaps producing three? The more quantity you can produce of a single chip, the cheaper man
Re: (Score:2)
We're still talking about two (maybe three) different ASICs, all packaged/fused into different products. Having multiple packages still costs some money, but being able to hit the sweet spot of every market segment is worth it.
Re: (Score:3, Interesting)
Let's just hope they fixed the manufacturing problems that are still dogging them.
I work fixing PCs for business and the public, and we have seen over 120 HP laptops with nVidia chipsets that have failed in the past six months. Usual symptoms are no video output (but otherwise boots), wifi card dropping out or just completely dead and not POSTing.
HP will do anything to get out of fixing the problem, which they won't even admit exists on most affected models. There is a website (http://www.hplies.com/ [hplies.com]) org
Re: (Score:2)
Getting off topic, but I just got an HP replaced for that reason (dead nVidia chip). (I'm an nVidia snob, which is why that lappy had one of their chips to begin with.) If you have a bad HP, take the advice at that site [hplies.com], and get a case manager. Using regular support, we had to send it in 3 times to get a working (though down-specced) machine. But once we got a case manager, they sent a new machine.
Re: (Score:2)
We managed to get three machines fixed by HP.
1. Took 8 months, went back and forth several times because they initially installed the wrong motherboard (similar spec but lacking a HDMI port). It was a US model and that seemed to confuse them. So much for a world wide warranty.
2. Was fixed the second time it went to them, but for some inexplicable reason came back with a cracked copy of Vista installed on the HDD. Luckily we imaged the drive before sending it off so we were able to restore it. No idea how or
low powered -- but better than standard integrated (Score:2)
Yes these are nothing special in the big picture. But the pricepoint could be extremely low for all we know. I'll bet this is an effort to put Nvidia chipsets in an entire generation of netbooks -- from which Nvidia has been excluded in favor of integrated graphics.
These are actually a new architecture of sorts (Score:5, Interesting)
This piece has more commentary on the release as opposed to regurgitating specs: http://www.pcper.com/article.php?aid=732 [pcper.com]
It looks like this new architecture is going to be quite different than the desktop counterpart.
Suicidal NVIDIA GPUs (Score:4, Interesting)
Re: (Score:3, Interesting)
Oh, wow. Thanks. I've never heard of this and just had my new laptop repaired with what appears to be an identical problem.
It was a Clevo with a 9300M on it and the symptoms sound exactly the same - 6 months in, the graphics starting playing up to the point that the computer just hung if you touched the keyboard or moved it in any way, always with graphical corruption, and sometimes Linux/Windows would just carry on regardless, but with corrupt graphics. Sometimes there'd be a kernel panic or freeze, but
Re: (Score:3, Interesting)
Well... they already killed themselves with their naming scheme changes. Re-labeling things so that you are pretty much guaranteed to feel ripped off when buying one of their cards, because it is just the same old shit with a new name, does not essentially make them trustworthy, or me wanting to buy anything from them.
Unfortunately, ATi's current generation is completely incompatible with Linux, (Not compatible to current kernel interfaces [>=2.6.29], massive tons of things that make it crash, composite
Re: (Score:3, Informative)
the one (Score:2, Funny)
Re: (Score:2)
and in the darkness bind them
Are you saying they will come bundled with Doom 4?
Better hope they integrate all eight at the same time. Or you might end up with the set of your keyboard LEDs having a higher resolution (and being brighter anyway).
Sucky Summary, could have held the whole FA... (Score:4, Informative)
NVIDIA is filling in what it presumes to be holes in its next-generation GPU lineup, adding the 40nm G210M, GT 230M, GT 240M and GTS 250M, with GDDR3 memory ranging from 512MB to 1GB, to its existing GTX 280M, GTX 260M and GTS 160M laptop graphics cards. Apparently the new cards sport "double the performance" and "half the power consumption" over the last generation of discrete GPUs they're replacing. The cards are SLI, HybridPower, CUDA, Windows 7 and DirectX 10.1 compatible, and all support PhysX other than the low-end G210M. Of course, with integrated graphics like the 9400M starting to obviate discrete graphics in the mid range -- even including Apple's latest low-end 15-inch MacBook Pro -- we're not sure what we'll do with eight different GPU options, but we suppose NVIDIA's yet-to-be-announced price sheet for these cards will make it all clear in time.
Look at the words changed:
[what it presumes to be holes] becomes [some imagined gap]
[Apparently the new cards sport "double the performance" and "half the power consumption"] becomes [These new chips supposedly double the performance and halve the power consumption]
[we're not sure what we'll do with eight different GPU options] becomes [still no word on why they think we need eight different GPU options]
and [but we suppose NVIDIA's yet-to-be-announced price sheet for these cards will make it all clear in time] gets completely omitted...
WTF?
Re: (Score:2)
Also, what's up with calling 9400M midrange in the same article that the faster 210M is called low end? And why is an article that mentions 4 new GPUs labeled as introducing 5 new GPUs in the title?
For those confused about the codenames... (Score:5, Informative)
So I was looking around after seeing this earlier to try and make sense of what older generation codenames match to the newer generation codenames, and found this: http://www.nvidia.com/object/geforce_m_series.html [nvidia.com] (scroll down).
Basically it goes GTX > GTS > GT > GS > G
The old 9400/8400 line has become the 210/110
The old 9600/8600 line has become the 230/130
The old 9800/8800 GT/GS has become the 250/150
And The old 9800/8800 GTX/GTS has become the 280
There are a few other cards that fall in the middle of categories, but that seems to be the basic gist of it as far as I can tell.
Heres another useful resource for comparing mobile gpu's: http://www.notebookcheck.net/Comparison-of-Graphic-Cards.130.0.html [notebookcheck.net]
Re: (Score:2)
I wonder if these codes promote fanboyism. You've learnt the code, you know the lingo, you buy the card that you know. Accepting that the other side might just have something better this iteration would require turning in your secret decoder ring.
Re: (Score:2)
Re: (Score:2)
The old 9400/8400 line has become the 210/110
The old 9600/8600 line has become the 230/130
The old 9800/8800 GT/GS has become the 250/150
And The old 9800/8800 GTX/GTS has become the 280
You mean the GTX 280M is not based on the desktop GTX 280, but the previous-generation 9800/8800? Death to NVIDIA!
I'm kidding, of course, but this is a long-time pet peave of mine. The GeForce4 MX was based on GeForce2 technology. The Radeon 8000 was not a DirectX 8/OpenGL 1.4 GPU like the rest of the 8000-series. This shit continues today with these NVIDIA mobile GPUs.
Competition is so cool... (Score:4, Insightful)
- Intel threatening an all-in-one smartphone chipset
- ARM showing up everywhere, netbooks coming soon, hopeful big battery life gains and HD playback
- Microsoft feeling left out of the smart- market. (I know, insert favorite pun here)
- Android liking its chances in the netbook market
- AMD looking at netbooks for growth
It's wonderful. I may yet get a netbook with 8+ hrs battery life, touchscreen, and I can settle for a Bluetooth headset profile connection to my smartphone in my pocket.
Now, gimme the 8' screen that folds out to 8"x14", and a swiveling keyboard. Woot. And that 700MHz thingie that is supposed to make broadband ubiquitous... For under $300, and less than $40/mo for the Interwebs.
I'll buy it.
Re: (Score:2)
Memo from NVidia CEO (Score:5, Funny)
Fuck Everything, We're Doing 5 GPUs [theonion.com]
(Hey, Slashcode, why won't you format <i> or <em> inside <blockquote>?)
Re: (Score:2)
(Hey, Slashcode, why won't you format <i> or <em> inside <blockquote>?)
Because you should be using <quote> instead, which does support that formatting.
how is 8 alot? (Score:2)
we're not sure what we'll do with eight different GPU options
yeah because theres hardly any options in the desktop market...
Laptop graphics cards - help needed (Score:1)
Re: (Score:2)
Re: (Score:3, Informative)
Sure, here's a link that'll send you in the right direction.
MXM [wikipedia.org]
Bleh, boring (Score:2)
If they aren't OpenCL compliant, ... (Score:2)
OpenCL available on all current nvidia products (Score:1, Informative)
AFAIK, Nvidia released OpenCL drivers that run on-top of the nvidia cuda runtime
http://www.nvidia.com/object/cuda_opencl.html [nvidia.com]
Since all recent Nvidia chips are CUDA enabled, they are by default also OpenCL enabled.