NVIDIA Unveils Lineup of GeForce 800M Series Mobile GPUs, Many With Maxwell 83
MojoKid writes "The power efficiency of NVIDA's Maxwell architecture make it ideal for mobile applications, so today's announcement by NVIDIA of a new top-to-bottom line-up of mobile GPUs—most of them featuring the Maxwell architecture—should come as no surprise. Though a couple of Kepler and even Fermi-based GPUs still exist in NVIDIA's new line-up, the heart of the product stack leverages Maxwell. The entry-level parts in the GeForce 800M series consist of the GeForce GT 820M, 830M, and 840M. The 820M is a Fermi-based GPU, but the 830M and 840M are new chips that leverage Maxwell. The meat of the GeForce GTX 800M series consist of Kepler-based GPUs, though Maxwell is employed in the more mainstream parts. NVIDIA is claiming the GeForce GTX 880M will be fastest mobile GPU available, but the entire GTX line-up will offer significantly higher performance then any integrated graphics solution. The GeForce GTX 860M and 850M are essentially identical to the desktop GeForce GTX 750 Ti, save for different frequencies and memory configurations. There are a number of notebooks featuring NVIDIA's GeForce 800M series GPUs coming down the pipeline from companies like Alienware, Asus, Gigabyte, Lenovo, MSI and Razer, though others are sure the follow suit. Some of the machines will be available immediately."
Re: (Score:2)
I invented the colour Orange. Prove me wrong.
Re: (Score:1)
I invented the colour Orange. Prove me wrong.
The color "orange" used to be called red-yellow. The color is named after the fruit.
The fact that you claim to have invented the color yet refer to it by its adopted name instead of the original proves that you're a liar who has nothing to do with the color's invention/naming/use.
Re: (Score:2)
I refer to it by the name most recognised in today's society.
Re: How to Falsify Evolution (Score:2)
Re: (Score:2)
Light never existed in the specific wavelength of orange until I commanded it.
There is no evidence to suggest otherwise so it must be true.
Re: How to Falsify Evolution (Score:2)
You cannot invent that which pre-exists you
Re: (Score:2)
Where is your evidence that it existed before me?
You have no evidence that light of that wavelength existed yesterday.
Re: (Score:1)
Re: (Score:1)
To prove evolution false, look for but never find any proof of genetic mutations.
Oops, we found them, many times. Antibiotics resistant mutations of bacteria that weren't resistant before is probably the best example of evolution and Darwinism.
Re: (Score:2)
tl, i actually skimmed it a bit. the only thing I could grab on to is this:
If evolution be not true, the only explanation for the appearance of varied life on the planet is intelligent design.
This is a logical leap, and creates a flaw in the remainder of the post. If the only two choices are evolution or ID, then an argument against evolution is an argument for ID (which is what the rest of the words are about I think). but why can't there be other potential theories? I'm sure the world has thought of hundreds.
whatevs, not a good use of time.
Re: (Score:2)
Don't encourage the bastard.
Re: (Score:3)
TL;DR
Re: (Score:2)
I don't care about the architecture, how much RAM they have, how many pipelines they have, with graphics cards. Seriously, I don't know enough for it to be relevant to me. All I want to know is how fast it is (playing games), and how much it costs. Those are the only 2 things that are relevant to me. I don't care what die it was shipped on, I don't care about anything but price/performance. Some might care about power usage... I don't (within reasonable limits).
Re: (Score:2)
Not that you seem to care, but nvidia is precisely launching a SECOND Maxwell chip with that laptop announcement, first was GM107 in desktop GTX 750 and Ti, now in 860M, 850M : it has five "SMM" and a 128bit bus. Second is GM108 in Geforce 830M and 840M, a smaller GPU with less SMM on 64bit bus. With DDR3 memory. That gives low performance, but it's clearly a low power low budget part.
Maxwell... (Score:2)
That's the second biggest GPU I've ever seen.
Re: (Score:2)
You should stop playing with it then!
seperate mobile GPU's is declining market (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Now? Intel GPU support has been excellent under Linux even back when the crusty GMA chips were all we had.
Except for the bugs. I used Linux, including tracking the latest kernels, for over 6 years with my last laptop having an Intel 915GM.
Every version of the kernel during that time rendered occasional display glitches of one sort or another, such as a line or spray of random pixels every few weeks. Rare but not bug free.
And that's just using a terminal window. It couldn't even blit or render text with 100% reliability...
I investigated one of those bugs and it was a genuine bug in the kernel's tracking of ca
Re: (Score:1)
Section "Device"
Identifier "Intel"
Driver "intel"
Option "DebugWait" "true"
EndSection
Re: (Score:1)
Thanks! But too late. That machine died this time last year, after 6 years of excellent service. I moved on to new hardware.
Hopefully the xorg.conf is useful to someone else.
I've just looked up what people are saying about DebugWait, and I see the font corruption - that's just one of the types of corruption I saw!
But perhaps that was the only kind left by the time my laptop died.
Just a note to others, that DebugWait doesn't fix the font corruption for everyone according to reports. But, it's reported as fix
Re: (Score:2)
Even multi configurations with top model GPUs don't do well with 3840x2400 resolutions.
Re: (Score:3)
Those of us doing more with computers than editing text documents and refreshing facebook still need descrete GPUs.
Re: (Score:2)
According to the latest market statistics 66% of PCs overall use embedded graphics. Even Steam has a 16% Intel share and probably some AMD APUs that aren't separated out. I don't know about you but anything "serious" I do like work doesn't push the GPU one bit, the only thing that does is gaming. And not everybody is a gamer or their idea of gaming is more like Candy Crush. On that note, I loved The Walking Dead, here's the system requirements:
Windows Operating system: Windows XP / Vista / Windows 7
Processo
Re: (Score:2)
What are you doing - pray tell... (Score:2)
Other than professionally modeling or doing video editing, or playing #D games - what use does a average person have for discrete graphics today?
Re: (Score:2)
Other than professionally modeling...
Look, this is Slashdot. You're not going to find professional models here.
Re: (Score:2)
The AMD and Intel integrated offerings while not amazing are more than adequate for the vast majority of purposes
Not only that, but the discrete graphics cards consume substantial amounts of power and generate more heat than the rest of the device combined.
Intel = bad drivers (Score:1)
Re:seperate mobile GPU's is declining market (Score:4, Interesting)
Re: (Score:1)
I would imagine Nvidia are very uncomfortable with the way their market has been contracting over the last couple of years.
At some point enough x86/x64 patents will expire that Nvidia will be able license the remaining ones and so an x64 chip of their own.
Or alternatively they could sell Arm+GPU SOCs instead - arguably Arm+GPU is a better bet than x64+GPU because the sales of phones and tablets will exceed the sales of x64 PCs. Of course the margins are likely to be thinner because there's a lot of competition in the Arm SOC market - Apple and Samsung have their own in house designs and outside that it looks like Qualcomm have
Re: (Score:2)
At some point enough x86/x64 patents will expire that Nvidia will be able license the remaining ones and so an x64 chip of their own.
But after x64 there were SSE3, SSE 4.x, AVX, AVX2, now AVX512 coming soon. Those are the wide SIMD instructions. This stuff isn't strictly needed - yet, already SSE2 gets needed to run some 32bit code like some flash versions and codecs, this annoys some current Athlon XP users. Maybe some other stuff like hardware encryption is "protected".
So the fullest x86/x64 support will be left to AMD and Intel only for the foreseeable future.
Nvidia is betting on ARMv8, with an ARMv8 + Kepler (of the GK208 variant) ch
Re: (Score:2)
Given the ridiculous prevalence of laptops with absolutely pathetic displays (1366x768 on a 15"? really?), "most" users aren't even going to need the integrated Intel 4th gen video. A dumb frame buffer would probably fit their needs.
Re: (Score:2)
Heck, we just
Re: (Score:2)
By ordering low-end GPU, you annoy everyone -- the users have to put up with crappy chips, IT has to support more complex systems, and budgeting has to pay for chips noone wants. So instead, order most of the laptops without discrete GPU to save a few bucks. Then order a few with high-end GPU for the few people who want them.
Re: (Score:2)
Advert disguised as story (Score:1, Troll)
This takes the cake. I've never complained once about an obvious advert disguised as a story.
But to pimp this, this CRAP company that has been so incredibly hostile to the free and open source community is such bad judgement.
The new slashdot management seems determined to undermine the loyalty of their userbase. What a disgrace.
I don't need more powerful. I just need cooler! (Score:2)
I've toasted two laptop monitors because of trying to play too many high-needs video games on them. Both of the monitors theoretically were good enough for the games by specs, but both of them burnt out within two years of when I bought them (admitedly, they were both a couple years old when I purchased them). With the first laptop, I just thought it was an age thing and didn't think enough of it, but with the second one, I realized the sad pattern. Now, I play my games with an external fan running, blowing
Re: (Score:2)
Re: (Score:2)
The new Maxwell stuff quite possibly has higher performance per watt than the Intel GPU. This may make the dedicated GPUs a bit more interesting again (and if the Intel GPU doesn't run, more watts can be spent on the CPU performance which can allow better framerate). Sure, stay modest enough on the wattage.
Re: (Score:2)
I'm with the others not understanding what you're on about with monitors, but indeed additional cooling is useful. The thing is no matter how efficient the CPU and GPU are, it's a product of the watt budget and how the laptop is designed. Depends on the thickness/thinness, heatsinks and fans, build quality etc. so it's really on a laptop per laptop basis.
Modern stuff also throttles, it gets slower when needed so the laptop won't melt itself, that plays in both sides.. Less chance of failure, but additional
Re: (Score:2)
My appologies for not being very clear. I don't have my tools here, nor do I have random spare parts to change out and test which components have problems with my system at the moment. The two laptops could have had different problems, but the net result was that the screens on both were no longer functional. It was my presumption (as I said, I don't have my tools to verify) that excess heat was the cause of my computer issues. For that reason, I would prefer to have a cooler-running laptop, so I don't feel
Re: (Score:2)
No problem. Also my above reply was pessimistic, better to check some reviews after finding a nice model.
The new GTX 850M and GT840M feel nice (the latter being rather slow if you're into demanding games)