Intel Unveils 'Sandy Bridge' Architecture 163
njkobie writes "Intel has officially unveiled Sandy Bridge, its latest platform architecture, at the first day of IDF in San Francisco. The platform is the successor to the Nehalem/Westmere architecture and integrates graphics directly onto the CPU die. It also upgrades the Turbo Mode already seen in Core i5 and i7 processors to achieve even greater speed improvements. Turbo Mode on Sandy Bridge processors can now draw more than the chip's nominal TDP where the system is cool enough to do so safely, enabling even greater boosts in core speeds than those seen in Westmere. No details of specific products have been made available, but Intel has confirmed that processors built on the new architecture will be referred to as 'second generation Core processors,' and are expected to go on sale in early 2011. In 2012 it is due to be shrunk to a 22nm process, under the name Ivy Bridge."
Turbo Mode (Score:5, Funny)
Old news. My 386 has turbo mode. Wake me when they add math coprocessors to this beast.
Re: (Score:2)
Don't we call those "graphics cards"?
Re:Turbo Mode (Score:5, Insightful)
Don't we call those "graphics cards"?
Has Intel ever made a quality graphics coprocessor?
Re: (Score:2)
Define quality. If you mean stable and get the job done for most people, then I'd say yes. If you mean blazing speed and can run the latest games at a framerate that would melt your face, then no. ATi and nVidia have it covered and Intel is sticking with what they do best, general computer processors.
Also their Linux driver support is top-notch in this area.
Re: (Score:2)
Define quality. If you mean stable and get the job done for most people, then I'd say yes. If you mean blazing speed and can run the latest games at a framerate that would melt your face, then no. ATi and nVidia have it covered and Intel is sticking with what they do best, general computer processors.
Also their Linux driver support is top-notch in this area.
By quality I mean it's comparable to a current $200 video card.
I don't need to play crysis at 60fps, But I sure would like to at least not have OpenGL 3.3 games be stereoscopic.
Re: (Score:2)
Exactly - why isn't AMD doing this first? They already have ATI Radeon / Firepro GPUs that are actually decent. All they have to do is drag and drop one of those designs onto their next CPU layout with the next process shrink :) OK, probably not that easy, but surely it's a huge leg up for AMD.
Re: (Score:2)
Re: (Score:2)
I'll actually gave the Intel GPUs a lot of credit these days. They do not compete performance wise with dedicated cards, of course, but then I wouldn't expect them to. What they do is get you an entire graphics feature set, the new ones are fully DirectX 10 compatible, with reasonable speed, video acceleration, and so on in a tiny, tiny power budget. There's a reason why switchable graphics laptops are popular. The dedicated GPU is great, but gobbles up batter life even when throttled down. That integrated
Re: (Score:2)
Intel graphics also have opensource drivers, including the necessary kernel bits in the vanilla Linux tree. They have been this way for years, and have exposed most of the functionality, unlike the open ATI/AMD drivers. Though it must be said that the latter are improving a lot; I'm currently typing on a Powerbook with Radeon graphics running Gentoo. For some reason, binary Linux PPC drivers are hard to find ;)
For my uses, Intel GPUs have been powerful enough for years, and this includes HD video playbac
Re: (Score:2)
With consoles each console has a distinct pool of games. Most games are either released for a single console or for a group of consoles with similar capbilities. Sometimes with a PC release as well but the PC release is often either crap, late or both.
If you want to play recent Mario games you have to buy a Wii (or maybe screw arround with emulation on a PC but most people won't bother to go to that much trouble). If you want to play GTA4 you have to buy a PS3, an xbox 360 or a high end gaming PC. If you wa
Game geometry and branching (Score:2)
branching performance usually suffers [...] Video and image processing, game geometry, and 2d rendering really belong on a GPU-like architecture, not the CPU.
I thought game geometry involved a lot of branching, especially in the cases of potentially visible set construction methods (e.g. portal casting or BSP), collision detection, and path finding. Or have these problems been solved?
Re: (Score:2)
second generation core? (Score:2)
Re:second generation core? (Score:4, Informative)
Wow. The nonsense..it hurts my brain.
First, IA64 is not a "64-bit x86 extension", it's a new ISA. AMD released x86_64 and Intel did very shortly after.
Second, Intel has had integrated CPU/GPUs out for a while. And you're crazy if you think Intel chips (now, not back in the bad old P4 days) draw more power and run hotter than AMD chips.
Basically everything you said is either wrong or backwards, and you confuse me because of this.
Re: (Score:2)
They'd be in trouble. They have strong competition from ARM derivatives in mobile and embedded markets, and the desktop and server markets are almost entirely x86_64 now. Nobody's interested in 32-bit servers, and IA64 is a niche platform whose fate was sealed when Microsoft stopped developing for it.
This is after their announcement that (Score:3, Funny)
They're opening a new factory in Madison county.
Intel needs to dump the DMI bus and go all QPI (Score:2)
Intel needs to dump the DMI bus and go all QPI the last thing you want is Intel video lock in and only x16 pci-e lanes.
Re: (Score:2)
Ultimately for laptops and low end desktops moving all the high speed logic (graphics, CPU, memory controller) into one chip makes a lot of sense from both a cost and a power point of view.
Yes it's annoying that the option of a nvidia chipset with integrated graphics that were better than intel's while being cheaper and lower power than a dedicated graphics chip with it's own memory has been frozen out by this change.
Yes it's annoying that you can no longer use a low end CPU with a high end platform or vice
3th party chip sets also apple. They can't stay co (Score:2)
3th party chip sets also apple. They can't stay core2 for ever on the mini / some of there laptops and intel video does not fit in there gpu api.
and they don't like to put full pci-e x16 video chips in there low end systems.
Re: (Score:2)
3th party chip sets
Less high power high speed chips means less power spent on interfacing between them and less complex system cooling. For most systems that is probablly worth sacrificing the ability to choose northbridges.
also apple. They can't stay core2 for ever on the mini / some of there laptops and intel video does not fit in there gpu api.
If true that sucks for them but ultimately I don't think in.
It appears (see http://slashdot.org/comments.pl?sid=1786182&cid=33570924 [slashdot.org] for caveats) that with san
Time to buy all new chipsets! (Score:2)
Re: (Score:2)
Because this is primary motivation as no one is even coming close to maxing out an i7.
Maxing it out at what? Maxing out an i7's CPU performance is trivial on a server that's doing CPU-intensive work.
Re:Time to buy all new chipsets! (Score:5, Interesting)
Re: (Score:3, Informative)
I'm not sure about the desktop side, but on the server side it is certainly not two dimms.
Each bank is composed of three dimms and there are multiple channels per proc.
While I don't have the details on me it's pretty easy to see that both camps have significantly increased their memory footprint and it's quite easy to build a system with 256gb of ram or greater.
In a few instances there are systems types which do tax the proc far more then others. For these types of systems and other instances where licensin
Re: (Score:2)
Currently with intel stuff (it's a while since i've looked at the AMD side) the laptop and low end desktop platforms have 2 channels and at least with the boards i've seen the max configuration supported is 4x4GB for a total of 16GB.
The current intel high end desktop platform has 3 channels and at least with the boards i've seen the max configuration supported is 4x4GB for a total of 16GB
Workstation/server platforms go much higher, with the right board and a big enough budget you can get 18x8GB (maybe more
Re: (Score:2)
and the Nehalem-EX with 8 sockets go as high as 2TB of RAM per server.
IIRC in theory you get four memory buffers per processor each with two channels. With two modules per channel, two channels per expander, four expanders and 8 processors you would get 128 modules. With 16GB modules you would indeed have 2TB of ram.
But does anyone actually sell a board that will take that much ram (the only nahelm ex board i've seen for sale was from supermicro and only had four CPUs and one module per channel so only sup
minor correction: (Score:2)
s/only supported 16 modules/only supported 32 modules/
Re: (Score:2)
IF it's really as low as $250K that doesn't actually sound like too bad a deal. it's "only" $125 per gigabyte.
Comparatively I built a 48GB dual quad box a while ago for about £3000 (including VAT but also including our discounts from the supplier) which at current exchange rates works out to just under $100 per gigabyte. I could have got that price lower with slower processors.
Not as large a difference as I expected really but I still think in a ram dominated virtualisation situation the dual socket b
Re: (Score:2)
On the whole I'd concur, but it does depend very much on your workload - YMMV. For most "enterprisey" setups, you'll probably be running a million and one individual utility servers that spend 99% of their time doing nothing and, if it anything like where I work, management will want every VM is overspecced and treated as if it was a physical machine - "the lowest-end hardware we can currently buy for a domain controller is a quad core box with 4GB of RAM... so all VMs must have at least four processors and
Re: (Score:2)
Heh, no, I'm based in London. But incompetent management is as universal as shitty apps running in VMware because there's no other way to run nine hojillion 1U rackmounts :)
Re: (Score:2)
For virtualization workloads:
I run a major virtualization operation (>1000 vms). Dell M600 blades loaded with 32GB of RAM and 2x4 Nehalems run at about 25% CPU utilization when fully loaded down. You can do the memory-cpu math from there. In our operation we'd likely run out of storage throughput first, actually, but the SAN is its own design issue.
C//
Re: (Score:2)
But Sandy Bridge isn't really a server side chip.
virtualization workloads are not really important for this CPU unless you are running a Mac and Parallels.
Re: (Score:2)
An i7 is a desktop chip.
For a sever you should be using a Xeon or one of AMDs new G34 CPUs. AMD makes an 8 core G34 server GPU that is under $300.
Hehehe (Score:4, Funny)
Re:Hehehe (Score:5, Funny)
Please let me push a button on the case to enable "turbo" mode.
Lol. Those were the days. I once worked in a computer shop in the mid 90's where we upgraded some guys 386 to one of the new 486 (DX i think) by swapping out the entire board but we kept the case to save him some money.
He comes back in the shop and complains that the turbo mode doesn't work anymore and we tried to explain with the new models that it was way faster than the 386 even in turbo mode but he didn't seem to understand.
So one of us takes it into the back rigs the button to simply light up the turbu LED when you press it. He seemed pretty happy with the results.
Re: (Score:2)
He comes back in the shop and complains that the turbo mode doesn't work anymore and we tried to explain with the new models that it was way faster than the 386 even in turbo mode but he didn't seem to understand.
So one of us takes it into the back rigs the button to simply light up the turbu LED when you press it. He seemed pretty happy with the results.
You should have sold it as a 486 special edition. Would have been cool if you could rigged it up with a speaker to have an extra loud fan noise too ;-)
Re: (Score:2)
Fans... in a 486? Don't think so - most of the 486s (and a substantial number of early pentiums) I saw were passively cooled. I think the first HS+Fan combo I saw was on a P120 back in the day.
The early original 486s (50/66/75MHz) ran passively without any problems, but the later i486DX4:100 and AMD 5x86:133 models needed the fans if I remember correctly (especially if you overclocked them, as everyone did)...
Re: (Score:2)
I loved that little light. I won't buy a PC that doesn't give me that little light and a little button to activate it with. Add the ability for me to set the color of the light programmatically, and I'll be brand loyal.
Comment removed (Score:5, Interesting)
My turbo button really worked! :) (Score:2, Interesting)
i had a AMD 486 DX5 at 133MHz on a 386 case, after some upgrades...
i connected the turbo button to the Bus speed jumpers, so when i pressed, the bus jumped from 33Mhz to 40Mhz, overclocking the cpu to 160Mhz... i run at "full" speed when i was at home and put the normal speed when i left it idle
To my surprise, it worked really well, the PCI bus accepted that speed, the network and SCSI card never gave any error until i disconnect the computer about 6 years ago
i also tried to up the bus to 50Mhz and the CPU,
Re: (Score:2)
The ironic thing is, that the "Turbo" speed was actually the native speed of the CPU. When you disabled turbo, you were actually underclocking it so that applications (games really), would run slower.
Yeah, what's crazier is that there was no particular "compatibility" speed, they were just slower by some random factor. The only time I ever functionally used turbo was on a 286 that would operate at 8088 speed - the CPU in the original IBM PC - when the turbo was off, everything since that just assumed it could be run on different CPUs of different speeds. It made no sense to make a 33 MHz 386 into a 25 MHz 386 or whatever it was for anything.
Re: (Score:2)
> So one of us takes it into the back rigs the button to simply light up the turbu LED when you press it. He seemed pretty happy with the results.
I bet his amp goes up to 11 as well !
Re: (Score:3, Funny)
Reminds me of a story too...
My Dad had a new-ish 386 PC which he loved, he especially loved how fast it was. One weekend I played some games on it, one of which (maybe Level 42?) needed the turbo off, as it was way too fast to play at the full nosebleed-speed of 33 MHz. I then went away for the week.
When I came back that Saturday lunchtime, he was literally waiting on the driveway for em, purple with fury. He'd been struggling for the whole week with an unuseably slow PC, and he'd tried rebooting, and he'd
Laptops still have a turbo mode (Score:2)
Please let me push a button on the case to enable "turbo" mode.
It's not a button on the case, but several laptops give the user a taskbar control to change the power-management strategy. So you have a "turbo" setting and a "battery life" setting.
Re: (Score:2)
And in 2 years they'll go to 22.
Wow... (Score:4, Funny)
Any news for Apple in this (Score:2)
Re: (Score:2)
The Sandy Bridge GPU is still weak by Apple's standards, but they can't keep using Core 2 forever. As long as OS X can compile OpenCL into AVX code I think Sandy Bridge will work OK in future MacBooks.
Re: (Score:2, Flamebait)
Re: (Score:2)
How is that new intel integrated GPU compared to the nVidia 320M currently used by Apple?
Re: (Score:2)
Re: (Score:2)
I would say none.
No support for OpenCL which I feel is really stupid.
Maybe Apple can make a translator.
More coverage: videos of Intel demos (Score:2)
and Intel 2nd Generation Core Processor to specialise in media processing [goodgearguide.com.au]
Not a single word on Intel killing overclocking? (Score:3, Interesting)
Re:Not a single word on Intel killing overclocking (Score:5, Funny)
Not a single word on Intel killing overclocking, eh? According to anand's article majority of new CPU's won't allow ANY kind of overclocking.
And 128 nerds cried themselves to sleep... :)
Really need to rationalise naming (Score:4, Insightful)
Thought the '2' in Core2 referred to the second generation already...
With the Core i3/5/7 being the third these are more like the fourth generation.
Might be time for people who make C(G)PUs to have a rethink on naming schemes... maybe even take a leaf out of the software industry, e.g
Core i .
Re: (Score:2)
Hm. SNAFU something grabbed rest of my post. :-(
Anyway
Core i[number of cores] [major design].[revision] would ease confusion.... then just have list of major design to code name mappings (or even append the name when listing cpu's on your product pages if you are a vendor) and job is done; least in terms of stopping the naming confusion.
Re: (Score:2)
Maybe it'll be Core ii X ? The i is roman numeral lower case?
Core ii9 here we come!
Re: (Score:2)
The Core Duo processor was basically a dual Pentium M with SSE3 on a single die. The Core microarchitecture, on the other hand, was found in Core 2 processors.
http://en.wikipedia.org/wiki/Core_(microarchitecture) [wikipedia.org]
This naming scheme will probably continue as long as processors have cores. Makes you wonder how processors worked at all, before the Cores were introduced.
Re: (Score:2)
Thought the '2' in Core2 referred to the second generation already...
With the Core i3/5/7 being the third these are more like the fourth generation.
Not really. Sandy Bridge seems to share more architectural similarities with Core/Core2 than with Core i# chips - they were obviously made by different, albeit likely related, internal development teams. SB doesn't support triple channel RAM, for starters.
Re: (Score:2)
maybe even take a leaf out of the software industry, e.g
Core i .
i++;
What is TDP? (Score:2)
From TFA:
Of course, we are left wondering what TDP means now, if exceeding it is standard.
Ironically, I was already wondering that. It never told what TDP is, and a Google define: search wasn’t terribly enlightening.
Re: (Score:3, Informative)
Thermal Design Power. Basically a measure of the amount of cooling required to prevent the chip frying.
Wikipedia is your friend (Score:2)
http://en.wikipedia.org/wiki/Thermal_design_power [wikipedia.org]
Just google for TDP to begin with.
Not worth upgrading really... (Score:2)
sandy bridge is only going to be like 20-30% faster then what's available right now, the last two generations after core 2 duo have only had modest games in the 20%-30 range. I've not been impressed at all lately, it seems technological leaps have slowed right down for cpu's at least.
Re: (Score:2)
Re: (Score:2)
Are you serious?
Core i5/i7 are significantly faster than the previous generations, even without the built-in overclocking. They're on par with AMD again (at least in terms of performance), and they're got triple channel memory support - a big win. They've also replaced the vastly inferior FSB, allowing for more system throughput that rivals what AMD has, again.
If you're not seeing gains, then it is quite likely that you do not need the newer processors - that is, the ones with more than one or two cores.
Re: (Score:3, Insightful)
Yeah. No one ever buys a desktop, and they certainly don't ever want it to be faster.
Re: (Score:3, Interesting)
If I want to make my desktop faster, I can replace the graphics card or CPU independently - it's big enough that an integrated CPU/GPU solution doesn't really make that much sense yet.
Mobile devices, on the other hand, make a lot more sense; if you can integrate the CPU and GPU on one chip with a reasonable max TDP, that's significantly less complexity in the design woth more computing power. You should see the heatsink arrangement in my HP laptop with a discrete CPU and GPU - it's insane, heat pipes and fa
Re: (Score:3, Insightful)
Re: (Score:2)
You do realize that a) Intel makes mobile chips as well that take power saving into consideration and b) TFA doesn't say it, but this feature will almost certainly be configurable by the bios and/or OS.
Indeed: in normal use while web-browsing and the like -- at least according to the Linux battery monitor -- my i5 laptop takes only slightly more power than my Atom netbook. But if I plug it into the wall I can play any modern game decently (with it's Nvidia GPU, not whatever's integrated with the CPU).
Re: (Score:2)
Meh, maybe I'm just an embedded person who treasures ARM above all else and thinks that 640k ought to be enough for anyone.
Re: (Score:3, Insightful)
I'm assuming that even if there are dead pins on the current socket, that can be used for the video portion, no existing boards will have this capability... so it wouldn't matter anyway, right?
thinkin' new socket.
Re: (Score:3, Informative)
If you went with 1156, which I did (P55 Classified + i7 860 @ 4.0 Ghz), then you're screwed, just earlier, since it's now Socket 1155, which isn't compatible even though it's just a 1 pin difference.
I wasn't very happy with Intel when I found this out, since they've recently switched sockets after holding on to 775 for so long, but from my understanding AMD has also don
Re: (Score:2)
Meh.. you had do know that *someone* would be left holding the bag when the next cycle came. That much was obvious. I'd be more pissed if I had gone with the 1366, rationally expecting the higher-end to have the longer life. As it turns out, everyone who bought an i anything hoping for upgradeability is taking it in the socket.
Re: (Score:2)
And that is why I go for AMD. Identical socket with extremely gradual incompatibilities.
Re: (Score:2)
Yes it will need a new socket.
Re:I have first-ed this article... (Score:5, Interesting)
Re: (Score:2)
Anand had early samples and showed that the Intel integrated SB video was actually faster than a Radeon 5450 in most cases.
To put this in context for someone switching from console to PC gaming, is this equivalent to a Wii's Hollywood GPU, equivalent to an Xbox 360's Xenos GPU, or somewhere in between?
Re: (Score:2)
It is not as good as the GPU in the Xbox 360.
http://www.tomshardware.com/reviews/gaming-graphics-card-geforce-gtx-480,2598-6.html [tomshardware.com]
In this chart, the Xbox 360's GPU is about the same as the X1900.
Re: (Score:2)
Not to mention that developers could spend days or weeks optimizing for the console's 3 graphics processors and have it pay off with a smooth experience. But with nearly a hundred desktop GPUs in varying usage, with varying levels of DirectX and OpengL support, there's not that "easy target" for optimization.
Re: (Score:2)
My question would be are those wanting a laptop that is too small for a discrete GPU and who care about graphics performance going to be better off staying with a core 2 duo with a nvidia chipset? or will they be better off with sandy bridge?
Unfortunately anandtech ( http://www.anandtech.com/show/3871/the-sandy-bridge-preview-three-wins-in-a-row/7 [anandtech.com] ) didn't include nvidia integrated graphics in their comparison. Also they were using a desktop not a laptop chip afaict (though they don't seem to know for sure
Comment removed (Score:5, Interesting)
Re: (Score:2)
I've been out of the PC building rat race for several years now, and I'm diving back in. I don't know what AMD and ATI have to offer because Intel and NVIDIA are getting the stars with technologies like Turbo Mode, SLI, and low heat dissipation in the i7. All of I've been reading about with the new Geforce GT and GTS, has me very excited for all the graphics power I'll have, although the lack of support for Starcraft II on my 7600 GT based iMac has me pissed. Do you think AMD and ATI have something worthwhi
Re: (Score:3, Interesting)
Re: (Score:2)
Thanks. As I just told one slashdotter, I've been leaning toward a laptop since maintaining an iMac (which I use for development and work) on my desk will be difficult if I have to add a new PC monitor. :D
I've always loved AMDs, but even being out of the game, I've still heard major news coming out from Intel. AMD always seemed silent to me. Basically, my story is that I had had enough with my Pentium IV PCs leaving my computer room frying in winter, so I became more conscious to issues like heat dissipatio
Re: (Score:2)
Re: (Score:2)
Hehe. That Acer does look good. I like that the 5650 in it is DirectX 11. The MSI looks good too, but I believe the 3200 is only DirectX 10. Of course, nothing I'm interested in that's here or on its way soon is DirectX 11, so perhaps DirectX 11 support early on is a bit unnecessary.
KVM switch won't work with my setup. The iMac is an all-in-one from '06, so any desktop monitor would need to share desk space with my iMac which I use for work.
Forget turbo mode, It's nearly 2011, the innovations we need are in
Re: (Score:2)
Re: (Score:2)
Here's the thing about Apple, and it gets back to the recent Slashdot article about how good software makes us stupid; when you have an all-in-one computer that just works, you can easily grow complacent. I think back to my PC building days. Much of the knowledge I had, as well as the technical skills, were the product of me chasing necessary upgrades for parts that either failed or were not powerful enough for the next generation round of games or operating systems. By using a reliable all-in-one, I lost t
Re: (Score:2)
I think the newer iMacs can handle video in and work as a monitor. Might be something to look in to.
D'oh! has to be Display port output from another system and then the 27" iMac will work as a monitor.
Re: (Score:2)
Yes. Previously this was a problematic issue. I think you had to find specific Belkin branded part to get the iMac to act as a monitor for a non-Macintosh computer. Maybe it's all smooth now.
Re: (Score:2)
If you don't want to wait you can buy AMD now and thanks to socket compatibility drop in a bigger CPU later.
Bulldozer CPUs will require a new socket called AM3+.
Re: (Score:2)
Re: (Score:2)
Thanks for the encouragement. So far, I'm leaning on a laptop since I don't have space to house an iMac plus PC monitor, and in the laptop end, my options for NVIDIA seem limited to the Geforce GT 330M. I'm sure I can get a lower end GT 400 series NVIDIA in a laptop, but I'm guessing that would require I buy an Alienware or some other beastly looking laptop. I want sleek and Sony'ish like a proper laptop should be, so that's that. I haven't seen an ATI Radeon 5870 in normal looking laptop. Of course, I have
Re: (Score:2)
Awesome, thanks a lot! I'm surprised to learn that the GT 330M is still a DirectX 10 processor whereas the current ATI is DirectX 11.
Re: (Score:2, Insightful)
I would suggest checking out ATI, for the last year and a half or so nVidia has been playing catch up to ATI,
Unfortunately, for the last decade, ATI has been playing catch up to nVidia for quality of drivers. While their quality has improve considerably over the last several years, they are still many years behind that of nVidia; especially for OpenGL drivers.
And like it or not, for Linux, you still have exactly one high end 3D solution - nVidia.
I'd rather be a few frames slower with nVidia than slightly faster and unstable or unplayable with ATI. ATI just has a horrible track record even on Microsoft platforms. J
Re: (Score:3, Informative)
gets shot down every single time
If it gets shot down at all, ignorance is prevailing. Been reading on several forums on lessor known games (Spring RTS, for example) and ATI drivers frequently cause problems. The situation I depicted RECENTLY happened and if you search the archives, various problems are constantly pop up. To imagine this is not a problem is to be delusional. Seriously.
Exactly as I said, if you don't care for OpenGL compatibility, ATI drivers will likely be a good experience for you. If OpenGL and/or alternate platforms are
Intel buy nVidia? Replace Intel CEO Otellini? (Score:3, Interesting)
Should Intel buy nVidia? Jen-Hsun Huang [wikipedia.org], who averages about $23.02 million per year [forbes.com], is not the sort of person who would easily integrate into Intel, and he is important to the leadership of nVidia. Intel's CEO, Paul Otellini [wikipedia.org], makes about $14 million. [computerworld.com]
Soon Intel's integrated graphics will have mid-range speed, leaving only the high range for nVidia. The high range of video adapters is mostly bought by teenagers who want to
Re: (Score:2)
Neo dual based netbooks to they are getting around 5 hours on a charge and the graphics and video performance is awesome
I have one, and only get about 3 hours out of a 6 cell battery. It does have good video and raw CPU performance for a netbook though.