Intel Unveils 10-Watt Haswell Chip 103
adeelarshad82 writes "At IDF, Intel announced the company's fourth-generation Core processor code-named Haswell. The chip is based off of the same 22nm process used in the current third-generation Core products. What makes this chip remarkably different from the third-generation chips is its ability to product twice the graphic capabilities at a much lower power consumption, which Intel has achieved by making use of a number of tactics."
HotHardware has video of Haswell running a 3D benchmark.
Re:Compared to ARM (Score:5, Interesting)
When you consider that the x86 uses 3x the power, but can run a benchmark such as multithreaded linpack 1000x faster, it suddenly seems like we're getting ripped off by these ARM processors.
In reality, this processor consumes 20x less (I assume that means 1/20th) power of the current Ivy Bridges. I presume that's under normal use. It's a huge win for laptops.
Re: (Score:2)
Perhaps you're talking about medfield. I'm talking about i7's.
Re:Compared to ARM (Score:5, Informative)
Re: (Score:2)
So most of the time, the processor is idle. The rest of the time, it's doing processing 1000x faster than an ARM CPU. Given that current mobile CPUs use somewhere around 60-70W under full load, this bodes well as the x86 processors are doing a lot more work for only about 10x the load power draw. When idle, the cpus draw far less.
Closing in on Atom (Score:5, Interesting)
Re: (Score:2)
Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU
At IDF, Intel also talked about upcoming 5W Atom chips that will be out at the same as Haswell
Re:Closing in on Atom (Score:5, Funny)
>> 5W
How the hell am I supposed to reheat pizza on that?
Re: (Score:2)
Re:Closing in on Atom (Score:4, Funny)
The part I was impressed with was how they did it...
"[...] which Intel has acheived by making use of a number of tactics."
+5 Informative!
Re:Closing in on Atom (Score:5, Funny)
The part I was impressed with was how they did it...
"[...] which Intel has acheived by making use of a number of tactics."
+5 Informative!
Welcome to the internet. We have these thing called hyperlinks. Anytime you see underlined text of a different color, you should consider clicking on it. I you had clicked on the phrase "number of tactics", you would have been taken to view another article which would have explained many of these tactics.
Re: (Score:2)
So,what did you guys think of the Hot Hardware link?
Re:Closing in on Atom (Score:5, Interesting)
The best part is that, unlike Atom, these things are usably fast. I have a 2x1.3Ghz core2 process shrunk or something with a TDP of 12W (total system is about 26W ... under full load). I mostly live in Emacs but I like a compositing window manager (wobbly windows are fun alright) and GL screen hacks... the thing does great and can handle my regular abuse of PostgreSQL/SBCL/mlton/... all while getting about 8-9 hours of realistic use (ok, closer to 7 now that the battery is down to 72Wh from its 84Wh theoretical max when it was new) and all under 10W generally. Sign me up for something that uses about the same power and is just a bit slower than the current Ivybridge processors... (come on laptop, don't die until next summer).
And it all Just Works (tm) with Debian testing (it even worked with Squeeze, but GL was a bit slow since it predated the existence of the graphics hardware and all). Now, if only someone would make one of these low voltage things with a danged Pixel Qi display or whatever Qualcomm has so that I can use it outside... not having to worry about finding power every 2 to 3 hours is great, but if you're still stuck indoors it's not half as great as it could be.
Re:Closing in on Atom (Score:4, Informative)
Intel's top Atom chips have a 10W TDP. Of course the chipset/RAM also play a large factor, but still -- this is an amazingly frugal CPU
You're thinking of the wrong Atom CPU there. You want to compare Intel's lowest-power Core architecture to...their lowest-power Atom.
Intel has placed an Atom Z2460 on a smartphone platform, complete with 1.6 GHz core speed and sub 1w typical power consumption [anandtech.com], and they've done it on just the old 32nm process. The 10w parts you're thinking of are for desktops.
These 10w Haswell chips will also be the pick of the litter, but the power consumption will be nowhere near that of Atom (and neither will the price...expect to pay upwards of $300 for these exotic cores). The Lava Xolo X900 costs only $400 on the street [anandtech.com], so you can imagine Intel's charging around $25 for their chipset.
Graphic Capabilities (Score:3)
So wait, is this only about the graphics part inside the CPU or what?
Who cares about that graphics part inside the CPU. Useful for a laptop maybe, but for the real stuff you need an actual graphics card.
Re:Graphic Capabilities (Score:4, Insightful)
Because most PC's sold use integrated graphics, traditionally they have been abysmal. In the last few years seemly pushed by AMD they have been looking to correct that.
Re: (Score:1)
Wrong, the only reason you have better integrated graphics from Intel is beacause AMD has been pushing performance since they bought ATI
Alternative history. (Score:2)
The only reason you have microprocessors of any kind is because Intel invented them.
Or Gilbert Hyatt [time.com] if you believe the story.
Re: (Score:2)
However, for those of us that live in the UK, there was no chance whatever of getting the funding to actually make one.
Re: (Score:1, Informative)
Pushed by Intel. AMD is following... still.
The GPU parts in AMD's "APUs" are miles beyond Intel's HD Graphics.
Re: (Score:2)
Re: (Score:2)
Re:Graphic Capabilities (Score:5, Insightful)
2) 1/3 the TDP is the difference between a battery with power and one without.
Re: (Score:2)
Using integrated graphics for gaming if you are concerned about framerates is just dumb.
So what 10" laptop has discrete graphics? Or should only turn-based games be played on a 10" laptop?
Underwater on a device (Score:2)
If the device doesn't meet your requirements, look for one that does.
Which is difficult if one is still making payments on the device that no longer meets one's expanded requirements, or if someone else controls the purse-strings for a household or business and fails to appreciate the expanded requirements. It's also difficult in a case where price, performance, and size are in a "pick two" relationship.
Re: (Score:2)
Which is difficult if one is still making payments on the device that no longer meets one's expanded requirements, or if someone else controls the purse-strings for a household or business and fails to appreciate the expanded requirements.
That sounds more like a personal problem than a technical one. Can't help you there.
It's also difficult in a case where price, performance, and size are in a "pick two" relationship.
Pretty well what I'm saying, except mine are price, performance and power usage.
Re: (Score:3)
Re: (Score:2)
the difference between 40fps and 20fps
Is that average FPS or minimum FPS? for some reason benchmarkers tend to focus on the former while the latter is what is really important. A game that plays at 20fps solid would probablly be tolerable (it's not much lower than movie framerates after all), one that plays at 30fps most of the time but bogs down in intensive scenes would note.
Re: (Score:2)
Good news! Haswell (the GT3 variants of it anyway) should approximately double Intel's IGP performance. For example: they demoed it playing Skyrim on High settings at 1920x1080.
Plus, some other good stuff [arstechnica.com].
Re: (Score:3, Informative)
The integrated graphics are still crap.
The thermal overhead added to the CPU die limits the amount of computational power they can realistically add. Not to mention that on enthusiast systems it creates needless waste heat that could be better spent on CPU cycles. (Supposedly we'll see some tightly integrated cpu+gpu systems with shared memory space and registers and whatnot.. But we're far away from that, as it presents a large departure from traditional PC architecture, let alone x86 arch. AMD is way ahea
Re: (Score:2)
You could use integrated GPUs for vector/matrix math, something they're a lot better at than the x86 core, thus greatly increasing the efficiency of certain workloads.
You don't have to use them only for making pretty pictures.
Re: (Score:2)
Correct and GPU/CPU combinations also gain the next fastest memory tier below "the register" instead of going off die like even discrete GPUs have to do for some workloads.
Re:Graphic Capabilities (Score:4, Informative)
"The integrated graphics are still crap."
Depends what for, really... Office, web and HD video? Nope, they're pretty good at that - so good, in fact, that I don't buy machines with dedicated graphics cards unless I'm planning on playing games or running applications that specifically require a fast GPU.
Even the HD3000 or HD4000 (Sandy and Ivy Bridge, respectively) graphics included with the last and current generations of Intel Core iX CPUs are overkill for most people - even a 4500MHD (Core 2 Duo 2nd generation) had perfect support for 1080p acceleration and ran Windows 7 at full tilt with all the bells and whistles, if you wanted those. What more do you want from integrated graphics?
The fact that I can even play Starcraft II on low at 1080p on a Core i3 with the integrated HD3000 at acceptable framerates is just icing on the cake...
Oh and have I mentioned the sub-5W total system power consumption on a 15.6" laptop with a standard voltage CPU? THAT is what integrated graphics are for. If you're looking to do gaming or CAD or use the GPU for computationally intensive tasks, you're not in the target audience...
Re:Graphic Capabilities (Score:5, Informative)
Ivy Bridge's HD4000 comes very close to matching its performance [notebookcheck.net] while burning a helluva lot less power. So the delta between mid-grade dedicated video and integrated video performance is down to a little over 2 years now. Intel claims Haswell's 3D video is twice as fast as HD4000. If true, that would put it near the performance of the GT 640M, and lower the delta to a bit over 1 year.
This is all the more impressive if you remember that integrated video is hobbled by having to mooch off of system memory. If there were some way to give the HD4000 dedicated VRAM, then you'd have a fairer apples to apples comparison of just how good the chipset's engineering and design are compared to the dedicated offerings of nVidia and AMD.
I used to be a hardcore gamer in my youth, but life and work have caught up and I only game casually now. If Haswell pans out, its integrated 3D should be plenty enough for my needs. It may be "crap" to the hardcore gamer, but they haven't figured out yet that in the grand scheme of things, being able to play video games with all the graphics on max is a pretty low priority.
Re: (Score:2)
have I mentioned the sub-5W total system power consumption on a 15.6" laptop with a standard voltage CPU?
Obviously at least an LED backlight, and probably turned way down. Or what, is it OLED?
Re: (Score:3)
Laptop displays have been LED backlit for years now - you can't buy a CCFL backlit display except maybe as a standalone monitor in the clearance aisle of your local big box electronics store...
As for AMOLED... that's useless as a laptop display, because it uses 2-5x as much power as a decently efficient LED backlit display when displaying mainly-white content (such as Slashdot or other websites) - not to mention the fact that AMOLED displays at this size (15.6" diagonal in this case, but consider this sente
When your requirements grow (Score:2)
Depends what for, really... Office, web and HD video? Nope, they're pretty good at that - so good, in fact, that I don't buy machines with dedicated graphics cards unless I'm planning on playing games
So if someone buys a laptop for "Office, web and HD video" and later decides to try games, what should he do? Buy another computer? Whatever happened to buying something that will grow with your requirements?
Re: (Score:2)
The problem is that buying a laptop (or even desktop - although the problems are usually more pronounced on laptops) with a high-powered graphics card has very negative side-effects:
1. More heat - Fan noise, uncomfortable heat during use, significan reduction in longevity (ever seen a non-plastic-POS Intel based laptop without dedicated graphics overheat? I haven't...)
2. Higher power consumption - the most efficient laptops with dedicated non-switchable graphics draw upwards of 10W idle... many draw 15 or 2
Re: (Score:2)
GDDR3 is an optimized type of DDR2 memory. The 700Mhz in the Xbox360 it is less than half the speed of the DDR3-800Mhz stuff used by Atom's for the last couple of years. Even if you fit them with 4 times the memory they can't get close to the 360's graphics performance?
In the context of Haswell you are talking about an entry level of dual-channel DDR3-1600Mhz or around 25Gb/s beating the GDDR5 in bleeding edge top of the line discreet cards from just 4 years ago.
Re: (Score:1)
Not true. Firstly, the memory channel to the iGPU is somewhat more sophisticated than just tacking on the main memory bus. Secondly, the iGPU has much less processing power than a top end dGPU, therefore it need much less memory bandwidth. Increasing bandwidth would be of no value.
iGPUs are mainly for budget laptops, where there is going to be no dGPU installed. Ivy bridge and Trinty iGPUs are powerful enough to run Crysis (on low settings). Something not to be scoffed at, considing low budget buyers never
Comment removed (Score:5, Insightful)
Re: (Score:2)
nVidia knows this too. As you can see, they've been focusing in on advanced 3D gaming and super computing.
And mobile stuff (Tegra) where they have their own (licensed) CPU and GPU.
Re: (Score:1)
It's official. Intel on-board video is all you'll ever need for home and general office use.
Agreed, but don't confuse this with "you should recommend integrated graphics to home users", though. Your examples are perfectly tuned for Intel graphics because many people have them. For a business, that's fine, you only need a fixed set of applications. For a home user, it's likely to be some flash game, Google Earth or some software that works much better with a dedicated graphics card. Good ones are quite cheap now, and if your looking at a "i5", spending some extra on a GPU gives more bang for the b
Re: (Score:2)
So what Intel has is "good enough" for 99.9% of the users, but AMD delivers the same thing but for less money ?
Re: (Score:2)
Useful for a laptop maybe
Hmm, I wonder where these ultra low-power chips are intended to go...
Re: (Score:2)
Re: (Score:3)
Laptops make up something like 50% of the consumer market. Integrated graphics are what go in most dells for corporate users. An HD4000 has no problem pushing a dual or triple screen setup. The triple head displays at my work choke on anything more than word processing. Dragging a youtube video across all three makes things very choppy. Also, the HD4000 is an actually usable chipset. It's nothing like the old integrated graphics of old like the GMA950 which couldn't even load TF2. HD4000 will do TF2 at 250-
Re: (Score:1)
Actually I benchmark most machines by seeing how fast the cards bounce after winning a game of Solitaire, you incensitive clod.
Re: (Score:2)
Re: (Score:2)
Actually, I find the integrated GPU interesting - not for graphics, but for additional GPGPU power. Those things are fully OpenCL/DX11.1 compliant, so you can probably run some fluid simulation or n-body on them while at the same time doing some different crunching on the CPU, all being rendered extra pretty by a powerful discrete GPU.
Re: (Score:2)
You do know that laptops outsell desktops. As for real stuff if you mean work then boy are you wrong. For anything outside of the sciences, CAD, CAM, Video, and Audio production, these will work very well. For all the home users that run Quicken and go on the web to use Facebook and such then this will do very well for them.
If these chips can get good enough performance on a 1080 monitor then they will be a giant boon for gaming. Most people use a single 1920x1080 monitor if this allows for a lot of games t
Re: (Score:2)
Why can't you have the integrated graphics render most things, and your games/cad software using a discrete card when they need it?
NVIDIA Optimus finally coming to Linux (Score:2)
Why can't you have the integrated graphics render most things, and your games/cad software using a discrete card when they need it?
Because until a couple weeks ago [phoronix.com], NVIDIA refused to make that technique (which it calls Optimus [wikipedia.org]) possible on a GNU/Linux operating system.
Re: (Score:1)
People who care about open source drivers to the point where they won't use ATI or NVidia on their personal machines (me).
Are you aware that there are open-source drivers (ATi ones even have 3D) for ATi and nVidia?
While I can understand avoiding nVidia if you don't want to install a proprietary graphics stack, why avoid ATi/AMD when there's serviceable open-source drivers for all but the latest cards?
Re: (Score:2)
Who cares about that graphics part inside the CPU. Useful for a laptop maybe, but for the real stuff you need an actual graphics card.
I have seen it asserted a couple times now that the current intel integrated graphics are acceptable for light gaming. If the new stuff is twice as powerful (I'm confused by the summary but don't care enough to RTFA as I have no plans for new machines in that class any time soon) then it will be entirely useful for everyone but gamers who must play the latest and greatest at high resolution.
Incorrect summary is incorrect (Score:3, Informative)
Intel's Statement was that it could produce similar results as Ivy Bridge at half the power consumption OR around twice the power at the same power consumption as Ivy Bridge's built in chip.
Which is still pretty good all considered.
Re: (Score:2)
Sounds awesome to me... I'll take half the power consumption, thanks. I wonder if that goes for total idle power consumption... I'm already seeing less than 5W idle (depending on which LCD panel is installed - add a Watt for the enormously inefficient AUOptronics 1080p panel Lenovo uses) on my Sandy Bridge laptop (and that power consumption includes the SSD), so Haswell should hopefully be able to drop that to 3-4W... hopefully that'll also average out to ~2W less in actual use - meaning a 94Wh battery woul
Re: (Score:2)
Haha, what do you have, a T520? I was excited about the T530 until I saw the keyboard. It's not even the chiclet thing, it's the missing row and consequently fucked up layout!
Aaanyway, I've been getting extremely impressive battery life out of the Sandy based laptops, so the future looks bright :)
Re: (Score:2)
AWESOME. I hope to $Deity that you're right... thanks for the info!
Selling function when all one can see is the form. (Score:2)
Welcome to the world of the supersmall. As real as software, and just as hard to impress when going, "see this".
Good direction (Score:1)
Intel has laid its share of rotten eggs, but for the past few years they seem to "get it" relative to the technology market. Consumers want lower power consumption, small form factor, and hardware acceleration for mobile access to Internet services. Companies want higher core density per physical chip, lower power consumption, and virtualization to better deliver services to the Internet. If Intel delivers the best products for each segment of that ecosystem, they have a bright future ahead of them.
i7 powered coke machine (Score:1)
I accidentally went into the article and near the bottom they mention an i7 powered coke machine. Now that's bloat.
C++ on ARM vs. C++ on x86 (Score:2)
If you go with Intel instead of some other embedded processor [for a vending machine], and for many units, I'm sure they'd cut you a deal. Your programmers will be cheaper too
How exactly are C++ programmers for ARM on something like a Raspberry Pi board cheaper than C++ programmers for x86 platforms?
Re: (Score:1)
That's just the code name for the Charlie Sheen bot they've got in skunkworks.
Re: (Score:2)
A lot of such devices are built round PCs (some use special embedded form factor PCs, others just have a normal PC tower sitting inside them) despite them being overkill and not particularly reliable. I guess it's because windows devs are easier to find than devs who can handle an arm linux board.
I do wonder why an i7 though, a celeron would be more than sufficent.
Upgrade path (Score:1)
Didn't showcase Itanic III? (Score:2)
Still not good enough for me(just my opinion) (Score:2)
What I want for my ultimate mobile computing device:
1. Small, lightweight and have physical keyboard
I walk a lot so I want small device that fit comfortably in my backpack (so that's below 7'') and weight less than 1.5(preferrably 1) pound. I'm not all-day mobile warrior so I can live with cramped keyboard but after testing my wife's galaxy s2 touch keyboard I decided I DO NEED a physical keyboard for typing documents/playing games(like nethack, old dosbox compatible games).
2. MS application/IE comp
Wow - must remember bay trail (Score:1)
Happy to know that bay trail platform finally drops PVR graphics core. Hope that some manufacturer produces small factor platform that I want in 2013.
GMA in Atom netbooks (Score:2)
MS IE only internet banking
Other banks exist.
I need graphic core that supports linux well and play angry bird. PVR core in atom don't support either.
Since when are PC makers still using GMA 500 (the PowerVR core) in new Atom netbooks? I thought they had all switched to four-digit GMAs, which have working drivers in Ubuntu.
Remember the GMA500 (Score:3)
Re: (Score:2, Informative)
The GMA500 was for embedded devices anyway, and not a real Intel chipset. Intel knows of the problem is actively working on replacing those PowerVR chips with their own chips. ARM chips have the same or even worse problems than GMA500 chips: You don't have working drivers for those either, maybe some for Android, but not for Xorg.
Re:Remember the GMA500 (Score:4, Informative)
The GMA500 disaster showed how much Intel cares for end users after selling them the hardware.
GMA500 = rebranded PowerVR SGX 535. The graphics Intel develops themselves isn't for serious gamers but it's improved leaps and bounds over the last couple years. You're of course free to be unhappy about the Poulsbo and with good reason, but most people with a recent Intel IGP are very happy and the sales of discrete cards only goes one way, down.
Re: (Score:2)
I know the Poulsbo chipset is a re-branded PowerVR. But that isn't the main problem here; I don't care if it was re-branded or developed in house. The problem is that Intel released that crap, then abandoned it. They had half-decent psb drivers, which where great for watching films without stressing the underpowered atom CPU, but they just dropped any development (or even basic maintenance) for them.Then, after a huge outcry, promised gallium drivers for it, had them almost finished, and never released the
Re: (Score:2)
The GMA500 is not an Intel chipset. It's a rebranded PowerVR SGX something or other.