Intel's Skylake Architecture Reviewed 99
Vigile writes: The Intel Skylake architecture has been on our radar for quite a long time as Intel's next big step in CPU design. We know at least a handful of details: DDR4 memory support, 14nm process technology, modest IPC gains and impressive GPU improvements. But the details have remained a mystery on how the "tock" of Skylake on the 14nm process technology will differ from Broadwell and Haswell. That changes today with the official release of the "K" SKUs of Skylake — the unlocked, enthusiast class parts for DIY PC builders. PC Perspective has a full review of the Core i7-6700K with benchmarks as well as discrete GPU and gaming testing that shows Skylake is an impressive part. IPC gains on Skylake over Haswell are modest but noticeable, and IGP performance is as much as 50% higher than Devil's Canyon. Based on that discrete GPU testing, all those users still on Nehalem and Sandy Bridge might finally have a reason to upgrade to Skylake.
Other reviews available at Anandtech, Hot Hardware, [H]ard|OCP, and TechSpot.
Much ado about relatively little yet. (Score:1)
The performance increase is going to be negligible until the "new instructions" on the skylake are utilized more in daily software use. Buy today, pay a premium for basically no bump.
Re:Much ado about relatively little yet. (Score:5, Interesting)
Of course, the biggest monster jump didn't make it in:
http://wccftech.com/mainstream... [wccftech.com]
So there might not be very dramatic bumps to be had even with updated libraries/compilers/etc.
Re: (Score:2)
The performance increase is going to be negligible until the "new instructions" on the skylake are utilized more in daily software use. Buy today, pay a premium for basically no bump.
I don't think there are any new instructions in the consumer version of Skylake. As far as I know they are reserving the new instructions for the Xeon models only.
Re: (Score:2)
Re: (Score:2)
It's been almost eight years. I think a $30 video card could probably do a decent job of running Crysis at this point.
Re:Wow! (Score:5, Insightful)
Are you kidding? As an owner of several Sandy Bridge systems there is no reason to upgrade.
Heck. I still have two Core i7-920 systems at work and I'm not touching them, they work just fine running Windows 10.
Intel hasn't had competition from AMD in years and this is the result.
Re: (Score:2)
I own Sandy Bridge systems too, and there are some reasons why I am seriously looking at upgrading. A noticeable performance increase, with quite a bit of power saving is always nice. A swedish hardware site has taken up another metric for it, by calculating the amount of joules used in total to render the scene Island in Blender at a set resolution and quality level, minus the rounded off idle energy use by the 980 Ti used in their tests. All results in joule, lower is better:
http://cdn.sweclockers.com/ar [sweclockers.com]
Re: (Score:2)
Thanks for sharing... Always interesting to see things put another way...
I don't doubt that the performance per watt is improving, that is one of Intel's main focus points over the past few years.
It would be interesting to put that difference in power into a dollar figure, because while it saves power, how much does the upgrade from i7 Sandy Bridge to i7 Skylake cost in terms of dollars (and downtime and labor which aren't free for anyone doing actual work)
-----------
However, the above point aside...
If you
Re: (Score:2)
Who says it's only about work? I do things like this for a hobby, and I'm not going to waste money by buying YET another system when it's not necessary. Besides, the 8+ core Xeons tend to sacrifice clockspeed instead, making them less useful for when you need the single-threaded performance for some tasks.
As for the link, keep in mind, that's the energy just for a single still image. Multiply for every frame in an animation as necessary, and you can see where the savings pile up.
Re: (Score:3)
Who says it's only about work?
I was simply point out that the 25% jump in performance over Sandy Bridge is only really going to matter to people who use the computer for that type of work. If it is a hobby, then your time isn't worth money. :)
I do things like this for a hobby
Fair enough... But if you're doing it for a hobby, are you going to spend $800 to upgrade from Sandy Bridge to Skylake for a 25% boost to performance?
As for the link, keep in mind, that's the energy just for a single still image. Multiply for every frame in an animation as necessary, and you can see where the savings pile up.
Sure, except what do those savings translate into? $5 a year, $5 a month, or $5 a day?
I'm all for saving power, it does cost money after all, but
Re: (Score:2)
I need CPU to rip video, at home. Xeon is ridiculously overpriced. My current Sandy Bridge is 4-cores, OCed to 4GHz, which is fine for DVD but "run overnight" slow for BluRay. Haswell offerer 8-cores, and is 20% faster clock-for-clock, but I didn't want to replace my system so soon when it was new. Skylake looks to be faster still clock-for-clock, and can be OCed to 4.8 GHz, so a likely 80% improvement per core. That's a big deal to me. I'll likely wait a bit for the 8-core, as I want a 3x improvement
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Don't forget it isn't just raw watt savings. You also have the heat dissipation of that heat, and then the additional AC load and it's inefficiency. I'd multiply power savings by 3-4 if you want an accurate figure for amount saved.
Fans running at a bit higher RPM does not increase energy consumption in a noticeable way. A typical PC fan power consumption is about 2W. If you are water cooling without fans then there will not be any difference at all.
If you need air-condition at all then you must realize that these systems have COP of about 4. That means that increasing heat dissipation of your computer by X watts will lead to increase of air-condition consumption by about X/4. That means that your power multiplier of 3-4 is very wron
Re: (Score:3)
Intel hasn't had competition from AMD in years and this is the result.
I'm not sure this follows. Intel's primary focus for the last 5-10 years (some would say longer) has been process technology, and I don't think lack of competition from AMD has hurt that at all. You might make the argument that the previous generation's failure of desktop parts to materialize in real quantity, and this generation's tick-tock-THUNK cycle are signs that they're coasting because of lack of competition, but simply saying "my 28nm chip and my 14nm chip are fairly close in performance, damn you
Re: (Score:2)
Intel will always invest in process tech as that lowers their cost. They won't lower their price until they get competition though.
Sandy Bridge: 216 mm^2 @ 32nm, $317 list price, $336 inflation adjusted
Ivy Bridge: 160 mm^2 @ 22nm
Haswell: 177 mm^2 @ 22nm, $339 list price
Broadwell: 133 mm^2 @ 14nm
Skylake: ??? mm^2?, $350 list price
Intel can probably make around 5 Skylake CPUs on the same die space as 3 Sandy Bridge and still sell them for the same price, essentially two more CPUs of pure profit. Make no mista
Re: (Score:2)
That's fair enough--while the costs of building new fabs are astronomical (and growing) so that chip prices should not necessarily fall linearly with die size, it's fair to say that they've definitely been padding their margins, rather than cutting prices (and I'll agree that it's due to a lack of serious competition) so I think you win the point.
That said, the treadmill here is about to stop spinning (I think that I've read that 7nm (maybe 4nm?) is basically the last possible process node, because at that
Re: (Score:2)
If Intel had any real competition, we'd be seeing an 8 core Skylake for $250 minus the IGP.
It would fit in about the same number of transistors as what we are getting, but Intel has no need to provide it and cut into Xeon sales because AMD's 8 core chip is not competitive.
In fact, AMD's 8 core chip is slower than Intel's 4 core chip in most tasks... that right there is the problem...
Re: (Score:2)
Re: (Score:2)
Fair enough... But since I can't recompile most programs, I have to take what exists...
Many games simply run faster on Intel. Regardless of the reason, they just do...
If AMD has a case, they should file a complaint with the proper government dept, but it is beyond my ability to do anything about it...
Then there is power consumption, AMD uses more power per unit of performance, makes more heat, so that is something as well...
Re: (Score:2)
Re: (Score:2)
the FX8350s trading blows with i7s (despite those being nearly 2 and a half times more expensive)
You show your bias...
The i7 is not 2.5 times more expensive, it isn't even 2 times more expensive.
The FX8350 is $175 at NewEgg right now, the i7 Haswell refresh is $309... but the reality is that the i5 beats the FX8350 in most cases, the hyperthreading of the i7 is simply not required.
The i5 is $185, or only $10 more than the FX, it consumes half the power and is faster in single threaded and dual threaded applications. Only in applications that can use 8 cores does the AMD have any chance.
So you're an AM
Re: (Score:2)
Re: (Score:3)
So its being a "fanboy" to not support corruption?
No, the fanboy part is where you keep comparing the i7 to the AMD chip, when the i5 is a better comparison...
For games and most users, the i7 offers nothing useful, the i5 is just as fast.
And the i5 is the same price, give or take $10, over the AMD chip.
Re: (Score:2)
You can recompile if you use Linux with OSS.
Re: (Score:2)
You can recompile if you use Linux with OSS.
Yes, but that doesn't help since the topic is games, and the fact that most games are simply faster on Intel.
The why is beside the point...
A $185 Intel i5 chip runs most games faster than a $175 AMD FX chip does...
Linux is not a viable gaming platform, regardless of how much some people wish it were. Even if it was, you can't recompile any of the games so it completely doesn't matter. Battlefield is not open source, so the OS being open source wouldn't matter.
Re: (Score:1)
That's why the Raspberry Pi 2 is a much better solution. Since its SoC is made in China, you know there's no NSA backdoor in it!
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
Yeah, I thought it was really suspicious when Microsoft heavily promoted the new version of their operating system. Then when hardware manufacturers kept on including wifi and bluetooth in their hardware, without the need for an external card, I knew the only possible explanation was a massive snooping campaign by the NSA.
Re:NSA & Windows 10 (Score:5, Interesting)
Sure, because there's nothing here that could be explained by market trends.
1. Microsoft's monopoly is cracking badly, though perhaps not in the way most of us imagined. If you look at StatCounter's platform stats it's now 55% desktop, 39% mobile, 6% tablets and the desktop has been losing 10%/year the last few years. And people expect apps for their platform, if you're only on Windows or even Mac/Linux too you're now a dinosaur unless it absolutely requires a traditional desktop.
2. The OS is going to become a commodity, they saw what happened with Android once it hit critical mass. Chromebooks are an early warning. Also that XP and Win7 work "too well" so users aren't interested in upgrading, even though it's an expense maybe twice a decade. That MS Office - their stranglehold on the business market - is now on mobile and tablets is clear proof Microsoft knows this.
3. So their old strongholds are breaking down, where do they want to go next? They want to be the middleman between the app developers and the consumers, like Apple's App Store pioneered and Google Play mimic. To do that you need Win10 everywhere. You must get the snowball rolling that to make money you must be on the MS Shop, the same way you could install apps from other sources on Android but the vast majority don't. If you're not on Google Play, you "don't exist".
As for Intel:
1. Mobile, tablets, convertibles, laptops all need wireless connectivity and it's basically just expected features today like network and sound is on desktops, they used to be add-in cards once but was integrated long ago. And fewer and fewer want the hassle of running cables as WiFi speeds go to hundreds of megabits. It's also a simple way for Intel to steal market share by vertical integration, squeezing out third party chips.
2. And here's the kicker people don't seem to understand, Intel doesn't really make desktop chips anymore. Their mainstream chips are laptop spin-offs which get a higher TDP and a few other modifications, the same way their high end chips are Xeon spin-offs. That is also why they sell grossly overpriced desktop chips with better IGP, even though you can do much cheaper with a dGPU. They're just laptop spin-offs that happen to sell well enough to make a desktop version of.
3. So what's the combined effect? Well, you get the laptop features for "free", whether you want them or not. Same way Intel puts an IGP in every chip killing off much of the second hand GPU market, before you had machines that needed any old graphics card and now you don't. Less resale/reuse value means gaming cards in net cost more. It's an indirect way of using their dominance in the CPU business to expand without running into antitrust problems, at least so far.
Or maybe I'm just a NSA disinformation agent out to discredit the revealing of our secret master plan. But you have to admit the cover story is pretty credible, yes?
I Wish (Score:3, Insightful)
Re: (Score:1)
Re: (Score:1)
...and a pony!
Re: (Score:1)
Re: (Score:2)
Re: (Score:3)
Re: (Score:1)
That would be a huge leap in progress for the CPU side of things for people who run dedicated graphics cards already. In gaming benchmarks the amount of difference at usable resolutions like 1080P and higher there
Re: (Score:1)
Because AMD is in a heap of trouble. Why should Intel push the performance envelope when you need to buy a 8 Core, 200W TDP AMD CPU to beat a 90W 4 core Haswell i5 in terms of performance?
Until AMD returns to the performance race with a good, competitive CPU lineup, you won't see major jumps from Intel. There is just no money to be had.
Re: (Score:1)
There are jumps because intels primary competition is ARMs not AMD.
Re: (Score:3)
Yeah, I would love to see a real high end enthusiast processor with 8 cores, hyperthreading, a 4+ GHz clock speed, and no integrated graphics.
I'm thinking THAT processor would have more than a puny 30% performance increase over a 4 year old Sandy Bridge part.
Maybe they could brand it with something new like "Core i9 Extreme Edition" to make it sound even more badass to the l33t gamer types.
Re: (Score:2)
Yeah, I would love to see a real high end enthusiast processor with 8 cores, hyperthreading, a 4+ GHz clock speed, and no integrated graphics./quote.
So would I... but they aren't doing that to avoid hurting Xeon sales...
And in fairness, they DO have such a CPU... It is the Haswell-E line of chips, but it'll cost you a thousand bucks...
Re: (Score:3)
Let the IGP do physics, AI, or whatever, and leave the heavy crunching to the big boy. Newer game engines are looking into better ways to allow the scre
Re: (Score:2)
Re: (Score:2)
The new graphics APIs will allow IGPs to be faster for some work loads.
The question is for how much extra work. As I understand it, from the DirectX/OpenGL side you don't really "notice" SLI/CF, the cards just take turns. That is why you effectively only get half the memory, they must mirror all the assets. With the low-level APIs all the details are exposed, but if you want to take advantage of the special cases well you have to write special case code. And the bulk of your market will not have any fancy new feature you introduced, so in the world with limited time and resour
Re: (Score:2)
Re:I Wish (Score:5, Informative)
What I'm wishing for is essentially an i5 and i7 'max core' edition that removes the IGP but has a mirror of the existing cores in it's place so they each are essentially like 2 K edition chips stuck together.
It already exists, it's called the i7-5960x and costs $999 and needs an expensive X99 motherboard. Eight cores, no IGP, fully unlocked and uses standard DDR4 UDIMMs which are now almost at price parity with DDR3. You just don't like the price.
Re: (Score:1)
Re: (Score:2)
They do, the i7-5960X is a consumer, unlocked i7 chip. Loads of cache, no integrated graphics and a whacking great price because there's zero competition.
(Of course, the X99 chips are only Haswell, but as the IPC improvements are minimal with Skylake they're still worth considering - especially the 6-core i7-5820K, which is actually cheaper than the new quad-core Skylake i7 here in the UK. The X99 chips are essentially Xeons with some bits turned off and overclocking enabled. They have vt-d enabled, amongst
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
The best part of the IGP that I've found is when I lucked into a free 3770K, I was able to re purpose the 2600K into my Linux box, and suddenly that IGP that I was "never ever" going to use was a well supported, perfectly adequate solution for my Linux desktop without have to deal with binary blobs and whatnot.
But seriously, what gamer doesn't have a stack of old GPU's around? I'm not even a gamer and the number of old graphics cards I have is impressive.
Re: (Score:2)
K & Virtualization features (Score:1)
Still a deal breaker for me.
And My i7-920 @ 3.8 Ghz Lives On..... (Score:1)
Re: (Score:3)
My next CPU is sooooo going to be an AMD.
I just built a system with an AMD CPU, but the latest Haswell i5 with four cores is faster than it is. Skylake should beat it into a corner.
Re: (Score:2)
Re: (Score:2)
Next year's Zen should be competitive, is said to have 40% better IPC than excavator, and will be on 14nm FinFet. Being stuck on 28nm has really limited their ability to compete.
Re: (Score:2)
They said the same for their Bulldozer CPU, remember that?
Sure, we all do, but we also remember the K6/2+ and /3+ and the K7, which were all fantastic. I wouldn't bet a lot on 'em, but I wouldn't count AMD out yet, either.
If only they could get their graphics drivers in hand, I would really be a massive fan. As it is, I did just build an FX-8350-based machine, because it hit the right price point for me.
Re: (Score:2)
Ewwww, I remember K6. While not as horrible as the K5, if you did anything other than integer work, it was a pile of junk. K7 otoh was overall great(Barton revision was fantastic), with some fails(Thoroughbred revisions for example...), and K8 was fantastic for a while.
Re: (Score:2)
Ewwww, I remember K6. While not as horrible as the K5, if you did anything other than integer work, it was a pile of junk.
The K6/3+ had adequate FPU, although yes, the P2 had more. But it was a seriously fast little processor, and while the P2 was only available in a slot package, it was still pinned. Aside from fp, it was actually faster than a P2, for less money. And the fp wasn't really all that bad.
Re: (Score:2)
I did a crapton of 3D rendering and such back then, and AMD K6's were basically dead in the water as far as anyone doing it on a hobby level beyond mere dabbling or professionally was concerned, especially when you also had the issues with AMD's AGP support(which persisted even partially into late K7 revisions).
Re: (Score:2)
I still have two Core i7-920 systems running as well and I won't replace them with this either.
But I'm not going AMD, too much power use. Over a five year lifespan, the difference in power use and AC to cool the rooms adds up.
Not Making Me Want to Ditch 4-1/2 Year Old 2600k (Score:2)
30% Percent Faster, perhaps, with the wind behind it, and if I don't overclock my rig.
Cinebench 931 VS 694 Multicore.
No, to be fair--it's only 25.4564983888292% Faster.
If you're on Sandy Bridge or newer, don't bother.. (Score:5, Insightful)
If you're on Sandy Bridge or newer, don't bother unless you really need the new chipset features.
Benchmarks of course show a small gain, but in the real world I suspect you could do a blind test of Sandy Bridge next to Skylake and you couldn't tell the difference.
Anyone who needs the performance difference shouldn't be on either chip, if you do serious image/video editing, you should be on Xeon anyway with 8+ cores if you make a living doing such work. The cost of such a system is trivial compared to the cost of the employee doing such things.
I have several systems in my office, ranging from a single Q6600 machine and two Core i7-920 machines all the way up to a Haswell Refresh i7-4790k. The difference in general Windows performance between all those machines is minor. Games play, more or less, the same in anything Sandy Bridge or newer, and we don't do anything so intensive to require more power.
Come on AMD, get back in the game so Intel has some real competition. Since Core2Duo came out, you haven't been coming to the party.
DDR4 is nice, but where is ECC support? (Score:1)
It's 2015. I want ECC support on all chipsets. Don't make me buy a Xeon just to get ECC.
Now featuring binary GPU blobs (Score:2)
https://01.org/linuxgraphics/i... [01.org]
"No reverse engineering, decompilation, or disassembly of this software is permitted."
AnandTech makes a bold statement! (Score:5, Insightful)
Reading AnandTech's review, they make a bold statement at the end:
"Sandy Bridge, Your Time Is Up."
That is an interesting thought, but is it really?
If you need USB 3, if you want some of the other newer chipset features, perhaps. But for performance?
In benchmarks, Skylake appears to be about 25% faster than Sandy Bridge. Sure, if you're doing video encoding all day or other CPU intensive applications, it is... (and if you ARE doing that stuff, why aren't you on Xeon?)
But for most desktop computer uses, you likely won't see any difference between the two. What is worse, is that most of the above gains came from Haswell, not Skylake.
http://www.anandtech.com/show/... [anandtech.com]
Look at the "Gains over Sandy Bridge" chart on that page. Look at the red lines, then the purple lines. The red lines are the Haswell gain over Sandy Bridge, then the purple lines are the Skylake gains over Sandy Bridge.
Re: (Score:1)
I am currently on a Sandy bridge cpu. I am considering a newer computer. It will probably be of the skylake generation. Maybe mid next year.
Mostly because of the power reductions achieved in haswell. Plus the 50 other things that have improved in the ~3 years since I bought it. Such as more laptops having better mini pci slots. Built in video being better etc etc etc...
640K (Score:1)
640K ought to be enough for anybody.
Lynnfield (Score:2)