Intel's 14-nm Broadwell CPU Primed For Slim Tablets 96
crookedvulture writes Intel's next-gen Broadwell processor has entered production, and we now know a lot more about what it entails. The chip is built using 14-nm process technology, enabling it to squeeze into half the power envelope and half the physical footprint of last year's Haswell processors. Even the thickness of the CPU package has been reduced to better fit inside slim tablets. There are new power-saving measures, too, including a duty cycle control mechanism that shuts down sections of the chip during some clock cycles. The onboard GPU has also been upgraded with more functional units and hardware-assisted H.265 decoding for 4K video. Intel expects the initial Broadwell variant, otherwise known as the Core M, to slip into tablets as thin as the iPad Air. We can expect to see the first systems on shelves in time for the holidays.
Thank GOD (Score:5, Funny)
Because what I was missing from a tablet was 4K movies!
Re:Thank GOD (Score:5, Funny)
Bow down to my 27" tablet!!!
Re: (Score:3, Insightful)
It's about future proofing. Plus H.265 applies to all the resolutions, not just 4K. So you might be able to download a 720p video that's 70% to half the current file size.
I haven't touched Ivy Bridge or Haswell. I want to hold out for Broadwell or Skylake for a nice and even lower power notebook. That or a future AMD offering.
Re: (Score:2)
I went from AMD Bulldozer (FX-8120) to Intel Ivy Bridge (i5-3570K) and couldn't have been happier with the upgrade. Didn't see a need to buy Haswell, and in all honesty I'll probably skip Broadwell as well (maybe. BOINC could always use more computer power...The only game that maxes it out is War of the Vikings, on max settings)
Re: (Score:3)
Re: (Score:2)
Re:Thank GOD (Score:5, Informative)
Re: (Score:2)
Don't get me wrong, I know the nuance of the change, I just had to laugh that 4K video was the selling feature of a tablet. I'd be hard pressed to see the difference in 1080 / 4K with my 52" TV and I'm 20/20, forget a screen pixel density significantly smaller pixel density rating or even perceived pixel density rating.
Re: (Score:3)
Yeah, but you can also play video from your tablet to your TV, through HDMI out if you have it, or else streaming to a set-top box. It may not be an extremely common use for tablets, but I've done it before. And a 13" tablet running a "retina" resolution (~300 dpi) would run over 1080p, for whatever that's worth.
I mean, I'm not sure I care about 4k right now, since 1080p seems to be doing just fine for my purposes. Still, it's not as though the idea is completely stupid.
Re: (Score:2)
why would i want to connect my tablet to my TV via HDMI so it's a PITA to use it while watching TV? if anything, i like airplay to my apple tv from my ipads to stream cartoons from the Nickelodeon app
if i'm going to stream to my TV i'll just buy a better apple TV or roku because ARM processors with hardware h.265 are probably on the horizon as well
Re: (Score:3)
I use HDMI from my tablet to TVs in hotel rooms when traveling.
Re: (Score:2)
why would i want to connect my tablet to my TV via HDMI so it's a PITA to use it while watching TV?
Well, for example, I've used airplay to stream a movie from my iPad to my parents Apple TV when I came to visit. It let us all watch the movie on TV instead of a little iPad, and I just didn't use the iPad while it was streaming.
if anything, i like airplay to my apple tv from my ipads to stream cartoons from the Nickelodeon app
Can you do other things now on an iPad while streaming video? Last I checked, if you were using AirPlay, you couldn't switch applications without it stopping the stream, which would negate your previous objection, "why would i want to connect my tablet to my TV... so it's a PITA t
Re: (Score:2)
Re: (Score:2)
You'll still need the hardware h265 decoding to do it via airplay unless you want to watch your iPad suck through its battery before the movie finishes.
Re: (Score:2)
Re: (Score:2)
I'm not sure how it works out on Apple as I don't have an HDMI adapter for my iPad 2 but I've tried it before with some of my Samsung Android tablets and typically what I found is that on stock firmware some apps will not display on HDMI out or even let you take screenshots of the app in the name of copy protection.
Of course if you're using a non-standard firmware like the incredible CyanogenMod then these copy protection mechanisms seem to be completely ignored but as far as the other side of the fence goe
Re: (Score:2)
This is so true... the comment "oh, you can't see the difference between 4k/1080p" is just as much nonsense as the one 10 years ago saying, "oh, you can't see the difference betwee 720p/1080p"
It is people talking out of their rear ends mostly...
Having seen them both in person, 4k blows away 1080p on the proper equipment.
Re: (Score:2)
Which for 99% of the population is never going to happen.
Because most people sit WAY too far away from their TVs - even 720p is "retina" resolution - increasing resolution does absolutely zi
Re: (Score:2)
Because most people sit WAY too far away from their TVs - even 720p is "retina" resolution - increasing resolution does absolutely zip because they can't even resolve the added resolution.
A rough guide is about 1:1 screen size for 1080p
Way too far away from their TVs for what? If your criterion for deciding the correct sitting distance is whether or not you can tell 720p from 1080p then perhaps you have a point, but if the object of the exercise is to watch television in comfort then 1:1 is just silly.
Re: (Score:2)
Maybe because its a a combo of a chip design pivot that saved the company, much like this might do if its successful.
Back in the day AMD was KILLING intel with the amd64 design in price/performance over the Pentium 4 line. Intel scraped that design and went back to Pentium M, the mobile version of the Pentium 3. It called it the core duo/solo. So "core M" makes a lot of sense. A piviot to meet a competitor ( this time ARM ).
Re: (Score:1)
... and what they never seem to mention is that it gets those smaller files at the cost of many times the CPU requirement to decode the stream than h.264 or VP8. There's really not that much groundbreaking as far as the algorithm goes, it's just choosing a different compromise point. This is why hardware support for h.265 and VP9 is required, you really don't want to view those streams on devices on older devices. Or should I say general purpose devices which haven't signed the papers?
the way it was meant... (Score:2)
You just haven't seen a movie the way the director intended, until you've seen in on a 10 inch tablet in 800ppi at an airport. Now, how do I get this 160 gig movie on there.
Re:Thank GOD (Score:5, Insightful)
Funny, but actually what it means is that you a sandy bridge class core CPU in an iPad Air form factor that dramatically alters the scenarios for usage. A nice port replicator or docking station will make for a clean and minimalist work area. One more generation and graphics will be pretty capable of mainstream gaming. Even with core M, many games will be playable with medium/low settings.
Currently I'm looking for an excuse to dump my still capable lenovo t400s
Re: (Score:2)
I can see an x86 (well, more accurately x86_64 because it is the AMD 64 bit extensions) tablet taking the role of a main desktop, similar to how the Microsoft Surface Pro is starting to do.
I would like to see five things on it to make it a serious contender for a desktop replacement role:
1: Two Thunderbolt connectors on a port replicator or docking station. These would work for video out, as well as provide 8 (in TB 1) or 16 (in the TB 2 spec) PCI lanes. I wonder if this would be enough for an external v
Re: (Score:1)
If you're running a game you will typically have the GPU and CPU maxed out, so basically all the clever power gating and duty cycle stuff is switched off. Basically, the battery isn't going to last much longer than prev gen CPUs.
Re: (Score:2)
If you're running a game you will typically have the GPU and CPU maxed out, so basically all the clever power gating and duty cycle stuff is switched off. Basically, the battery isn't going to last much longer than prev gen CPUs.
If they are iPad/iphone class games then I'm okay with that. But real gaming would be docked anyways with a KB and mouse.
Re: (Score:2)
That is until the thermal protection kicks in and the game starts to crawl.
The Ouya found this, they had room to add a small heatsink to the otherwise standard mobile SOC, and were able to get a lot more performance out of it because it wasn't hitting the thermal limits.
Re: (Score:3)
One more generation and graphics will be pretty capable of mainstream gaming.
I'm not sure if I should disagree with you because there's plenty gaming on phones/tablets today or because the bar of what's mainstream keeps going up but I don't agree. Every time they do a better tablet they also release a new generation of graphics cards and a new generation of games comes out to use it. We no longer run Quake and Crysis is no longer all it's cracked up to be, so next generation I expect the situation to be exactly the same - many games will be playable with medium/low settings. And the
Re: (Score:2)
I meant desktop class gaming.
As far as the 150w vs 15w argument I disagree. Desktop components are typically less efficient than mobile components of the same generation. When you start comparing across 2-3 generations, than mobile components can easily perform as well as desktop components 2-3 gen behind. Desktop gaming targets multiple hardware generations usually so you have to factor that in, as game Devs always do. Today more game Devs are targeting hardware chips for better optimization.
The xb
Re: Thank GOD (Score:1)
Marketing (Score:2)
That was my first thought. What does a tablet need 4K compatibility for!?
Though I guess technically rather than having a 50" tablet, it might allow someone to use the tablet as a media device to the TV.
I used my Samsung phone for example in a pinch when both my Netflix, and my media computer was on the fritz.
However that said, they better start offering some much larger storage configurations if they plan on people carting around a bunch of movies that don't look like garbage on 4k.
Re: (Score:2)
Re: (Score:2)
Don't have 4K or anything so not sure, but I suspect you may run into bandwidth issues. I guess it is really like offloading the cost of internal storage onto your ISP dl cap.
Re: (Score:2)
Re: (Score:2)
No but when referring to Online streaming you will run into both.
Also not sure if you can use your Tablet to access your NAS and stream from your NAS to your tablet for rendering to then stream it again to your TV. I think you might find that you can run into at least network issues when trying that.
Re: (Score:2)
The only thing that makes 4K on a tablet less desirable than 1080p to me is that a tablet would need a much faster, and presumable power-hungry, graphics subsystem to drive all the pixels in a 4K display, especially for gaming.
Re: (Score:2)
The latest generation tablets already have resolutions above Full HD and would therefore benefit from 4K video.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I guess you need your eyes checked, because I can easily tell the difference between a 1080p tablet versus higher resolution tablets because the pixels are visible even at standard viewing distance.
Anyway, looking at Intel's published die area cost, it adds probably a few pennies to the cost of the CPU to add a 4K decoder. Also, the 4K decoder algorithm didn't have to get developed, it was designed years ago. Once the algorithm is designed most of the process shrink work is done automagically in software. I
Re: (Score:2)
Are you sure that you are average? Perhaps you should not entirely discount the idea that you are in the 50% of the population with better than average vision.
I have no trouble seeing the difference between 720p / 1080p on a 55" screen at 5m (15'), what I find strange is that I notice that many other people do. I always thought the figures for average vision must be underestimates, but other people seem to roll with them.
Re: (Score:2)
Dude, the electronics industry needs 4k to sell us 4k panels for our living rooms. Right now, everyone is happy with a el'cheapo 1080p. Time to step it up to 4K. Personally, I am happy with my 720p plasma TV. I am sad to see plasma go in favor of LCD, LED, OLED or whatever over-saturated color technology is being pushed out cheaply.
Mobile-only article; snort (Score:2)
I am MUCH more interested in Broadwell DESKTOP chips. I'm using a Haswell Xeon E3-1245v3 in a server now, and it speedsteps all the way from 3.4 GHz down to 100 MHz under light load. Ivy Bridge only stepped down to 800 MHz, and Sandy Bridge only stepped down to 1.6 GHz.
Re: (Score:2)
Since Broadwell-K is not going to launch until half-way through 2015 and Skylake was still on the 2015 roadmap last time I remember seeing one, I would not be surprised if Intel canned Broadwell-K altogether - no point in flooding the market with parts that only have a few months of marketable life in front of them. If Broadwell-K does launch beyond OEMs, it may end up being one of Intel's shortest-lived retail CPUs ever.
In the first Broadwell roadmaps, there were no plans for socketed desktop parts; all mo
Re: (Score:2)
But Intel have been bringing out a new CPU every year for years now.
Cedar Mill came out 6 months before Conroe
Re: (Score:2)
The P4 was getting destroyed by AMD in benchmarks, the 65nm die shrink failed to translate into significant clock gains and interest in power-efficient desktop CPUs was starting to soar so Intel had little choice but to execute their backup plan to save face: bring their newer and better-performing next-gen Core2 mobile CPU design to the desktop.
Broadwell only brings minor performance improvements to desktops and shaves a few watts along the way. If Intel decided to scrap Broadwell-K, or perhaps produce the
Re: (Score:2)
"a few watts" is 30%, which means a few hours more battery life in an ultrabook.
With a little more CPU power and more GPU too.
They're also talking 18-cores for the broadwell xeons and desktop chips not coming out till Q2 2015
Skylake won't be here in 2015.
Re: (Score:2)
But the comment I was replying to was about Broadwell-K which is the desktop variant. Shaving a few watts on a desktop CPU is not going to get you much battery life even if you have an UPS. Most people who will buy Broadwell-K will be using it with a discrete GPU too.
Re: (Score:2)
Where did you get Broadwell-K from? Apparently the desktop versions are going to be Broadwell-H
They're getting the GT3e GPU, which comes with a bunch of EDRAM and hardware support for VP8 and H265 and the GPU can be used for encoding and decoding at the same time.
Re: (Score:2)
Broadwell-H might be Intel's shipping name but the roadmap name has been Broadwell-K for about a year. That's why you see Broadwell-K used everywhere.
The fact that K-series chips (the enthusiast unlocked chips) will be from the Broadwell-K lineup likely contributed to most computer enthusiast sites choosing to stick with the old roadmap name instead of adopting Intel's new production codenames.
Re: (Score:2)
I think what will be interesting and compelling for Broadwell Desktop is the Iris Pro graphics on LGA parts (not just BGA mobile parts like Haswell.) Certainly it won't be capable of competing with high end cards but you can probably expect mid range discrete graphics performance built in to the CPU.
For your standard desktop tower gaming rig it doesn't matter much since you will be likely using discrete graphics there anyway, what excites me more is mid range discrete graphics performance without the added
Re: (Score:2)
While Iris Pro performs quite well when you turn down graphics low enough to fit most of the resources in the 128MB Crystalwell L4 cache, nobody interested in mid-range graphics would be willing to give up this much quality for decent frame rates. Once you exceed that 128MB, even low-end discrete GPUs with GDDR5 take the lead. Broadwell's four extra units are not going to change this by much.
If Intel released chips with an upgraded 512MB Crystalwell and twice the L4 bandwidth, then that would nuke low-end G
Re: (Score:1)
Re: (Score:2)
Going from one third-party chip to another is fine and dandy, but why the hell would Apple -- especially Apple! -- dump their own design?
Re: (Score:2)
Re: (Score:2)
They've done it before--ADB [wikipedia.org].
One issue, though, is how long would Apple maintain both? Yeah, Apple dumped 68K for PowerPC, but they also stopped developing PowerPC. When Apple switched to Intel, they stopped doing things for PowerPC.
They could probably switch the iPad Air to Intel. But what about the iPhone? Apple--and 3rd party developers--would probably end up supporting both ARM and Intel for several more years in iOS.
14 nanometers? (Score:5, Insightful)
And I thought 65nm (~300 silicon atoms across) was impressive five years ago.
Re: (Score:2)
I will be highly surprised if the quantum effects ever let them get close to one atom thick, at that small electrons start doing very weird things that are hard to compensate for. We've pushed process technology just about to it's limit.
Re: (Score:2)
Real-world Moore's Law is toast... (Score:2)
The transistor budget may still be scaling according to Moore's law, but that's failing to translate into real-world speed increases. The 5% increase in single-core IPC is weak sauce. And an annoying number of apps don't scale to multiple processors, or scale badly (Amdahl's law is unforgiving...)
You can add more cores, add more compute units to your GPU, or add DSP (Broadwell) or FPGA (Xeon), but that has an ever decreasing marginal impact on real-world speed.
We're probably stuck in a "5% IPC increase pe
Re: (Score:1)
So, no more .NET and Java...back to the bare metal!
Less power?? (Score:2)
Power is governed by change of states per second. It varies by the voltage, but by the square of the current. There's only so much saving from reducing voltage, too, as you run into thermal issues and electron tunnelling errors.
You are much, much better off by saying "bugger that for a lark", exploiting tunnelling to the limit, switching to a lower resistance interconnect, cooling the silicon below 0'C and ramping up clock speeds. And switching to 128-bit logic and implementing BLAS and FFT in silicon.
True,
Re: (Score:1)
They have the tech to make the best low power processor(s) on the planet...if only they would make an ARM one!
Re: (Score:2)
They did, at one point. They bought the rights to StrongARM and sold it for some time, then abandoned it completely.
Re: (Score:1)
StrongARM was before iDevices...
I mean a 14nm ARM SoC for mobile devices, in fact why wouldn't they now when they push with x86 into this market space?
Can't be profit margins and they do not care about Windows anymore there, do they?
Thinkpad 8 (Score:2)
I have a Thinkpad 8 and a Miix 2 8. The Thinkpad 8 is a desktop replacement. I use bluetooth for keyboard and mouse, run an HDMI monitor, and stick power through the USB. It works well, but not perfectly. I'll upgrade to a good broadwell or cherrytrail. Anyway, the future looks awesome.
I will enjoy having a slimmer tablet ... (Score:1)
Makes me want to buy Intel stock (Score:1)