Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel's 14-nm Broadwell CPU Primed For Slim Tablets 96

crookedvulture writes Intel's next-gen Broadwell processor has entered production, and we now know a lot more about what it entails. The chip is built using 14-nm process technology, enabling it to squeeze into half the power envelope and half the physical footprint of last year's Haswell processors. Even the thickness of the CPU package has been reduced to better fit inside slim tablets. There are new power-saving measures, too, including a duty cycle control mechanism that shuts down sections of the chip during some clock cycles. The onboard GPU has also been upgraded with more functional units and hardware-assisted H.265 decoding for 4K video. Intel expects the initial Broadwell variant, otherwise known as the Core M, to slip into tablets as thin as the iPad Air. We can expect to see the first systems on shelves in time for the holidays.
This discussion has been archived. No new comments can be posted.

Intel's 14-nm Broadwell CPU Primed For Slim Tablets

Comments Filter:
  • Thank GOD (Score:5, Funny)

    by ADRA ( 37398 ) on Monday August 11, 2014 @02:53PM (#47650197)

    Because what I was missing from a tablet was 4K movies!

    • by Lab Rat Jason ( 2495638 ) on Monday August 11, 2014 @03:02PM (#47650247)

      Bow down to my 27" tablet!!!

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      It's about future proofing. Plus H.265 applies to all the resolutions, not just 4K. So you might be able to download a 720p video that's 70% to half the current file size.

      I haven't touched Ivy Bridge or Haswell. I want to hold out for Broadwell or Skylake for a nice and even lower power notebook. That or a future AMD offering.

      • by armanox ( 826486 )

        I went from AMD Bulldozer (FX-8120) to Intel Ivy Bridge (i5-3570K) and couldn't have been happier with the upgrade. Didn't see a need to buy Haswell, and in all honesty I'll probably skip Broadwell as well (maybe. BOINC could always use more computer power...The only game that maxes it out is War of the Vikings, on max settings)

    • I do have a 4k display on my Mac Pro, but I don't have a tablet because I like having one device that can do it all. A Surface Pro with this new chip might end up being that device.
    • Re:Thank GOD (Score:5, Informative)

      by CastrTroy ( 595695 ) on Monday August 11, 2014 @03:09PM (#47650305)
      You're missing the biggest point. It has hardware h.265 support(not to be confused with h.264) which is a newer compression algorithm that allows for even smaller files while maintaining the same video quality, or better quality when using the same bitrate.
      • by ADRA ( 37398 )

        Don't get me wrong, I know the nuance of the change, I just had to laugh that 4K video was the selling feature of a tablet. I'd be hard pressed to see the difference in 1080 / 4K with my 52" TV and I'm 20/20, forget a screen pixel density significantly smaller pixel density rating or even perceived pixel density rating.

        • Yeah, but you can also play video from your tablet to your TV, through HDMI out if you have it, or else streaming to a set-top box. It may not be an extremely common use for tablets, but I've done it before. And a 13" tablet running a "retina" resolution (~300 dpi) would run over 1080p, for whatever that's worth.

          I mean, I'm not sure I care about 4k right now, since 1080p seems to be doing just fine for my purposes. Still, it's not as though the idea is completely stupid.

          • by alen ( 225700 )

            why would i want to connect my tablet to my TV via HDMI so it's a PITA to use it while watching TV? if anything, i like airplay to my apple tv from my ipads to stream cartoons from the Nickelodeon app

            if i'm going to stream to my TV i'll just buy a better apple TV or roku because ARM processors with hardware h.265 are probably on the horizon as well

            • by afidel ( 530433 )

              I use HDMI from my tablet to TVs in hotel rooms when traveling.

            • why would i want to connect my tablet to my TV via HDMI so it's a PITA to use it while watching TV?

              Well, for example, I've used airplay to stream a movie from my iPad to my parents Apple TV when I came to visit. It let us all watch the movie on TV instead of a little iPad, and I just didn't use the iPad while it was streaming.

              if anything, i like airplay to my apple tv from my ipads to stream cartoons from the Nickelodeon app

              Can you do other things now on an iPad while streaming video? Last I checked, if you were using AirPlay, you couldn't switch applications without it stopping the stream, which would negate your previous objection, "why would i want to connect my tablet to my TV... so it's a PITA t

            • Airplay is buggy and not nearly as reliable as a wire. I have had 3 Apple TV boxes for years now. They dont work consistently.
            • You'll still need the hardware h265 decoding to do it via airplay unless you want to watch your iPad suck through its battery before the movie finishes.

          • I'm not sure how it works out on Apple as I don't have an HDMI adapter for my iPad 2 but I've tried it before with some of my Samsung Android tablets and typically what I found is that on stock firmware some apps will not display on HDMI out or even let you take screenshots of the app in the name of copy protection.

            Of course if you're using a non-standard firmware like the incredible CyanogenMod then these copy protection mechanisms seem to be completely ignored but as far as the other side of the fence goe

      • by Anonymous Coward

        ... and what they never seem to mention is that it gets those smaller files at the cost of many times the CPU requirement to decode the stream than h.264 or VP8. There's really not that much groundbreaking as far as the algorithm goes, it's just choosing a different compromise point. This is why hardware support for h.265 and VP9 is required, you really don't want to view those streams on devices on older devices. Or should I say general purpose devices which haven't signed the papers?

    • You just haven't seen a movie the way the director intended, until you've seen in on a 10 inch tablet in 800ppi at an airport. Now, how do I get this 160 gig movie on there.

    • Re:Thank GOD (Score:5, Insightful)

      by VTBlue ( 600055 ) on Monday August 11, 2014 @03:10PM (#47650319)

      Funny, but actually what it means is that you a sandy bridge class core CPU in an iPad Air form factor that dramatically alters the scenarios for usage. A nice port replicator or docking station will make for a clean and minimalist work area. One more generation and graphics will be pretty capable of mainstream gaming. Even with core M, many games will be playable with medium/low settings.

      Currently I'm looking for an excuse to dump my still capable lenovo t400s

      • by mlts ( 1038732 )

        I can see an x86 (well, more accurately x86_64 because it is the AMD 64 bit extensions) tablet taking the role of a main desktop, similar to how the Microsoft Surface Pro is starting to do.

        I would like to see five things on it to make it a serious contender for a desktop replacement role:

        1: Two Thunderbolt connectors on a port replicator or docking station. These would work for video out, as well as provide 8 (in TB 1) or 16 (in the TB 2 spec) PCI lanes. I wonder if this would be enough for an external v

      • If you're running a game you will typically have the GPU and CPU maxed out, so basically all the clever power gating and duty cycle stuff is switched off. Basically, the battery isn't going to last much longer than prev gen CPUs.

        • by VTBlue ( 600055 )

          If you're running a game you will typically have the GPU and CPU maxed out, so basically all the clever power gating and duty cycle stuff is switched off. Basically, the battery isn't going to last much longer than prev gen CPUs.

          If they are iPad/iphone class games then I'm okay with that. But real gaming would be docked anyways with a KB and mouse.

        • by mr_exit ( 216086 )

          That is until the thermal protection kicks in and the game starts to crawl.

          The Ouya found this, they had room to add a small heatsink to the otherwise standard mobile SOC, and were able to get a lot more performance out of it because it wasn't hitting the thermal limits.

      • by Kjella ( 173770 )

        One more generation and graphics will be pretty capable of mainstream gaming.

        I'm not sure if I should disagree with you because there's plenty gaming on phones/tablets today or because the bar of what's mainstream keeps going up but I don't agree. Every time they do a better tablet they also release a new generation of graphics cards and a new generation of games comes out to use it. We no longer run Quake and Crysis is no longer all it's cracked up to be, so next generation I expect the situation to be exactly the same - many games will be playable with medium/low settings. And the

        • by VTBlue ( 600055 )

          I meant desktop class gaming.

          As far as the 150w vs 15w argument I disagree. Desktop components are typically less efficient than mobile components of the same generation. When you start comparing across 2-3 generations, than mobile components can easily perform as well as desktop components 2-3 gen behind. Desktop gaming targets multiple hardware generations usually so you have to factor that in, as game Devs always do. Today more game Devs are targeting hardware chips for better optimization.

          The xb

    • All good news now they just need to lower their prices so they are used more lol
    • That was my first thought. What does a tablet need 4K compatibility for!?

      Though I guess technically rather than having a 50" tablet, it might allow someone to use the tablet as a media device to the TV.

      I used my Samsung phone for example in a pinch when both my Netflix, and my media computer was on the fritz.

      However that said, they better start offering some much larger storage configurations if they plan on people carting around a bunch of movies that don't look like garbage on 4k.

      • by Xenx ( 2211586 )
        Storage capacity is where streaming comes in handy. Not just online streaming, but NAS and the like.
        • Don't have 4K or anything so not sure, but I suspect you may run into bandwidth issues. I guess it is really like offloading the cost of internal storage onto your ISP dl cap.

          • by Xenx ( 2211586 )
            ... You're not likely to run into bandwidth issues OR issues with a download cap, in regards to local network storage.
            • No but when referring to Online streaming you will run into both.

              Also not sure if you can use your Tablet to access your NAS and stream from your NAS to your tablet for rendering to then stream it again to your TV. I think you might find that you can run into at least network issues when trying that.

    • 4K video and pictures on a tablet would look amazing, and a 4K display could display text much more clearly than a 1080p one.

      The only thing that makes 4K on a tablet less desirable than 1080p to me is that a tablet would need a much faster, and presumable power-hungry, graphics subsystem to drive all the pixels in a 4K display, especially for gaming.
    • by Ichijo ( 607641 )

      The latest generation tablets already have resolutions above Full HD and would therefore benefit from 4K video.

    • Comment removed based on user account deletion
      • by Xenx ( 2211586 )
        Because your anecdotal evidence shows how everyone experience is. Out of an office of 11 employees, 8 have tablets. 7 of us use them regularly. The 8th person is waiting for the next Nexus tablet as his old one's usb port isn't working correctly and won't reliably charge. Personally I think we might be an above average sample, but I somehow think you might be a below average sample.
        • Comment removed based on user account deletion
          • by Xenx ( 2211586 )
            I used my office as an extreme. One that I knew the bounds of. Sure, I support thousands of people as well at work. Lots of them user iPads or android tablets. I don't have numbers for them. But, a lot of the ones I actually talk to... prefer it over their computer. It's people like you that actually make it worse. You intentionally recommend people buy inferior products, worsening their opinion of the form factor as a whole. Also, making fun of people tends to piss people off regardless of whether you're r
    • I guess you need your eyes checked, because I can easily tell the difference between a 1080p tablet versus higher resolution tablets because the pixels are visible even at standard viewing distance.

      Anyway, looking at Intel's published die area cost, it adds probably a few pennies to the cost of the CPU to add a 4K decoder. Also, the 4K decoder algorithm didn't have to get developed, it was designed years ago. Once the algorithm is designed most of the process shrink work is done automagically in software. I

      • Are you sure that you are average? Perhaps you should not entirely discount the idea that you are in the 50% of the population with better than average vision.

        I have no trouble seeing the difference between 720p / 1080p on a 55" screen at 5m (15'), what I find strange is that I notice that many other people do. I always thought the figures for average vision must be underestimates, but other people seem to roll with them.

    • Dude, the electronics industry needs 4k to sell us 4k panels for our living rooms. Right now, everyone is happy with a el'cheapo 1080p. Time to step it up to 4K. Personally, I am happy with my 720p plasma TV. I am sad to see plasma go in favor of LCD, LED, OLED or whatever over-saturated color technology is being pushed out cheaply.

  • I am MUCH more interested in Broadwell DESKTOP chips. I'm using a Haswell Xeon E3-1245v3 in a server now, and it speedsteps all the way from 3.4 GHz down to 100 MHz under light load. Ivy Bridge only stepped down to 800 MHz, and Sandy Bridge only stepped down to 1.6 GHz.

    • Since Broadwell-K is not going to launch until half-way through 2015 and Skylake was still on the 2015 roadmap last time I remember seeing one, I would not be surprised if Intel canned Broadwell-K altogether - no point in flooding the market with parts that only have a few months of marketable life in front of them. If Broadwell-K does launch beyond OEMs, it may end up being one of Intel's shortest-lived retail CPUs ever.

      In the first Broadwell roadmaps, there were no plans for socketed desktop parts; all mo

      • But Intel have been bringing out a new CPU every year for years now.
        Cedar Mill came out 6 months before Conroe

        • The P4 was getting destroyed by AMD in benchmarks, the 65nm die shrink failed to translate into significant clock gains and interest in power-efficient desktop CPUs was starting to soar so Intel had little choice but to execute their backup plan to save face: bring their newer and better-performing next-gen Core2 mobile CPU design to the desktop.

          Broadwell only brings minor performance improvements to desktops and shaves a few watts along the way. If Intel decided to scrap Broadwell-K, or perhaps produce the

          • "a few watts" is 30%, which means a few hours more battery life in an ultrabook.
            With a little more CPU power and more GPU too.

            They're also talking 18-cores for the broadwell xeons and desktop chips not coming out till Q2 2015

            Skylake won't be here in 2015.

            • But the comment I was replying to was about Broadwell-K which is the desktop variant. Shaving a few watts on a desktop CPU is not going to get you much battery life even if you have an UPS. Most people who will buy Broadwell-K will be using it with a discrete GPU too.

              • Where did you get Broadwell-K from? Apparently the desktop versions are going to be Broadwell-H
                They're getting the GT3e GPU, which comes with a bunch of EDRAM and hardware support for VP8 and H265 and the GPU can be used for encoding and decoding at the same time.

                • Broadwell-H might be Intel's shipping name but the roadmap name has been Broadwell-K for about a year. That's why you see Broadwell-K used everywhere.

                  The fact that K-series chips (the enthusiast unlocked chips) will be from the Broadwell-K lineup likely contributed to most computer enthusiast sites choosing to stick with the old roadmap name instead of adopting Intel's new production codenames.

      • I think what will be interesting and compelling for Broadwell Desktop is the Iris Pro graphics on LGA parts (not just BGA mobile parts like Haswell.) Certainly it won't be capable of competing with high end cards but you can probably expect mid range discrete graphics performance built in to the CPU.

        For your standard desktop tower gaming rig it doesn't matter much since you will be likely using discrete graphics there anyway, what excites me more is mid range discrete graphics performance without the added

        • While Iris Pro performs quite well when you turn down graphics low enough to fit most of the resources in the 128MB Crystalwell L4 cache, nobody interested in mid-range graphics would be willing to give up this much quality for decent frame rates. Once you exceed that 128MB, even low-end discrete GPUs with GDDR5 take the lead. Broadwell's four extra units are not going to change this by much.

          If Intel released chips with an upgraded 512MB Crystalwell and twice the L4 bandwidth, then that would nuke low-end G

  • 14 nanometers? (Score:5, Insightful)

    by ChipMonk ( 711367 ) on Monday August 11, 2014 @03:51PM (#47650633) Journal
    Given that the covalent radius of silicon is 111 picometers, that comes to a channel that's 63 silicon atoms across.

    And I thought 65nm (~300 silicon atoms across) was impressive five years ago.
  • The transistor budget may still be scaling according to Moore's law, but that's failing to translate into real-world speed increases. The 5% increase in single-core IPC is weak sauce. And an annoying number of apps don't scale to multiple processors, or scale badly (Amdahl's law is unforgiving...)

    You can add more cores, add more compute units to your GPU, or add DSP (Broadwell) or FPGA (Xeon), but that has an ever decreasing marginal impact on real-world speed.

    We're probably stuck in a "5% IPC increase pe

  • Power is governed by change of states per second. It varies by the voltage, but by the square of the current. There's only so much saving from reducing voltage, too, as you run into thermal issues and electron tunnelling errors.

    You are much, much better off by saying "bugger that for a lark", exploiting tunnelling to the limit, switching to a lower resistance interconnect, cooling the silicon below 0'C and ramping up clock speeds. And switching to 128-bit logic and implementing BLAS and FFT in silicon.

    True,

  • I have a Thinkpad 8 and a Miix 2 8. The Thinkpad 8 is a desktop replacement. I use bluetooth for keyboard and mouse, run an HDMI monitor, and stick power through the USB. It works well, but not perfectly. I'll upgrade to a good broadwell or cherrytrail. Anyway, the future looks awesome.

  • ... that is 12 nm slimmer than it might otherwise be without this new technology.
  • These innovations make me want to buy Intel stock.

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...