Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Intel Hardware

Intel Demos Kaby Lake 7th Gen Core Series Running Overwatch At IDF (hothardware.com) 56

Reader MojoKid writes: Intel unveiled a number of new product innovations out at IDF last week, but the company also stuck to its core product march by teasing its next gen Core series processor. Kaby Lake is the follow-up product to current, 6th Generation Skylake-based Core processors. With Kaby Lake, Intel is adding native support for USB 3.1 Gen 2, along with a more powerful graphics architecture for improved 3D performance and 4K video processing. Kaby Lake will also bring with it native HDCP 2.2 support and hardware acceleration for HEVC Main10/10-bit and VP9 10-bit video decoding. To drive some of those points home, Intel showed off Overwatch running on a next-gen Dell XPS 13 built around a 7th Gen ULV Core i5 processor, in addition to a HP notebook smoothly playing back 4K HDR video. Kaby Lake 7th Generation Core-based products should start arriving to market in the fall.
This discussion has been archived. No new comments can be posted.

Intel Demos Kaby Lake 7th Gen Core Series Running Overwatch At IDF

Comments Filter:
  • "HP notebook smoothly playing back 4K HDR video"

    Is this not possible at the moment?

    • Re:4K HDR video (Score:5, Insightful)

      by CajunArson ( 465943 ) on Monday August 22, 2016 @10:00AM (#52748159) Journal

      HDR HEVC video on 4K is not trivial to process on a very low power CPU. That's where the hardware acceleration comes in as being important.

      As a point of reference, my desktop 4770K that's overclocked to 4.7GHz can have problems with playback of 60 Hz 4K HEVC video when in software mode, and that's with a software decoder that's using all available cores too.

      With mpv setup properly, my GTX-1080 can show the same videos perfectly smoothly with single-digit CPU usage and the GPU doesn't even really heat up that much either since only a relatively small part of the GPU actually does the video acceleration. The hardware accelerated paths for video decoding are quite important.

      • You know, I never have problems running 1080 or 4k content in a browser (Netflix or Youtube) with my Intel i7-3770 with / GeForce G2X 275 desktop. However, CPU usage goes up and frames start dropping the moment I rent an HD movie from iTunes. Why, for the love of all that's holy, is iTunes still the biggest pile-o-shit on a PC??!!

        • by darkain ( 749283 )

          The bitrate is probably exponentially higher. Netflix and Youtube isn't quite known for having stellar bitrates on their videos...

  • Nice but... (Score:1, Redundant)

    ....the digital revolution is coming to and end. Moores Law has ended already and as a corollary the processing power of digital computers will be incremental. This is a big deal, because it throws future developments into doubt. Will we ever be able to handle the ever increasing processor needs of applications? A lot of people are depending on seemingly infinite processing power to get real AI. Is this ever going to be possible? It seems unlikely since we are seeing only processor improvements of 30% per g
    • We may be reaching the limits of current design, but there is a path forward to continue with performance leaps: https://en.wikipedia.org/wiki/... [wikipedia.org]
    • ....the digital revolution is coming to and end.

      So Intel's next processor will be 'Swan Lake'?

      • I think Intel has been dancing Swan Lake for quite a while now. There haven't been much in raw processor improvements for several years.
    • by Z80a ( 971949 )

      I think one of the possible ways they will go is following that HBM memory idea.
      The idea would be to manufacture the CPUs/GPUs etc into physically separate parts, and then assemble em into this "megazord" into a single package. This allows you to get around the yield problems of manufacturing "giant" chips, and even get more fine grained prices/performance setups.
      Of course, this idea brings its own problems like coolling the thing, having precise machines to get a good yield on the chip aligning, the reduce

      • by Z80a ( 971949 )

        To be more specific, as it didn't got very clear, those parts are not just "cores and GPUs", but caches, integer/FPU pipelines, branch predictors, instruction decoders etc..

    • by Black.Shuck ( 704538 ) on Monday August 22, 2016 @10:37AM (#52748391)

      Too bad, but it was nice while it lasted!

      You mean we'll have to start work on optimising our software?

      Shit.

    • A lot of people are depending on seemingly infinite processing power to get real AI.

      Actually, pretty much everybody in the AI-world is convinced that running AI on general purpose computing hardware is very inefficient and that artificial neuron like-hardware is the future for fast AI.

    • by tlhIngan ( 30335 )

      ....the digital revolution is coming to and end. Moores Law has ended already and as a corollary the processing power of digital computers will be incremental. This is a big deal, because it throws future developments into doubt. Will we ever be able to handle the ever increasing processor needs of applications? A lot of people are depending on seemingly infinite processing power to get real AI. Is this ever going to be possible? It seems unlikely since we are seeing only processor improvements of 30% per g

  • by ledow ( 319597 ) on Monday August 22, 2016 @10:02AM (#52748171) Homepage

    The most pointless, short, useless and under-described "demo" I've ever seen.

    I'm not familiar with Overwatch's spec but pretty much they show one short-range view of two static robots turn the corner to walk up some stairs with some skyboxes, then jump back and that's IT. Nothing there performance-related. And we know why. Compared to a real graphics card, it can't compete.

    All the other stuff was pretty meh too. Oh look, it's faster than previous generations. Cool. I should hope so otherwise it's pointless trying to sell it.

    • Don't be so mean! Shouldn't it be totally exciting that Intel's coming-real-soon-now-we-promise integrated GPUs are capable of running a game aimed at the low cost AMD GPUs of three years ago that power today's consoles?

      I know I'm excited!
  • And that is where Intel needs to step up and no 8-12 lines out the of PCH that is feed by an X4 DMI does not court.

  • WTF is USB3.1 Gen2?

    USB3.1 wasn't a colossal clusterfuck already, that they had to make it even *more* complicated?

    I wish Apple would pull their thumbs out and just license their lightning connector. Simple, clean design, and none of this 50 Shades of USB nonsense.

    • by swb ( 14022 )

      It's too bad the USB consortium can't get their marketing speak right.

      As I understand it, what we think of USB 3 is really USB 3.1 gen 1. Gen 2 adds 10 Gbits/sec as the maximum speed.

      It's too bad they're marketing speed is so brain damaged, the widespread USB 3 has managed to produce useful high speeds with negligible CPU overhead.

      Getting 10 Gbits/sec out that port is pretty decent, and I wish there was better vendor support for devices traditionally connected via SAS for use of 3.1 gen 2 ports.

    • They had their chance, and blew it. Not USB-C is the defecto standard mandated be the EU for universal charging. Why else do you think the MacBooks are going this route? Lowest common denominator. And yes, Lightning is a superior connector.

    • https://en.wikipedia.org/wiki/... [wikipedia.org]

      It is that newfangled connector that adds more power, reversibility (of the connector, and host/device relationship) and some Thunderbolt integration into a USB spec.

  • Integrating a GPU is nice, but can somebody show me a benchmark that proves Kaby Lake is actually any faster than Skylake? Anybody that cares about gaming is going to disable the built-in GPU and run a GTX 1080 anyway, so what's the point? More cores? No current games use all the cores.
    • If 8 cores were standard, I think you would see game engines putting a lot of effort into making use of them. I wish the dead silicon of the GPU in my skylake was 2 more cores, it would be more value than a disabled crappy GPU in a high'ish end machine.

      The current pricing for a 6 or 8 core CPU is obscene, highway robbery at its worst.

  • Maybe I missed something but the fx8300 says Copyright 2011 on it and besides the 220W joke they played on everyone, they haven't come out with anything since. What exactly are they doing over there at AMD? SOMEONE needs to put some pressure on intel to make them lower their utterly ridiculous prices.
    • I can't tell if you are serious or not, but their Zen architecture should be dropping soon, and they at least in theory have caught up with the Intel CPUs of a generation or two ago.

      If they have a good price point, they might start actually giving Intel some competition, which is good, since Intel has done next to nothing very interesting since the Ivy/Sandy Bridge days.

  • by Anonymous Coward

    Kaby Lake will also bring with it native HDCP 2.2 support

    Woooo! that sounds like an awesome feature, will it definitely stop me doing what i want with the output of my graphics card, for realsies this time? And native, I'm so glad i can do away with the HDCP 2.2 add-on dongle i've had strapped to my PC to restrict me in the meantime.
    1)if i've completely misunderstood the purpose of modern HDCP, oops, sorry, but who's going to RTFA or google for a spec?

  • Really. (Score:4, Interesting)

    by John Smith ( 4340437 ) on Monday August 22, 2016 @01:59PM (#52750161)
    Kaby Lake is Skylake Refresh. No new wafer, just slightly improved clocks, slightly lower prices and a new chipset.
  • Is that what they're saying? Doesn't that go on the Southbridge?

Faith may be defined briefly as an illogical belief in the occurence of the improbable. - H. L. Mencken

Working...