Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Intel AMD Hardware

Intel Core I7-7700K Kaby Lake Review By Ars Technica: Is the Desktop CPU Dead? (arstechnica.co.uk) 240

Reader joshtops writes: Ars Technica has reviewed the much-anticipated Intel Core i7-7700K Kaby Lake, the recently launched desktop processor from the giant chipmaker. And it's anything but a good sign for enthusiasts who were hoping to see significant improvements in performance. From the review, "The Intel Core i7-7700K is what happens when a chip company stops trying. The i7-7700K is the first desktop Intel chip in brave new post-"tick-tock" world -- which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. [...] If you're still rocking an older Ivy Bridge or Haswell processor and weren't convinced to upgrade to Skylake, there's little reason to upgrade to Kaby Lake. Even Sandy Bridge users may want to consider other upgrades first, such as a new SSD or graphics card. The first Sandy Bridge parts were released six years ago, in January 2011. [...] As it stands, what we have with Kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever smaller manufacturing processes and power envelopes. Where the next major leap in desktop computing power comes from is still up for debate -- but if Kaby Lake is any indication, it won't be coming from Intel. While Ars Technica has complained about the minimal upgrades, AnandTech looks at the positive side: The Core i7-7700K sits at the top of the stack, and performs like it. A number of enthusiasts complained when they launched the Skylake Core i7-6700K with a 4.0/4.2 GHz rating, as this was below the 4.0/4.4 GHz rating of the older Core i7-4790K. At this level, 200-400 MHz has been roughly the difference of a generational IPC upgrade, so users ended up with similar performing chips and the difference was more in the overclocking. However, given the Core i7-7700K comes out of the box with a 4.2/4.5 GHz arrangement, and support for Speed Shift v2, it handily mops the floor with the Devil's Canyon part, resigning it to history.
This discussion has been archived. No new comments can be posted.

Intel Core I7-7700K Kaby Lake Review By Ars Technica: Is the Desktop CPU Dead?

Comments Filter:
  • by Chris Katko ( 2923353 ) on Tuesday January 03, 2017 @03:23PM (#53599461)

    If the article ends with a question mark, the answer is "No". Because if they had evidence to say it, they would have just put a period.

    • by DamonHD ( 794830 ) <d@hd.org> on Tuesday January 03, 2017 @03:30PM (#53599515) Homepage

      I quote:

      Betteridge's law of headlines is one name for an adage that states: "Any headline that ends in a question mark can be answered by the word no." It is named after Ian Betteridge, a British technology journalist, although the principle is much older.

      https://en.wikipedia.org/wiki/... [wikipedia.org]

      Rgds

      Damon

    • If the article ends with a question mark, the answer is "No". Because if they had evidence to say it, they would have just put a period.

      The articles linked end with periods. The headline ends with a clickbait, troll, sensationalist shit-up-the-internet question. Ars used to be better than that.

      • by pla ( 258480 )
        "Used to".

        Ars hasn't been better than that since... Well, since the days when Slashdot was better than having clickbait make it to the FP. :)
      • Ars hasn't been worth reading since Hannibal left. His were the only articles where I'd read something in a field that I knew about and not only fail to spot any glaring errors, but also learn something new. None of the other Ars authors seems to have even a vague clue about what they're writing about.
    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Tuesday January 03, 2017 @04:02PM (#53599749)
      Comment removed based on user account deletion
    • by gweihir ( 88907 )

      Well, any tech article that proclaims something "dead" or asks whether it is "dead" usually is just a sign of a brain-dead writer. Also, anybody that expects any real speed-ups from Intel in the next 2-3 years has no clue how long it takes to fundamentally improve a CPU.

      • by Gr8Apes ( 679165 ) on Tuesday January 03, 2017 @05:15PM (#53600203)

        Also, anybody that expects any real speed-ups from Intel in the next 2-3 years has no clue how long it takes to fundamentally improve a CPU.

        It's been since 2010's release of the 980x that we've only moved up the charts maybe 50% on a per core basis. Note that a 980x is unlocked and can be increased significantly over its stock clocking. A 4790K (the fastest single core performer) can only be OC'd a little bit, so the actual performance differences may actually be significantly less than 50%. And that's just sad given that it's now 7 years later.

        As a final insult, to actually double the performance from 7 years ago, you'll be spending nearly $1500+ for a 10 core 6950, and that's before exercising the considerable headroom of a 980x over that of the 6950.

        • by gweihir ( 88907 )

          Indeed. My take is that AMD will now catch Intel and maybe move a tiny bit ahead (10-20%) in the years to follow. Intel will find those 10-20% as well eventually, but that is basically it for the AMD64 architecture. Not that I am complaining, I think the raw computing power is pretty awesome. Software wastes most of it though, and frameworks, interpreted languages and clueless coders are the main reasons.

          The only real option, baring some fundamental breakthrough (not even on the horizon, caches, pipelining

          • by JanneM ( 7445 )

            The only real option, baring some fundamental breakthrough [...] is massively more and simpler cores

            The problem with that approach is that most problems are not infinitely paralleliseable, and some important problems fundamentally do not parallelise at all. You rapidly hit diminishing returns for more cores, and that's before you consider that you need to go beyond a shared-memory architecture beyond a dozen cores or so.

            The newest generation of supercomputers already have big problems finding jobs that actu

            • by gweihir ( 88907 )

              I am aware of that. Even going to parallelized software done well will not give us much more, but there may still be some real gains to be had in areas like gaming, simulation and classifiers (often misnamed "learning"). One of the nice things of ARM though is that you can have different cores and mix them and that it generally draws much less power. But yes, for many tasks that have no speed-up or really bad speed-up when parallelized, we are now possibly seeing close-to endgame performance. This is not re

    • by sconeu ( 64226 )

      I refuse to believe that the Desktop computer is dead until Netcraft confirms it!

    • by Z00L00K ( 682162 )

      Desktop computers are dead, tower computers aren't dead.

      But the improvements on CPU technology seems to have slowed to snail pace the last few years. Lack of competition combined with less pressure from the market seems to be the cause - computers seems to have reached a flat spot on the push for improved performance for many applications.

      • But the improvements on CPU technology seems to have slowed to snail pace the last few years. Lack of competition combined with less pressure from the market seems to be the cause - computers seems to have reached a flat spot on the push for improved performance for many applications.

        Thing is, even for people who actually use a computer to do work - how many tasks are really CPU-limited anymore? Obviously there are some niches where a faster CPU will improve the efficiency of their workflow... but that's a small percentage of an ever-declining overall percentage.

        • I've just been using some advanced image sharpening software. Waiting an hour between runs to see if I should adjust some input parameter is tedious. I could use a 1000X speedup, and I don't see that happening in my lifetime.
    • by kuzb ( 724081 )
      ...which is why ars is no longer worth reading. Everything is clickbait with them.
  • No. (Score:4, Insightful)

    by Hylandr ( 813770 ) on Tuesday January 03, 2017 @03:29PM (#53599497)

    A story comes out like this at least twice a year. The harsh / glorious reality hasn't changed. If you want to get real work done it's going to be on a desktop. Even laptops get docked with a proper keyboard, mouse and at least 1 extra monitor when it's time for heavy lifting.

    Then again one has to wonder at the headline. Tech update 'NEW Cpu!' Combined with the leading question, 'Is the desktop dead'. Will the new Slashdot owners please stop treating these message boards like the alphabet channels and focus on the geek culture? Sure it's yours but can you at least pretend it's not been subjugated by the mainstream entertainment industry?

    Also, any headline that asks a question can be answered with 'No'.

    • Re:No. (Score:5, Interesting)

      by DamonHD ( 794830 ) <d@hd.org> on Tuesday January 03, 2017 @03:36PM (#53599559) Homepage

      The harsh / glorious reality hasn't changed. If you want to get real work done it's going to be on a desktop.

      Depends what you mean by "real". Yes, I got paid megabuck(s) in banking to optimise quant algos across cores, CPUs and servers in (eg) the Credit dept at Lehman's, but I find my nominally underpowered MacBook Air (the saleswoman was slightly reluctant to sell it to me when I said I was a dev) to generally be damn good for what I need, including some decent data driven models and analysis, wrapped in not-even-optimised C++ unit tests, and running within a Java-based IDE!

      So, horses for courses.

      Also, I am the happy owner of an RPi that does all the work a Sun server farm used to do for me:

      http://www.earth.org.uk/note-o... [earth.org.uk]

      and I target my primary code to 8-bit MCUs similar to a Z80A form 30Y ago in power, running some nice slim highly-optimised distributed coding.

      Cut your suit to fit your cloth.

      Rgds

      Damon

      • by Jawnn ( 445279 )
        I'd say it depends more on how you define "get it done". I can perform any of my regular chores on my laptop, but not as quickly nor as easily. My desktop has more grunt per dollar spent on the basic platform, and has three displays and a much friendlier mouse and keyboard. For my workload, looking from one monitor to another is a lot more effective than alt-tab, repeat.
        • I can perform any of my regular chores on my laptop, but not as quickly nor as easily.

          I guess it depends on whether your job lets you ride a bus or train as opposed to driving or cycling. Transit users can make productive use of commute time, for which a laptop is more efficient than not having a suitable computer at all.

          • by DamonHD ( 794830 )

            Well, there's a good point, yes.

            So I can get my work done at my desk or kitchen table or on the sofa or in bed, as well as on the train and when hot-desking (since my company doesn't have 'an' office, so we meet up ~1/week). Having a desktop in all of those places would be ... impractical.

            And as to the GP point about "getting it done", yes bigger more ergonomic displays would be good sometimes, but impractical in many of the places I work, as before.

            And in terms of keeping me waiting: I often find that the

      • by Hylandr ( 813770 )

        and I target my primary code to 8-bit MCUs similar to a Z80A form 30Y ago in power, running some nice slim highly-optimised distributed coding.

        You forgot to add 'up and down hills for 100 miles in the snow on a bicycle backwards while on your way to school.' :) It sounds like what you're working on is mostly text. While a Tablet, phone or laptop can certainly host a terminal window, typing speed is still much faster with a proper keyboard. imho.

        Cut your suit to fit your cloth.

        Very interesting quote.Following that comparison I wonder how much our smart phones clothe us today?

        If the amount and quality of clothing were expressed in computing power then the first astronauts to land

        • by DamonHD ( 794830 )

          and I target my primary code to 8-bit MCUs similar to a Z80A form 30Y ago in power, running some nice slim highly-optimised distributed coding.

          You forgot to add 'up and down hills for 100 miles in the snow on a bicycle backwards while on your way to school.' :) It sounds like what you're working on is mostly text. While a Tablet, phone or laptop can certainly host a terminal window, typing speed is still much faster with a proper keyboard. imho.

          I am actually from Yorkshire and resemble that remark! We fought over our holes in ground...

          But again, my MBAir keyboard is one of the better ones I've used, and I do a lot of typing (including code and words for a living). Laptop ergonomics are not great, but in any case to come back to the original point of the fine article, that hardly has a very strong connection with the CPU type. Or am I misunderstanding you?

          I do live my terminal windows and vi though!

          Cut your suit to fit your cloth.

          Very interesting quote.Following that comparison I wonder how much our smart phones clothe us today?

          If the amount and quality of clothing were expressed in computing power then the first astronauts to land on the moon did so wearing loincloths. now there's a mental visual!

          Very very scanty string thongs.

          The first (Cray

    • I can't imagine buying a laptop for a machine that sits in my basement to hold a lot of files for my family to use. Or a machine to back up that machine.
    • I think you misread the headline. It asks, "Is the desktop CPU dead," not, "Is the desktop dead." This is nothing to do with keyboards and mice. It's about whether there are CPUs that are specifically designed for desktops, ones that are a lot more powerful than the ones in laptops.

      • by Hylandr ( 813770 )

        Looks like another headline edit.

        Going to have to start making screenshots. CNN does the same thing to it's headlines.

  • [sic] (Score:2, Informative)

    by Anonymous Coward

    [sic] does not mean [wikipedia.org] what you think it means.

  • by randomErr ( 172078 ) <.ervin.kosch. .at. .gmail.com.> on Tuesday January 03, 2017 @03:32PM (#53599531) Journal
    It seem like the major development is switching to portable devices. Will ARM or the new RISC become the new standard in desktops? The Raspberry Pi's are good enough for most people's needs.
    • This ARM board looks promising:
      64-bit
      4 GiB of RAM
      32GB of eMMC flash
      802.11ac WiFi
      Ethernet
      2x M.2 PCIe
      USB 3
      $199
      https://www.kickstarter.com/pr... [kickstarter.com]

      or higher end, and much more expensive AMD A1100 series Opteron:
      http://softiron.com/products/o... [softiron.com]

      • by fnj ( 64210 )

        That's about the best Arm64 board I've seen, but still comes up short despite being overpriced. No SATA, and no DIMM slots for real RAM expandability.

        • SATA could be fixed. The CPU doesn't support SATA but it does have limited PCI-express so motherboards could just add controller chip. The RAM limit can't be solved that easily, most ARM chips are made with mobile phones in mind and they are generally limited to 4GB or something like that, the most I've seen is 6GB. Motherboard makers just solder on the max amount supported.

          I do love that these small boards have neat things that desktop computers just don't have, like 4k camera support and hardware x264 vi
    • by Luthair ( 847766 )

      If we're talking Windows running x86 software on ARM, I doubt it. I have hard time believing we're not going to be seeing Netbook 2.0 here. The top ARM processors aren't as powerful as the commonly used x86 processors (which incidentally people claim the base Macbook isn't powerful enough), then add a translation penalty.

      The second half of this equation is also, if the manufacturer goes for the cheaper, slower CPU they will also do the same for every other part and end up with a slow piece of junk.

      • by tepples ( 727027 )

        I have hard time believing we're not going to be seeing Netbook 2.0 here.

        If it's Netbook 2.0, count me in, because a 10 inch laptop was small enough not to be quite as obvious of a target for thieves compared to a 13-17 inch laptop.

        The top ARM processors aren't as powerful as the commonly used x86 processors

        But are they more powerful than the 1-core, 2-thread Atom CPUs from the netbook era, especially when skipping the translation layer by running software recompiled by its publisher or (in the case of free software) by the user's copy of GCC or Microsoft C++? One reason Surface RT 1 and 2 failed was that Microsoft deliberately locked out publishers and

    • by caseih ( 160668 )

      I have my doubts. Like I've said before, I have a drawer full of various ARM devices that all turned out to be less useful in real life than they looked on paper. The main problem is that there is just no standard for ARM socs. Each one requires a custom kernel and distribution. They don't have common hardware trees, and most importantly they lack a common, open boot loader. So you're always fighting with some custom uBoot. Would far rather have a normal EFI bios in there and have the ability to boot of

      • by e r ( 2847683 )
        Agreed. The thing that is holding ARM back is just how standardized x86 is and how fly-by-night and slipshod ARM's infrastructure is.
        • slipshod ARM's infrastructure is.

          That and the quazi-open status of most chips is what kills it for me. I had a SheevaPlug [wikipedia.org] years ago that was great. It ran my house's HVAC and web server and did some light downloading. The uBoot was fairly straight forward and the Kirkwood chipset made its way into Debian. I never had a reason to replace it so I didn't.

          Recently I got a CubieBoard [cubieboard.org] since it billed itself as "Open Source Hardware". It was shit. Nothing on it was open. uBoot was a mess. It only ran specific versions of Ubuntu that didn't have a

      • I have my doubts. Like I've said before, I have a drawer full of various ARM devices that all turned out to be less useful in real life than they looked on paper. The main problem is that there is just no standard for ARM socs. Each one requires a custom kernel and distribution. They don't have common hardware trees, and most importantly they lack a common, open boot loader.

        ^^^ This.

        Folks that can't understand why it's such an effort for their carrier to update the Android OS on their device, or why they can't just compile AOSP and flash it onto their phones should read this.

    • Plain old RISC was a fad. On general purpose hardware a CISC instruction set with a RISC execution pipeline is hands down better w.r.t. performance, and thats what (for example) both Intel and AMD are already doing.

      As far as ARM penetrating the desktop space, I think its a certainty that the x86 line will eventually fall due to licensing. Intel is losing the FAB edge (they are now arguably just keeping up) and if all these other FAB's cant produce x86, they will still produce something. Maybe ARM takes ov
    • Yes, they are absolutely coming to .. servers and laptops and eventually desktops. Remember, all we need is the right major crisis and the nations will accept the GNU World Order (Many think David Rockefeller said "New World Order" but GNU is actually pronounced new).

      Today we have something called ReactOS which is an Android distribution for x86/x86-64 computers. I have an older laptop that I put Fedora and ReactOS on in dual-boot and this let me do something interesting: Benchmark Android on said laptop
  • by ErichTheRed ( 39327 ) on Tuesday January 03, 2017 @03:38PM (#53599571)

    For me, docking stations and big monitors allow me to use my laptop in a reasonably comfortable work environment. But, there are still use cases for desktop PCs, especially those that aren't shoved into the back of an all-in-one monitor. You're not going to let a call center employee in a regulated, locked down environment pull out his iPad or laptop to work, for example. A cash register is likely going to be some sort of PC, same thing with a kiosk or ATM. And at the high end, workstations are meant for "real" work - though most have the Xeon processors in them. It's an interesting time; desktops and thin clients are sort of merging and tablet use is demanding more of CPU manufacturers' attention. And this makes sense - mobile stuff has the constant pressure to be squeezed into smaller spaces, produce less heat, provide more on-chipset functionality and run cooler at the same time. I'm still surprised when I see a Surface Pro or other convertible tablet and remember that there's a full-fat Intel processor crammed inside that tiny case without melting through the bottom!

    I just think the desktop market is maturing and there's less and less that Intel processors and chipsets don't natively provide. PC processors are already insanely fast and powerful for what typical users throw at them. Desktops aren't dead, they're just a niche market these days, but one that is still there. The pundits want to claim that no one wants a powerful client device and just wants all their stuff streamed from the cloud onto a tablet or phone they don't control. I think that's true in the consumer space, but businesses still have use cases for desktops.

    • by barc0001 ( 173002 ) on Tuesday January 03, 2017 @03:59PM (#53599725)

      That's not really what the "question" in the article was implying though. I completely agree that desktops are going to be a thing for ages to come yet (and I have 2), but the question was lazily trying to point out that performance increases on the desktop are seemingly coming to a halt for newer chips. This isn't really a surprise for me, as I've got a 5 year old i5 2500K in my home machine that is keeping pace with even the newer games just fine as long as I spend a couple hundred bucks every 2-3 years on a new video card. Same at the office. We went to assess our 3 year upgrade cycle for workstations and realized we'd only get a 20-25% boost in peak processing power by spending our full per-person budget on new machines and instead decided to keep what we have, switch all boot OS drives to SSD, max out the RAM and get 32" monitors and we STILL have money left over.

      I'm not sure if AMD's got anything in the pipeline that can shake things up, but if they do, this is their chance (again).

      • by Bigjeff5 ( 1143585 ) on Tuesday January 03, 2017 @04:11PM (#53599807)

        I'm not sure if AMD's got anything in the pipeline that can shake things up, but if they do, this is their chance (again).

        Some of the official stuff released about Ryzen look pretty spectacular. It's still not clear whether it will be able to beat Intel in total performance, but it's looking damn close, which is really encouraging to me. Furthermore, they are actually introducing new technologies in the chip, rather than slightly polishing old ones.

        I have my doubts that AMD will fully match Intel this cycle, let alone beat them, but it gives me hope for the future. It's pretty clear right now who is resting on their laurels and who is driving to be the future of CPUs.

    • by Kjella ( 173770 )

      For me, docking stations and big monitors allow me to use my laptop in a reasonably comfortable work environment. But, there are still use cases for desktop PCs, especially those that aren't shoved into the back of an all-in-one monitor. You're not going to let a call center employee in a regulated, locked down environment pull out his iPad or laptop to work, for example.

      No, but neither is he likely to use a proper desktop, Thin clients and virtualization are so much easier to deal with if you consider fixed locations and centralized control to be a feature. Sure there are those with particular workstation or input/output device needs but not the average corporate desktop. If it wasn't for gaming I think the desktop would be relegated to a small, small niche.

      • say a call center running Thin clients over dual screens is a lot of network bandwidth. In some settings you may need dual / 1 big screen to be able to show lot's of info at the same time.

  • kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever smaller manufacturing processes and power envelopes.

    Disasters like Netburst aside, is this not the usual pattern.

    1) Invest millions in designing a new architecture. Incorporating everything learned about CPU design in the past, try and open as big advantage over your nearest competitors as possible.
    2) profit.
    3) Make minor revisions to protect your advantage and create an excuse for the high performance market where your biggest margins are to buy new parts
    4) profit some more, and with greater margin
    5)... repeat as long as competition / existing design allow

  • For CPUs, there's really not a lot that left to do. Stream video? Load Facebook? I'm pretty sure the older chips do that just fine.

    The real action is in GPUs.

  • Two trivial benchmarks really?
  • Lets compare (Score:5, Informative)

    by Anonymous Coward on Tuesday January 03, 2017 @03:50PM (#53599657)

    Top Kaby Lake Intel Core i7-7700K @ 4.2GHz has a Passmark score of 12800 for $350 at 95W released Q4 2016
    Top Sandy Bridge Intel Core i7-3970X @ 3.5GHz has Passmark score of 12651 for $770 at 150W released Q1 2012

    So yes, it looks like 4 years got us 1/3 less power and 1/2 price for same performance of the top Extreme Sandy Bridge

    http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-3970X+%40+3.50GHz&id=1799
    http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-7700K+%40+4.20GHz&id=2874

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Yep, my own 7 year old PC gets 11,800 on passmark. I recently got from ebay an old and cheap Xeon W3680, which is 7 years old. But overclocked to 4.3 GHz it's still able to mix it with the latest high end desktop CPUs. It's remarkable to see how little difference 7 years has made. In 2010, a 7 year old PC would be obsolete, not still up there with the latest kit.

    • by Kjella ( 173770 )

      Of course the "top extreme" of anything will always be ridiculously poor value for money. A i7-2600k @ 3.4GHz has a Passmark score of 8488 for $317 at 95W released Q1 2011, correcting for inflation it's pretty much same price, same power, +50% performance increase in six years or about 7% annually. That's ten years to double performance, the next generation will have eight times the performance in 30 years. Granted if it was anything other than computers it wouldn't be that bad, but if you compare the 2010s

      • I see it as pretty obvious that x86/x86-64 CPUs stalled out because Intel decided to milk the market due to the lack of competition (from AMD). Look at Intels offerings in the Xeon E7 Family for examples of exactly why I say this is obvious. It's not like chips that are far better than what is currently offered to average consumers do not exist, they do - they are just priced outrageously.

        If AMD delivers with Ryzen and offers something with a good IPC and lots of cores at half the price of Intel then per
        • by Kjella ( 173770 )

          Hardware x264 and x265 video decoding and x264 video encoding has been standard in ARM chips for years. Intel just got x264 decoding. They don't have 265 decoding and they don't have any hardware video encoding. I could go on but my point is that Intel has fallen way behind because they figured they didn't have competition.

          Not sure what you're smoking, H264 encode and decode support has been there since Sandy Bridge [wikipedia.org] 6 years ago and Kaby Lake does H265 Main10 decoding in UHD resolution as well as 8 bit encoding. Maybe you've used a poor media player?

    • by Malc ( 1751 )

      How's the hardware accelerated iQSV HEVC encoding at 4K on that Sandy Bridge system? Oh yeah: it's not possible.

      • Works just fine on my i7-2600k system because I've got an AMD video card that handles it way better than Intel's Quick Sync or whatever they call it ever did or will in the foreseeable future.

    • Top Sandy Bridge is a Xeon E5-2690 which has a passmark score of 20699 at 135W TDP.
      I personally have bought a E5-2665 a few months ago, for only 70 Euros. Not bad for a CPU with a passmark score of 12084. Finding a motherboard was a bit difficult, though. Then again used ECC RAM is far cheaper than desktop RAM.
      Top Ivy Bridge by the way would be also a Xeon E5-2690, but this time V2. Passmark says 16546. Costs some serious money, though whereas the still very fast V1 can be bought for about 300 Euros.

  • Hyperbolic? NEVER! (Score:5, Insightful)

    by Gravis Zero ( 934156 ) on Tuesday January 03, 2017 @03:53PM (#53599673)

    How many times do we need people to declare the "desktop is dead!" or some other equally preposterous hyperbolic statement? Does someone feel like /. doesn't have enough hyperbole because I will just die if there is someone like that. -_-

    • Yo moron, the title said "Is the desktop CPU dead?" not desktop computers. The article was talking about the performance of the desktop CPU being at a stand still. Try reading the article sometime.
    • As long as Intel is top dog in the CPU market, the Desktop CPU is indeed dead.

      However, only an idiot would think that means the Desktop PC is dead.

      This is the perfect opportunity for another party to sweep in and innovate the CPU market. Whether it will be a traditional x86 style CPU manufacturer like AMD, or an ARM or RISC style conentder, who knows. But the longer Intel stagnates, the larger the opportunity will grow.

  • Maybe dying? (Score:5, Interesting)

    by duke_cheetah2003 ( 862933 ) on Tuesday January 03, 2017 @04:09PM (#53599787) Homepage

    I am definitely a bit underwhelmed by the release of the new CPUs from Intel. They're not really all that much better than Sandy Bridge i7s, which is what I have (2 of them.)

    Is the desktop computer dead? Na. But it may be dying. The improvements we've come to expect over the years has definitely slowed down quite a bit compared to previous jumps in performance.

    Have we reached some kind of 'peak' in designing faster and faster CPU's? I definitely think a kick to the pocket book of Intel is this underwhelming release. If Intel and/or other manufacturers cannot convince users to upgrade their computers it could definitely be trouble for the desktop computer. I certainly don't feel like I need to upgrade, my i7-2600 based PC seems to run anything/everything I throw at it, quite well. Lackluster performance in new generation of computers isn't very wise, because you're going to need a bigger jump to convince people to upgrade. It's of course not helping that older Core series (and Core2's for that matter) are STILL running todays browsers, operating systems and various software quite well. Should be noted, AMD Turion X2s are also about on par with Core2's. Still running todays stuff pretty handily. That hurts the manufacturers a lot, used to be you had to upgrade, now its more like, "might be nice to upgrade, but not really necessary." The more times they release something new and it's lackluster, the more it hurts, cuz people will be in the mindset, like me, "That's not a big improvement, I'll wait for the next big thing." I certainly feel no compelling reason to jump to this new CPU. 600mhz of performance, for the price of basically replacing my entire PC? Na, pass.

    One could get the impression the desktop is a dying breed of computer, I suppose. Certainly seems like things are headed in a different direction (mobile computing, tablets, etc) for mainstream consumers. But I definitely feel like the industry can and will cater to whichever group of people will earn them the most profit. That seems to be mobile computing right now. And it seems like the news reflects this. Seeing much bigger jumps in performance in the mobile CPU offerings (Qualcomm's Snapdragon CPU are darn impressive!)

    • Are people reading challenged. The article was not talking about the end of desktop computers. It was talking about CPU performance. Whether or not there will be performance increases like those in the past is doubtful without adequate competition. The latest line of Intel processors is proof of that. That's all.
      • by DamonHD ( 794830 )

        Um, I don't think it's about CPU performance in general (server and supercomputer CPUs continue to exist and generally outperform desktops in many metrics) but about the dynamics of one *segment* of the CPU market, one that used to be dominant.

        Or am I reading it wrong, too?

        Rgds

        Damon

      • Are people reading challenged.

        You certainly are. The blasted TITLE of this article reads: Is the desktop CPU dead?

        Think before you post, eh?

    • by AHuxley ( 892839 )
      The single GPU can just get to 4K without SLI. The next 4K GPU generation for games will need a bit more CPU power.
      Until 4K at the very best setting is one GPU ready and needs a new CPU, the CPU profit taking will fill in the release gap.
      Solutions exist for the very best in art, photography, move, broadcast media.
      So the games are pushing for 4K but thats a gpu and lcd generation away from been perfect at the max quality settings and top frame rates.
  • Seems like a low threshold to describe something as "dead". I stumbled on the stairs the other day, luckily journalists didn't start writing my obituary.

  • Any of the last several generations of Intel CPUs can run any modern application just fun. Up to and including top-end gaming.

    There is no incentive to innovate, so there is no innovation. Desktop CPUs will remain in a holding pattern until something happens to force their hand.

  • by Joe_Dragon ( 2206452 ) on Tuesday January 03, 2017 @05:20PM (#53600239)

    ZEN ZEN ZEN with more pci-e at the same price or less.

    Intel may need to go back to there old tricks again to lock out AMD.

  • Big bucks and still 4 lousy cores. Huge amount of R&D went into 10% overall performance increase compared to Skylake (or really anything semi-recent). I want more damn cores, and drop the useless GPU that is wasting silicon area. 6-8 kickass cores should be the norm these days, but intel wants a massive premium for that.

    Yes, I know that most software only uses up to 4 cores today. But I don't care. More cores being common will be a big incentive for software developers to find ways to use that unta

    • I recently upgraded my 5 year old 3770K (quad core) desktop to a new 6800K (hex core). I am working on a data management system that can utilize as many cores as are available. It not only can do more things at once (e.g. multiple separate queries) with more cores, but it can also break up a single query so that parts of it can run in parallel on separate cores. By doing this, a single large query can often complete faster on 6 cores than it can on 4 cores. Even though each core on the 6800K had a lower clo
  • by ffkom ( 3519199 ) on Tuesday January 03, 2017 @07:19PM (#53601115)
    ... at either company. Right now, Intel just has no financial incentive to innovate. Maybe that is going to change in 2017.
  • by King_TJ ( 85913 ) on Tuesday January 03, 2017 @07:41PM (#53601275) Journal

    The impression I'm getting in recent years is that we're transitioning towards a computing world where individual consumers primarily want portables, or alternately, "all in one" or super small form-factor desktops which just use mobile motherboards and CPUs anyway.

    The high-end "power users" who tell you they still need a desktop machine for the work they do are best served by a "workstation" class system, vs. a regular desktop PC. The primary differentiation between a "desktop" and a "workstation"? Seems to be the inclusion of a Xeon class processor, originally intended to go into servers. Secondarily, workstations tend to offer the highly costly video cards optimized for use with CAD/CAM and other graphics design packages.

    • Workstation GPU's are garbage. Only speaking from what I see Dell stick into the machines we have in my design center, they are crap. I got a brand new workstation when I joined 3 years ago, and the GPU was listed for about $500, and did have 4 mini displayport outputs, but could not drive 4k screens, and had major issues driving 4 1920x1080 screens. My GTX750TI at home was more powerful and drove 4k no problem, and got about 1/3 as much. The only saving grace for my workstation is that it can take up t

  • http://techreport.com/review/3... [techreport.com] Conclusion: "If time is money for your work, and your work can take advantages of lots of threads, the i7-6950X is the fastest high-end desktop CPU we've ever tested, full stop. If you don't need all of its cores and threads, however, the Core i7-7700K arguably delivers the best gaming performance on the market for about a fifth of the price. Intel's Extreme Edition CPUs have never been good values, but the i7-6950X takes the definition of "halo product" to eye-watering ne
  • Because according to /. editors DJT will cause the end of the world on January 21,2017.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...