Intel Core I7-7700K Kaby Lake Review By Ars Technica: Is the Desktop CPU Dead? (arstechnica.co.uk) 240
Reader joshtops writes: Ars Technica has reviewed the much-anticipated Intel Core i7-7700K Kaby Lake, the recently launched desktop processor from the giant chipmaker. And it's anything but a good sign for enthusiasts who were hoping to see significant improvements in performance. From the review, "The Intel Core i7-7700K is what happens when a chip company stops trying. The i7-7700K is the first desktop Intel chip in brave new post-"tick-tock" world -- which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. [...] If you're still rocking an older Ivy Bridge or Haswell processor and weren't convinced to upgrade to Skylake, there's little reason to upgrade to Kaby Lake. Even Sandy Bridge users may want to consider other upgrades first, such as a new SSD or graphics card. The first Sandy Bridge parts were released six years ago, in January 2011. [...] As it stands, what we have with Kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever smaller manufacturing processes and power envelopes. Where the next major leap in desktop computing power comes from is still up for debate -- but if Kaby Lake is any indication, it won't be coming from Intel. While Ars Technica has complained about the minimal upgrades, AnandTech looks at the positive side: The Core i7-7700K sits at the top of the stack, and performs like it. A number of enthusiasts complained when they launched the Skylake Core i7-6700K with a 4.0/4.2 GHz rating, as this was below the 4.0/4.4 GHz rating of the older Core i7-4790K. At this level, 200-400 MHz has been roughly the difference of a generational IPC upgrade, so users ended up with similar performing chips and the difference was more in the overclocking. However, given the Core i7-7700K comes out of the box with a 4.2/4.5 GHz arrangement, and support for Speed Shift v2, it handily mops the floor with the Devil's Canyon part, resigning it to history.
First rule of journalism. (Score:5, Insightful)
If the article ends with a question mark, the answer is "No". Because if they had evidence to say it, they would have just put a period.
Re:First rule of journalism. (Score:5, Informative)
I quote:
Betteridge's law of headlines is one name for an adage that states: "Any headline that ends in a question mark can be answered by the word no." It is named after Ian Betteridge, a British technology journalist, although the principle is much older.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Rgds
Damon
Re: (Score:2)
Somehow, this headline looks like the exception that proves the rule
Re: (Score:2)
SIC!
[sic]
Re: (Score:2)
If the article ends with a question mark, the answer is "No". Because if they had evidence to say it, they would have just put a period.
The articles linked end with periods. The headline ends with a clickbait, troll, sensationalist shit-up-the-internet question. Ars used to be better than that.
Re: (Score:2)
Ars hasn't been better than that since... Well, since the days when Slashdot was better than having clickbait make it to the FP.
Re: (Score:3)
Comment removed (Score:4, Insightful)
Re: (Score:2)
Well, any tech article that proclaims something "dead" or asks whether it is "dead" usually is just a sign of a brain-dead writer. Also, anybody that expects any real speed-ups from Intel in the next 2-3 years has no clue how long it takes to fundamentally improve a CPU.
Re:First rule of journalism. (Score:5, Insightful)
Also, anybody that expects any real speed-ups from Intel in the next 2-3 years has no clue how long it takes to fundamentally improve a CPU.
It's been since 2010's release of the 980x that we've only moved up the charts maybe 50% on a per core basis. Note that a 980x is unlocked and can be increased significantly over its stock clocking. A 4790K (the fastest single core performer) can only be OC'd a little bit, so the actual performance differences may actually be significantly less than 50%. And that's just sad given that it's now 7 years later.
As a final insult, to actually double the performance from 7 years ago, you'll be spending nearly $1500+ for a 10 core 6950, and that's before exercising the considerable headroom of a 980x over that of the 6950.
Re: (Score:2)
Indeed. My take is that AMD will now catch Intel and maybe move a tiny bit ahead (10-20%) in the years to follow. Intel will find those 10-20% as well eventually, but that is basically it for the AMD64 architecture. Not that I am complaining, I think the raw computing power is pretty awesome. Software wastes most of it though, and frameworks, interpreted languages and clueless coders are the main reasons.
The only real option, baring some fundamental breakthrough (not even on the horizon, caches, pipelining
Re: (Score:3)
The problem with that approach is that most problems are not infinitely paralleliseable, and some important problems fundamentally do not parallelise at all. You rapidly hit diminishing returns for more cores, and that's before you consider that you need to go beyond a shared-memory architecture beyond a dozen cores or so.
The newest generation of supercomputers already have big problems finding jobs that actu
Re: (Score:2)
I am aware of that. Even going to parallelized software done well will not give us much more, but there may still be some real gains to be had in areas like gaming, simulation and classifiers (often misnamed "learning"). One of the nice things of ARM though is that you can have different cores and mix them and that it generally draws much less power. But yes, for many tasks that have no speed-up or really bad speed-up when parallelized, we are now possibly seeing close-to endgame performance. This is not re
Re: First rule of journalism. (Score:2)
Re: (Score:3)
Re: (Score:3)
I refuse to believe that the Desktop computer is dead until Netcraft confirms it!
Re: (Score:2)
Desktop computers are dead, tower computers aren't dead.
But the improvements on CPU technology seems to have slowed to snail pace the last few years. Lack of competition combined with less pressure from the market seems to be the cause - computers seems to have reached a flat spot on the push for improved performance for many applications.
Re: (Score:2)
But the improvements on CPU technology seems to have slowed to snail pace the last few years. Lack of competition combined with less pressure from the market seems to be the cause - computers seems to have reached a flat spot on the push for improved performance for many applications.
Thing is, even for people who actually use a computer to do work - how many tasks are really CPU-limited anymore? Obviously there are some niches where a faster CPU will improve the efficiency of their workflow... but that's a small percentage of an ever-declining overall percentage.
Re: (Score:2)
Re: (Score:2)
No. (Score:4, Insightful)
A story comes out like this at least twice a year. The harsh / glorious reality hasn't changed. If you want to get real work done it's going to be on a desktop. Even laptops get docked with a proper keyboard, mouse and at least 1 extra monitor when it's time for heavy lifting.
Then again one has to wonder at the headline. Tech update 'NEW Cpu!' Combined with the leading question, 'Is the desktop dead'. Will the new Slashdot owners please stop treating these message boards like the alphabet channels and focus on the geek culture? Sure it's yours but can you at least pretend it's not been subjugated by the mainstream entertainment industry?
Also, any headline that asks a question can be answered with 'No'.
Re:No. (Score:5, Interesting)
The harsh / glorious reality hasn't changed. If you want to get real work done it's going to be on a desktop.
Depends what you mean by "real". Yes, I got paid megabuck(s) in banking to optimise quant algos across cores, CPUs and servers in (eg) the Credit dept at Lehman's, but I find my nominally underpowered MacBook Air (the saleswoman was slightly reluctant to sell it to me when I said I was a dev) to generally be damn good for what I need, including some decent data driven models and analysis, wrapped in not-even-optimised C++ unit tests, and running within a Java-based IDE!
So, horses for courses.
Also, I am the happy owner of an RPi that does all the work a Sun server farm used to do for me:
http://www.earth.org.uk/note-o... [earth.org.uk]
and I target my primary code to 8-bit MCUs similar to a Z80A form 30Y ago in power, running some nice slim highly-optimised distributed coding.
Cut your suit to fit your cloth.
Rgds
Damon
Re: (Score:2)
Laptops are good for transit users (Score:2)
I can perform any of my regular chores on my laptop, but not as quickly nor as easily.
I guess it depends on whether your job lets you ride a bus or train as opposed to driving or cycling. Transit users can make productive use of commute time, for which a laptop is more efficient than not having a suitable computer at all.
Re: (Score:2)
Well, there's a good point, yes.
So I can get my work done at my desk or kitchen table or on the sofa or in bed, as well as on the train and when hot-desking (since my company doesn't have 'an' office, so we meet up ~1/week). Having a desktop in all of those places would be ... impractical.
And as to the GP point about "getting it done", yes bigger more ergonomic displays would be good sometimes, but impractical in many of the places I work, as before.
And in terms of keeping me waiting: I often find that the
Re: (Score:2)
and I target my primary code to 8-bit MCUs similar to a Z80A form 30Y ago in power, running some nice slim highly-optimised distributed coding.
You forgot to add 'up and down hills for 100 miles in the snow on a bicycle backwards while on your way to school.' :) It sounds like what you're working on is mostly text. While a Tablet, phone or laptop can certainly host a terminal window, typing speed is still much faster with a proper keyboard. imho.
Cut your suit to fit your cloth.
Very interesting quote.Following that comparison I wonder how much our smart phones clothe us today?
If the amount and quality of clothing were expressed in computing power then the first astronauts to land
Re: (Score:3)
and I target my primary code to 8-bit MCUs similar to a Z80A form 30Y ago in power, running some nice slim highly-optimised distributed coding.
You forgot to add 'up and down hills for 100 miles in the snow on a bicycle backwards while on your way to school.' :) It sounds like what you're working on is mostly text. While a Tablet, phone or laptop can certainly host a terminal window, typing speed is still much faster with a proper keyboard. imho.
I am actually from Yorkshire and resemble that remark! We fought over our holes in ground...
But again, my MBAir keyboard is one of the better ones I've used, and I do a lot of typing (including code and words for a living). Laptop ergonomics are not great, but in any case to come back to the original point of the fine article, that hardly has a very strong connection with the CPU type. Or am I misunderstanding you?
I do live my terminal windows and vi though!
Cut your suit to fit your cloth.
Very interesting quote.Following that comparison I wonder how much our smart phones clothe us today?
If the amount and quality of clothing were expressed in computing power then the first astronauts to land on the moon did so wearing loincloths. now there's a mental visual!
Very very scanty string thongs.
The first (Cray
Re: (Score:2)
Re: (Score:2)
I think you misread the headline. It asks, "Is the desktop CPU dead," not, "Is the desktop dead." This is nothing to do with keyboards and mice. It's about whether there are CPUs that are specifically designed for desktops, ones that are a lot more powerful than the ones in laptops.
Re: (Score:2)
Looks like another headline edit.
Going to have to start making screenshots. CNN does the same thing to it's headlines.
Re:No. (Score:5, Interesting)
So many people without clues these days. Around here, I expect more understanding, even if the reader's needs don't fit the niche.
If your primary interest is in performance, especially when overclocking, a laptop chassis isn't going to have the thermal dissipation.. Hell never mind that, just try a 5 hour video encode on most laptops.. I wish you luck. They'll hit max temp and throttle big time. I've seen some with warped boards from excessive heat damage..
[sic] (Score:2, Informative)
[sic] does not mean [wikipedia.org] what you think it means.
ARM Processors coming to Desktops? (Score:5, Interesting)
Firefly-RK3399 (Score:2)
This ARM board looks promising:
64-bit
4 GiB of RAM
32GB of eMMC flash
802.11ac WiFi
Ethernet
2x M.2 PCIe
USB 3
$199
https://www.kickstarter.com/pr... [kickstarter.com]
or higher end, and much more expensive AMD A1100 series Opteron:
http://softiron.com/products/o... [softiron.com]
Re: (Score:2)
That's about the best Arm64 board I've seen, but still comes up short despite being overpriced. No SATA, and no DIMM slots for real RAM expandability.
Re: No SATA and no RAM expandability (Score:2)
I do love that these small boards have neat things that desktop computers just don't have, like 4k camera support and hardware x264 vi
Re: (Score:2)
Re: (Score:3)
ARM doesn't have anywhere near the IPC of intel chips.
But thats on purpose. They are targeting different markets. Intel tried to shoulder in on ARM's power efficiency market and hoped that its greater performance would make up for not being better at power efficiency, but all they ended up with was an under-performing x86 that nobody really wanted. They have since backed off on that push and have instead re-focused on keeping ARM from making a big dent in the server space.
Re: (Score:2)
If we're talking Windows running x86 software on ARM, I doubt it. I have hard time believing we're not going to be seeing Netbook 2.0 here. The top ARM processors aren't as powerful as the commonly used x86 processors (which incidentally people claim the base Macbook isn't powerful enough), then add a translation penalty.
The second half of this equation is also, if the manufacturer goes for the cheaper, slower CPU they will also do the same for every other part and end up with a slow piece of junk.
Re: (Score:2)
I have hard time believing we're not going to be seeing Netbook 2.0 here.
If it's Netbook 2.0, count me in, because a 10 inch laptop was small enough not to be quite as obvious of a target for thieves compared to a 13-17 inch laptop.
The top ARM processors aren't as powerful as the commonly used x86 processors
But are they more powerful than the 1-core, 2-thread Atom CPUs from the netbook era, especially when skipping the translation layer by running software recompiled by its publisher or (in the case of free software) by the user's copy of GCC or Microsoft C++? One reason Surface RT 1 and 2 failed was that Microsoft deliberately locked out publishers and
Re: (Score:2)
I have my doubts. Like I've said before, I have a drawer full of various ARM devices that all turned out to be less useful in real life than they looked on paper. The main problem is that there is just no standard for ARM socs. Each one requires a custom kernel and distribution. They don't have common hardware trees, and most importantly they lack a common, open boot loader. So you're always fighting with some custom uBoot. Would far rather have a normal EFI bios in there and have the ability to boot of
Re: (Score:2)
Re: (Score:2)
slipshod ARM's infrastructure is.
That and the quazi-open status of most chips is what kills it for me. I had a SheevaPlug [wikipedia.org] years ago that was great. It ran my house's HVAC and web server and did some light downloading. The uBoot was fairly straight forward and the Kirkwood chipset made its way into Debian. I never had a reason to replace it so I didn't.
Recently I got a CubieBoard [cubieboard.org] since it billed itself as "Open Source Hardware". It was shit. Nothing on it was open. uBoot was a mess. It only ran specific versions of Ubuntu that didn't have a
Re: (Score:2)
I have my doubts. Like I've said before, I have a drawer full of various ARM devices that all turned out to be less useful in real life than they looked on paper. The main problem is that there is just no standard for ARM socs. Each one requires a custom kernel and distribution. They don't have common hardware trees, and most importantly they lack a common, open boot loader.
^^^ This.
Folks that can't understand why it's such an effort for their carrier to update the Android OS on their device, or why they can't just compile AOSP and flash it onto their phones should read this.
Re: (Score:3)
As far as ARM penetrating the desktop space, I think its a certainty that the x86 line will eventually fall due to licensing. Intel is losing the FAB edge (they are now arguably just keeping up) and if all these other FAB's cant produce x86, they will still produce something. Maybe ARM takes ov
Re: ARM Processors coming to Desktops? (Score:3)
Today we have something called ReactOS which is an Android distribution for x86/x86-64 computers. I have an older laptop that I put Fedora and ReactOS on in dual-boot and this let me do something interesting: Benchmark Android on said laptop
Desktops aren't dead (Score:5, Insightful)
For me, docking stations and big monitors allow me to use my laptop in a reasonably comfortable work environment. But, there are still use cases for desktop PCs, especially those that aren't shoved into the back of an all-in-one monitor. You're not going to let a call center employee in a regulated, locked down environment pull out his iPad or laptop to work, for example. A cash register is likely going to be some sort of PC, same thing with a kiosk or ATM. And at the high end, workstations are meant for "real" work - though most have the Xeon processors in them. It's an interesting time; desktops and thin clients are sort of merging and tablet use is demanding more of CPU manufacturers' attention. And this makes sense - mobile stuff has the constant pressure to be squeezed into smaller spaces, produce less heat, provide more on-chipset functionality and run cooler at the same time. I'm still surprised when I see a Surface Pro or other convertible tablet and remember that there's a full-fat Intel processor crammed inside that tiny case without melting through the bottom!
I just think the desktop market is maturing and there's less and less that Intel processors and chipsets don't natively provide. PC processors are already insanely fast and powerful for what typical users throw at them. Desktops aren't dead, they're just a niche market these days, but one that is still there. The pundits want to claim that no one wants a powerful client device and just wants all their stuff streamed from the cloud onto a tablet or phone they don't control. I think that's true in the consumer space, but businesses still have use cases for desktops.
Re:Desktops aren't dead (Score:5, Insightful)
That's not really what the "question" in the article was implying though. I completely agree that desktops are going to be a thing for ages to come yet (and I have 2), but the question was lazily trying to point out that performance increases on the desktop are seemingly coming to a halt for newer chips. This isn't really a surprise for me, as I've got a 5 year old i5 2500K in my home machine that is keeping pace with even the newer games just fine as long as I spend a couple hundred bucks every 2-3 years on a new video card. Same at the office. We went to assess our 3 year upgrade cycle for workstations and realized we'd only get a 20-25% boost in peak processing power by spending our full per-person budget on new machines and instead decided to keep what we have, switch all boot OS drives to SSD, max out the RAM and get 32" monitors and we STILL have money left over.
I'm not sure if AMD's got anything in the pipeline that can shake things up, but if they do, this is their chance (again).
Re:Desktops aren't dead (Score:4, Interesting)
I'm not sure if AMD's got anything in the pipeline that can shake things up, but if they do, this is their chance (again).
Some of the official stuff released about Ryzen look pretty spectacular. It's still not clear whether it will be able to beat Intel in total performance, but it's looking damn close, which is really encouraging to me. Furthermore, they are actually introducing new technologies in the chip, rather than slightly polishing old ones.
I have my doubts that AMD will fully match Intel this cycle, let alone beat them, but it gives me hope for the future. It's pretty clear right now who is resting on their laurels and who is driving to be the future of CPUs.
Re: (Score:2)
AMD: We're now in second place! Second place!!
Re: (Score:2)
For me, docking stations and big monitors allow me to use my laptop in a reasonably comfortable work environment. But, there are still use cases for desktop PCs, especially those that aren't shoved into the back of an all-in-one monitor. You're not going to let a call center employee in a regulated, locked down environment pull out his iPad or laptop to work, for example.
No, but neither is he likely to use a proper desktop, Thin clients and virtualization are so much easier to deal with if you consider fixed locations and centralized control to be a feature. Sure there are those with particular workstation or input/output device needs but not the average corporate desktop. If it wasn't for gaming I think the desktop would be relegated to a small, small niche.
Re: (Score:2)
say a call center running Thin clients over dual screens is a lot of network bandwidth. In some settings you may need dual / 1 big screen to be able to show lot's of info at the same time.
Isn't this always the way? (Score:2)
kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever smaller manufacturing processes and power envelopes.
Disasters like Netburst aside, is this not the usual pattern.
1) Invest millions in designing a new architecture. Incorporating everything learned about CPU design in the past, try and open as big advantage over your nearest competitors as possible.
2) profit.
3) Make minor revisions to protect your advantage and create an excuse for the high performance market where your biggest margins are to buy new parts
4) profit some more, and with greater margin
5)... repeat as long as competition / existing design allow
There's nowhere to go (Score:2)
For CPUs, there's really not a lot that left to do. Stream video? Load Facebook? I'm pretty sure the older chips do that just fine.
The real action is in GPUs.
Re: (Score:2)
for other than gamers and CAD operators? my old computer does movies just fine....
Thats a Review Now? (Score:2)
Lets compare (Score:5, Informative)
Top Kaby Lake Intel Core i7-7700K @ 4.2GHz has a Passmark score of 12800 for $350 at 95W released Q4 2016
Top Sandy Bridge Intel Core i7-3970X @ 3.5GHz has Passmark score of 12651 for $770 at 150W released Q1 2012
So yes, it looks like 4 years got us 1/3 less power and 1/2 price for same performance of the top Extreme Sandy Bridge
http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-3970X+%40+3.50GHz&id=1799
http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-7700K+%40+4.20GHz&id=2874
Re: (Score:2, Informative)
Yep, my own 7 year old PC gets 11,800 on passmark. I recently got from ebay an old and cheap Xeon W3680, which is 7 years old. But overclocked to 4.3 GHz it's still able to mix it with the latest high end desktop CPUs. It's remarkable to see how little difference 7 years has made. In 2010, a 7 year old PC would be obsolete, not still up there with the latest kit.
Re: (Score:3)
Of course the "top extreme" of anything will always be ridiculously poor value for money. A i7-2600k @ 3.4GHz has a Passmark score of 8488 for $317 at 95W released Q1 2011, correcting for inflation it's pretty much same price, same power, +50% performance increase in six years or about 7% annually. That's ten years to double performance, the next generation will have eight times the performance in 30 years. Granted if it was anything other than computers it wouldn't be that bad, but if you compare the 2010s
Re: It's because AMD quit (Score:2)
If AMD delivers with Ryzen and offers something with a good IPC and lots of cores at half the price of Intel then per
Re: (Score:2)
Hardware x264 and x265 video decoding and x264 video encoding has been standard in ARM chips for years. Intel just got x264 decoding. They don't have 265 decoding and they don't have any hardware video encoding. I could go on but my point is that Intel has fallen way behind because they figured they didn't have competition.
Not sure what you're smoking, H264 encode and decode support has been there since Sandy Bridge [wikipedia.org] 6 years ago and Kaby Lake does H265 Main10 decoding in UHD resolution as well as 8 bit encoding. Maybe you've used a poor media player?
Re: (Score:3)
How's the hardware accelerated iQSV HEVC encoding at 4K on that Sandy Bridge system? Oh yeah: it's not possible.
Re: (Score:2)
Works just fine on my i7-2600k system because I've got an AMD video card that handles it way better than Intel's Quick Sync or whatever they call it ever did or will in the foreseeable future.
Re: (Score:3)
Top Sandy Bridge is a Xeon E5-2690 which has a passmark score of 20699 at 135W TDP.
I personally have bought a E5-2665 a few months ago, for only 70 Euros. Not bad for a CPU with a passmark score of 12084. Finding a motherboard was a bit difficult, though. Then again used ECC RAM is far cheaper than desktop RAM.
Top Ivy Bridge by the way would be also a Xeon E5-2690, but this time V2. Passmark says 16546. Costs some serious money, though whereas the still very fast V1 can be bought for about 300 Euros.
Hyperbolic? NEVER! (Score:5, Insightful)
How many times do we need people to declare the "desktop is dead!" or some other equally preposterous hyperbolic statement? Does someone feel like /. doesn't have enough hyperbole because I will just die if there is someone like that. -_-
Re: (Score:2)
Re: (Score:2)
As long as Intel is top dog in the CPU market, the Desktop CPU is indeed dead.
However, only an idiot would think that means the Desktop PC is dead.
This is the perfect opportunity for another party to sweep in and innovate the CPU market. Whether it will be a traditional x86 style CPU manufacturer like AMD, or an ARM or RISC style conentder, who knows. But the longer Intel stagnates, the larger the opportunity will grow.
Maybe dying? (Score:5, Interesting)
I am definitely a bit underwhelmed by the release of the new CPUs from Intel. They're not really all that much better than Sandy Bridge i7s, which is what I have (2 of them.)
Is the desktop computer dead? Na. But it may be dying. The improvements we've come to expect over the years has definitely slowed down quite a bit compared to previous jumps in performance.
Have we reached some kind of 'peak' in designing faster and faster CPU's? I definitely think a kick to the pocket book of Intel is this underwhelming release. If Intel and/or other manufacturers cannot convince users to upgrade their computers it could definitely be trouble for the desktop computer. I certainly don't feel like I need to upgrade, my i7-2600 based PC seems to run anything/everything I throw at it, quite well. Lackluster performance in new generation of computers isn't very wise, because you're going to need a bigger jump to convince people to upgrade. It's of course not helping that older Core series (and Core2's for that matter) are STILL running todays browsers, operating systems and various software quite well. Should be noted, AMD Turion X2s are also about on par with Core2's. Still running todays stuff pretty handily. That hurts the manufacturers a lot, used to be you had to upgrade, now its more like, "might be nice to upgrade, but not really necessary." The more times they release something new and it's lackluster, the more it hurts, cuz people will be in the mindset, like me, "That's not a big improvement, I'll wait for the next big thing." I certainly feel no compelling reason to jump to this new CPU. 600mhz of performance, for the price of basically replacing my entire PC? Na, pass.
One could get the impression the desktop is a dying breed of computer, I suppose. Certainly seems like things are headed in a different direction (mobile computing, tablets, etc) for mainstream consumers. But I definitely feel like the industry can and will cater to whichever group of people will earn them the most profit. That seems to be mobile computing right now. And it seems like the news reflects this. Seeing much bigger jumps in performance in the mobile CPU offerings (Qualcomm's Snapdragon CPU are darn impressive!)
Re: (Score:2)
Re: (Score:2)
Um, I don't think it's about CPU performance in general (server and supercomputer CPUs continue to exist and generally outperform desktops in many metrics) but about the dynamics of one *segment* of the CPU market, one that used to be dominant.
Or am I reading it wrong, too?
Rgds
Damon
Re: (Score:2)
Are people reading challenged.
You certainly are. The blasted TITLE of this article reads: Is the desktop CPU dead?
Think before you post, eh?
Re: (Score:2)
Until 4K at the very best setting is one GPU ready and needs a new CPU, the CPU profit taking will fill in the release gap.
Solutions exist for the very best in art, photography, move, broadcast media.
So the games are pushing for 4K but thats a gpu and lcd generation away from been perfect at the max quality settings and top frame rates.
Re: (Score:3)
it's not important if software still runs on your computer, the problem is that after 5-6 years, I don't see a need to upgrade, because the new chip is hardly faster. I would expect the calculations to be 2x as fast. So I would love to buy a new pc, yes everything works, but if I have to wait less long for something I would love to upgrade... The only thing happening here, is I pay 1000$ and I probably won't notice any difference...
This was the entire gist of my post. If new offerings are only a marginal improvement over what I have now, and I'm likely not to notice much of a change in performance, why should I upgrade? And this is a self-perpetuating problem. The more times they release lackluster improvements, the more times we opt not to upgrade, they lose more profit and decide against developing better cuz better isn't selling.
Hyperbolic (Score:2)
Seems like a low threshold to describe something as "dead". I stumbled on the stairs the other day, luckily journalists didn't start writing my obituary.
Not dead, just plateaued. (Score:2)
Any of the last several generations of Intel CPUs can run any modern application just fun. Up to and including top-end gaming.
There is no incentive to innovate, so there is no innovation. Desktop CPUs will remain in a holding pattern until something happens to force their hand.
ZEN ZEN ZEN with more pci-e at the same price or (Score:3)
ZEN ZEN ZEN with more pci-e at the same price or less.
Intel may need to go back to there old tricks again to lock out AMD.
4 cores, yet again (Score:2)
Big bucks and still 4 lousy cores. Huge amount of R&D went into 10% overall performance increase compared to Skylake (or really anything semi-recent). I want more damn cores, and drop the useless GPU that is wasting silicon area. 6-8 kickass cores should be the norm these days, but intel wants a massive premium for that.
Yes, I know that most software only uses up to 4 cores today. But I don't care. More cores being common will be a big incentive for software developers to find ways to use that unta
Re: (Score:2)
Let's hope AMD's RyZen will cause some progress... (Score:5, Insightful)
But.... Xeon? (Score:3)
The impression I'm getting in recent years is that we're transitioning towards a computing world where individual consumers primarily want portables, or alternately, "all in one" or super small form-factor desktops which just use mobile motherboards and CPUs anyway.
The high-end "power users" who tell you they still need a desktop machine for the work they do are best served by a "workstation" class system, vs. a regular desktop PC. The primary differentiation between a "desktop" and a "workstation"? Seems to be the inclusion of a Xeon class processor, originally intended to go into servers. Secondarily, workstations tend to offer the highly costly video cards optimized for use with CAD/CAM and other graphics design packages.
Re: (Score:2)
Workstation GPU's are garbage. Only speaking from what I see Dell stick into the machines we have in my design center, they are crap. I got a brand new workstation when I joined 3 years ago, and the GPU was listed for about $500, and did have 4 mini displayport outputs, but could not drive 4k screens, and had major issues driving 4 1920x1080 screens. My GTX750TI at home was more powerful and drove 4k no problem, and got about 1/3 as much. The only saving grace for my workstation is that it can take up t
TechReport Review is Favorable (Score:2)
A course the answer is YES (Score:2)
Because according to /. editors DJT will cause the end of the world on January 21,2017.
joshtops doesn't know about ellipses? (Score:3, Informative)
I think that this "joshtops" character may not know about ellipses, and is wrongly using "[sic]" where any sensible person would use "..." to indicate that some text was removed from the quoted material. Even then, given how much text is omitted, any reasonable person would probably just use several separate quotations instead of trying to cram it all into one big and mangled quotation.
Re: (Score:3)
So kind of like a hiccup, but misspelled.
Re: (Score:2)
Just because Huxley chose to use an article, doesn't mean every subsequent use of that phrase needs to as well.
Dropping articles makes you sound like Indian. (Score:2)
There should be a "the" before "brave".
Re: (Score:2)
The book's title is Brave New World. No article, definite or otherwise.
Re: (Score:3)
I would like to see the CPU RAM bottleneck get as much attention as the CPU itself has. Unless we get that addressed I really don't see faster chips doing much good.
Re: (Score:2)
Quite a lot of what I paid attention to in banking work was that bottleneck, and it has been an issue since the earliest days of computing (my old prof would roll his eyes and talk about data stalls on the MU5...).
So one job of making stuff run well is to cut bloat and make more of it fit in cache, have fewer branches/misses in inner loops, and reduce data flows generally where possible.
Actually, I'm enjoying the ATMega328P with NO caches and a whole 2k of RAM! B^>
Rgds
Damon
Re: (Score:2)
Games, all the time. I'd love if a turn of Civ6 took less time. I'd pay more for that then I would a network or disk speedup.
Re: (Score:2)
When is the raw CPU speed a bottleneck anymore?
Whenever the user waits for a computation to complete. Lots of filters in photoshop, for example, are waited for by the user that is applying them.
Re: (Score:3)
Professional software development typically has compilation steps that can use all processors at 100% for minutes at a time. I can easily use 100% of all CPUs for 20 minutes straight when compiling 5 million lines of source code scattered across 25,000 files. Which I do several times per day typically.
Of course, not all of compilation is embarassingly parallel; there is usually a link step at the end which cannot be multithreaded (at least not by current tools) and which just sits there adding another 5 m
Re: (Score:2)
The future is already here, in the form of ARM CPUs. They are pretty much without a whole lot of the baggage the INTEL processors have, even if it is "emulated" at this point. There were a few good break points that were possible during the long run of the x86 where Intel could have dropped legacy support, but failed to do so.
Re: (Score:2)
That's because dropping what you call "legacy support" would have erased their main advantage over ARM -- raw balls-to-the-wall no-compromise performance. In car terms, an i7 is like a maxed out Tesla Model S P90D with "ludicrous mode" switched on... by comparison, an ARM is like a Chevy Volt.
Re: (Score:2)
Take a look at Cavium ... 48 cores x 64bit at 2.5 ghz.
http://www.cavium.com/ThunderX... [cavium.com]
Re: (Score:2)
I know transistor counts are way up, but Elbrus is VLIW, and I don't understand how the limitations of Itanium or the Transmeta chip won't still be a factor. It's not as if the Russians are the first to try VLIW.
Re: (Score:3)
Over time the best way to get the most work into the pipeline per cycle changes. Right now Intel CPU's can pull in at most 4 instructions per cycle into the pipeline (unless that has changed with the latest update)
Re: (Score:2)
Much recent bottom line performance can be attributed to faster RAM and a faster chipset to support it.
CPUs have been RAM pipe constrained for decades.
Intel boads don't have the pci-e lanes for that (Score:2)
Intel boads don't have the pci-e lanes for that and even with the Intel 200 series boards you are still pushing a lot over the pci-e 3.0 X4 dmi link. With the other X16 going to video.
With AMD they need to have X16 + usb, pci-e storage, etc on there own lanes. To crush Intel and that is the low end the higher end seemes to be X16 X16 + chip set stuff on there own.