Intel's Upcoming Coffee Lake CPUs Won't Work With Today's Motherboards (pcworld.com) 240
Intel's upcoming Coffee Lake CPUs won't work with existing 200-series motherboards that support Kaby Lake, a manufacturer confirmed on Wednesday. In a Twitter post by Asrock last Saturday, the company confirmed the news when asked if "the Z270 Supercarrier [will] get support for the upcoming @intel Coffee Lake CPUs." Their response: "No, Coffee Lake CPU is not compatible with 200-series motherboards." PCWorld reports: According to at least one reliable source outside of Intel, the new Coffee Lake CPU will indeed not be compatible with Z270 boards, even though the chipsets with the upcoming Z370 appear to be the same, PCWorld was told. The source added that there are hopes in the industry that Intel will change its mind on compatibility. Tomshardware.com said it had independently confirmed the news with Asrock officials as well.
Why this matters: The vast majority of new CPU sales are in new systems, and they likely won't be impacted by the incompatibility. However, there's also a very large and very vocal crowd of builders and upgraders who still swap out older, slower CPUs for newer, faster CPUs to maximize their investment. An upgrade-in-place doesn't sell an Intel chipset, but it at least keeps them on the Intel platform. If consumers are forced to dump an existing Z270 motherboard for a newer Z370 to get a six-core Coffee Lake CPU, Intel risks driving them into the arms of AMD and its Ryzen CPUs.
Why this matters: The vast majority of new CPU sales are in new systems, and they likely won't be impacted by the incompatibility. However, there's also a very large and very vocal crowd of builders and upgraders who still swap out older, slower CPUs for newer, faster CPUs to maximize their investment. An upgrade-in-place doesn't sell an Intel chipset, but it at least keeps them on the Intel platform. If consumers are forced to dump an existing Z270 motherboard for a newer Z370 to get a six-core Coffee Lake CPU, Intel risks driving them into the arms of AMD and its Ryzen CPUs.
Why is this news? (Score:5, Insightful)
Re:Why is this news? (Score:4, Insightful)
No shit. They do this *all* the time.
Re: (Score:3)
that, and to do things like introduce the Core "i" series with GPU (775 -> 1156), shits and giggles (1156 -> 1155), add integrated voltage regulators in to chips (1155 -> 1150), switch from DDR3 to DDR4 (1150 -> 1151)
Re: (Score:2)
Re: (Score:2)
Because it is a "bait and switch" situation.
Intel had previously announced that Coffee Lake would use the 1151 socket, and so people assumed that that meant compatibility with previous motherboards that have that socket.
I would bet that there are quite a few consumers out there who had got 1151 socket motherboards and a 4-cores or less CPU (which is all that is available) with the intention to upgrade to a Coffee Lake 6-core CPU in the future.
Right now, even a 2-core CPU with high clock is considered a good
Re: (Score:2)
It's not bait-and-switch if new information invalidates people's incorrect assumptions that are based on zero relevant information from the manufacturers. Anyone who purchased a motherboard that didn't explicitly state compatibility with the upcoming "Coffee Lake" and expected compatibility was taking a gamble, and is personally liable for the result of that gamble.
It's bait-and-switch if Intel advertised that the new CPU would work in Kaby Lake motherboards, and then it didn't work. They didn't do that,
Re: (Score:2)
Was going to say - anyone remember the "Slot 1" debacle? And then after that turned out to suck and cause significant challenges, back to ZIF sockets halfway through the Pentium-3 products. They didn't even wait for a new architecture.
How is this news? They've literally been doing this for longer than 15 years.
Re: (Score:2)
Take socket 2011. There are actually several electrically incompatible versions of that for no good reason.
Re: (Score:2)
Intel, for as long as I remember, needlessly changed sockets.
Goes back to the 90s when Pentium was first introduced: Pentium sockets were nothing like Pentium Pro or II or III or 4. AMD would try and leverage the existing infrastructure by designing CPUs that were drops-in to Pentium sockets, but Intel created everything from scratch.
Having said that, though, today's CPUs - if one wants to upgrade them, one has to closely match them w/ the chipset & everything else, or suffer a performance hit. Anyone who wants to upgrade won't notice a performance boost: th
Re: (Score:2)
Re: (Score:2)
"Which is not a thing that violates antitrust law."
But could actually violate Magnusson-Moss anti-tying provisions.
Re: (Score:2)
Re: (Score:2)
That's probably b'cos supporting chipsets didn't come even close to 200+MHz CPUs, so they probably had frequency dividers b/w CPU and chipset that took care of this particular issue
Re: (Score:2)
And AMD was sticking with Socket-7 after Intel started using various different sockets and slots because reasons almost two decades ago. How is this news, again?
Is anyone surprised? (Score:5, Insightful)
For the last 10 years since Intel gained complete monopoly control over Intel chipsets for Intel CPU's they go out of their way to make minor changes to force new motherboards to feed their income from chipsets. They add a pin or two or make some other minor change that makes it impossible to use new cpus with older montherboards even if the chipset is identical in features.
This is SOP at Intel these days. Use that Monopoly power to extract maximum revenue. Hell the new Platinum Xeon chips have MSRP's of up to $13,000. Something that would not be possible with legitimate competition.
Re:Is anyone surprised? (Score:5, Funny)
Are you upset that a Ferrari costs 10x the price of a Honda?
Yes.
Re: (Score:2)
If this chip was so overpriced then nobody would be buying them. If you need that much performance in a single CPU then $13K is trivial relative to the rest of the costs.
Re: (Score:2)
A complete and utter non-sequitur. If may or may not be trivial relative to total cost. I'm actually pretty goddam sure a $13K CPU is likely to dwarf the cost of the rest of the system - unless it has some outlandish amount of installed RAM - like terabytes.
Re: (Score:2)
You've lost track of time buddy. AMD hasn't been in the lead for more than a decade, last time they were on top GW Bush had just been elected.
Upgrading CPUs? (Score:5, Insightful)
Outside of the gimmicky super-shredder-killer-fps-man-slayer motherboards, it's not like they have been the most expensive part of a computer build for a long time. Introducing a new video card incompatibility like the transition from PCI -> AGP -> PCI Express would be a whole different story.
Re:Upgrading CPUs? (Score:5, Interesting)
No. When I upgrade, I get a new motherboard and CPU. And often, new memory for the MB. I have built systems for maybe 25 years, and I don't remember doing a simple CPU upgrade. But I did swap out a Cyrix CPU because it kept crashing Win95.
Re: (Score:3, Interesting)
I've built systems for about that long and I did a simple CPU upgrade about 2 years ago. About 6 years ago I built a dual Xeon E5645 workstation for myself ($500 per CPU at build time) and two years ago I upgraded them to Xeon X5690 CPUs. The X5690 CPUs were about $2000 each when I built the machine but only $200 each used on eBay 4 years later. I've also piecemeal upgraded a bunch of other parts like RAM, disks, etc.
The end result is a 6 year old workstation with shockingly good performance when compared t
Re: (Score:2)
Re: (Score:2)
Builders that lean towards AMD upgrade their cpu's
Have you? I had a Phenom II on AM3 and kind of wanted an upgrade. No option for that since the FX line used AM3+. But hey, if I bought a brand new system with an AM3+ mobo at least I could downgrade to a shitty Phenom! This just the nature of technology.
Re: Upgrading CPUs? (Score:3)
I finally did this for the first time recently. My almost 5 year old build with a Sandy Bridge i3 and Radeon 6850 was barely acceptable for Overwatch...playable at the lowest settings. Got an Ivy Bridge i5 off eBay (dual to quad core was huge but also better clock) and a new 1060 and now Iâ(TM)ve got even Doom and Gears 4 running max settings at 1080p with 60fps.
Re: (Score:2)
Do people really "upgrade" processors?
Sure... on AMD systems. On intel systems? No, they really don't, because Intel changes their CPU socket at the drop of a hat. Any hat.
My desktop PC started as a Phenom II X3 720, then it was a Phenom II X6 1045T, and now it's a FX-8350. All in the same socket. I built other PCs to take the hand-me-down processors, and do other jobs. (One Linux box, one test bench.)
Re: (Score:2)
I'm one of those guys who likes to have a top-end gaming PC. Spend quite a lot of money on it. Put a lot of thought into when to do part-swaps and when to go for a whole new system.
I'm not really sure this is going to matter all that much for most people, even for people like me. Even for high-end gaming (i.e. trying to hold 60fps with high/ultra graphics settings at 4k), the CPU upgrade cycle isn't particularly intense. If you're using a decent Skylake or Kaby Lake (e.g. a 6700K or 7700K) you should be goo
Re: (Score:2)
That's the thing - we're not talking about people who buy a complete system, or even who hack together a system because they simply prefer picking and choosing components (like me) over getting whatever vendors decided to put in. I build my own system so that I can do things like getting quiet fans and power supplies; I also know the graphics card installed on most systems (usually some built in crap) is not going to be good enough. I'm not a big gamer, but I do play, and I do use some graphics apps profe
Re: (Score:2)
I upgraded an entire engineering computer lab's worth of CPUs once... in 1996. lol. Those Pentiums swapped out pretty easily and Cyrix P166+ processors fit nicely into the same slots as old Pentium 60s (though even then, the motherboards were a bit of a bottleneck at times). A roughly 2.5 x jump in cpu speed back then meant a lot for NT4 boxes running AutoCAD and other engineering software.
Today... I can't imagine why anyone would bother (though maybe it's my lack of imagination's fault).
CPUs aren't us
Re: (Score:2)
Yes. Duh.
Re: (Score:3)
The last Intel socket change was for DDR4 support, so there goes the "add more RAM" reason...
These days the year-on-year improvements in performance are getting less and less significant in terms of actually noticing it.
Every few years though. something else ends up being upgraded, like DDR technology, PCIe generations, thunderbolt, USB3...
These things usually end up getting implemented (except USB3?) in the CPU, which then needs to be passed via the socket and chipset to a connector somewhere.
Even if the n
It's always this way (Score:2)
I keep my systems at least 3 years. Although the theory is that you can swap to a better CPU I've only done this one time. Most of the time Intel deliberately continues evolving the sockets, not for any real technical reason AFAIKT, but to keep you buying those motherboards. This is one of the reasons that I don't upgrade processors very often (I skip a few generations) as the gains are small enough that it's just not worth it for the cost and hassle.
Re: (Score:2)
The last time I upgraded my CPU without replacing the motherboard would have been when I upgraded from a Pentium 166 MMX to a 300MHz Cyrix part and even then I ended up regretting it and wishing I had moved to a Pentium II or something instead.
they're trying to help AMD (Score:5, Insightful)
Re:they're trying to help AMD (Score:5, Informative)
NEITHER company is a true champion of DIYers; both companies have forced numerous socket changes and have had very short-lived platforms.
Seldom have I seen a more disingenuous statement on Slashdot. AMD has never had a short-run platform, and has never forced a socket change just to force people to buy new motherboards. AMD has always kept support on their old platforms going well after their creation. Check out for example the lifespan on GEODE compared to single-core Atom, you will apparently be surprised. Meanwhile, Intel has clearly made several changes designed specifically to sell more motherboards, which means selling more overpriced chipsets. AMD CPUs are cheaper per flop and AMD chipsets are cheaper per GB/sec or by PCI-E lane. No matter how you slice it, Intel are bigger assholes than AMD.
Already switched to AMD (Score:5, Insightful)
Re: (Score:2)
Intel rocks for gaming. They make faster CPUs per core by a good margin still as not everything can be run in parallel
Re: (Score:2)
Of course some developers can't wrap their heads around more than one thread - even ones born after multiple CPUs were in desktop computers!
Re: (Score:3)
Gaming is one of the classic examples of embarassingly parallel as shown by all those processing units in video cards.
Graphics, yes. Gaming? No. From what I've understood most games have divided threads by task, this thread does AI, this thread does rendering and so on. Which is why so many games still do well on dual cores, there's one core running the main loop and one running everything else. Not even Civilization VI, the kind of game that possibly could use lots of cores for the computer's AI manages to use 8 cores.
Re: (Score:2)
That's a good illustration of that problem with developers isn't it?
The number of times I've seen something struggling on one core when there are seven free is maddening.
Games, especially very graphical ones with a simulated 3D environment and sound sources located in 3D have a lot of things they could be doing at once. The sensible thing is to divide threads by task (as said above)
Re: (Score:2)
That's a good illustration of that problem with developers isn't it?
The number of times I've seen something struggling on one core when there are seven free is maddening.
Games, especially very graphical ones with a simulated 3D environment and sound sources located in 3D have a lot of things they could be doing at once. The sensible thing is to divide threads by task (as said above) but there isn't a lot of that going on.
It's all very well offering armchair advice but the task of actually thinking through and making reliable logic in a parallel environment is beyond most humans, we just don't think that way and can't think that way. Until we come up with new programming paradigms that make this stuff easier & more reliable, we're always going to have this problem for general computing. Our languages and methods are designed around a single-threaded world.
Re: (Score:2)
That is insulting.
We are not supposed to be "most humans", we are supposed to be the ones that get those collections of silicon, copper etc running well. We're supposed to use all those years since high school to pick up the tricky stuff instead of stagnation.
Stuff was most likely written in FORTRAN to run in parallel before yo
Re: (Score:2)
That is insulting.
Wow, you're really easy to insult! I would say it's insulting to just assume that everybody is lazy rather than try to think of why the task might be just too hard. When software is still released buggy, when as an industry we still haven't worked out how to 'over-engineer' for safety without still throwing up catastrophic security holes or system crashes in important systems, when games are written by over-worked programmers in crunch mode for months on end... the whole industry is still horribly immature.
Re: (Score:2)
That is insulting.
Wow, you're really easy to insult! I would say it's insulting to just assume that everybody is lazy rather than try to think of why the task might be just too hard. When software is still released buggy, when as an industry we still haven't worked out how to 'over-engineer' for safety without still throwing up catastrophic security holes or system crashes in important systems, when games are written by over-worked programmers in crunch mode for months on end... the whole industry is still horribly immature. But sure, it's laziness.
The world needs hundreds of thousands of programmers, and they're not all going to be at your level. The process of making software needs to be resilient enough to handle them, you can't simply wish everyone was as proficient because they're never going to be.
It was a little be-littling. But back to you? Why should a game developer waist his time with the CEO of Rockstar games want the unfinished piece of crap shipped before Christmas whether it is finish or not waste time optimizing for what 1% of gamers have?
According to Steam majority still had 2 core cpus until 18 months ago thanks to the core2duos and cheap crappy laptops parents but their kids. Today it is now just approaching 4 core with no hyper-threads. Sure Linus Tech Tips and Gamers Nexus on youtube a
Re: (Score:2)
The reason Civ6 performs so poorly and doesnt use a lot of threads is because the entire fucking game is an interpreted script built on top of shitty xml garbage.
Its why those loading times are enormous even on an SSD.
These Civ games have not gotten more advanced since Civ3. They have just been pushed into easier to maintain scripting so that their lead developers can hand off the maintenance to junior developers sooner, freeing up the lead developers for another title.
Re: (Score:2)
Civ6 on Steam is the only game I have. My laptop has a Xenon w/ 64GB of ECC RAM. The fist disk is 256GB of NVMe, then I have a 2TB disk I use for holding VMWare images, basically.
After about 200 turns, it starts crawling. The other day, I finished a full 500 turn game after a few days poking at it here and there, and when I was done I tried to "exit to desktop" and the whole thing fucking crashed. Of course, Windows 10 Pro wouldn't let me start task manager on the monitor hooked in over the DVI, and I coul
Civ6 issues (Score:2)
My biggest issue w/ Civ6 is how constrained it has been. In previous versions of the game - aside from I & II, you could name your leader, your civilization, your cities. Civ 4 was the best - they had a scenario editor where you could start all the players you wanted in certain spots, preload them w/ whatever units, money, cities & resources you wanted, including renaming anything right from the base game, and then play. In Civ V, there never was a scenario editor: the closest to it was a mod cal
Re: (Score:2)
Well - no - /imperative/ languages and methods are designed around a single threaded world and have been graunched to fit the multithreaded world.
However, there are languages and methods that work very well in a parallel world. Erlang to give one example - the only problem is no one's going to be writing games in Erlang any time soon, even though it's quite easy to write a highly reliable Erlang application that spawns tens of thousands of threads.
Re: (Score:2)
It's all very well offering armchair advice but the task of actually thinking through and making reliable logic in a parallel environment is beyond most humans
Having done this many times, it's not nearly as hard as you portray.
It does require more thought than single-threaded, and it does require letting go of the mindset that you know exactly what every single bit is doing at any moment.
But it's well within the grasp of the vast majority of programmers. Especially with modern languages making it easier and easier to do.
Re:Already switched to AMD (Score:5, Insightful)
As a professional game developer, I can assure you that videogames are the exact opposite of an "embarrassingly parallel" problem. There's a LOT more to videogames than graphical processing: AI, pathfinding, physics, audio, resource and memory management, animation, world updates / occlusion systems, etc. These are all CPU-intensive subsystems, and many of them are inherently bound to global data (a virtual world simulation), which makes it extremely tricky to split off into independent threads (probably the only exception being audio).
That means that each subsystem in the game must be carefully and painstakingly optimized for threaded performance. It's not possible to trivially split up all these subsystems by thread either. Many of these systems tend to interact with each other and the global world database, and that means the gains tend to be smallish and non-scalable in nature.
Generally speaking, it's an extremely difficult problem, and one which I don't think the industry has really cracked yet. Believe me, if it were trivial to do, we've have done it a long time ago.
Re: (Score:2)
I know that games are pretty well all about state but that doesn't mean some tasks can't be done with in parallel. You mentioned physics - consider how that's simulated in the numerical computing world and how some game engines are dealing with it in a similar way
That probably came off as teaching an old dog (Score:2)
Now that the average software developer has finally grasped 64 bit and is starting to get a feel for multiple threads that difficult problem will be chipped away at a
Re: (Score:2)
I seem to have had this discussion about just about everything in computing since the mid 1990s, the difference being most people found a situation where more than a single thread worked. You've given some examples yourself about multi-threading.
I know that games are pretty well all about state but that doesn't mean some tasks can't be done with in parallel. You mentioned physics - consider how that's simulated in the numerical computing world and how some game engines are dealing with it in a similar way. Many problems in a simulation do seem to be highly parallel if more than a tiny area is modelled.
Did you just read his post? He is a game developer. What I wonder and perhaps grand parent could answer is if the problem is also not difficulty but rather most gamers still used 2 core cpus just 18 months ago on Steam survey? If some still have core2dio and cheap AMD dualcore Walmart special laptops to i5's then is it economical to parallel painstakingly code anyway?
Until Ryzen takes over and 6 to 8 cores and hyper-threaded goodness become standard then why bother optimizing?
Re: (Score:2)
Until Ryzen takes over and 6 to 8 cores and hyper-threaded goodness become standard then why bother optimizing?
It's important not to conflate average numbers on Steam versus what your game's particular demographic tends to use. When you say "Steam gamers", for instance, that's pretty much meaningless, as it's just an average across a *very* wide swath of gamers and gaming hardware.
The latest AAA games will not simply not run well on a dual core laptop with 4GB of RAM and an integrated Intel GPU. The target audience for those types of games tends to be someone who has a moderately powerful desktop machine or a *ver
Re: (Score:2)
Until Ryzen takes over and 6 to 8 cores and hyper-threaded goodness become standard then why bother optimizing?
It's important not to conflate average numbers on Steam versus what your game's particular demographic tends to use. When you say "Steam gamers", for instance, that's pretty much meaningless, as it's just an average across a *very* wide swath of gamers and gaming hardware.
The latest AAA games will not simply not run well on a dual core laptop with 4GB of RAM and an integrated Intel GPU. The target audience for those types of games tends to be someone who has a moderately powerful desktop machine or a *very* beefy laptop. Those machines are likely to have 4 cores and eight hardware threads at *minimum*. So, threading optimization is obviously very much worthwhile for these types of games.
Now that I've gone indie (I left my professional job about four years ago), my own game's target hardware is much more modest. So, in that case, I don't have to put in the same degree of optimization effort in my own engine as large studios do, since I'm a one man studio and can't put in the resources they do. There are plenty of older or lightweight indie games that those WalMart laptops will run just fine.
In short, it really all depends on your game's minimum requirements and intended target audience.
Cool thanks for answering my question. Curious if you own an AMD ryzen for your work or plan too if you do not mind me asking :-)
I want AMD to succeed and wish they had better gaming performance for IPC per core. But hey for running Vms at my job they are great lol.
Re: (Score:2)
Ehh, depending on your needs, Intel can still make a lot of sense. If single or low-threaded performance is more important to you (e.g. a lot of gaming, sadly), Intel still has the lead in terms of per-core performance. And from what I've gathered (admittedly, I haven't investigated either side much), AMD's integrated video performance lags behind Intel's, so if you're forgoing the video card, Intel may make more sense (again, take that with a massive grain of salt). But if you're going to be using it for w
Re: (Score:2)
And from what I've gathered (admittedly, I haven't investigated either side much), AMD's integrated video performance lags behind Intel's
AMD has destroyed Intel on integrated graphics since the first AMD APU's which was over half a decade ago. Its not even close. The only parts Intel has that can compete are the "Iris Pro" chips which have extremely expensive edram bolted onto them such that a similar performing (in every way, cpu, gpu, i/o, etc..) AMD APU is literally 50% of the cost of Intels "Iris Pro" solutions.
So "what you gather" is complete crap. "what you gather" is apparently so sketchy that you need to question the sources you "
Re: (Score:2)
I was thinking specifically of Iris Pro, but thanks for calling me on that. I always welcome corrections when I get things wrong.
Not a big deal? (Score:5, Insightful)
Intel has hardly ever had usable CPU upgrade on the same motherboard, generally they have kept compatibility for two consecutive generations. It's only like one year in between and has probably been for the OEMs' sake not the consumers. Maybe that's up to two years now that they've switched from tick-tock to process-architecture-optimization, but in any case the year-over-year improvements has been minimal so why? If you so desperately want to replace last year's Z270+CPU, sell them as a package deal and buy a new Z370+CPU combo. Though if you're doing it for the six-core, do yourself a favor and buy a Ryzen or if you must buy Intel then an X299. Doing it just for the two extra cores is stupid. Except for the fanbois who'll take any chance to trash talk the opposing team, is there anyone here who'll stand up and say they'll miss this upgrade path? I expect crickets...
Re: (Score:2)
Intel has hardly ever had usable CPU upgrade on the same motherboard
Single core > multiple core > generation upgrade.
There have been plenty of "usable" CPU upgrades on the same motherboard, especially on the lower end of the spectrum where the motherboard upgrade triples the cost of the total upgrade.
Dont forget the early LGA 2011-v1 adopters too. (Score:2)
I needed to upgrade, and LGA 2011 came out, figured it would be a good platform to go. Then intel moves to V3 and no more cpu upgrades for v1.
So I'm stuck with 2x 2011-v1 systems, but I don't trust intel, and this just proves it.
Bigger dies thanks to AMD (Score:2)
Coffeelake got shaken up by AMD Ryzen. Intel freaking out is loading more cores on their CPUs as the newer i7s will go from 4 cores 4 threads to 8 cores 16 threads. The newer i5s are rumored to go from 4 cores to 4 to 8 cores with no hyperthreading.
My guess is Intel quickly glued 2 CPUs together like they did with the i9 and now the socket has doubled in size :-)
Re: (Score:2)
The plan is apparently i3 goes from 2 to 4 cores, i5 goes from 4 to 6 and mainstream i7 goes from 4 to 6 with HT
Re: (Score:2)
The K series of both the i5 and i7 will have 2 more cores so i5 8670K will have 6 cores and the i7 8770k will have 8 to match Ryzen. Another rumor is the i5 series will now have hyperthreading so the i5 will still have 4 cores with the exception of the K with 6 but will be hyperthreaded.
These are just rumors. But it would make sense if INtel had to quickly double the cores that the die and pins would change. Of course this would double the cost of CPU production with 2 sizes?
IF they added more pci-e in a new socket then it's (Score:2)
IF they added more pci-e in a new socket then it's not so bad more like it's about time they moved off of LGA 1151 / 1150 they are just about the same in number of pci-e lanes / ram channels.
Dumping a socket standard is nothing new (Score:3)
Given that Intel has abused its industry dominance to first create and then abandon de facto socket standards perhaps two dozen times - who's keeping count now? - over its history, this is hardly a shocking maneuver. Rather it is entirely expected. They like to force people to buy all new hardware sooner rather than later, considering they're collecting royalties for much of it that doesn't have its brand name on it. Back in the Good Olde Days when there were actually other manufacturers competing to populate those same de facto standard sockets, Intel would abandon sockets just to shake up those little guys and drain their resources trying to retool and keep up. Having fully succeeded in eliminating ALL competition for their own de facto socket standards, they now do it just for grins and giggles (and perhaps for those licensing fees).
It's all a horrific mess for nerds. (Score:5, Interesting)
We are in a bad bad timeline for hardcore and even regular PC enthusiasts, the technological leaps have stagnated significantly, where people with 7 year old PCs need only double their memory and add an SSD (if they didn't already have one) and almost all tasks are fast enough.
The delay in shift from 14nm to 10nm has been pretty bad across the industry, in fact considering the performance improvements for processors, GPUs over the past 7 years, it seems quite apparent that the manufacturing process still plays a very heavy part in the performance boost between generations, just as much as architectural design of the processor.
I have a fairly specific use case, similar but not quite the same to gamers (I want a ridiculously fast PC for general use, I'm an extreme browser, exceeding 100-400 tabs at a time, but I don't game anymore, so I like mid to small ITX, quiet, professional looking machines)
I almost always have open from 8 to 25 applications open of varying kinds. I really like a very responsive system at sub $5000 expense (a 64gb, quad channel, DDR4 4000 machine with 12 cores, liquid cooled, would be great, but the cost would be insane and honestly, a complete top of the line, but not HEDT machine would likely do what I need at easily 30 to 50% savings)
Unfortunately Intel is all over the place with product varieties, when you look around the Intel ARK site (the new one is awful, great job web developers, great job, another unecessary redesign) you can see just how many processors they make, from 6w to 150w across all kinds of segments.
Sadly the days of a "preemo desktop" CPU being their primary bread and butter is over and that's why we see ridiculous things like this article is stating, they are diversified everywhere and the complexity seems beneficial to their bottom dollar.
The rumor is the coffee lake 6 core desktop processor won't work in the existing z170/270 chipset, despite the fact it's basically the same family as the last 2 CPUs for those boards (i7-6700 / i7-7700 etc) just 2 more cores 'glued on'
We also don't know if this new processor was ever intended to come out at 14nm or it was originally 10nm.
There's talk that the new chipset, Z370 isn't even any more than a re-badge of the z270! Which makes forcing people to use it even more ridiculous.
There's a "z390" (?) is a cannonlake chipset or "PCH" - and it's coming out next year - but that chipset is only for cannonlake processors, except there are (apparently) none of those planned for desktop.
So, do you buy an i7-8700k now and put it on a z370, knowing that you might be missing out on some new features in 2018, like bluetooth 5 and wifi ac being built into the chipset itself?
The whole thing is messy and awkward to follow, it's only gotten worse the past few years.
Honestly, I think the best thing to do, if you're capable is to stop reading the news about this stuff and just buy what's best when you need a new machine. It's endlessly time consuming and confusing to be an educated consumer with PC stuff. (I should know, I've wasted possibly years of my life googling / reading this rubbish since I first started building my own machines 20 years ago)
But the long and short of it is, stuff just isn't improving at a fantastic rate anymore. Even if you're silly rich, you can't buy a machine that utterly decimates other machines easily. People can get 60 to 80% of your performance for 1/4 or less.
Re: (Score:2)
The delay in shift from 14nm to 10nm has been pretty bad across the industry, in fact considering the performance improvements for processors, GPUs over the past 7 years, it seems quite apparent that the manufacturing process still plays a very heavy part in the performance boost between generations, just as much as architectural design of the processor.
AMD went from 22nm everything, to both 14nm and 10nm ... this year.
The only company delayed on 10nm is Intel. Their specialized 3D "Tri-Gate" transistors apparently can't be produced economically at 10nm. Intel has just back-peddled back to working with regular old FinFET's like the rest of the industry. They are late to the 14nm FinFET game so its still going to be another year before Intel figures out how to do 10nm FinFET's like the 3 other companies that heave already beaten them to 10nm production.
Re: (Score:2)
It seems improving means better phones with better battery life. PCs are booring like how kids in the 1980s snubbed their noses at Mainframes and DECs even though nerds at that time debated big iron and mainframes were the wave of the future and cool etc.
It seems crappy no name phones can last for many days without a recharge. Not same with heavy smart phones.
More transistors due jack with the x86 which is why INtel and AMD are trying cores. Intel gave up on this as they are afraid once people upgrade to a
Re: (Score:2)
So, do you buy an i7-8700k now and put it on a z370, knowing that you might be missing out on some new features in 2018, like bluetooth 5 and wifi ac being built into the chipset itself?
Yes. Dongles are cheap and effective.
Honestly, I think the best thing to do, if you're capable is to stop reading the news about this stuff and just buy what's best when you need a new machine.
I agree completely.
I don't care (Score:2)
I'm still waiting for quad-core Arduino ATmega328P.
I upgrade DRAM generations.. BANDWIDTH (Score:3)
I upgrade for each new generation of memory. I will soon upgrade to a DDR4 based system.
Wait for the new standard to hit price parity, then grab whatever CPU is at the best price/performance point. New faster PCI or whatever, sure. Give me the new fast RAM!
All computing comes down to bandwidth. Memory bandwidth is always the first roadblock. Then disk, and later network.
Yes, about every 4-5 years. Shrug, works for me!
SSD was my only upgrade in about 4 years!
Re: (Score:2)
4K and new CPU needs will be the next question.
Why same socket then? (Score:2)
Why same socket then?
They need to add more pci-e lanes / boost the DMI link speed. Just going to 6 cores at the top end seems like an other kaby lake x joke.
AMD is killing them and AMD has more pci-e lanes at all levels (other then maybe an 4 cpu Intel system that cost will be way higher at least X2 or more then an good amd server system)
Intel has lost its way (Score:2)
Intel is thrashing around. They've moved from "you want our stuff because it's the best" to, "you're going to buy our stuff because we'll make deals with people you buy computers from". That's not well described, and I'm not an expert (my computing needs are modest) but I've seen this happen before with other tech and non-tech companies. They get big and powerful, and they forget it was willing buyers who made them that way. As far as I'm concerned, you can put me down as a default AMD customer for my n
Re: (Score:2)
They get big and powerful, and they forget it was willing buyers who made them that way.
This is typical of most large businesses. They get big and powerful and most of the people who got them there cash out and move on to other things or retire. A bunch of MBA's who have never learned how to actually make money are hired. Nepotism and cronyism take hold, and management goes from a staff of competent people to a staff of sycophants, brown nosers and yes men/women. And eventually the golden goose is dead and butchered
Large corporations are part of the death cycle of business. Usually some of t
Re: (Score:2)
Intel is preferred still where gamers rejected Ryzen for i7's and even the older FX series [steampowered.com]! AMD's marketshare plummets as soon as Ryzen comes out :-(
So I think Intel must be doing something right as gamers feel AMD sucks for games and has a bad brand name attached compared to Intel/Nvidia.
Re: (Score:2)
Great analysis. Wish I could disagree, but I really can't.
Well done Intel. (Score:2)
Nothing new (Score:2)
Intel has always been doing this. However, I'm not so sure the upgrade problem is that much of a problem in real life.
I always build my own PCs and typically go for the best performance-per-euro solution. I have often looked into upgrades, but hardly ever were such impossible-due-to-socket-changes-upgrades really worth it from a performance-per-euro point of view. It's almost always a better idea to save your money and buy a new cpu+ram+mobo combo a year later than to upgrade now.
Upgrading might be interest
Serious mistake (Score:2)
The people they are talking about are the more technically proficient users among consumers, their perceptions of the superiority of one platform vs another are what drive the decisions everyone else makes and repeats to those who think they are the "technical guy" and it spreads from th
Re:They probably will work. (Score:4, Funny)
As opposed to AMD (Score:2)
They're saying not compatible. What this likely means is a change in pin layout.
As opposed to AMD's "AM#" motherboard which more or less have compatible pinouts,
and are generally within a firmware upgrade away from supporting next generation's CPUs on previous generation's motherboards (though lacking support for the feature introduced with the newest "AM#" platform).
Re:They probably will work. (Score:5, Informative)
Re: (Score:3, Interesting)
Yes it is known that intel usually changes their sockets, the problem they have now is AMD has finally stepped up to the plate with a competitive if not better product.
If enthusiasts are going to have to replace their motherboard to jump onto intel's latest and greatest you can guarantee that enthusiasts will give AMD a good look over, since they are having to sink money into a whole new motherboard platform no matter which side they go with.
With this new competition Intel would have probably been better of
Re:They probably will work. (Score:5, Insightful)
if it's a different socket, okay.
but " Z270 boards, even though the chipsets with the upcoming Z370 appear to be the same" .. that matters a few years down the line when you're building a kit from some parts. if the socket is different then it's not that much of a problem.
a more interesting thing would be just.. is it faster in any sort of meaningful way?
different socket is okay Only with more pci-e or b (Score:3)
different socket is okay Only with more pci-e or better DMI. Not just 1152 or 1151B that just locks out the older boards.
Re: (Score:3)
They have basically given enthusiasts a reason to look at the competition rather than just dropping in a new CPU upgrade.
Why should Intel care? How many people replace the CPU on their motherboard? Is this even 1% of the market?
Re: (Score:3)
I do and had built my Skylake with the intention of buying a better CPU down the road, I am an Intel guy through and through but AMD has my attention and might have my next purchase.
Re:They probably will work. (Score:5, Interesting)
Why should Intel care? How many people replace the CPU on their motherboard? Is this even 1% of the market?
Intel should care not because of the home builders. Intel should care because of the big box builders.
..'cept now they can't... they have to RAISE their prices in order to screw down these latest more expensive motherboards.
Those older motherboards have come down in price over their period of compatibility, meaning that even big name system builders could offer lower prices.
Re: (Score:2)
It's not the the home builder, it's the 'buzz' it creates. If these CPU's work that well, the builders will tell their friends and family to go to BestBuy and pick up a Ryzen. I basically tell all of my extended family what to purchase if not actually build it for them.
Re: (Score:3)
"The Builders". How many of us do you think there are? Probably 99%+ of sales are people going to Dell.com or BestBuy and picking something off the shelf. (Well, these days the Apple store, I guess, which further negates processor choice.) I doubt that very many people know the difference between Intel or AMD, other than the fact that Intel advertises, so the name might be vaguely familiar. FFS, I bet not many people actually understand the difference between Intel and IBM for that matter. "They're both com
Re: (Score:2)
Re: (Score:2)
Intel and AMD don't realize, apparently, that allowing backdoor spyware means the eventual end of their companies.
No, they realise very well, and that there's no other competitor besides them, especially if you go down the Windows route.
Consumers have no other choice, and especially coupled with the fact very few people appear to truly care about their privacy... you only have to look at the countless Android and Google users [huffingtonpost.com] to realise how little people (even geeks) seem to care!
* I didn't mention Apple, because they are the only major company who appear to care for the privacy of their users [apple.com] and fighting against Gov [theinquirer.net]
Re: (Score:2)
Exactly. The last time I upgraded a CPU on a mother board was to move from a 25MHz '486 to a 66MHz one. Anything less than 5 years old is fast enough for most cases. Spend the money on an SSD or more RAM.
Re: (Score:2)
That I buy AMD.
Ryzen. Released February 2017. Socket: AM4
Threadripper Released July 2017. Socket: TR4
They didn't even make it 6 months without requiring you change sockets to get the l latest CPU.
Re: (Score:2)
This is not an incremental update, though.
Re: (Score:2)
Re: (Score:2)
I built a system with a Phenom II X3 720. Then I upgraded it to a Phenom II X6 1045T. Then I upgraded my motherboard, and built another system with the X3. Then I upgraded my CPU again to FX-8350, and the X3 system became an X6.
Re: (Score:2)
"very very few DDR2 machines"
Well, yea, it was practically impossible to find DDR2 in any size larger than 2GB for desktop systems and laptops, which made any machine running DDR2 effectively garbage for upgrading to modernish-standards since a huge chunk of them only ever shipped with two memory slots on the motherboard.