Intel's First 10nm Cannon Lake CPU Sees the Light of Day (anandtech.com) 184
Artem Tashkinov writes: A Chinese retailer has started selling a laptop featuring Intel's first 10nm CPU the Intel Core i3 8121U. Intel promised to start producing 10nm CPUs in 2016 but the rollout has been postponed almost until the second half of 2018. It's worth noting that this CPU does not have integrated graphics enabled and features only two cores.
AnandTech opines: "This machine listed online means that we can confirm that Intel is indeed shipping 10nm components into the consumer market. Shipping a low-end dual core processor with disabled graphics doesn't inspire confidence, especially as it is labelled under the 8th gen designation, and not something new and shiny under the 9th gen -- although Intel did state in a recent earnings call that serious 10nm volume and revenue is now a 2019 target. These parts are, for better or worse, helping Intel generate some systems with the new technology. We've never before seen Intel commercially use low-end processors to introduce a new manufacturing process, although this might be the norm from now on."
AnandTech opines: "This machine listed online means that we can confirm that Intel is indeed shipping 10nm components into the consumer market. Shipping a low-end dual core processor with disabled graphics doesn't inspire confidence, especially as it is labelled under the 8th gen designation, and not something new and shiny under the 9th gen -- although Intel did state in a recent earnings call that serious 10nm volume and revenue is now a 2019 target. These parts are, for better or worse, helping Intel generate some systems with the new technology. We've never before seen Intel commercially use low-end processors to introduce a new manufacturing process, although this might be the norm from now on."
Not everyone needs $1900 Core i9 (Score:3, Insightful)
Not everyone needs to cough up $1900 for a CPU to have a computer that is usable to them.
I absolutely hate this notion today that only the most expensive modern things are usable, and that anything else will not work properly.
Re: (Score:2)
No, you are dead right.
However it is less than stellar when Intel launch a new process with a CPU that is slower and less capable than the previous generation.
They know this, and would not be doing this unless there were problems...
You may not launch with the best CPU a process will ever support - that takes time, however you shoot a bit higher than 'lowest end possible'.
It looks like it is slow and hot... not a good sign.
Re: (Score:3)
However it is less than stellar when Intel launch a new process with a CPU that is slower and less capable than the previous generation.
Well, if it's slower because they took out some of the insecure tricks they've been using to get good performance from their antiquated architecture, maybe that's a good thing.
"If".
Re:Not everyone needs $1900 Core i9 (Score:4, Informative)
They didn't. It's still vulnerable to Spectre and Meltdown. There simply wasn't enough time to modify the hardware. They are targeting Icelake for the hardware fixes.
Re: (Score:2)
It's slower because the majority of what they produced got binned. There are more low-end, half-disabled CPUs coming off the line than fully functioning.
Re: (Score:2)
The indicators point at process issues. Each shrink is a crapshoot, engineering-wise, and after many years of boxcars, Intel finally rolled snake eyes. In other words, Intel bet on some process technology that didn't perform to expectations, requiring expensive and time-consuming backtracking. Meanwhile, TSMC, Glofo and Samsung are moving more cautiously. AMD made a great call by focusing on multi die SoCs.
Maybe this means the era of Intel developing its own process tech and running its own fabs is coming t
Re: (Score:1)
Well does this generation perform better or worse than the previous generation pre or post the spectre/meltdown patches?
A Chinese retailer claims (Score:2)
A Chinese retailer is selling 10nm Intel processors cheap.
A Chinese retailer is also selling OtterBox cases for $1.50, and Genuine Applà McBook chargers for $12.
Re: (Score:2)
Re: (Score:2)
Not everyone needs to cough up $1900 for a CPU to have a computer that is usable to them
How right you are, when a Ryzen Threadripper [newegg.com] will blow it away in throughput for half the price. :-)
Re: (Score:2)
In 2 years a $400 computer will be better than yours.
Just sayin'.
Re: (Score:2)
In 2 years a $400 computer will be better than yours.
Just sayin'.
Well... we are at a point where physical limitations have already killed Moore's Law... The reason we see little to no improvements from one CPU generation to the next, is that intel, AMD and others are trying to strech the last possible speed increases as long as possible to ensure income untill something is ready to take over from silicon. My computer is 2 years old, it is not a particular expensive computer but still a top of the line gamer computer for its time... it packs 32 GB of memory, GTX1080 and
Re: (Score:2)
My computer is 2 years old, it is not a particular expensive computer... it packs 32 GB of memory, GTX1080 and an i6700K
If your computer is 2 years old and packs a GTX 1080, you must have bought the card practically on launch day (May 27th 2016), which must have cost a small fortune.
Re: (Score:3)
If your computer is 2 years old and packs a GTX 1080, you must have bought the card practically on launch day (May 27th 2016), which must have cost a small fortune.
Well I guess it's what a fortune is to you. I bought a 1080Ti at launch, sure it was $700 but it's well over a year later and apart from a few ridiculously overpriced Titan cards it's head and shoulders above the pack. I expect it'll be faster than a 1170 but slightly slower than a 1180, that's usually been the case. Two years after that'll it'll probably be behind, but not so terribly far behind the 1270 that I'll replace it. So I'm thinking the 13xx generation would be a likely replacement time. That's ov
Re: Not everyone wants to be obsolete or broken so (Score:2)
Well I guess it's what a fortune is to you.
It's a fortune compared to most video cards.
Re: (Score:3)
Sounds like the desktop I'm running right now. I was briefly fishing around to replace it but the performance way outclasses what I need. I imagine that I will be keeping this desktop for another 2 years, at least.
Sitting behind me is my linux server. It's running a FX-8350 from 2012. Every year for the past 3 years I've been thinking about replacing it. The only excuse I have to replace it is because its over six years old. Other than that I don't have any. In it's role its performance also excee
Re: (Score:3)
This hasn't been the case for years as processor development has slowed. My desktop is coming up on two years, and it would still crush a new $400 computer Heck, a 5 year old high end computer would still be competitive with a $400 computer in most tasks.
Re: (Score:3)
This hasn't been the case for years as processor development has slowed. My desktop is coming up on two years, and it would still crush a new $400 computer Heck, a 5 year old high end computer would still be competitive with a $400 computer in most tasks.
My PC will be three years old this summer... I looked at replacing it, and had a very hard time building something that would have a noticeable performance gain at ANY price (at least given that I need single-core performance for gaming). The only reason I was looking at all was because I need the hardware to replace my old Linux server (which is a 6-7 year old i5/16GB box that's also still perfectly fine, but I need to do a complete OS refresh/rebuild, so I might as well update the hardware at the same tim
Re: (Score:2)
Re: (Score:2)
In 2 years a $400 computer will be better than yours.
Just sayin'.
Doesn't happen quite that fast these days, which has the side effect of making it actually worth the effort to build a performance machine, hence the rise of the enthusiast sector.
Re: (Score:2)
In some ways, a self-built machine isn't as good of a deal as it used to be. 15+ years ago, it was often possible to buy low-speed binned chips and overclock them beyond the top binned chips. Since the only difference between the cheap processor and the expensive one was clockspeed (and maybe a small amount of cache), you ended up with a processor faster than the fastest one available OEM (and OEM motherboards typically did not allow overclocking) for a fraction of the cost. Today, there are important physi
Re: (Score:2)
Your comments on processor upgrades make sense if you only thing about Intel. The enthusiast sector has massively shifted towards AMD recently, with good reason. Unlike Intel, AMD designed their high end desktop socket to be stable for several generations and they have already delivered on that with the 12nm Ryzen refresh. Re binned chips: they are binned for a reason, usually because they failed tests at high clocks, unless the manufacturer is running into price resistance at the higher price points, then
Re: (Score:2)
It's true that I've been out of the loop for a while, but mostly because progress is sufficiently slow that I don't really feel the need to do things like upgrade from Gen 1 Ryzen to Gen 2- the difference just isn't material enough to bother. AAA gaming titles run fine on a couple year old hardware, and I can't tell the difference between 80FPS and 90. My current setup is Intel, but I bought before Ryzen came out and the AMD options weren't really competitive outside the low end at the time. Funny thing is
Re: (Score:2)
There isn't a lot of difference between a new computer an a 2-year old computer nowadays. Or even older than that. I built my i7 system in 2012, and the only real reason a new i7 system would beat it is that the newer i7's have more than 4 cores. Put it up against an i5 (or whatever the quad core option is today) and it might be 50% faster than my 6-year old computer.
Re: Not everyone wants to be obsolete or broken so (Score:1)
Because we are talking about computers, while you are talking about jewlery.
Re:Not everyone needs $1900 Core i9 (Score:5, Informative)
"Nobody needs more than 640k of RAM" ~Steve Jobs~ -
Nope, that quote is from Bill Gates, not Steve Jobs. It was the IBM PC running MSDOS/PCDOS in regular memory on x86 processors that had that limitation.
Re: (Score:1)
Re: (Score:1)
And inventing the telephone.
Re: (Score:2)
And for inventing the smart phone.
Re: (Score:2)
And for inventing the smart phone.
and inventing the mp3 player
Re: Not everyone needs $1900 Core i9 (Score:2)
He didnt invent any of those things. He is more like Henry Ford then. Just made them good enough for everyone to want one, and feel like they needed it.
Re: (Score:2)
Re: (Score:1)
The quote was probably made well before poster Armonk existed.
That is quite a bold assumption. You do not know the first thing about this user.
Re: (Score:2)
That is quite a bold assumption.
Not at all; it was a statistically-likely guess. Now go change your diaper.
Re: (Score:1)
That is quite a bold assumption.
Not at all; it was a statistically-likely guess. Now go change your diaper.
I'll do yours if you do mine? :-) Don't you worry, I have changed many diapers ;-)
Re: (Score:2, Informative)
Nope, that quote is from Bill Gates, not Steve Jobs.
Neither Bill Gates nor Steve Jobs ever said that. It is normally misattributed to Bill, but there is no evidence that he ever said it.
Re:Not everyone needs $1900 Core i9 (Score:4, Informative)
Re: (Score:3, Funny)
Intel had some ad for the 286 saying it was too much for a single user..
It is true that married men make more money than single men, but that's still an odd way to market it.
Re: (Score:2)
Neither Bill Gates nor Steve Jobs ever said that. It is normally misattributed to Bill, but there is no evidence that he ever said it.
You should anyway quote the right legend, elsewhere is Chaos! Anarchy!
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Note that they didn't actually *believe* that, they simply didn't want to admit that their current offering might be lacking.
Re: (Score:1)
Re: (Score:3)
And that is why sales reps are not engineers.
Sadly, sometimes they *are* engineers. A large increase in income will occasionally draw them to the dark side.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Oh shit (Score:5, Insightful)
The free ride is over, software retards. You may actually have to start programming again, instead of creating multi-gigabyte copy-and-paste monsters that can't even keep up with typing at the keyboard, yet use 100% CPU on quad core machines.
I am waiting for the fanless version (Score:1)
This Core i3 8121U chip has a TDP of 15Watts. Don't know if it can be fanless, or not
But if Intel can come up with a fanless version (preferably with GPU), with even lower TDP I will be willing to design a mini-itx mobo for it
Re: (Score:2)
This Core i3 8121U chip has a TDP of 15Watts.
So... a bit like the Atom CPUs then.
Move along, nothing new here.
Re: (Score:2)
Re: (Score:2)
Except for not sucking like Atom. Core arch (basically, P3) is just better than Atom. They should kill Atom, it only exists for political reasons. There is no niche where Atom outperforms core arch in performance/watt except by market manipulation.
Re: (Score:3)
Re: (Score:2)
In my day, we hadda fit everything into a 512-bit cyclic mercury vibration pipe which cost as much as a medium apartment building. We didn't have no stinking 64k of RAM. Wheeeee what luxury you spoiled brats get offa my lawn.
Re: (Score:2)
I'd rather have software well designed and programmed in classic compiled languages
Meltdown&co fixed? (Score:4, Insightful)
Why this is news (Score:4, Interesting)
This CPU is nearly 3 years late.
Intel are having immense difficulty with the 10nm move. Down from 14nm (which was 'refreshed' twice)
What this means is that other manufactuers are now genuinely catching up to Intel, as much as I didn't believe it, it does seem that (I think TMSC?) is now just about ready to start putting out 7nm products.
(Note, they all bloody lie about the figures, TMSC 7nm is basically about Intels 10nm)
That does mean that AMD may be producing CPUs with a similar transistor density and voltage requirements to Intel soon, meaning the only advantage available is processor design, not manufacturing process.
Regardless of AMDs improved competition potential here though, is the concern that the move from 22nm to 14nm to 10nm has been AWFULLY slow and it's one of the driving factors in why computer processing hasn't really improved hugely in the past 4 to 10 years. It's improved but nothing at all like the previous decade.
If you're an enthusiast dying for top of the line performance with a deep budget, this has been painful, as you upgrade every 18 months to 20% faster, instead of 70+% faster. If you're a homelab server nerd who wants to run a great little VM cluster, on some mid range, low power chips, the chips you could've bought 3 years ago, are probably fairly viable, still, to todays options.
Intel has delayed the rest of their 10nm processors I think until next year. Means the Intel 8700k 6core and the rumoured 8750 / 8900 (?) 8 core model (soonish) will be the best you can probably buy, for the next 18 months. If you've been holding off upgrading, may be worth considering.
It kind of sucks, I'm in the 'want a nice, low power server, but still kinda powerful' camp and I don't want 85w of CPU in my cupboard, but I would like at least 6, half decent threads. It's possible, but would've been much more likely with the shrinks being on time.
Re: (Score:2)
I imagine the day when Intel goes fabless, perhaps spinning it off like AMD did with Global Foundries. Doing all the die-shrink R&D just for x86 isn't going to be profitable, probably before ARM shrinks become unprofitable (smartphone and PC shipments are both effectively flat, now). We might go back to the days of the 8086, where a certain clock speed is effectively standard, and all that differs is how much RAM your system has. I imagine there will be some DRAM-only shrinks, since it's easier to get r
Re: (Score:1)
Intel has bought Altera so the die shrinks are not only for x86. Top FPGAs are high-performance and very, very high margin parts. They are also selling manufacturing capabilities for older processes and making some networking gear themselves (cellular modems). Churning out new chipsets all the time is consuming the fab capabilities nicely as well.
DRAM will follow they way of 3D NAND in a few years. Either that or just stacking of planar dies. We've already had experiments with HBM, but they turned out to be
Re: (Score:1)
The "85w" is just the TDP which is only usually achieved with full core load and/or AVX2. You actually want a newer CPU instead of the older ones because Intel has made great strides in *idle* power optimizations. A home server running on Coffee Lake will be way less wasteful than a Sandy Bridge for example. Almost all the other chips on the motherboard are made in lower power processes as well (chipset, NIC, super-IO, etc). Those things add up.
Re: (Score:1)
If your CPUs run idle, you have wasted money.
well it really depends on what you want to compare with... did a guy who owns like 10 supercars waste money because he can only drive one car at the time?
Re: Why this is news (Score:2)
did a guy who owns like 10 supercars waste money because he can only drive one car at the time?
That's easy: did those ten cars produce the desired effect of enlarging his cock, improving his physique and making him smarter/more charismatic?
Re: (Score:2)
Another car analogy: A sports car might have the same legal top speed as a truck but it sure accelerates faster and may be more enjoyable to drive.
Re:Why this is news (Score:4, Insightful)
I believe it's more about the limits of current technology and the fact that the CPU frequency depends on the voltage and since the power consumption and dissipation varies with the square of the DC supply voltage you just cannot raise the voltage arbitrarily unless you want your CPU to consume hundreds of watts of energy. And also there's the speed of light at play - you cannot arbitrarily raise CPU frequency because electrons will not have enough time to traverse the chip. Another issue is that the x86-64 instructions set is very difficult to optimize because the architecture is so old.
Re: (Score:2)
Wonder if it is similar to the "ridiculous" mcdonalds lawsuit thing, or truth through repetition thing.
It's a load of bollocks. x86, x86_64, or what have you is just a decoder stuck on the front of the CPU. x86 processors have been internally RISCy since the Am586 and Pentium.
Re: (Score:2)
just a decoder stuck on the front of the CPU
Which costs power and chip real estate, even if throughput is the same and latency only increases slightly. But AMD has exactly the same issue.
ARMS do front end translation as well, especially if running the thumb instruction set, but it is much less, hence a modest power efficiency and cost advantage.
Re: (Score:2)
just a decoder stuck on the front of the CPU
Which costs power and chip real estate, even if throughput is the same and latency only increases slightly. But AMD has exactly the same issue.
That was an issue way back when. Today, the x86 decoder is a minuscule portion of the chip.
Re: (Score:2)
Today, the x86 decoder is a minuscule portion of the chip.
Dunno. The last time I saw the decoder outlined on a mask was in Opteron days and it looked like a pretty big chunk of chip to me. But too lazy to go hunting for photos now.
Re: (Score:2)
Yep, still a pretty big chunk of chip. [pcper.com]
Re: (Score:2)
I can think of multiple reasons for this, but there isn't anything substantial in the public domain to
Re: (Score:1)
Re: (Score:2)
the amount of effort required to get to the next die shrink is high enough that there's considerable room for others to appear to catch up
It's not just that, it is also that they all buy their lithography equipment from the same supplier. [wikipedia.org] Nobody gets ahead of that.
How interesting that the Dutch still dominate printing technololgy [wikipedia.org] 600 years down the road.
Re: (Score:2)
In the end Intel was able to stay competitiv
Re: (Score:2)
CPUs are fast enough. We don't need radical improvements every year. We need price cuts and security fixes. We don't need process shrinkage that urgently.
Re: (Score:2)
Speak for yourself. I need as much fast, cool and cheap as I can get.
Re: (Score:2)
Fast enough for you, not me, not many.
High speed doed not need to imply less security.
Process shrinkage is the number one way to improve performance.
Re: (Score:2)
the move from 22nm to 14nm to 10nm has been AWFULLY slow
It's because production costs increase exponentially with number of multi-patterning steps to work around the resolution issue, and EUV [wikipedia.org] is really nasty stuff, it won't go through lens for one thing.
Re: (Score:2)
I have no idea how they intend to solve it. Yet TMSC claims they are doing "7nm" (roughly 10?) and I am pretty sure they have less resources than Intel.
Re: (Score:2)
It's all done with mirrors. :)
Re: (Score:2)
btw, TSMC now has more resources than Intel, because the smartphone market is much bigger than the PC market.
Re: (Score:2)
Ryzen
Wont' work, he said he wants some half-decent threads.
Re: (Score:2)
Ryzen single-core performance is more than decent, especially with the 12nm refresh.
Re: (Score:2)
https://bugzilla.kernel.org/sh... [kernel.org]
Also the single core performance is getting better, it's not amazing.
Re: (Score:2)
AMD put out a bios fix for the soft lockup at idle which seems to have fixed the issue for the minority of users that see it. Best guess is, it really is a power supply issue where some power supplies freak out when current draw is too low. AMD has not made any statement yet. I had the issue and the bios fix seems good. Now with 5 weeks of uptime, but then that is not really special for a Linux box. When it gets to a year I will will let you know. Best I ever had was a year and a half or so, and then only s
Re: (Score:2)
Literal quote from my post.
"(Note, they all bloody lie about the figures, TMSC 7nm is basically about Intels 10nm)"
Also (Score:5, Interesting)
Most likely by mistake last Sunday Intel released Z390 chipset information [guru3d.com]. The page has since been pulled down [intel.com] because this chipset was rumored to be accompanied with octa-core Coffee Lake CPUs which are yet to be announced.
Next time I'm gonna web-archive their mistakes ;-)
I always get the feeling (Score:2)
Re:I always get the feeling (Score:4, Interesting)
Have you seen their R&D expenditures?
Designing a 14nm tech process in the 70's/80's was impossible because it has taken billions of dollars of investments and new technologies (some of which weren't invented at Intel) to get there. Also, considering that they've rehashed their 14nm tech process twice and their first 10nm part is a castrated 2core CPU minus iGPU, it surely looks like 10nm is extremely difficult/costly to get right.
Re:I always get the feeling (Score:4, Funny)
Have you seen their R&D expenditures?
Designing a 14nm tech process in the 70's/80's was impossible because it has taken billions of dollars of investments and new technologies (some of which weren't invented at Intel) to get there. Also, considering that they've rehashed their 14nm tech process twice and their first 10nm part is a castrated 2core CPU minus iGPU, it surely looks like 10nm is extremely difficult/costly to get right.
Well, that's exactly what they want you to believe. When they say "R&D expedinture", they really mean "R&R expedinture".
Did Intel confirm it? (Score:4, Funny)
We have a Chinese retailer claiming to sell a 10nm CPU that has the features (and probably speed) of a 5 year old low budget processor. And since Chinese companies have a spotless track record of never trying to sell counterfeited products, we should readily believe that this seemingly ancient CPU is bleeding edge.
I ... erh... well... how do you put it nicely...
What bugs does it have? (Score:1)
Given the recent rash of Intel architecture bugs and that this is not a new architecture, what bugs that we already know about are in this "new" CPU?
This makes sense... (Score:1)
...yields in the 10nm fabrication are apparently too sketchy for a high-end/up-market release. Solution: disable the cores and features of the chip that don't work, and sell it cheap.
Wut? (Score:2)
>We've never before seen Intel commercially use low-end processors to introduce a new manufacturing process.
Yes we have. However, if you're only paying attention to the desktop CPUs, you might get that impression.
doesn’t care (Score:1)
Re: (Score:2)
I use a VIA C3, you insensitive clod!
Re: (Score:2)
Most on-board graphics suck and you have disabled it away and buy a good graphics card to play modern games.
AMD's integrated Vega doesn't suck and will be the right solution for many laptops and midrange desktops. It so doesn't suck that Intel is licensing it.
Re: (Score:2)
Thanks!
However could you maybe delve a little bit more into that please? Other fabs have seemingly had very few problems transitioning from 12nm (which is close to Intel's 14nm) to 7nm (which basically matches your 10nm). And unlike other fabs you started working on the 10nm node several years earlier, so you had quite an advantage.
Re: (Score:2)