Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel The Internet Hardware

Intel's First 10nm Cannon Lake CPU Sees the Light of Day (anandtech.com) 184

Artem Tashkinov writes: A Chinese retailer has started selling a laptop featuring Intel's first 10nm CPU the Intel Core i3 8121U. Intel promised to start producing 10nm CPUs in 2016 but the rollout has been postponed almost until the second half of 2018. It's worth noting that this CPU does not have integrated graphics enabled and features only two cores.

AnandTech opines: "This machine listed online means that we can confirm that Intel is indeed shipping 10nm components into the consumer market. Shipping a low-end dual core processor with disabled graphics doesn't inspire confidence, especially as it is labelled under the 8th gen designation, and not something new and shiny under the 9th gen -- although Intel did state in a recent earnings call that serious 10nm volume and revenue is now a 2019 target. These parts are, for better or worse, helping Intel generate some systems with the new technology. We've never before seen Intel commercially use low-end processors to introduce a new manufacturing process, although this might be the norm from now on."

This discussion has been archived. No new comments can be posted.

Intel's First 10nm Cannon Lake CPU Sees the Light of Day

Comments Filter:
  • by Anonymous Coward on Tuesday May 15, 2018 @02:06AM (#56613234)

    Not everyone needs to cough up $1900 for a CPU to have a computer that is usable to them.

    I absolutely hate this notion today that only the most expensive modern things are usable, and that anything else will not work properly.

    • No, you are dead right.

      However it is less than stellar when Intel launch a new process with a CPU that is slower and less capable than the previous generation.
      They know this, and would not be doing this unless there were problems...
      You may not launch with the best CPU a process will ever support - that takes time, however you shoot a bit higher than 'lowest end possible'.
      It looks like it is slow and hot... not a good sign.

      • by nagora ( 177841 )

        However it is less than stellar when Intel launch a new process with a CPU that is slower and less capable than the previous generation.

        Well, if it's slower because they took out some of the insecure tricks they've been using to get good performance from their antiquated architecture, maybe that's a good thing.

        "If".

        • by Anonymous Coward on Tuesday May 15, 2018 @03:00AM (#56613388)

          They didn't. It's still vulnerable to Spectre and Meltdown. There simply wasn't enough time to modify the hardware. They are targeting Icelake for the hardware fixes.

        • It's slower because the majority of what they produced got binned. There are more low-end, half-disabled CPUs coming off the line than fully functioning.

        • The indicators point at process issues. Each shrink is a crapshoot, engineering-wise, and after many years of boxcars, Intel finally rolled snake eyes. In other words, Intel bet on some process technology that didn't perform to expectations, requiring expensive and time-consuming backtracking. Meanwhile, TSMC, Glofo and Samsung are moving more cautiously. AMD made a great call by focusing on multi die SoCs.

          Maybe this means the era of Intel developing its own process tech and running its own fabs is coming t

      • by bn-7bc ( 909819 )

        Well does this generation perform better or worse than the previous generation pre or post the spectre/meltdown patches?

      • A Chinese retailer is selling 10nm Intel processors cheap.
        A Chinese retailer is also selling OtterBox cases for $1.50, and Genuine Applà McBook chargers for $12.

    • You're underestimating software bloat.
    • Not everyone needs to cough up $1900 for a CPU to have a computer that is usable to them

      How right you are, when a Ryzen Threadripper [newegg.com] will blow it away in throughput for half the price. :-)

  • Oh shit (Score:5, Insightful)

    by Anonymous Coward on Tuesday May 15, 2018 @02:07AM (#56613238)

    The free ride is over, software retards. You may actually have to start programming again, instead of creating multi-gigabyte copy-and-paste monsters that can't even keep up with typing at the keyboard, yet use 100% CPU on quad core machines.

    • by Anonymous Coward

      This Core i3 8121U chip has a TDP of 15Watts. Don't know if it can be fanless, or not

      But if Intel can come up with a fanless version (preferably with GPU), with even lower TDP I will be willing to design a mini-itx mobo for it

      • This Core i3 8121U chip has a TDP of 15Watts.

        So... a bit like the Atom CPUs then.

        Move along, nothing new here.

        • Their current 15w quad cores are as fast as my 80-something watt 4th gen desktop i5. Wattage != performance
        • Except for not sucking like Atom. Core arch (basically, P3) is just better than Atom. They should kill Atom, it only exists for political reasons. There is no niche where Atom outperforms core arch in performance/watt except by market manipulation.

    • This.... Oh boy, this. I develop primarily in Java these days, mind you. And so I know it's possible to make lean applications using Java (an entire multi-user government system using 80MB of RAM? No problem). And knowing this I get to cry with anger when I see freaks using gigabytes of RAM to do more or less the same fucking job, simply because they throw everything and the kitchen sink over (frameworks) the thing to have ONE function that they could simply do themselves.
    • Yep, who thought using Javascript to develop editors and such was a good idea? It eats your CPU alive.
      I'd rather have software well designed and programmed in classic compiled languages
  • Meltdown&co fixed? (Score:4, Insightful)

    by NuclearCat ( 899738 ) on Tuesday May 15, 2018 @02:29AM (#56613302) Journal
    One thing producing performance-impacting patches for existing processor, another thing trying to sell and manufacture defective processor with known before launch vulnerability.
  • Why this is news (Score:4, Interesting)

    by AbRASiON ( 589899 ) * on Tuesday May 15, 2018 @02:30AM (#56613310) Journal

    This CPU is nearly 3 years late.
    Intel are having immense difficulty with the 10nm move. Down from 14nm (which was 'refreshed' twice)
    What this means is that other manufactuers are now genuinely catching up to Intel, as much as I didn't believe it, it does seem that (I think TMSC?) is now just about ready to start putting out 7nm products.
    (Note, they all bloody lie about the figures, TMSC 7nm is basically about Intels 10nm)

    That does mean that AMD may be producing CPUs with a similar transistor density and voltage requirements to Intel soon, meaning the only advantage available is processor design, not manufacturing process.

    Regardless of AMDs improved competition potential here though, is the concern that the move from 22nm to 14nm to 10nm has been AWFULLY slow and it's one of the driving factors in why computer processing hasn't really improved hugely in the past 4 to 10 years. It's improved but nothing at all like the previous decade.

    If you're an enthusiast dying for top of the line performance with a deep budget, this has been painful, as you upgrade every 18 months to 20% faster, instead of 70+% faster. If you're a homelab server nerd who wants to run a great little VM cluster, on some mid range, low power chips, the chips you could've bought 3 years ago, are probably fairly viable, still, to todays options.

    Intel has delayed the rest of their 10nm processors I think until next year. Means the Intel 8700k 6core and the rumoured 8750 / 8900 (?) 8 core model (soonish) will be the best you can probably buy, for the next 18 months. If you've been holding off upgrading, may be worth considering.

    It kind of sucks, I'm in the 'want a nice, low power server, but still kinda powerful' camp and I don't want 85w of CPU in my cupboard, but I would like at least 6, half decent threads. It's possible, but would've been much more likely with the shrinks being on time.

    • by mentil ( 1748130 )

      I imagine the day when Intel goes fabless, perhaps spinning it off like AMD did with Global Foundries. Doing all the die-shrink R&D just for x86 isn't going to be profitable, probably before ARM shrinks become unprofitable (smartphone and PC shipments are both effectively flat, now). We might go back to the days of the 8086, where a certain clock speed is effectively standard, and all that differs is how much RAM your system has. I imagine there will be some DRAM-only shrinks, since it's easier to get r

      • by Anonymous Coward

        Intel has bought Altera so the die shrinks are not only for x86. Top FPGAs are high-performance and very, very high margin parts. They are also selling manufacturing capabilities for older processes and making some networking gear themselves (cellular modems). Churning out new chipsets all the time is consuming the fab capabilities nicely as well.

        DRAM will follow they way of 3D NAND in a few years. Either that or just stacking of planar dies. We've already had experiments with HBM, but they turned out to be

    • by Anonymous Coward

      The "85w" is just the TDP which is only usually achieved with full core load and/or AVX2. You actually want a newer CPU instead of the older ones because Intel has made great strides in *idle* power optimizations. A home server running on Coffee Lake will be way less wasteful than a Sandy Bridge for example. Almost all the other chips on the motherboard are made in lower power processes as well (chipset, NIC, super-IO, etc). Those things add up.

    • by Artem S. Tashkinov ( 764309 ) on Tuesday May 15, 2018 @05:35AM (#56613728) Homepage

      the move from 22nm to 14nm to 10nm has been AWFULLY slow and it's one of the driving factors in why computer processing hasn't really improved hugely in the past 4 to 10 years

      I believe it's more about the limits of current technology and the fact that the CPU frequency depends on the voltage and since the power consumption and dissipation varies with the square of the DC supply voltage you just cannot raise the voltage arbitrarily unless you want your CPU to consume hundreds of watts of energy. And also there's the speed of light at play - you cannot arbitrarily raise CPU frequency because electrons will not have enough time to traverse the chip. Another issue is that the x86-64 instructions set is very difficult to optimize because the architecture is so old.

    • Intel supposedly dropping the ball like this to the point where it seems like TSMC, GF and Samsung are seemingly catching up genuinely causes me to scratch my head thinking how it's even possible. We're talking about the company with the best foundry facilities and some of the most talented people so I can't imagine this could be just plain incompetence or Intel just not bothering to invest sufficiently.

      I can think of multiple reasons for this, but there isn't anything substantial in the public domain to
      • by Gr8Apes ( 679165 )
        You missed the base reason that physical constraints are likely limiting them and the amount of effort required to get to the next die shrink is high enough that there's considerable room for others to appear to catch up. That's the rosy scenario. Given Intel's history of not having the smartest people in the room leading the charge, recall the P4 super-scalar pipeline and AMD64 incidents as cases in point, it's pretty easy to believe that someone's focus on being *right* may have yet again sent Intel down
        • the amount of effort required to get to the next die shrink is high enough that there's considerable room for others to appear to catch up

          It's not just that, it is also that they all buy their lithography equipment from the same supplier. [wikipedia.org] Nobody gets ahead of that.

          How interesting that the Dutch still dominate printing technololgy [wikipedia.org] 600 years down the road.

        • I intentionally left out the diminishing returns in manufacturing-related R n' D because this applies to their competition just as much as it applies to Intel. As for the dead end that was the NetBurst microarchitecture, that was first and foremost them thinking they could build a really inefficient architecture (massive 20-something stage pipeline running at a frequency way higher than anyone else) and make it work by having a way better manufacturing process.

          In the end Intel was able to stay competitiv
    • by I4ko ( 695382 )

      CPUs are fast enough. We don't need radical improvements every year. We need price cuts and security fixes. We don't need process shrinkage that urgently.

      • Speak for yourself. I need as much fast, cool and cheap as I can get.

      • Fast enough for you, not me, not many.

        High speed doed not need to imply less security.

        Process shrinkage is the number one way to improve performance.

    • the move from 22nm to 14nm to 10nm has been AWFULLY slow

      It's because production costs increase exponentially with number of multi-patterning steps to work around the resolution issue, and EUV [wikipedia.org] is really nasty stuff, it won't go through lens for one thing.

  • Also (Score:5, Interesting)

    by Artem S. Tashkinov ( 764309 ) on Tuesday May 15, 2018 @03:11AM (#56613436) Homepage

    Most likely by mistake last Sunday Intel released Z390 chipset information [guru3d.com]. The page has since been pulled down [intel.com] because this chipset was rumored to be accompanied with octa-core Coffee Lake CPUs which are yet to be announced.

    Next time I'm gonna web-archive their mistakes ;-)

  • That Intel invented all these different xx-nanometer manufacturing processes back in the 70's and 80's and has been steadily drip-feeding them to us in order to make the most profit. When the pace of their "progress" is so steady, they simply have to be drip-feeding. They could have released this processor back in the 386 days if they wanted. Imagine all the e-waste that would have been saved if they didn't bother with this tactic
    • by Artem S. Tashkinov ( 764309 ) on Tuesday May 15, 2018 @05:01AM (#56613660) Homepage

      Have you seen their R&D expenditures?

      Designing a 14nm tech process in the 70's/80's was impossible because it has taken billions of dollars of investments and new technologies (some of which weren't invented at Intel) to get there. Also, considering that they've rehashed their 14nm tech process twice and their first 10nm part is a castrated 2core CPU minus iGPU, it surely looks like 10nm is extremely difficult/costly to get right.

      • by TeknoHog ( 164938 ) on Tuesday May 15, 2018 @05:15AM (#56613682) Homepage Journal

        Have you seen their R&D expenditures?

        Designing a 14nm tech process in the 70's/80's was impossible because it has taken billions of dollars of investments and new technologies (some of which weren't invented at Intel) to get there. Also, considering that they've rehashed their 14nm tech process twice and their first 10nm part is a castrated 2core CPU minus iGPU, it surely looks like 10nm is extremely difficult/costly to get right.

        Well, that's exactly what they want you to believe. When they say "R&D expedinture", they really mean "R&R expedinture".

  • by Opportunist ( 166417 ) on Tuesday May 15, 2018 @05:36AM (#56613730)

    We have a Chinese retailer claiming to sell a 10nm CPU that has the features (and probably speed) of a 5 year old low budget processor. And since Chinese companies have a spotless track record of never trying to sell counterfeited products, we should readily believe that this seemingly ancient CPU is bleeding edge.

    I ... erh... well... how do you put it nicely...

  • by Anonymous Coward

    Given the recent rash of Intel architecture bugs and that this is not a new architecture, what bugs that we already know about are in this "new" CPU?

  • ...yields in the 10nm fabrication are apparently too sketchy for a high-end/up-market release. Solution: disable the cores and features of the chip that don't work, and sell it cheap.

  • >We've never before seen Intel commercially use low-end processors to introduce a new manufacturing process.

    Yes we have. However, if you're only paying attention to the desktop CPUs, you might get that impression.

  • Hah, sure, Intel doesn’t care about Bitcoin and mining difficulty. They just want get some money from all those insane miners. Look, Intel offers a new wonderful hardware with correctly planned name – for mining. These words raise the cost of processor automatically. Then miners all over the world buy novelty raising its cost again. It’s clear profit for Intel. But I think that physical mining now is ineffective, we can’t get enough power to compete with other players. Cloud mining s

You are always doing something marginal when the boss drops by your desk.

Working...