Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Businesses Hardware

Intel CEO Blames Company's Obsessive Focus on Capturing 90% CPU Market Share For Missing Out on Other Opportunities (wccftech.com) 101

Intel chief executive Bob Swan says he's willing to let go the company's traditional dominance of the market for CPUs in order to meet the rising demand for newer, more specialized silicon chips for applications such as AI and autonomous cars. From a report: Intel's Bob Swan blames being focused on 90% CPU market share as a reason for missing opportunities and transitions, envisions Intel as having 30% of all-silicon TAM instead of majority CPU TAM. Just a few years ago, Intel owned more than 90% of the market share in the x86 CPU market. Many financial models used Intel's revenue as a proxy for the Total Available Market of the CPU sector. With a full-year revenue of $59.4 billion in 2017, you can estimate the total TAM of the CPU side of things at roughly $66 billion (2017 est). Bob Swan believes that this mindset of protecting a majority share in the CPU side has led to Intel becoming complacent and missing out on major opportunities. Bob even went as far as to say that he is trying to "destroy" this thinking of having a 90% market share in the CPU side and instead wants people to come into office thinking Intel has 30% market share in "all Silicon." Swan on how Intel got to the place where it is now: How we got here is really kind of threefold, one we got a lot faster than we expected and the demand for CPUs and servers grew much faster than we expected in 2018. You'll remember we came into 2018 projecting a 10% growth and we grew by 21% growth so the good news problem is that demand for our products in our transformation to a data-centric company was much higher than we expected. Secondly, we took on a 100% market share for smartphone modem and we decided that we would build it in our fabs, so we took on even more demand. And third, to exacerbate that, we slipped on bringing our 10nm to life and when that happens you build more and more performance into your last generation for us -- 14nm -- which means there is a higher core count and larger die size. So those three -- growing much faster than we thought, bringing modems inside and delaying 10nm resulted in a position where we didn't have flexible capacity.
This discussion has been archived. No new comments can be posted.

Intel CEO Blames Company's Obsessive Focus on Capturing 90% CPU Market Share For Missing Out on Other Opportunities

Comments Filter:
  • Go AMD! (Score:5, Interesting)

    by war4peace ( 1628283 ) on Monday December 09, 2019 @11:19AM (#59501158)

    AMD had been lagging behind in the x86 CPU area until a couple years ago, when they started churning out glorious CPUs. All my machines are slowly switching to AMD CPUs. My fiancee's mITX PC has a Ryzen 3600, my sister's build will have either 3950X or 2950X (whichever becomes available first when I start buying the components), and my main PC will have a Threadripper 3xxx (will upgrade sometime during fall next year). My freshly built NAS has a Ryzen 3 1200 in it as well. All my friends who upgraded their PCs during the last couple years have chosen AMD.

    Intel's dominance in the CPU market has two reasons, none of which is related to performance. The first is their multi-year exclusive deals with companies such as Lenovo, Dell, HP, Toshiba (using Intel CPus in laptops) and the second is capacity. They can churn out a lot more CPUs.

    • by twocows ( 1216842 ) on Monday December 09, 2019 @12:03PM (#59501324)
      From what I've seen, Intel seems to still do better with single-threaded workloads, though thankfully AMD's finally competitive here with Zen 2 where Intel was the clear winner in years past. The bulk of PC games tend to be single-threaded workloads (exceptions: emulation, some mmos), so it's definitely a case that matters.

      I'll probably buy AMD for my next system anyway because Intel honestly deserves what they're getting right now and I like that they don't change sockets every ten seconds. Maybe if Intel gets crushed hard enough, they'll start doing things the right way like AMD has been all along and we'll have two legitimate competitors.
      • Oops, I accidentally over-edited my post. I mean to say "the bulk of pc games are single-threaded, but gpu-limited anyway (exceptions: emulation, some mmos)."
      • by Targon ( 17348 )

        Intel only has an edge in clock speeds at the very top end of their product line, but for the rest of their chips, Intel is behind in terms of performance. Think about it, you hear about the 9900k and the 9700k, but for the rest of the Intel products, for the price, you end up with performance that isn't all that great. Reduced overall performance from security mitigations(due to all the security problems with Intel chips) means that in another two years, even the 9900k may end up being slower than the

        • Reduced overall performance from security mitigations(due to all the security problems with Intel chips) means that in another two years, even the 9900k may end up being slower than the Ryzen 7 3800X.

          BZZZZT! AMD high performance, Out-of-Order (OoO) chips have the same class of Spectre bugs that every other OoO design and company baked into their CPUs, ARM, IBM both mainframe and POWER, MIPS, and SPARC. That was official in the very first Spectre paper.

          No one has really cared that much about AMD's chips be

      • by AmiMoJo ( 196126 )

        Most PC games are multithreaded now since all the consoles have been multicore since the 2000s. It's just that there tends to be one thread that is the limiting factor (if not the GPU) so single threaded performance can help... For peak FPS.

        If you look at average FPS and 99th percentile AMD chips are very competitive. Also when you look at bang-for-buck metrics you can't beat AMD.

        For most people the best thing would probably be to get an AMD CPU and spend the savings on a better GPU, then a few years later

    • by Kjella ( 173770 )

      Intel's dominance in the CPU market has two reasons (...) the second is capacity. They can churn out a lot more CPUs.

      Not in the last decade, during the bad years between Bulldozer and Zen AMD had to pay several penalties because they couldn't sell enough chips to buy all the wafers they had committed to. They had the capacity but not the customers. Intel played dirty too but from 2006 (Core 2) to 2011 (Sandy Bridge) they made huge [anandtech.com] strides technologically, it's only after they saw that Bulldozer was a dud that they started taking their foot off the gas pedal. I loved the Athlons of the early 2000s as much as anybody but th

      • The inability to command fab capacity is one of the implications of their decision to going fabless

        • Why? Contracting external manufacturers brings much more flexibility regarding production volumes. If they used their own fabs they would always have the same capacity, no matter if they made a good or a bad CPU.
          • Contracting external manufacturers brings much more flexibility regarding production volumes.

            Until, as a hypothetical, the external manufacturer says something like, sorry, Apple has already reserved the extra capacity you now desire, we'll pencil you in for half of what you want next year.

            AMD's most recent really big success period was due to superior high level architecture decisions (Hypertransport + direct memory attachment to CPUs vs. a FSB, and the AMD64 macroarchitecture when Itanium AKA IA-64 faile

            • Well, in the past Intel was ahead of everyone because they made the most money with their quadi-monopoly. Only because of this their own fabs were an advantage.
              Before their good chips AMD would just have wasted lots of money on their own fabs because they would have been underutilized. The fine they paid is probably not so much compared to that cost. And they would have been unable to finance the upgrade to "7nm", and then be stuck with a good processor design but not enough production capacity to make mon
              • Well, in the past Intel was ahead of everyone because they made the most money with their quadi-monopoly. Only because of this their own fabs were an advantage.

                Please "show your work", how does being the most technologically advanced happen only because of a financial advantage, which you're absolutely sure no one else was able to muster during this history (once upon a time VCs funded hardware as well as software companies)? Intel is showing right now that all the money in the world can't make their "10 n

                • Wow. Where do you see a "Intel evil, AMD virtuous" worldview?

                  Upgrading to new nodes became more and more epxensive in the past. Can you show me a company which jumped ahead of everyone with a new node, without investing huge amounts of money? And why should that suddenly happen for AMD? Fab upgrades are not really their area of expertise.

                  The money to pay these upgrades must come from somehwere. It requires production volume, or a really revolutionary chip. Having investors does not change that, AMD w
            • by Kjella ( 173770 )

              Many things could make this a temporary advantage, like Intel succeeding in their "7 nm" node (but I'm not betting on that one)

              They could strike back, but I'm pretty sure the window of opportunity where Intel could get so far ahead as to drive AMD out of business has closed. There's not enough silicon generations left before both Intel and TSMC have to hit some kind of physical limit - a silicon lattice is 0.5nm wide it simply can't go on - and then I suspect they'll end up more like Airbus and Boeing, both making airplanes but neither is so revolutionary it'll take the whole market.

        • The inability to command fab capacity is one of the implications of [AMD's] decision to going fabless

          I think they were sort of forced into it by very bad decisions and mistakes they made right after beating Intel across the board. But whatever the reasons, their ability to bid against others for TSMC capacity will be a ceiling for what they'll be able to accomplish. Of course, it's their competitor being entirely unable to use its equivalent process node that gives them something of an advantage for now,

    • intel did Shit like raid keys caping pci-e lanes in high end cpus.

      BS like old gen the $500 cpu had full lanes next gen $500 caped and you had to get the $900 one to get all the lanes.

    • Re:Go AMD! (Score:5, Interesting)

      by UnknownSoldier ( 67820 ) on Monday December 09, 2019 @12:46PM (#59501568)

      > All my friends who upgraded their PCs during the last couple years have chosen AMD.

      Same. Everyone I talk too is super excited about Ryzen and Threadripper; none of my computer friends are talking about Intel chips. All we have seen from Intel is incremental upgrades -- quite a few of us are still on the 3770K / 4770K / 5770K era and have seen ZERO reason to upgrade -- until now! I predict the R9 3900X is going to become the new octa-core -- similar to how the i7 4770K was an extremely popular quad-core.

      I just bought a another Threadripper 1920X (12C/24T for $200 !!!) and have a TR 3960X on (pre) order.

      > Intel's dominance in the CPU market has two reasons

      I would humbly add Lies of Marketing as reason #3.

      Intel's shenanigans have been outright lying [youtube.com] (Intel's Disgraceful Marketing Gets Worse). When even Linus flames Intel [youtube.com] over the Core i9 10980XE you know it's bad. Best YT comment was this:

      You know you seriously messed up when a canadian is mad at youâ¦

      • by sinij ( 911942 )

        quite a few of us are still on the 3770K / 4770K / 5770K era and have seen ZERO reason to upgrade -- until now!

        God damn it! I was hoping for another couple years of not paying for CPU/Mobo updates. Still on DDR3 here, so probably full build will be necessary.

        • Even though the i7 3770K / 4770K / 5770K are getting a little old they are still decent chips. It REALLY depends on what you doing.

          The biggest problem with the R9 3950X, TR 3960X, and TR 3970X is availability! (Some might argue the high price(s) but these are HEDT -- they are NOT meant for casual users / gamers. I will admit the price of the 3rd gen Threadripper mobos being north of $500 is annoying. People eBaying them is just outright robbery.)

          Basically bookmark this price summary and availability [nowinstock.net] meta

          • by sinij ( 911942 )
            Thank you for info dump. I am so used to "nothing happens, thanks to Intel" state of things that I stopped keeping up with changes.
          • But yeah the trinity of hardware upgrades is CPU, RAM, and Motherboard IMHO

            PSU as well, in most cases, because you really don't want your new hardware to run on a 7 year old 450W PSU, which might or might not provide all the specialized pinouts such as 8-pin CPU power connector. Add a new case too if you cheaped out in the past.

            • A new case is definitely a nice QoL!

              From my experience I haven't had any issues with PSUs as I tend to buy 750W - 1,000W at the 80+ Gold or better. Âe.g. I just ordered a 1,000 Titanium PSU for the TR 3960X.

              Corsair has an older but good PSU guide on efficiency [corsair.com].

    • I have never found a fanless Mini-ITX board with an AMD CPU.

    • Intel's dominance in the CPU market has two reasons, none of which is related to performance.

      Sorry but that is just false. Intel's dominance has largely to do with performance as well. CPUs and computers aren't just tossed out like iPhones once a year. AMD being "back in the game" is a recent phenomenon and there's a shitton of Intel CPUs from a day where AMD had no viable performing alternative that will keep ticking for many years. And as it stands Intel still has the single threaded performance crown, though that is becoming less and less relevant.

      Capacity also has little to do with it. You say

  • Stagnation (Score:5, Interesting)

    by nbritton ( 823086 ) on Monday December 09, 2019 @11:27AM (#59501196)

    Yeah, he is saying that because they are getting massacred by AMD’s EPYC 2nd Gen Rome processors and they have no viable way to compete with them for the foreseeable future. For instance, one of the road maps I saw said that Intel wasn’t even contemplating a migration from PCIe 3.0 until 2021 at the earliest. AMD already has PCIe 4.0, which is 252 gigabit/s for x16 lanes. This is a problem for people like me because I’m already deploying 200 GbE cards.

  • These single-focus programs turn big companies into idiots*. When Google made overpowering Facebook their main goal, their applications started cross-leaking personal info like the diaper of 50 lb baby.

    When Microsoft became "tablet UI or bust", they made the Frankenstein OS Windows 8 that did neither finger nor mice well: "Minger".

    And once IBM made revenue streams their primary goal, and customer service took a shot through the heart as a result. Sure, they got a short-term revenue boost, but customers star

  • A microarch from 6 years ago because they are still dealing with supply issues in 14nm. What is going on? First they had problems making 10nm viable and slipped schedules. Now the working 14nm node is unable to keep up with demand, which has actually gone down in some segments due to AMD's competition and remained stagnant in others. They can supply fewer 14nm chips today than they could in 2017. It's almost as if their fabs are being sabotaged.

    • They can supply fewer 14nm chips today than they could in 2017. It's almost as if their fabs are being sabotaged.

      See my first comment [slashdot.org], Intel has a very strong self-sabotage program with stack ranking. And there are more reasons which can't be discussed on a forum like Slashdot.

      But the main reason would be Intel inside not admitting that their "10 nm" node just doesn't work at all, and it's now clear won't ever, and converting their "14 nm" fab lines, either to produce support and other chips, since "10 nm"

      • But the main reason would be Intel inside not admitting that their "10 nm" node just doesn't work at all, and it's now clear won't ever, and converting their "14 nm" fab lines, either to produce support and other chips, since "10 nm" was supposed to replace them for CPUs, or maybe to switch to "10 nm" or "7 nm".

        I'll have to review ASML's press releases but I didn't think they had shipped many EUV scanners to Intel yet. TSMC and Samsung have bought over 50 between the two though. Still, an overzealous conversion to 10nm sounds consistent with the habits of a company that has rapidly changed nodes for the last 40 years.

        See my first comment [slashdot.org], Intel has a very strong self-sabotage program with stack ranking. And there are more reasons which can't be discussed on a forum like Slashdot.

        Not surprising. They've had this problem for decades. Vinod Dham didn't stick around long after Pentium.

      • But the main reason would be Intel inside not admitting that their "10 nm" node just doesn't work at all, and it's now clear won't ever

        To be clear, Intels failure on 10nm had to do with blowing off 5+ years trying to get their 3D Trigates to be practical at 10nm. Even at 14nm their yields were bad with Trigates. One can only imagine how bad they were at 10nm.

        To be even more clear, Intels 14nm no longer use 3D Trigates either. After wasting 5+ years trying to make Trigates something other than a dead end, they had to suck it up, and are now desperately playing catch up, as more than one rent-a-fab has better FinFET's than Intel now.

        The

        • What's inherent in Intel's "10 nm" that forced them to use 3D Trigates, and prevented them from using FinFETs? What's inherent in their "7 nm" process node that ... I don't even get your point, you're saying that at the same time Intel is desperate for 14 nm CPU fab capacity, and is using every bit it can scrounge up, since they can't make their "10 nm" work at all, it's guaranteed to build too many "7 nm" EUV based fab lines?
          • A "3D Trigate" is a finFET. OP has no idea what they are talking about.

            Specifically, trigate means the gate is run up both sides and also across the top of the fin and they are all connected together. The other producers of finfet processes do that as well, and their fins look roughly the same. Theoretically you could not run the gate across the top, then the two sides could be electrically isolated and you could even drive them separately, but no one really does that at volume.

    • It's almost as if their fabs are being sabotaged.

      Meredith probably went behind Bob's back and changed the specs on the Malaysian production lines.

  • by mangastudent ( 718064 ) on Monday December 09, 2019 @11:33AM (#59501220)

    Nothing will save you if you completely blow a transition to the next process node, as Intel has done with their "10 nm" node (all 193nm lithography, but more aggressive than TSMC's initial all 193nm "7 nm" node).

    Nothing will save you if you're completely incompetent at doing other types of chips, as I've read Intel has been with cellular modems. They've never been able to meet their promises to customers, and have finally given up on the market.

    Nothing will save your efforts to broaden your product lines you if you earn a reputation for giving up on non-core ones before they have a chance to make it, Intel's done this at least a few times, I don't know if they've lost enough trust in the process.

    Nothing will save you if you adopt stack ranking to fire 10% or more of your employees every year, we're seeing the end game of this with GE, and we know about its sad history at Microsoft after Ballmer removed all the safety mechanisms Gates had built into the system. It means employees can't plan on having a career at the company, for one bad relationship with a new manager means you're out. If the granularity is small enough, it means you can't assemble teams of superstars, because no matter what their manager wants, he will have to fire 10% of them every year. All in all it makes employees focus on the politics of survival instead of doing what the company ostensibly is supposed to do.

    • by Junta ( 36770 ) on Monday December 09, 2019 @11:40AM (#59501240)

      Agreed. They only retroactively decided they were too obsessive on CPU market share when AMD came along to disrupt that.

      The problem is despite earnest attempts to diversify, they always failed. IoT chips, Mobile devices, networking chips, complete systems, Wireless modems, omnipath, Optane, McAffee, HPC software. I would in fact say I've seen more evidence of the opposite, that they were so distracted by trying to diversify, secure in their unassailable PC market position that they failed to keep up with the CPU technology.

      The truth is that their non-CPU efforts have sucked. Generally speaking it's not from lack of dedication or investment, it's because they simply lack the competencies and they don't even have a way of knowing how to get the competencies. Their failures have led to a reputation issue of giving up and that has hamstrung any future legitimate advances, but the failures failed in the first place because of more fundamental limitations than obsession on CPU.

      • The truth is that their non-CPU efforts have sucked. Generally speaking it's not from lack of dedication or investment, it's because they simply lack the competencies and they don't even have a way of knowing how to get the competencies.

        I have to wonder about their embedded efforts which they abruptly exited not long ago, the Galileo and Edison. They are conceptually enough like their high end CPUs and support chips that they should have had the competencies necessary to make them worthwhile to use, but

        • by timholman ( 71886 ) on Monday December 09, 2019 @12:11PM (#59501368)

          I have to wonder about their embedded efforts which they abruptly exited not long ago, the Galileo and Edison. They are conceptually enough like their high end CPUs and support chips that they should have had the competencies necessary to make them worthwhile to use, but they failed to create enough of an ecosystem, like sufficient documentation or something essential at that level, or follow up when inevitable problems were found in what they delivered to fix them. This would seem to be a lack of dedication and/or investment, instead of competency.

          We were given several Galileo boards for educational evaluation. Over a year's time, I repeatedly emailed Intel to try to obtain long-promised add-on boards that never showed up. Every time I did, the person who I had corresponding with previously had moved on to another group, and a new person said the boards would be there soon. And on and on ...

          Apparently the number one priority of every engineer in the Galileo product group was to get out of it as quickly as possible. It didn't surprise me in the least when Intel abandoned the entire product line.

      • I would point out Intel have a long, very long line of failed new CPU architectures, Itanium being the most prominent. They got lucky with x86 and used the profits to move ahead of everyone else's FAB process making up for the rubbish CPU architecture in the meantime. Noting that even some of their x86 architectures where rubbish too (here's looking at you NetBurst). Intel are and have always been a one trick pony and that pony is x86 processors.

        • You're confounding macro with microarchitectures. Not counting microcontrollers, their macroarchitectures are the 4004 through AMD64, iAPX 432 (first 32 bit in 1981, failed), i860 and i960 (RISC, first failed, second successful), and Itanium (VLIW, another concept that failed due to the "a sufficiently smart compiler" conceit).

          Lots of these succeeded entirely on their own merits, like the early microprocessors which among other things launched the personal computer with the 8080, and the i960 which was ver

    • It's clearly something went terribly wrong. Especially considering they didn't manage to at least make new archs on 14 nm. They must have kept thinking "Next year it is certain 10nm will work!" cause had they known how things would turn out, surely they could have made something more exciting than Skylake on their 14nm+++ node.

      One thing is that Moore's law is slowing, but another thing is absolutely nothing is happening on the arch/feature side. True for both AMD and Intel. Buy a CPU today and get the sa
      • It's clearly something went terribly wrong. Especially considering they didn't manage to at least make new archs on 14 nm. They must have kept thinking "Next year it is certain 10nm will work!" cause had they known how things would turn out, surely they could have made something more exciting than Skylake on their 14nm+++ node.

        There's clearly some very nasty internal politics including the unwarranted optimism you guess going on, preventing the microarchitecture people from putting the next Ice Lake major m

  • by 140Mandak262Jamuna ( 970587 ) on Monday December 09, 2019 @11:37AM (#59501234) Journal
    Now, dear candidate, what are your negative points?

    Sir, I work too hard and neglect my family. I often forget to file expense reports. I have this nasty habits of immediately doing stuff the boss orders without first worrying about ethical guidelines and other such issues.

  • They are already dominating Intel but if they steer more focus AMD will take over indeed.
  • Keep your eye on the ball and not the scoreboard.
  • by timholman ( 71886 ) on Monday December 09, 2019 @11:53AM (#59501278)

    It's one thing for Intel to declare that CPU dominance doesn't matter so much, and another thing entirely to translate those words into actions.

    Intel lives to sell high-margin chips. Their forays into low-cost computing (e.g. Galileo) have been an unmitigated disaster. The entire corporate culture revolves around rewarding those who design and sell high-end products, not commodity components.

    For Intel to give up their obsession with CPU dominance in favor of low-margin products makes as much sense as Mercedes-Benz declaring that they will compete in the economy car market. It's not going to happen.

    • For Intel to give up their obsession with CPU dominance in favor of low-margin products makes as much sense as Mercedes-Benz declaring that they will compete in the economy car market. It's not going to happen.

      Yes, the A series never happened.

      • Yes, the A series never happened.

        With an entry-level MSRP of nearly $33K, the A series subcompacts are not economy cars, except in comparison to other Mercedes-Benz models.

        Let me know when Mercedes-Benz starts competing in the same market as the Honda Fit and the Toyota Yaris.

        • The 2001 A series did, the new version knows not to tread too low again, specially after their spectacular failure of the moose test.
      • The A series is the least-expensive Mercedes, sure.
        And you can get a Honda or Toyota for half the price.

        I don't think Mercedes is focused on the economy car market.

  • specialized silicon chips

    The Microsoft philosophy, which Intel adopted, was to implement everything with (proprietary) O/S drivers. All you need is the bare minimum A/D hardware and everything else will be taken care of by the CPU.

  • "Willing to let go"? (Score:5, Interesting)

    by gweihir ( 88907 ) on Monday December 09, 2019 @12:20PM (#59501420)

    More like desperately outclassed and knows it. It is absolutely astonishing that AMD with a far smaller budget, less people and no own fabs can humiliate Intel in this way. (And they just did it for the 2nd time too) Intel is fat, lazy and stupid and does care not one bit about its customers. There is no other explanation.

    Also, Intel is traditionally a memory company, while AMD grew with signal processors. May explain why AMD engineering is so often better, for example with a CPU-integrated memory controller years ahead of Intel. The only edge Intel ever had was a better manufacturing process and some fundamental dishonesty that allowed them to cut corners that should not be cut.

    Now, I do not hope Intel dies, that would probably be too much incentive for AMD to get lazy as well. But cutting them down to the size they deserve at around 30% of the CPU market, with the other 30% AMD and the rest from others would be an excellent long-term situation. That is if they ever manage to create a competitive, secure CPU design again.

    • May explain why AMD engineering is so often better, for example with a CPU-integrated memory controller years ahead of Intel. The only edge Intel ever had was a better manufacturing process and some fundamental dishonesty that allowed them to cut corners that should not be cut.

      There are different levels of engineering that are relevant here. Intel for a very long time has done very poorly at the highest levels of engineering architecture decisions, like their long obsession that they would have trouble att

    • by leonbev ( 111395 )

      Yeah, this sounds like CEO speak for "We screwed up, and let AMD leapfrog us on desktop and server processors. Now we need to greatly increase our R&D budget to catch up, and that's going to hurt our quarterly earnings for the next year or so"

    • It is absolutely astonishing that AMD with a far smaller budget, less people and no own fabs can humiliate Intel in this way.

      It's not astonishing at all. If you outsource a large part of production and R&D, and then simply exclude those numbers then it may be bad, but the reality is unless you combine AMD and TSMC together you are comparing Apples to the entire bloody Apple tree.

      Also, Intel is traditionally a memory company, while AMD grew with signal processors. May explain why AMD engineering is so often better, for example with a CPU-integrated memory controller years ahead of Intel.

      It's debatable just how "ahead" they actually are. It was only a few weeks ago that AMD actually managed to surpass Intel in memory bandwidth at the upper end, and as for memory latency even their latest Ryzens are still well behind Intel on that fron

      • by gweihir ( 88907 )

        You really have no clue what you are talking about. That is not "calling my bullshit", that is just disgracing yourself publicly.

  • Holy hell, I was afraid the CPU market might become a monopoly, but I didn't think it would be AMD winning out.

    Nor did I think that I would wish Intel to make a comeback to keep the market from that.

    • Intel has more than enough resources to develop a new architecture and return to the fight. We might not see them for 2 years, but they'll be back. The desktop market is too important to just cede.

      • We might not see them for 2 years, but they'll be back.

        With turn-a-round times of about 5 years between initial design and production, you are right about 2 years, but only for the transition to a smaller node. Intel realized its failure to solve the yield problem about 3 years ago as evidenced by their massive layoffs and P.R. about their "New Cloud Strategy."

        Yes, Intel knew it before AMD released their first chiplette designs. These companies cant keep secrets from each other. AMD was ~4 years into chiplettes before Intels first major layoffs, where the ~4

  • Otelinni missed the op to be the chip on iPhones. "because their data said there wasn't a large market." That meant that they missed not just the mobile market, but the market for low power low price cpus. Purchases like McAfee disagnosed their problem to need to differentiate correctly, but the solution wasn't a match. Then, the move to the cloud. No one cares if there's "Intel inside", unless it's a worry about spectre. They are now one of those sun's that will emit light for a long period of time, but
    • No one cares if there's "Intel inside", unless it's a worry about spectre. They are now one of those sun's that will emit light for a long period of time, but they're effective at the end of their lives.

      Every single Out-of-Order (OoO) design has Spectre bugs, and that name was chosen because these bugs will be haunting us for a very long time. (Even the Meltdown mistake was shared by ARM, and IBM, both mainframe and POWER.)

      For cloud vendors, there's plenty of differences between AMD and Intel, and AMD isn

    • At least they got will.i.am to be Director of Creative Innovation...

  • Intel deserves scorn (Score:4, Interesting)

    by presearch ( 214913 ) on Monday December 09, 2019 @02:06PM (#59501914)

    When I was at Intel, I was working on the Infotainment project, getting Intel chips into cars for GPS, radio, streaming etc. I found a bug in the FM hardware that would only tune in every 3rd station. Got proof of the domain of the bug, reported it to the manager (one of the old-line good old boys). He told me to go up to the top floor, find any station that worked, forget the bug, and send the dev kits off to the customers. Although Intel culture says to get things right, reality was that my small pushback got me on the shit list and out of the group in short order. The project apparently died anyway, as most all non-CPU things do there. Intel is a pit of backstabbing vipers and marketing weasels. The firehose of CPU money is the only thing that keeps them alive.
    There was a digital signage project whose only purpose was to dump overstocked low-binned Core2 CPUs on an unsuspecting market. The dev kit -never- worked. It was so slow, it couldnâ(TM)t get out of its own way. Yet, Intel told customers they were custom-built chips made for signage. Lies.

    On the other hand, I was in the lab with a guy tasked with bringing up the Atom. I got to see the first time one ever booted. He used Doom as the first test suite.

  • They spent over $2 mil to have Orange County Choppers build an Intel bike.
    They kept it behind glass at the Chandler AZ facility. The electronics were all empty fakes.

    It did have a nice paint job. I wonder where it ended up?
    Paul Otellini was a buffoon.

  • So the message here is get in to other business, but, sell them off before they can deliver and stick to CPU's. Got it. I think...

  • They also "forgot" to mention that their market share has shrunk to mere 18% in some segments, such as: https://wccftech.com/amd-decim... [wccftech.com]. The OEMs are slower in their moves than people making their own builds, but one can see more and more AMD-based computers on the market.

    When going to pretty much any dealer and checking the most popular CPUs, the top 2-5 spots are almost always AMD's. So they didn't only forget other goals than 90% CPU market share, they also slipped on that goal, and it looks like their

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...