Intel Vows Better Communication With Partners About CPU Shortage (crn.com) 34
Intel's channel organization is vowing increased communication and transparency with partners on issues such as the current CPU shortage, which has caused delays, price hikes and other challenges this year. From a report: In an exclusive interview with CRN, Todd Garrigues, director of partner sales programs at Intel, said better transparency about supply issues, new business opportunities and new technologies is one of the company's top priorities for partners heading into 2019. "We got some feedback -- some critical feedback if I'm honest -- from some partners through our advisory boards, and we're working hard to make sure we do better at that," he said. "The request, bluntly, was just to work harder at being transparent as close to real time as possible. And we took that to heart -- a lot of internal discussions on how we enable that."
One of the challenges, Garrigues said, has been engaging with Intel's broader base of partners that the company may not have one-on-one relationships with. To mitigate the issue, the Santa Clara, Calif.-based company is investing more in its relationships with distributors to boost Intel's signal. "One of the big priorities I've placed on this year is really working very close with our distribution partners who do serve that broad channel base more directly," said Jason Kimrey, Intel's U.S. channel chief. "I would tell you that we are having much more direct, open transparent dialogue with them to help them plan and help our mutual customers plan to roadmaps and plan around the supply."
One of the challenges, Garrigues said, has been engaging with Intel's broader base of partners that the company may not have one-on-one relationships with. To mitigate the issue, the Santa Clara, Calif.-based company is investing more in its relationships with distributors to boost Intel's signal. "One of the big priorities I've placed on this year is really working very close with our distribution partners who do serve that broad channel base more directly," said Jason Kimrey, Intel's U.S. channel chief. "I would tell you that we are having much more direct, open transparent dialogue with them to help them plan and help our mutual customers plan to roadmaps and plan around the supply."
LOLZ, typical empty suit-think (Score:3)
If you can't supply the components your competitor will get more business.
you can't put a motherboard using "transparency" into a system. talk in lieu of product means nothing
Re: LOLZ, typical empty suit-think (Score:1)
What does it mean to have internal discussions about transparency?
Re:LOLZ, typical empty suit-think (Score:4)
Not necessary. In the corporate world its 98% of the people its intel or nothing. Corporate managers and bean counters are more than happy to line up in the intel queue like cattle, and wait. To them, AMD means second rate crap. An those of us that know better recommend intel anyway, because "nobody ever got fired for recommending intel."
Same thing in the consumer market. People that know nothing about computers but just know they need one, will pick intel over AMD. They do so because the know the name.
Re: (Score:2)
I want to buy AMD but Apple only uses Intel for their Macs!
At least I know I'll be able to dump Intel in a few years when Apple switches to their own laptop/desktop-grade ARM processors.
Re: (Score:3)
I'll believe this rumored switch away from Intel for Apple when I see it. They switched CPU's twice already and it hurt them pretty bad. The switch to intel wasn't as bad as the cluster fuck that was the switch to PowerPC.
But if they do switch I doubt it will be to a ARM based processor. Apple has a pretty good chip in the Iphone. If they where to scale it up to desktop specs it would be a pretty nice CPU for Macs.
Re: (Score:2)
bean counters can be swayed by certain arguments that now can be made for some of the AMD chips. money talks, and I think we'll be seeing more AMD in the corporate space
intel is melting down time to go AMD! (Score:2)
intel is melting down time to go AMD!
Re: (Score:2)
AMD isn't completely secure either, I'm building a beowulf cluster of ATmega328p!
Re: (Score:2)
Simply too large a model line up, to inflate prices for high end newer CPU and that is crippling production.
"vows" = does not give a shit and lies about it (Score:2)
Just the usual dishonesty you can expect from Intel. In addition, you can expect their CPUs to be overpriced, backdoored and full of critical security problems. You know, the usual things corporations with a dominant market position do because they have long since stopped caring about their customers.
Re: (Score:2)
I don't see how transparency is compatible with total omerta regarding Cannon Lake.
Here We Go Again... (Score:3)
Graphic Cards were in short supply because of crypto-coin mining. They just couldn't make enough cards to keep prices from zooming up. A bit after that it was RAM. Oh, all the fabs switched to NAND flash, so no RAM for you and it's going to cost. Now Intel hasn't enough 14nm capacity to keep up with processor demand?
All market manipulation. All of it.
Re: (Score:2)
Intel bet everything on their 10nm process, and now they are faltering. Many mocked AMD for getting out of fabbing their own CPUs, but now it looks like they made the correct decision after all. Intel needs to either get out of the fab business ASAP, or get their 10nm process fixed stat. Otherwise they're going to fall behind AMD bigly.
Re: (Score:1)
Intel manufactures lots of chips, the shortage will be much worse if they close their fabs. Just type lspci (in a terminal under Linux) on most Dell products and you'll see what I mean. Now most of these chips (non CPU) are probably made on older processes, where the cost of 14nm cannot be justified.
Their 10nm process is a failure, it's too late to fix it before the next process is ready and the investment to transform enough production lines to make their current 10nm process mainstream would never be reco
Too little, too late (Score:3)
You can announce better communication and publish roadmaps, that doesn't mean you'll be able to stick to them. The past is proof enough.
Take Apple for example. There's been no official announcement (and never will be, right until the launch of the computers) but we all know Apple is working on breaking away from Intel CPUs.
Take Microsoft as another example. They're already working on pushing the transition to ARM CPUs, they have Windows running on it and already selling hardware that doesn't use Intel CPUs.
If there's one thing you can be sure of it's that Intel's days are numbered*.
* I mean, Intel uses calendars just like the rest of us, right?
Re: (Score:1)
And good riddance. Don't forget that Intel was never able to win/create a market, it was handed over to them by IBM on a silver plate. Everything else that Intel has attempted to switch away from x86 has failed: iAPX432, i860, and Itanium. The last one has been an EPIC (used as acronym for Explicitly Parallel Instruction Computing in Itanium's documentation) fail.
Besides that the 16 bit segmented addressing model was a software developer nightmare for the first 15 to 20 years of the PC, something that I
While you're at it (Score:2)
While you're at it, please explain exactly what went wrong with the 10nm process.
Re: (Score:2)
While you're at it, please explain exactly what went wrong with the 10nm process.
They pretty much have. Just not to mainstream media.
Intel was trying to use multiple masks and multiple exposures, but couldn't work out mask registration accurately enough for the multiple exposures to line up correctly.
Re: (Score:2)
While you're at it, please explain exactly what went wrong with the 10nm process.
They pretty much have. Just not to mainstream media.
Why not? And citation please. I know about the speculation, I don't know anything about an actual statement from Intel, to technical media or otherwise.
The biggest challenge for Intel was clearly (Score:2)
Coming up with such vacuous words that pretty much scream, âoeWe are not saying anything to anyone and youâ(TM)ll maybe get your chips, maybe.â