Former Intel Engineer Claims Skylake QA Drove Apple Away (pcgamer.com) 252
UnknowingFool writes: A former Intel engineer has put forth information that the QA process around Skylake was so terrible that it may have finally driven Apple to use their own processors in upcoming Macs. Not to say that Apple would not have eventually made this move, but Francois Piednoel says Skylake was abnormally bad with Apple finding the largest amount of bugs inside the architecture rivaling Intel itself. That led Apple to reconsider staying on the architecture and hastening their plans to migrate to their own chips. "The quality assurance of Skylake was more than a problem," says Piednoel. "It was abnormally bad. We were getting way too much citing for little things inside Skylake. Basically our buddies at Apple became the number one filer of problems in the architecture. And that went really, really bad. When your customer starts finding almost as much bugs as you found yourself, you're not leading into the right place."
"For me this is the inflection point," added Piednoel. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform."
Apple made the switch official at its developer conference on Monday, announcing that it will introduce Macs featuring Apple-designed, ARM-based processors later this year.
"For me this is the inflection point," added Piednoel. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform."
Apple made the switch official at its developer conference on Monday, announcing that it will introduce Macs featuring Apple-designed, ARM-based processors later this year.
Piednol full of it (Score:2, Informative)
Piednol is not the best source for . . . well, anything, really. Apple may have found numerous bugs in Skylake, but now that Intel is finally starting to roll out mobile CPUs that are not Skylake-based (Icelake-U, Tiger Lake-U), Apple turns on Intel? Nah. It has more to do with Intel's lackluster product lineup most-likely.
Re:Piednol full of it (Score:5, Insightful)
Apple may have found numerous bugs in Skylake, but now that Intel is finally starting to roll out mobile CPUs that are not Skylake-based (Icelake-U, Tiger Lake-U), Apple turns on Intel? Nah.
How long ago do you think Apple made the strategic decision to move desktop-class computers to ARM? They'd need the chip designs, production capacity, software support, all sorts of plans for the transition. This is not a decision made in the past year, my guess is 3-4 years ago. Once they committed the resources it's unlikely Intel could do anything to change their mind.
Re:Piednol full of it (Score:4, Informative)
Apple has an ARM license that allows them to design their own parts. They've been putting their own chips in mobile devices for about a decade now.
Re:Piednol full of it (Score:4, Informative)
Yes, that's how ARM works. They don't make CPU at all, they have no manufacturing. All they do is design the architecture and provide support to companies that make actual CPUs and developers who write software for them.
So Apple licences ARM cores and then develops a CPU that implements the precisely defined functions, guaranteeing it will run ARM code in the expected way. There is room to optimize and innovate around that core functionality, as long as the end result passes the ARM test suite. Passing the tests is part of the contract to prevent ARM fragmenting.
Re:Piednol full of it (Score:5, Interesting)
Apple has had an incentive to move to their own ARM-based silicon for much longer than the years a change like that will take and it's clear they've taken it pretty slow, building up know-how and the IP building blocks necessary by over-investing in the stagnant tablet market, up until fairly recently. The issues with Skylake did probably undermine their internal x86 diehards' position so badly they completely lost their ground, allowing the project to move ahead at full speed.
Re: (Score:2)
Intel are also tied down by Microsoft. Microsoft thrives on legacy and this pushes Intel to optimize for old codebases.
Apple on the other hand says jump and Mac developers jump or get fucked, they have much more room for processor innovation.
Comment removed (Score:5, Insightful)
Re:The writing was on the wall in 2005. (Score:4, Funny)
They have people from DEC, Transmeta, MIPS, Motorola, PA-Semi, HP, Sun, you name it.
People who worked on a bunch of failed CPU architectures? I'm joking, a little bit.
Re:The writing was on the wall in 2005. (Score:4, Insightful)
The chips Apple made for their devices two years ago are still better than many of the cutting edge chips in the flagship devices of their competitors. It would seem that Apple hired a lot of very competent engineers.
Re: (Score:2)
68k was great, in fact I think it could have remained competitive with x86 were it not for the fact that PCs became dominant and Motorola decided the future was PowerPC.
Apple's CPUs are no better than the competition, just different. Most other designs emphasis multi-core performance while Apple goes for single threaded performance. The result is that Apple looks good in certain benchmarks but in others it falls far behind, and for real applications it depends heavily on how multi-threaded it is.
Apple are q
Re: (Score:2)
Yes, it's called a System on Chip and ARM provide those too. They define the interfaces for peripherals to the ARM core for example.
Manufacturers can then add their own peripheral devices and sub-processors. Common ones include cellular modems, wifi/Bluetooth/ethernet, USB controllers and MEMS sensors like accelerometers.
Apple only provides 2 high performance cores on its latest phone chips. In the Android world only the lowest end cheapest devices have two high performance cores, many budget phones have 4
Modded funny? Seriously? (Score:5, Interesting)
Anyone who thinks the DEC Alpha, Sun Sparc, 68K , MIPS in SGI kit was unsuccessful has either just graduated from wearing short trousers at school or is willfully ignorant. x86 didnt win the CPU contest because it was the best but because by a twist of fate IBM used it in the PC. If it hadn't been for that its highly likely x86 and quite possibly Intel would have had their last rites in the 90s.
Turns out now that x86 internally is now such a dogs dinner from adding hack on top of hack to keep this antiquated architecture relevant that its now a bug ridden security nightmare and Apple have had enough.
Re: (Score:2)
Yep, IBM's decision had huge ramifications. They nearly went with 68k which is a much better architecture.
Having said that x86 turned out to be superior to other architectures in many ways, especially RISC ones. It acts more like an intermediate language now, describing functionality that the CPU is able to optimize on the fly to make best use of available resources.
RISC has many advantages but for raw performance CISC seems to be superior.
The cruft in x86 is largely irrelevant now, CPUs implement some of i
Re: (Score:3)
Re: (Score:2)
Depends on what you deem as failure. Certainly the companies failed but that doesn’t mean the CPU architecture failed. For example, PA-Semi had a number of CPU designs that were commercially available. When Apple bought them, they closed down the entire business as they wanted the engineers to work on Apple CPUs. Whether PA Semi’s CPUs would have been industry will never be known now.
Re: (Score:2)
I wouldn't say IBM dropped the ball. The G5 was power hungry and hot, they couldn't make a mobile version. The POWER architecture is IBM's own design for their in house products, high end servers where power draw and cooling is low on the priority list.
Re: (Score:2)
Well, IBM certainly dropped the ball in context. They could not or would not make something competitive for the desktop space. Of course they might have not cared about playing in the space at all anymore and wanted the business to dissolve through apathy, it might have been deliberate.
POWER has been in trouble for a while even in the high end. The best thing they did as of late was support NVLINK and give up on the real work and just let nVidia do it, accepting being relegated to the role of mostly supervi
Re: (Score:2)
I suspect Intrinsity made up the core of their processor development team and other acquisitions failed and what remained was made to fit in with them.
Comment removed (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
"Everybody knows if you throw enough money at something and hire in a big enough staff, you'll succeed at a huge technical project."
Yes, just look at how well it's working for Intel.
Except wait, Apple is not Intel. Maybe they're smart enough to hire the right people. Intel obviously isn't.
It's not even just the right people (Score:2)
Except wait, Apple is not Intel. Maybe they're smart enough to hire the right people. Intel obviously isn't.
Intel might even have the right people. What they have for sure, is the wrong management... that will stop even a team of great people from succeeding.
What Apple does not get very much acknowledgment for is that they have a very stable team committed to long term goals. On a podcast with John Gruber, a well known Apple journalist, he made the excellent point that the people who are doing Rosetta 2
Re: (Score:2)
End of the Mac? (Score:5, Interesting)
To me this signals the end of Apple desktop computing.
Using an ARM CPU locks you into the platform in the same way the 68000 series Macs locked you in. Or worse: the PPC chips with Carbon- the 68k emulator that Apple supported for a short time during the PowerPC years.
And back then they were selling PCs on an expansion board to bridge the gap created by being a platform island.
At this point closing the platform further is a mistake. Everyone has to buy new software now. The emulation used for legacy x86 apps will be painfully slow. Developers will fall by the wayside further narrowing the scope of software for the platform.
Support options will plummet.... IT professionals can't build a Hackintosh on the cheap to learn the system. We certainly are not going to buy an ARM Mac from Apple.
They should have moved over to the AMD Ryzen chips. This requires minimal R&D, is high performance, doesn't obsolete all the software, and can still run Windows (which is a selling point).
In the 90s, when I worked for Apple, closing the platform up was a huge mistake. Closing it up further... even worse.
The problem with a walled garden is that eventually everyone leaves except for the person who owns the garden. No one tries to get back into jail once they are out- no matter how pretty the murals on the walls are.
end of Apple Desktop computing? (Score:2)
Well, to me this happened when Apple started the walled garden with apps.
At least, this is when I finalized my switch towards Linux machines, and I must say, today with a recent Lenovo + Debian testing, with full support for the GPU and a calibrated screen for instance, I consider myself robust enough.
But I still see the graphics guys, in my factory, working with macs... Maybe they'll continue even when Apple will finally succed in turning macintoshes into actual tablets, with closed hardware as well as sof
Re: (Score:2)
You know the difference between you and the graphics guys? They don't care what OS or CPU their machine has. It's a tool they use to get work done.
Re:End of the Mac? (Score:5, Insightful)
IT professionals can't build a Hackintosh on the cheap to learn the system
IT professionals don't build hackintoshs to "learn the system" they might have the skills and decide to build one because they want to a cheap Mac at home, but it ain't about learning. No "professional" would be maneuvered into supporting Apple Equipment in the enterprise without at least insisting they have some hardware of their for learning / testing.
Re: (Score:2)
Re: (Score:2)
I'm an IT professional. I built a hackintosh for learning purposes.
Just saying.
Re: (Score:2)
That's about open source, not some proprietary stuff. Like, I sometimes receive bug reports about some build stuff on Fruits -- issues that look trivial to fix -- but I have no way to fix them as there's no way I'd put my own money into buying massively overpriced inferior hardware just to support people. On the other hand, Windows is not hardware-locked, and you can get licenses for peanuts or even gratis, so even though I don't care about Windows myself testing stuff there isn't a big hurdle.
Re:End of the Mac? (Score:5, Insightful)
I'm wondering how an ARM CPU "locks you into the platform" more than an Intel one? It's not like you could run (say) a Windows Intel binary on a Mac before, and you won't be able to afterwards either. You'll be able to run Mac apps and nothing else - just like now. By now, Apple have already talked to a lot of the major app vendors about the move to ARM, and presumably the biggest ones are all okay with it - so you'll get ARM versions for at least the most used apps just as you can get them for Intel today.
Other news about Apple trying to make an "ecosystem" of products and to bring phone apps to laptops might well be the "lock in" you're talking about, and unless Apple are careful, making that work at the exclusion of all the other apps in the world would indeed be lock-in and possibly the "end of Apple desktop". But the CPU architecture has nothing to do with any of that.
Lastly, I'm pretty certain Apple considered using AMD chips - it certainly would be the most obvious "get away from Intel" move. I suspect ARM won out because they can make cores that are "fast enough", but critically make loads of them in one chip. Thus you get lots of multi-tasking, and probably for less money than a single AMD (or Intel). Then, you can turn off cores to save battery when all you're doing is watching a cat video. But Apple may have reasons beyond this that we're not being made aware of - time will tell.
Re: (Score:3)
Because right now if you buy a Mac you are able to run Windows on it. If you buy an ARM mac you'll no longer be able to do that. You probably won't have a good LInux offering for quite a while either. Linux is more likely to come around, but I'm sure it will be a bit before they actually get everything up and running properly.
Re: (Score:2)
Uh, Windows 10 on ARM [microsoft.com]. If and when Apple or a third party supports it remains to be seen, but it can't be dismissed out of hand.
Re: (Score:2)
And.... uh....
Exactly what ARM software is there available?
Re: (Score:2)
Re: (Score:2)
It's not like you could run (say) a Windows Intel binary on a Mac before, and you won't be able to afterwards either. You'll be able to run Mac apps and nothing else - just like now.
You need to tell my coworker that then. I see him doing it every day.
Re: (Score:2)
But the CPU architecture has nothing to do with any of that.
Eh, it has some to do with it. I think the end game is to steer developers away from traditional desktop software distribution and toward the App Store. Presumably with some enhanced iDevice platform capabilities, this enables a 'desktop' software vendor to have low incremental investment for their app to 'scale-down' and get bought by the much larger iDevice market, but the same app scaling up to full function when used on a more appropriate laptop/desktop form factor. This *coincidentally* means that the
Re: (Score:2)
ARM CPU in and of itself doesn't lock anyone into a platform as you pointed out. Heck, how many of us have Raspberry Pi's hanging around our houses doing half a dozen different tasks on our networks... ARM CPU's... no lock in.
But here's my thought on it, and call it a conspiracy theory if you want. Apple wants a lock in. They want the complete vertical integration approach they had back when it was the 68k and then the PPC CPU's. As a result, they're going to lock up the platform starting at the top; at the
Re: (Score:2)
I think it's not the Arm architecture itself but the commonality with ios that will drive the lock-in. Apple's already makning ungodly amounts of money from the appstore, and now they'll be all "heeey check this out you can get all your favorite fart apps from the same store on your mac!". And soon enough, if your app isn't on the app store (and giving apple a 30% cut), it might as well not exist. Maybe they'll even make it the only way to install software (without jumping through some hoops), and everyone
Re: (Score:2)
If Apple's emulation of x86 (translation, recompiling, whatever) is as good as they claim, then it may not matter. Existing Mac apps will continue to run well for the next 5-10 years, and by that point all existing software will be recompiled. If their emulator/translator works with VMs (it appears it does), then users can continue to run Windows in a VM. Sure there's a performance and resource penalty, but the ARM chips are getting faster and faster and eventually it will just be enough. Maybe they cou
Re: (Score:2)
Apple has already done this. Rosetta was good enough that most people noticed their old apps were actually PowerPC binaries when OS X 10.5 started popping up notices that Rosetta would be discontinued in 10.7.
Re:End of the Mac? (Score:5, Informative)
Apple built iPhones since 2007, and in April 2010 they released the first ARM processor that can be called "Apple designed". That's ten years ago. Other these ten years, Apple's ARM processors have improved and improved and improved. About three years ago there was a quote "Apple's iPad has a more powerful processor than 90% of laptops sold". Today the iPad processor is powerful enough to hold its own against quad core Intel chips.
And that's a chip designed for a mobile device. Now there are two possibilities: Either Apple has gone completely loopy, or Apple has ARM-based processors that have the clock speed of a desktop processors, and at least twice the number of cores of their tablets, ready or close to ready. I'd bet on the second possibilities. Which means by the end of the year they will likely have a 3 core chip beating Intel's quad core, a six core chip beating Intel's 8 core processors, and possibly another chip that blows everything but the high end iMac Pro and Mac Pro out of the water. Maybe they have that even now.
There is no emulation for legacy. Code is recompiled. Apple is to a huge part responsible for LLVM; ARM has twice the registers of x86, so I wouldn't be surprised at all if some Intel code actually ends up running faster. And recompiling for ARM is one switch in the compiler.
Nobody has to buy new software. Old Intel-based software will run just fine. Recompiled software will run significantly faster. For MacOS developers, no change. For iOS developers a huge change: Their apps will instantly run on a Mac.
What about AMD? You can feel free to buy a computer with a Ryzen processor. It would be of no benefit whatsoever to Apple.
Re: (Score:3)
Using an ARM CPU locks you into the platform in the same way the 68000 series Macs locked you in. Or worse: the PPC chips with Carbon- the 68k emulator that Apple supported for a short time during the PowerPC years.
I would quite equate ARM with PowerPC or 68000. The total number of ARM CPUs probably exceeds the total number of x86 CPUs at this point. The lock-in in some sense is if they remained with x86. They will, at least for a while, be shipping both ARM and x86 Macintoshes. If Intel manages to get their act together, it may be that they keep Apple's business for the high-end and the ARM will be Apple's low-end line.
In the 90s, when I worked for Apple, closing the platform up was a huge mistake. Closing it up further... even worse.
This one I find a bit funny. The Macintosh was basically always closed until Apple started allowing
ARM lock-in? Not. (Score:3)
"Using an ARM CPU locks you into the platform in the same way the 68000 series Macs locked you in. Or worse: the PPC chips with Carbon- the 68k emulator that Apple supported for a short time during the PowerPC years"
Exactly, the same kind of lock-in that the 6502, 68k, and PPC had. Oh wait.
People are locked into the MacOS, not into a specific ISA. I've been using a Mac since the 512KE. It's not lock-in, it's a personal choice. To me, every other OS sucks.
Apple use AMD? Why trade one dependency for another?
Not good dev machines anymore (Score:5, Insightful)
With MacOS closing down it has been getting harder and harder to use Macs as dev machines - Catalina gave us quite some challenges at work. Now going to ARM will probably make the x86 VMs we need quite a bit slower, so we'll most likely have to hang on to our old machines and consider switching to something else...
Before someone says Spectre (Score:5, Interesting)
Just remember we're not talking about security vulnerabilities here. We're talking about actual erratum to the specification of the architecture. The 6th and 7th Gen erratums are documents combining a total of 130 pages with the most recently published errata having designation: SKL190.
There's no expectation for a modern CPU to be bug free, but 190 erratum to a specification is a hell of a lot.
For comparison AMD are more difficult since they sequentially label all their bugs and don't associate them with a product generation, but in total I count 46 for 17h 00-0Fh which is the designation for Ryzen 1000 and the original Zen architecture. This is especially significant considering how new and unique Zen was compared to AMD's previous architecture and how Skylake wasn't that much of a deviation from Intel's past architecture.
Re: (Score:2)
I'm no CPU developer - I'm assuming "erratum" means "known issue" / performance contrary to specifications / something like that?
So are erratum things they are working on or intend to fix before release, or are they things that the people implementing the boards and compilers are just expected to work around in the final product?
Like with the venerable pentium math bug, "our chip has problems with math in this situation, "You'd better do the math yourself instead of using our opcode if it's in this range so
Re: (Score:2)
Virtually anyone who is a user of Intel products with support accounts can submit an errata. An errata is what Intel calls design defects or errors.
https://www.intel.com/content/... [intel.com]
If someone suabmits an errata after the processor is released, there's almost always no fix (as you can imagine.) It is published, however, after review - for the good of the customers.
I'm not convinced that the number of errata is the primary reason that Apple decided to swtich to a different CPU. There are so many other reasons
Re: (Score:2)
As the article clearly states this was likely the tipping point for Apple. They probably had plans to move before but this pushed them to go ahead with those plans.
I can imagine Apple expressed their unhappiness with Intel directly. Skylake wasn’t supposed to be a huge architectural change like when AMD started Ryzen. While the Intel engineer can’t know for sure, there were probably signs from Apple like them not ordering as many CPUs in the future. Also Apple likely ramped up hiring more engine
Re: (Score:3)
Errata are identified issues, that basically document the actual behavior rather than what was documented before. Sometimes these can be fixed through microcode updates to work around the defect, sometimes it's a "Yeah, this is screwed up and you're going to have to deal".
What a Troll! (Score:2)
Who needs glasses when 2020 is the year of perfect hindsight?! Only Apple isn't switching. They're bridging the gap between their platforms. They've been with Arm for quite a while now and they're keeping the Intel line very much alive for a good time. Intel is then not the only x86 manufacturer and AMD makes rather attractive CPUs and Apple could just, you know, use AMD CPUs?! Oh wait, no, they couldn't - they already are! Doh.
There is no switch, no running away here. This is a disgruntled employee throwin
I'm skeptical.. (Score:5, Insightful)
Looking at Mac sales versus iDevice sales, Apple made the call it was better for the Mac ecosystem to be in support of the iDevice ecosystem, simple as that. Now their Macs will easily run iPhone/iPad software of which there are more to choose from, and makes Apple more money (they tried OSX store, and likely found it wanting as software largely sidestepped the store).
It may well be true that Apple found nearly as many problems as Intel, however in my experience the big name customers sometimes demand to see/test/evaluate the product concurrently with your own testing organization. I would assume Apple demanded this treatment and would hope they were told to expect it to be rocky as it was a pre-QA product and Intel presumably relented. I know from experience that a lot of QA engineers are offended by the thought of an external party seeing things before they get the chance and would be eager to jump at an opportunity for 'I told you so' for a move like this, but it's hard to imagine Apple not demanding the sort of access that would lead to them seeing a lot of problems, but also for them to know exactly what they are getting into.
Re: (Score:2)
Re: (Score:3)
My point is that a computer vendor has access to early steppings and everyone knows that will not be final stepping. This is even when they defer to the chip vendor to stay out of it until the initial samples are worked out at the vendor before sending in internally QAed product.
That is not necessarily 100% true. A vendor may be given samples that are at the end of testing where the stepping has been finalized. At that point the vendor may be more concerned about physical dimensions and manufacturing processes.
I suppose I can't be sure, but based on my interactions with certain companies, it would not surprise me a bit if Apple (with mobile chip design under their belt) said to Intel "we want to be part of those steppings you don't even normally ship to partners, we want to start early". Whereas before they might have been more high-level in their early engineering engagements with Intel and waited for Intel QA to have their first pass.
There’s no need to speculate as I don’t doubt Apple already had early access. Here’s the main reason; Apple is responsible for OS software too when they launch a new model. Apple has to make sure that MacOS works just as well on this Intel generation as the
Wilson, Furber, and Hauser Deserve Draper Prize (Score:5, Informative)
Sophie Wilson, the architect of the first ARM processor, was inspired by the engineers working on the successor to the 6502 at the Western Design Center. They proved that a small team can build a simple microprocessor which is competitive with a microprocessor (like the x86 processors) built by an army of H-1B-visa engineers with a budget of billions of dollars.
So, Wilson and the other 2 engineers (Steve Furber and Hermann Hauser) on her team designed, built, and tested the first ARM processor. Its simplicity gave it 2 additional characteristics: low power consumption and ease of testing. These 2 characteristics would, decades later, pave the way for ARM to enter the market for laptops, desktops, and supercomputers.
ARM will appear in Apple laptops and desktops in late 2020.
As of today, Fugaku [cnet.com], a supercomputer built by Fujitsu, is powered by ARM processors and is the fastest supercomputer in the world.
Wilson and her 2 British colleagues, Steve Furber and Hermann Hauser, deserve the Charles Stark Draper Prize for Engineering. This prize is the engineering equivalent of the Nobel Prize. These engineers have done for computing systems what Claude Shannon did for communication systems.
They also researched SPARC. (Score:2)
https://hackaday.com/2018/05/08/sophie-wilson-arm-and-how-making-things-simpler-made-them-faster-more-efficient/ [hackaday.com]
Re: (Score:3)
Re: (Score:3)
It was commonplace for computer manufacturers to design their own processors, and RISC had its start directly in that space. Why people think they need to now make up a history of ARM in ignorance of actual history is remarkable, but it is /. That's not a criticism of your post but rather of the attempt create a myth around ARM superiority. ARM had its start as a modest processor, of low cost, for use in a low cost personal computer (that failed). It found a niche for non-technical reasons and took off
Re:Wilson, Furber, and Hauser Deserve Draper Prize (Score:4, Informative)
Better on all metrics (Score:3)
I expect Apple's new Mac CPU to be faster, use much less power, and to be cheaper than the Intel CPU that it replaces. It will also have more features and a lot better built in graphics capabilities than Intel's comparable integrated graphics.
Bugs in Intel chips may have played a role in the decision, but when you can significantly improve on every metric at the same time, it's not like you need an extra annoyance to push you into the decision.
Re: (Score:3)
I expect Apple's initial ARM release Macbook to be comparable to an Intel ULV part in performance with better power characteristics. I do not expect Apple's GPU efforts to be better than AMD or Nvidia, ever - there's way too much institutional expertise and technology patenting going on there.
So that basically comes down to the first generation of low-end to mid-grade ARM Mac being on par with an Intel-Mac with no discrete GPU. However, you get into the upper end and things are likely to be different. Th
so, Intel has learned *nothing* in nearly 20 years (Score:5, Interesting)
I have a friend who was a chip designer for HP.
He worked on the team that built the chipset for their superdome computer.
When the IA-64 came out, their team regularly shipped rev-1 silicone. When their chipset first shipped, it had exactly 1 known bug--which happened to be in a debug path.
When Intel partnered with HP to build the IA-64 chips, HP asked if they should build one also. Intel said, "no, we got this.".
Intel took a long time to produce their 1st gen IA-64 (Merced) chip. They didn't ship 1st-rev silicone. It was slow. It had a huge list of errata.
HP didn't take no, they decided to build their own anyways. In nearly the same time as it took for Intel to build their first one, HP built theirs.
HP's design became the 2nd gen IA-64 CPU as it worked *much* better, faster, and had very few bugs. Go figure.
HP's team was considered one of the most advanced chip design teams on the planet. They had one guy that had built the most advanced software simulator in the world so they could easily test everything pre-silicon. The entire team was part of the design process, so that all of them had input--this is essential.
Soon after, Intel decided to buy this team, since it produced such good results. Did they learn anything about how this team operated differently? Apparently not.
My friend soon after told me how Intel did things in their shop:
- everything was siloed
- recent grads were used up and burnt out, given only tiny amounts of info about their little piece of the puzzle since they had high turn-over from being over worked
- only the "Intel Fellows" architects created the actual design
- the actual developers had no real input
- the "Fellows" would create a design and essentially throw it over the wall and say "here, implement this", with no regard to practical feasibility
The resulting products were essentially knee-capped. The testers wouldn't always have what they actually needed to test the design. If they were able to give feedback, their requests for new or different test features wouldn't make it in until the *next* architecture--and by then may not be applicable!
*GO FIGURE* they can't test worth shit.
He described to me how their process was fundamentally broken, and how his team was essentially instructed to do things *their way* as soon as they arrived.
You'd think Intel would have been smart enough to recognize that the team they purchased wasn't so successful merely because of the particular butts in seats, but because of *how they did things differently* than Intel did. I guess not.
Is this guy just bitter, or... (Score:2)
Does Intel suck that much currently?
Oh the irony. (Score:2)
The market works even among bad actors. (Score:2)
Imagine that, make shitty flawed products, and your customers start looking elsewhere. Continue making shitty flawed products, and your customers go out of their way to fire you.
Re:not likely only reason (Score:5, Insightful)
Apple is always lookin for ways to make their devices more proprietary every day. Using a in house made cpu means they can crack down on repairs to people they want to be allowed to do them even more. On top of now they can make the hardware cheaper but still charge same price for it.
Sure, it's not like it's even remotely possible Intel would ever f**k up their QA and piss off their customers to the point that the customers realises that if you want something done properly you'd better do it yourself. Oh, and nobody is holding a gun to your head and forcing you to buy Apple products. So why did you even come here and start whining about this when you could be using your time more productively sourcing your devices from a bargain bin at Walmart?
Re: (Score:2, Informative)
Indeed, he seems to completely ignore the argument in the story. IMHO, QA is very important and poor QA ends up driving your customers and audience away.
Take this guy who has a YouTube channel for example, his last video is a review about a very popular movie. In it, he pronounces the title of the movie at least 20 times but he can't manage to pronounce it right even once! How does his seriousness look then? His lack of QA caused his videos to get a total of only 9 views in only one week and a bunch of thum
Comment removed (Score:4, Interesting)
Re:not likely only reason (Score:5, Insightful)
Re: (Score:3)
The most important QA engineer in the company is the CEO.
You can have all the processes in the world, but unless the process is actually bloated enough to crush productivity, corners can be cut. When the CEO wants the corners cut, the corners will be cut. He or she does not have to say it out loud. The message will heard, by making the right kinds of prodding about schedule.
Re:not likely only reason (Score:5, Insightful)
So true!
CEO to the Public: I want to make sure our product is ROCK SOLID!
CEO to Management: Why is this product not out yet? We can fix bugs post release, we have investors to keep happy, the quarterlies are almost here!
Re:not likely only reason (Score:5, Funny)
The last Windows "update" wiped out printing again at my work on the big HP printers.
Oh man, I'd be all on the screw Microsoft bandwagon, but ... if they managed to break HP printers where can I sign up to help donate to those people introducing bugs that screw HP? I want to help the cause.
Re: (Score:3, Interesting)
Luck of the draw perhaps, but for someone like me who cut hit teeth on 6502 stuff 40 years ago can learnt to type at a preposterous speed decades back, I've found it to be the fastest keyboard I've ev
Re: (Score:2)
Took me a year to get used to the keyboard. Now I can't handle anything less shallow than their magic keyboards. Meh, just something to get used to. Dust can be an issue. But a can of air has solved my return key a few times now. Our climate/environment isn't particularly dusty and I tap the keys lightly. Now as for the display cable being too thin internally and gradually wearing out, THAT was bad, but warranty fixed it, thankfully. Generally, Apple makes mistakes, but generally, their design culture seems
Re: (Score:2, Informative)
That's not true. Apple used to be quite serious about quality. Of course, they weren't as good as HP, which was also serious about quality, but they were better than most. I wouldn't have called them beautiful, though. (And I didn't. But reliable, yes.)
I think the change came about around the time they released OSX, though it could have been before that as I had moved away from using Apple for a few years. But the Apple ][+, Mac, Fat Mac, LC, LC3, and Mac 2 were generally quite reliable. (By the sta
Re: (Score:2)
You're right of course, I was mostly saying Apple since the advent of the Intel machines... that was my fault for not being clear about that. The older hardware pre-Intel was generally really solid and well engineered. As soon as they went "generic Intel" things went to crap pretty quickly.
Thanks for the correction :)
Re:not likely only reason (Score:4, Informative)
I'd say I've handled about 300 Macs. Most are fine. Occasionally there's one that's crap.
Their design work is like that too. Remotes and mice which you cannot tell up from down. But the trackpad on the laptops is pretty good.
But some of this is personal preference. Complaining about Apple just being merely shiny appeal forgets that laptops used to come covered in shitty stickers which are just advertising. Apple set the example and put an end to that particular nonsense.
Re: (Score:2)
The standard for pro grade systems always seems to be one major defect per product line. Dell has been impacted by this, where they have failing power supplies for years (optiplex 960-990), then they finally fix the failing power supplies but source a cpu fan that has bearing failures everywhere after a year of service life (9010).
Apple tends to go beyond these problems though since they have the same inherent in-built problems plus atrocious design choices causing failures. Everyone focuses on the touchb
It's a myth that any company is consistent (Score:3)
Apple's hardware has always been second-rate to the higher-tier hardware from the big manufacturers like HP, Lenovo and Dell.
Let's get real here. Apple has strengths and weaknesses. I am not a fan of their devices, but their quality is the best I've seen. However, their keyboards, Apple TV, networking gear...were far less impressive. The mouse has TERRIBLE design and quality
I would say the iPad is built better than any tablet in the android world. My extended family has owned 10. None died from natural causes. Probably 20 iPhones among my family. Most still work from a physical perspective. A few got dropped and broke
Re: (Score:2)
... sourcing your devices from a bargain bin at Walmart?
Which is where most Apple devices belong
(eg. Remember the keyboard problems? It's only a few months ago...or was it because we were typing them wrong?)
Re: not likely only reason (Score:5, Interesting)
Re: not likely only reason (Score:2)
Re: (Score:2)
It's also not the first time that Apple has had this problem. Back when they were still on PowerPC, they also had limited control over IBM's chip production and design given the low volume of Apple products. I seem to recall this was one of the justifications offered for jumping to x86 after the G5 PowerBook failed to materialize. Perhaps they've simply decided enough is enough, especially now that there are a lot more options for fabrication available.
Re:not likely only reason (Score:4, Interesting)
Nah. Investors were tired of being wrong about Apple switching to ARM for 10 years in a row.
As for why, it was always possible to be a "when" not "if", but I also had my doubts of it ever happening before Apple puts out a non-mobile ARM device (eg the AppleTV 4K) but since Apple never put out an actual TV, and basically got their App onto other TV's, that seemed like a product that was eventually going to be discontinued. So the next logical thing was the lowend iMac, and the already-rubbish Macbook Air/Macbook Pro 13" models since these devices use Intel's horrible iGPU devices so it just made sense when the iPad Pro already ran circles around the Intel U parts, and Intel's Y parts are even worse.
With that said, don't count on seeing the ARM parts in the MacMini, Macbook Pro 16", iMac 24" /iMac Pro or Mac Pro for a while. Anything with a dGPU has to have vendor support for it, otherwise Apple will have to start building dGPU's themselves, and that means designing more chips. Sure they can do it, but since only one product uses upgradable dGPU parts it (the Mac Pro), it would be a hugely painful effort to create a dGPU, and the SoC design doesn't scale very well to large chip dies, since that just makes them ultra expensive (and why Intel has bins for like 30 different CPU's from the same die, and AMD/nVidia have 8 or so bins for the same gpu die.) The bigger the chip, the more there is to fail in validation. So AMD's chiplet design at least lets them salvage the good dies, but that design only works for their CPU-only chip. Their APU SoC uses the second chiplet space for the iGPU. Likewise Intel has used AMD's GPU parts before, so it seems to me that Apple would have to do the same if they're going to create anything more than an entry-level mac with an ARM cpu.
Re:not likely only reason (Score:5, Informative)
1. The first ARM-based Mac is already available, it's the macOS Big Sur developper kit, which is a Mac mini equipped with the same A12Z as the iPad Pro, along with 16GB RAM and a 512GB SSD.
2. Apple announced at WWDC, a few days ago, that the first ARM-based Macs will be available in 2020.
3. A lot of Macs don't have dedicated GPUs and only have Intel's built-in GPU, which is weaker than the built-in GPU of the iPad Pro.
4. Why couldn't Apple simply add nVidia or AMD GPUs to their ARM-equipped Macs?
Re:not likely only reason (Score:4, Insightful)
Using an in house chip means they can optimize it for their devices and use cases while at the same time preventing security flaws beyond their control. This will mean a better experience for their users.
See? It is a matter of perspective.
Re: (Score:2)
Their use cases are the same as everyone else's. And it's doubtful that they will get compelling single thread performance out of ARM any time soon. It will be fine for highly parallel apps like Photoshop, but it's going to present some real performance limitations that they may eventually solve... But not quickly.
They also aren't going to save any money any time soon.
I suspect in the end they'll wish they went with AMD. But I guess we'll find out.
Re: (Score:2)
Good points, but they certainly seem to have an advantage in the mobile arena around chip performance. I agree that until we see actual specs it is all conjecture :)
Re:not likely only reason (Score:4, Interesting)
Their use cases are the same as everyone else's. And it's doubtful that they will get compelling single thread performance out of ARM any time soon.
What the f*** are you taking? I've written some benchmarks just for fun, and the same code (purely FPU code), using all available cores, ran significantly faster on an iPhone XR with two cores plus four slow cores giving it an extra 20%, than on a quad core iMac using all four cores. So two vs four cores it beat Intel. Core for core, it beat the shit out of it.
Just compare the number of registers. Compare L1 and L2 cache sizes. Notice another 16MB cache between processor and memory. Take seven wide instruction decode, Intel can't even dream of that.
Re: (Score:2)
By reading this comment, there is no doubt you are an industry-leading floating point expert. ;). As we all know, single thread performance is determined entirely by floating point performance as measured by experts like you. Indeed, if only Intel could dream of "seven wide instruction decode" maybe their floating point performance would be better, as measured by you, an industry expert.
Re: (Score:2)
While your points are valid, Apple isn't using ARM, only using an ARM instruction set. They have a lot to prove though, and a technical mountain to climb.
Re:not likely only reason (Score:5, Insightful)
One example is that now you can HD audio out of the lightning port with a variety of options to choose from.
Why can't I have both?
Apple loves airpods - they're soooo easy to drop and lose (leading to replacements). It's now one of their biggest earners:
https://www.google.com/search?... [google.com]
They also have removed a port that can allow water into the device.
Rubbish. Waterproof headphone sockets are a done deal.
eg. Here's some: https://www.aliexpress.com/ite... [aliexpress.com]
But ... keep drinking the Apple kool-aid.
Re: (Score:2)
In my company we never had a single issue with a pro keyboard
Well that proves it then. Everybody else is just hallucinating and Apple's special "keyboard service program" is a figment of our imaginations.
https://www.google.com/search?... [google.com]
... every coder is on a MacBook Pro.
Ouch.
I'm hoping that's down to Apple not letting you develop iOS apps on anything other than Macs.
Re: (Score:2)
"Nope, our devs choose MacBooks..."
That's certainly consistent with the expertise you project.
"What do you do exactly? Since you have such strong opinions I guess you have a ton of industry experience. What are you working on?"
Same can be asked of you. You're clearly not a developer nor do you understand what the things you same actually mean.
Re: (Score:2)
How does the change of processor affect repairability at all? Zero people are getting the Intel processor replaced on their Mac motherboard: They're replacing the whole system board. This will be an identical process for both chips.
Re: (Score:2)
Using a in house made cpu means they can crack down on repairs
Really? The lucrative CPU repair market? How about faster, cheaper, and more efficient?
Re:not likely only reason (Score:5, Interesting)
Apple accounts for >5% of Intel's revenue (on a purely processor basis the number is a good deal higher). That actually makes them a very big fish if not the biggest fish. Indeed, Apple has been rolling out Intel chips before anyone else as Intel gives them first dibs.
But yes, Apple is a "control freak". They don't like their product roadmap being dictated by Intel's delays, including that Intel is still at 14nm while everyone else is at 7 and are already pushing down to 5. And if they can provide similar or better performance and efficiency with their own designs, in what universe would they choose otherwise?
"In a few years (or less) Apple will have some flimsy excuse for switching away from ARM."
What does this even mean? Ignoring the hilarious "or less" bit, in the future the world will be different and hopefully the company can adapt as they have before. That's life. Arguing otherwise would be incredibly stupid.
Re: (Score:2)
Apple accounts for >5% of Intel's revenue (on a purely processor basis the number is a good deal higher). That actually makes them a very big fish if not the biggest fish. Indeed, Apple has been rolling out Intel chips before anyone else as Intel gives them first dibs.
I don't think it is that high. They are certainly not remotely the biggest fish, with HP, Dell, and Lenovo each having more than 2x more sales than Apple in the segment and Dell and Lenovo also getting at least some of the datacenter processor revenue where Apple sells none. More challenging is on software front, where satisfying Microsoft takes care of more than 90% of their desktop business across all the myriad vendors and OSX is just 5%. Apple has also been notoriously sluggish about being time to marke
Re: (Score:2)
Do you have a citation for that dubious >5% statistic? Most of the analysts I can find say it's more like 2-3% of Intel's revenue and I can't find any first-party (i.e. from Intel or Apple) info that says anything about it... though granted I didn't do a very deep dive into the data because frankly I don't think it matters.
2-3% is still somewhat notable and that loss of revenue is statistically significant for Intel it's not like Intel won't be able to pivot easily enough to other business lines or other
Re:not likely only reason (Score:5, Interesting)
Apple accounts for 6% of the PC market, was 100% Intel in that market, and used almost entirely premium products. Neither company reports these specific numbers, but there is no analysis that has them as a minor customer.
What is Intel going to pivot to?
-Intel is simply dead in mobile and connected/IoT devices. They tried and failed miserably
-the PC market is shrinking even before you slice 6% off
-AMD is going absolutely gangbusters, and in many estimates is now outselling Intel to retail customers
-Intel's sole remaining cash cow, data centers, is seeing Google making their own processor, Amazon making their own processor (the Gravitron 2 is looking stellar), and now the top supercomputer in the world is an ARM based machine. AMD is back in play in the supercomputer market as well with a new Epyc entrant.
-All of this has caused Intel to slash their top tier product pricing
Intel might do okay with Optane as they scale it up and out, though their window of opportunity for that is crashing. Their other major cash cow is chipsets and motherboards, though that's contingent on you using their CPU.
I always expected Intel to dominate whatever realm they entered, but things are looking pretty grim.