Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Intel Bug Desktops (Apple) Apple Hardware

Former Intel Engineer Claims Skylake QA Drove Apple Away (pcgamer.com) 252

UnknowingFool writes: A former Intel engineer has put forth information that the QA process around Skylake was so terrible that it may have finally driven Apple to use their own processors in upcoming Macs. Not to say that Apple would not have eventually made this move, but Francois Piednoel says Skylake was abnormally bad with Apple finding the largest amount of bugs inside the architecture rivaling Intel itself. That led Apple to reconsider staying on the architecture and hastening their plans to migrate to their own chips. "The quality assurance of Skylake was more than a problem," says Piednoel. "It was abnormally bad. We were getting way too much citing for little things inside Skylake. Basically our buddies at Apple became the number one filer of problems in the architecture. And that went really, really bad. When your customer starts finding almost as much bugs as you found yourself, you're not leading into the right place."

"For me this is the inflection point," added Piednoel. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform."

Apple made the switch official at its developer conference on Monday, announcing that it will introduce Macs featuring Apple-designed, ARM-based processors later this year.
This discussion has been archived. No new comments can be posted.

Former Intel Engineer Claims Skylake QA Drove Apple Away

Comments Filter:
  • Piednol full of it (Score:2, Informative)

    by DrMrLordX ( 559371 )

    Piednol is not the best source for . . . well, anything, really. Apple may have found numerous bugs in Skylake, but now that Intel is finally starting to roll out mobile CPUs that are not Skylake-based (Icelake-U, Tiger Lake-U), Apple turns on Intel? Nah. It has more to do with Intel's lackluster product lineup most-likely.

    • by Kjella ( 173770 ) on Thursday June 25, 2020 @05:52AM (#60225736) Homepage

      Apple may have found numerous bugs in Skylake, but now that Intel is finally starting to roll out mobile CPUs that are not Skylake-based (Icelake-U, Tiger Lake-U), Apple turns on Intel? Nah.

      How long ago do you think Apple made the strategic decision to move desktop-class computers to ARM? They'd need the chip designs, production capacity, software support, all sorts of plans for the transition. This is not a decision made in the past year, my guess is 3-4 years ago. Once they committed the resources it's unlikely Intel could do anything to change their mind.

    • by The Cynical Critic ( 1294574 ) on Thursday June 25, 2020 @06:19AM (#60225774)
      The thing about big engineering changes like moving a whole ecosystem like MacOS to another architecture is you don't exactly do that on a dime. A change like that takes years of planning and work before you even get to the initial customer end rollout and it'll be years after that before it's complete.

      Apple has had an incentive to move to their own ARM-based silicon for much longer than the years a change like that will take and it's clear they've taken it pretty slow, building up know-how and the IP building blocks necessary by over-investing in the stagnant tablet market, up until fairly recently. The issues with Skylake did probably undermine their internal x86 diehards' position so badly they completely lost their ground, allowing the project to move ahead at full speed.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday June 25, 2020 @05:41AM (#60225722)
    Comment removed based on user account deletion
    • by AmiMoJo ( 196126 ) on Thursday June 25, 2020 @06:27AM (#60225798) Homepage Journal

      They have people from DEC, Transmeta, MIPS, Motorola, PA-Semi, HP, Sun, you name it.

      People who worked on a bunch of failed CPU architectures? I'm joking, a little bit.

      • by alvinrod ( 889928 ) on Thursday June 25, 2020 @08:47AM (#60226190)
        Were all of them bad or just not commercially successful? There's also more to be learned from a failure than a success and not every engineer on a team of hundreds is equally competent.

        The chips Apple made for their devices two years ago are still better than many of the cutting edge chips in the flagship devices of their competitors. It would seem that Apple hired a lot of very competent engineers.
        • by AmiMoJo ( 196126 )

          68k was great, in fact I think it could have remained competitive with x86 were it not for the fact that PCs became dominant and Motorola decided the future was PowerPC.

          Apple's CPUs are no better than the competition, just different. Most other designs emphasis multi-core performance while Apple goes for single threaded performance. The result is that Apple looks good in certain benchmarks but in others it falls far behind, and for real applications it depends heavily on how multi-threaded it is.

          Apple are q

      • by Viol8 ( 599362 ) on Thursday June 25, 2020 @09:28AM (#60226326) Homepage

        Anyone who thinks the DEC Alpha, Sun Sparc, 68K , MIPS in SGI kit was unsuccessful has either just graduated from wearing short trousers at school or is willfully ignorant. x86 didnt win the CPU contest because it was the best but because by a twist of fate IBM used it in the PC. If it hadn't been for that its highly likely x86 and quite possibly Intel would have had their last rites in the 90s.

        Turns out now that x86 internally is now such a dogs dinner from adding hack on top of hack to keep this antiquated architecture relevant that its now a bug ridden security nightmare and Apple have had enough.

        • by AmiMoJo ( 196126 )

          Yep, IBM's decision had huge ramifications. They nearly went with 68k which is a much better architecture.

          Having said that x86 turned out to be superior to other architectures in many ways, especially RISC ones. It acts more like an intermediate language now, describing functionality that the CPU is able to optimize on the fly to make best use of available resources.

          RISC has many advantages but for raw performance CISC seems to be superior.

          The cruft in x86 is largely irrelevant now, CPUs implement some of i

      • Depends on what you deem as failure. Certainly the companies failed but that doesn’t mean the CPU architecture failed. For example, PA-Semi had a number of CPU designs that were commercially available. When Apple bought them, they closed down the entire business as they wanted the engineers to work on Apple CPUs. Whether PA Semi’s CPUs would have been industry will never be known now.

    • I wouldn't say IBM dropped the ball. The G5 was power hungry and hot, they couldn't make a mobile version. The POWER architecture is IBM's own design for their in house products, high end servers where power draw and cooling is low on the priority list.

      • by Junta ( 36770 )

        Well, IBM certainly dropped the ball in context. They could not or would not make something competitive for the desktop space. Of course they might have not cared about playing in the space at all anymore and wanted the business to dissolve through apathy, it might have been deliberate.

        POWER has been in trouble for a while even in the high end. The best thing they did as of late was support NVLINK and give up on the real work and just let nVidia do it, accepting being relegated to the role of mostly supervi

    • I suspect Intrinsity made up the core of their processor development team and other acquisitions failed and what remained was made to fit in with them.

  • End of the Mac? (Score:5, Interesting)

    by beheaderaswp ( 549877 ) * on Thursday June 25, 2020 @06:25AM (#60225792)

    To me this signals the end of Apple desktop computing.

    Using an ARM CPU locks you into the platform in the same way the 68000 series Macs locked you in. Or worse: the PPC chips with Carbon- the 68k emulator that Apple supported for a short time during the PowerPC years.

    And back then they were selling PCs on an expansion board to bridge the gap created by being a platform island.

    At this point closing the platform further is a mistake. Everyone has to buy new software now. The emulation used for legacy x86 apps will be painfully slow. Developers will fall by the wayside further narrowing the scope of software for the platform.

    Support options will plummet.... IT professionals can't build a Hackintosh on the cheap to learn the system. We certainly are not going to buy an ARM Mac from Apple.

    They should have moved over to the AMD Ryzen chips. This requires minimal R&D, is high performance, doesn't obsolete all the software, and can still run Windows (which is a selling point).

    In the 90s, when I worked for Apple, closing the platform up was a huge mistake. Closing it up further... even worse.

    The problem with a walled garden is that eventually everyone leaves except for the person who owns the garden. No one tries to get back into jail once they are out- no matter how pretty the murals on the walls are.

    • Well, to me this happened when Apple started the walled garden with apps.
      At least, this is when I finalized my switch towards Linux machines, and I must say, today with a recent Lenovo + Debian testing, with full support for the GPU and a calibrated screen for instance, I consider myself robust enough.
      But I still see the graphics guys, in my factory, working with macs... Maybe they'll continue even when Apple will finally succed in turning macintoshes into actual tablets, with closed hardware as well as sof

      • You know the difference between you and the graphics guys? They don't care what OS or CPU their machine has. It's a tool they use to get work done.

    • Re:End of the Mac? (Score:5, Insightful)

      by DarkOx ( 621550 ) on Thursday June 25, 2020 @06:56AM (#60225854) Journal

      IT professionals can't build a Hackintosh on the cheap to learn the system

      IT professionals don't build hackintoshs to "learn the system" they might have the skills and decide to build one because they want to a cheap Mac at home, but it ain't about learning. No "professional" would be maneuvered into supporting Apple Equipment in the enterprise without at least insisting they have some hardware of their for learning / testing.

      • It's the other way around. People who build hackintoshes at home and learn the system become the IT professionals who are comfortable with using and recommending Apple equipment.
      • I'm an IT professional. I built a hackintosh for learning purposes.

        Just saying.

      • That's about open source, not some proprietary stuff. Like, I sometimes receive bug reports about some build stuff on Fruits -- issues that look trivial to fix -- but I have no way to fix them as there's no way I'd put my own money into buying massively overpriced inferior hardware just to support people. On the other hand, Windows is not hardware-locked, and you can get licenses for peanuts or even gratis, so even though I don't care about Windows myself testing stuff there isn't a big hurdle.

    • Re:End of the Mac? (Score:5, Insightful)

      by coofercat ( 719737 ) on Thursday June 25, 2020 @07:20AM (#60225906) Homepage Journal

      I'm wondering how an ARM CPU "locks you into the platform" more than an Intel one? It's not like you could run (say) a Windows Intel binary on a Mac before, and you won't be able to afterwards either. You'll be able to run Mac apps and nothing else - just like now. By now, Apple have already talked to a lot of the major app vendors about the move to ARM, and presumably the biggest ones are all okay with it - so you'll get ARM versions for at least the most used apps just as you can get them for Intel today.

      Other news about Apple trying to make an "ecosystem" of products and to bring phone apps to laptops might well be the "lock in" you're talking about, and unless Apple are careful, making that work at the exclusion of all the other apps in the world would indeed be lock-in and possibly the "end of Apple desktop". But the CPU architecture has nothing to do with any of that.

      Lastly, I'm pretty certain Apple considered using AMD chips - it certainly would be the most obvious "get away from Intel" move. I suspect ARM won out because they can make cores that are "fast enough", but critically make loads of them in one chip. Thus you get lots of multi-tasking, and probably for less money than a single AMD (or Intel). Then, you can turn off cores to save battery when all you're doing is watching a cat video. But Apple may have reasons beyond this that we're not being made aware of - time will tell.

      • Because right now if you buy a Mac you are able to run Windows on it. If you buy an ARM mac you'll no longer be able to do that. You probably won't have a good LInux offering for quite a while either. Linux is more likely to come around, but I'm sure it will be a bit before they actually get everything up and running properly.

        • by msauve ( 701917 )
          "right now if you buy a Mac you are able to run Windows on it. If you buy an ARM mac you'll no longer be able to do that."

          Uh, Windows 10 on ARM [microsoft.com]. If and when Apple or a third party supports it remains to be seen, but it can't be dismissed out of hand.
      • It's not like you could run (say) a Windows Intel binary on a Mac before, and you won't be able to afterwards either. You'll be able to run Mac apps and nothing else - just like now.

        You need to tell my coworker that then. I see him doing it every day.

      • by Junta ( 36770 )

        But the CPU architecture has nothing to do with any of that.

        Eh, it has some to do with it. I think the end game is to steer developers away from traditional desktop software distribution and toward the App Store. Presumably with some enhanced iDevice platform capabilities, this enables a 'desktop' software vendor to have low incremental investment for their app to 'scale-down' and get bought by the much larger iDevice market, but the same app scaling up to full function when used on a more appropriate laptop/desktop form factor. This *coincidentally* means that the

      • ARM CPU in and of itself doesn't lock anyone into a platform as you pointed out. Heck, how many of us have Raspberry Pi's hanging around our houses doing half a dozen different tasks on our networks... ARM CPU's... no lock in.

        But here's my thought on it, and call it a conspiracy theory if you want. Apple wants a lock in. They want the complete vertical integration approach they had back when it was the 68k and then the PPC CPU's. As a result, they're going to lock up the platform starting at the top; at the

        • I think it's not the Arm architecture itself but the commonality with ios that will drive the lock-in. Apple's already makning ungodly amounts of money from the appstore, and now they'll be all "heeey check this out you can get all your favorite fart apps from the same store on your mac!". And soon enough, if your app isn't on the app store (and giving apple a 30% cut), it might as well not exist. Maybe they'll even make it the only way to install software (without jumping through some hoops), and everyone

    • by caseih ( 160668 )

      If Apple's emulation of x86 (translation, recompiling, whatever) is as good as they claim, then it may not matter. Existing Mac apps will continue to run well for the next 5-10 years, and by that point all existing software will be recompiled. If their emulator/translator works with VMs (it appears it does), then users can continue to run Windows in a VM. Sure there's a performance and resource penalty, but the ARM chips are getting faster and faster and eventually it will just be enough. Maybe they cou

      • by ceoyoyo ( 59147 )

        Apple has already done this. Rosetta was good enough that most people noticed their old apps were actually PowerPC binaries when OS X 10.5 started popping up notices that Rosetta would be discontinued in 10.7.

    • Re:End of the Mac? (Score:5, Informative)

      by gnasher719 ( 869701 ) on Thursday June 25, 2020 @08:38AM (#60226138)
      Unfortunately, there is no tag for "uninformative", "you are wrong" etc. But you are fundamentally wrong.

      Apple built iPhones since 2007, and in April 2010 they released the first ARM processor that can be called "Apple designed". That's ten years ago. Other these ten years, Apple's ARM processors have improved and improved and improved. About three years ago there was a quote "Apple's iPad has a more powerful processor than 90% of laptops sold". Today the iPad processor is powerful enough to hold its own against quad core Intel chips.

      And that's a chip designed for a mobile device. Now there are two possibilities: Either Apple has gone completely loopy, or Apple has ARM-based processors that have the clock speed of a desktop processors, and at least twice the number of cores of their tablets, ready or close to ready. I'd bet on the second possibilities. Which means by the end of the year they will likely have a 3 core chip beating Intel's quad core, a six core chip beating Intel's 8 core processors, and possibly another chip that blows everything but the high end iMac Pro and Mac Pro out of the water. Maybe they have that even now.

      There is no emulation for legacy. Code is recompiled. Apple is to a huge part responsible for LLVM; ARM has twice the registers of x86, so I wouldn't be surprised at all if some Intel code actually ends up running faster. And recompiling for ARM is one switch in the compiler.

      Nobody has to buy new software. Old Intel-based software will run just fine. Recompiled software will run significantly faster. For MacOS developers, no change. For iOS developers a huge change: Their apps will instantly run on a Mac.

      What about AMD? You can feel free to buy a computer with a Ryzen processor. It would be of no benefit whatsoever to Apple.
    • Using an ARM CPU locks you into the platform in the same way the 68000 series Macs locked you in. Or worse: the PPC chips with Carbon- the 68k emulator that Apple supported for a short time during the PowerPC years.

      I would quite equate ARM with PowerPC or 68000. The total number of ARM CPUs probably exceeds the total number of x86 CPUs at this point. The lock-in in some sense is if they remained with x86. They will, at least for a while, be shipping both ARM and x86 Macintoshes. If Intel manages to get their act together, it may be that they keep Apple's business for the high-end and the ARM will be Apple's low-end line.

      In the 90s, when I worked for Apple, closing the platform up was a huge mistake. Closing it up further... even worse.

      This one I find a bit funny. The Macintosh was basically always closed until Apple started allowing

    • "Using an ARM CPU locks you into the platform in the same way the 68000 series Macs locked you in. Or worse: the PPC chips with Carbon- the 68k emulator that Apple supported for a short time during the PowerPC years"

      Exactly, the same kind of lock-in that the 6502, 68k, and PPC had. Oh wait.

      People are locked into the MacOS, not into a specific ISA. I've been using a Mac since the 512KE. It's not lock-in, it's a personal choice. To me, every other OS sucks.

      Apple use AMD? Why trade one dependency for another?

  • by Ecuador ( 740021 ) on Thursday June 25, 2020 @06:27AM (#60225794) Homepage

    With MacOS closing down it has been getting harder and harder to use Macs as dev machines - Catalina gave us quite some challenges at work. Now going to ARM will probably make the x86 VMs we need quite a bit slower, so we'll most likely have to hang on to our old machines and consider switching to something else...

  • by thegarbz ( 1787294 ) on Thursday June 25, 2020 @06:48AM (#60225832)

    Just remember we're not talking about security vulnerabilities here. We're talking about actual erratum to the specification of the architecture. The 6th and 7th Gen erratums are documents combining a total of 130 pages with the most recently published errata having designation: SKL190.

    There's no expectation for a modern CPU to be bug free, but 190 erratum to a specification is a hell of a lot.

    For comparison AMD are more difficult since they sequentially label all their bugs and don't associate them with a product generation, but in total I count 46 for 17h 00-0Fh which is the designation for Ryzen 1000 and the original Zen architecture. This is especially significant considering how new and unique Zen was compared to AMD's previous architecture and how Skylake wasn't that much of a deviation from Intel's past architecture.

    • by v1 ( 525388 )

      I'm no CPU developer - I'm assuming "erratum" means "known issue" / performance contrary to specifications / something like that?

      So are erratum things they are working on or intend to fix before release, or are they things that the people implementing the boards and compilers are just expected to work around in the final product?

      Like with the venerable pentium math bug, "our chip has problems with math in this situation, "You'd better do the math yourself instead of using our opcode if it's in this range so

      • Virtually anyone who is a user of Intel products with support accounts can submit an errata. An errata is what Intel calls design defects or errors.

        https://www.intel.com/content/... [intel.com]

        If someone suabmits an errata after the processor is released, there's almost always no fix (as you can imagine.) It is published, however, after review - for the good of the customers.

        I'm not convinced that the number of errata is the primary reason that Apple decided to swtich to a different CPU. There are so many other reasons

        • As the article clearly states this was likely the tipping point for Apple. They probably had plans to move before but this pushed them to go ahead with those plans.

          I can imagine Apple expressed their unhappiness with Intel directly. Skylake wasn’t supposed to be a huge architectural change like when AMD started Ryzen. While the Intel engineer can’t know for sure, there were probably signs from Apple like them not ordering as many CPUs in the future. Also Apple likely ramped up hiring more engine

      • Errata are identified issues, that basically document the actual behavior rather than what was documented before. Sometimes these can be fixed through microcode updates to work around the defect, sometimes it's a "Yeah, this is screwed up and you're going to have to deal".

  • Who needs glasses when 2020 is the year of perfect hindsight?! Only Apple isn't switching. They're bridging the gap between their platforms. They've been with Arm for quite a while now and they're keeping the Intel line very much alive for a good time. Intel is then not the only x86 manufacturer and AMD makes rather attractive CPUs and Apple could just, you know, use AMD CPUs?! Oh wait, no, they couldn't - they already are! Doh.

    There is no switch, no running away here. This is a disgruntled employee throwin

  • I'm skeptical.. (Score:5, Insightful)

    by Junta ( 36770 ) on Thursday June 25, 2020 @07:46AM (#60225978)

    Looking at Mac sales versus iDevice sales, Apple made the call it was better for the Mac ecosystem to be in support of the iDevice ecosystem, simple as that. Now their Macs will easily run iPhone/iPad software of which there are more to choose from, and makes Apple more money (they tried OSX store, and likely found it wanting as software largely sidestepped the store).

    It may well be true that Apple found nearly as many problems as Intel, however in my experience the big name customers sometimes demand to see/test/evaluate the product concurrently with your own testing organization. I would assume Apple demanded this treatment and would hope they were told to expect it to be rocky as it was a pre-QA product and Intel presumably relented. I know from experience that a lot of QA engineers are offended by the thought of an external party seeing things before they get the chance and would be eager to jump at an opportunity for 'I told you so' for a move like this, but it's hard to imagine Apple not demanding the sort of access that would lead to them seeing a lot of problems, but also for them to know exactly what they are getting into.

    • But Skylake wasn’t Apple’s first Intel CPUs. Apple had been officially on Intel for 10 years at that point and at times collaborated with Intel on designs. For example, Apple convinced Intel at one point to use smaller packaging for some laptop CPUs. There’s no doubt that Apple has tested other Intel CPUs before Skylake. The engineer points out that Skylake was particularly bad especially since it wasn’t supposed to be a large change in architecture from the previous CPUs. One thing
  • by reporter ( 666905 ) on Thursday June 25, 2020 @07:47AM (#60225986) Homepage

    Sophie Wilson, the architect of the first ARM processor, was inspired by the engineers working on the successor to the 6502 at the Western Design Center. They proved that a small team can build a simple microprocessor which is competitive with a microprocessor (like the x86 processors) built by an army of H-1B-visa engineers with a budget of billions of dollars.

    So, Wilson and the other 2 engineers (Steve Furber and Hermann Hauser) on her team designed, built, and tested the first ARM processor. Its simplicity gave it 2 additional characteristics: low power consumption and ease of testing. These 2 characteristics would, decades later, pave the way for ARM to enter the market for laptops, desktops, and supercomputers.

    ARM will appear in Apple laptops and desktops in late 2020.

    As of today, Fugaku [cnet.com], a supercomputer built by Fujitsu, is powered by ARM processors and is the fastest supercomputer in the world.

    Wilson and her 2 British colleagues, Steve Furber and Hermann Hauser, deserve the Charles Stark Draper Prize for Engineering. This prize is the engineering equivalent of the Nobel Prize. These engineers have done for computing systems what Claude Shannon did for communication systems.

    • https://hackaday.com/2018/05/08/sophie-wilson-arm-and-how-making-things-simpler-made-them-faster-more-efficient/ [hackaday.com]

      "The second was a project called the Berkeley RISC project, which stood for Reduced Instruction Set Coding. The idea behind this was that if a CPU was built to only run a very small set of instructions, it could run faster and more efficiently. Rather than add more instructions to the processor itself, the operating system running on top of the processor would break tasks down into the simpler i

    • Motivation for ARM was mostly cost. The lower power CPU could utilize a plastic package in place of the standard ceramic package used by other manufacturers of the time. So lower power was a desired design property, but the underlying reason was reduced manufacturing cost.
      • by dfghjk ( 711126 )

        It was commonplace for computer manufacturers to design their own processors, and RISC had its start directly in that space. Why people think they need to now make up a history of ARM in ignorance of actual history is remarkable, but it is /. That's not a criticism of your post but rather of the attempt create a myth around ARM superiority. ARM had its start as a modest processor, of low cost, for use in a low cost personal computer (that failed). It found a niche for non-technical reasons and took off

      • by ChrisMaple ( 607946 ) on Thursday June 25, 2020 @01:03PM (#60227326)
        8080 in plastic. 6502 in plastic. 6800 in plastic. Z80 in plastic. 68000 in plastic.
  • by Kohath ( 38547 ) on Thursday June 25, 2020 @08:36AM (#60226130)

    I expect Apple's new Mac CPU to be faster, use much less power, and to be cheaper than the Intel CPU that it replaces. It will also have more features and a lot better built in graphics capabilities than Intel's comparable integrated graphics.

    Bugs in Intel chips may have played a role in the decision, but when you can significantly improve on every metric at the same time, it's not like you need an extra annoyance to push you into the decision.

    • I expect Apple's initial ARM release Macbook to be comparable to an Intel ULV part in performance with better power characteristics. I do not expect Apple's GPU efforts to be better than AMD or Nvidia, ever - there's way too much institutional expertise and technology patenting going on there.

      So that basically comes down to the first generation of low-end to mid-grade ARM Mac being on par with an Intel-Mac with no discrete GPU. However, you get into the upper end and things are likely to be different. Th

  • by Anonymous Coward on Thursday June 25, 2020 @09:17AM (#60226298)

    I have a friend who was a chip designer for HP.
    He worked on the team that built the chipset for their superdome computer.
    When the IA-64 came out, their team regularly shipped rev-1 silicone. When their chipset first shipped, it had exactly 1 known bug--which happened to be in a debug path.

    When Intel partnered with HP to build the IA-64 chips, HP asked if they should build one also. Intel said, "no, we got this.".
    Intel took a long time to produce their 1st gen IA-64 (Merced) chip. They didn't ship 1st-rev silicone. It was slow. It had a huge list of errata.
    HP didn't take no, they decided to build their own anyways. In nearly the same time as it took for Intel to build their first one, HP built theirs.
    HP's design became the 2nd gen IA-64 CPU as it worked *much* better, faster, and had very few bugs. Go figure.

    HP's team was considered one of the most advanced chip design teams on the planet. They had one guy that had built the most advanced software simulator in the world so they could easily test everything pre-silicon. The entire team was part of the design process, so that all of them had input--this is essential.

    Soon after, Intel decided to buy this team, since it produced such good results. Did they learn anything about how this team operated differently? Apparently not.
    My friend soon after told me how Intel did things in their shop:

    - everything was siloed
    - recent grads were used up and burnt out, given only tiny amounts of info about their little piece of the puzzle since they had high turn-over from being over worked
    - only the "Intel Fellows" architects created the actual design
    - the actual developers had no real input
    - the "Fellows" would create a design and essentially throw it over the wall and say "here, implement this", with no regard to practical feasibility

    The resulting products were essentially knee-capped. The testers wouldn't always have what they actually needed to test the design. If they were able to give feedback, their requests for new or different test features wouldn't make it in until the *next* architecture--and by then may not be applicable!

    *GO FIGURE* they can't test worth shit.

    He described to me how their process was fundamentally broken, and how his team was essentially instructed to do things *their way* as soon as they arrived.
    You'd think Intel would have been smart enough to recognize that the team they purchased wasn't so successful merely because of the particular butts in seats, but because of *how they did things differently* than Intel did. I guess not.

  • Does Intel suck that much currently?

  • The irony of a company which made the screen cable too short so if you opened the screen 90 degrees or more your display failed, and the same company who puts the backlight supply voltage on a pin on a header notorious for corrosion next to the pin that supplies the CPU regularly ensuring a 1.5-3V CPU gets a healthy 12V sent to it has the gall to call out someone else on their shitty QA?
  • Imagine that, make shitty flawed products, and your customers start looking elsewhere. Continue making shitty flawed products, and your customers go out of their way to fire you.

"All the people are so happy now, their heads are caving in. I'm glad they are a snowman with protective rubber skin" -- They Might Be Giants

Working...