Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft Games Hardware

The Truth About Last Year's Xbox 360 Recall 255

chrplace forwards an article in which Gartner's Brian Lewis offers his perspective on what led to last year's Xbox 360 recall. Lewis says it happened because Microsoft wanted to avoid an ASIC vendor. "Microsoft designed the graphic chip on its own, cut a traditional ASIC vendor out of the process, and went straight to Taiwan Semiconductor Manufacturing Co. Ltd., he explained. But in the end, by going cheap — hoping to save tens of millions of dollars in ASIC design costs, Microsoft ended up paying more than $1 billion for its Xbox 360 recall. To fix the problem, Microsoft went back to an unnamed ASIC vendor based in the United States and redesigned the chip, Lewis added. (Based on a previous report, the ASIC vendor is most likely the former ATI Technologies, now part of AMD.)"
This discussion has been archived. No new comments can be posted.

The Truth About Last Year's Xbox 360 Recall

Comments Filter:
  • Bleh... (Score:4, Insightful)

    by Anonymous Coward on Tuesday June 10, 2008 @07:08PM (#23737927)

    ...hoping to save tens of millions of dollars in ASIC design costs, Microsoft ended up paying more than $1 billion for its Xbox 360 recall.
    I'm glad that I am not wealthy enough to be able to afford to be that incompetent.
  • by boner ( 27505 ) on Tuesday June 10, 2008 @07:13PM (#23738003)
    well, it's the difference between an MBA making a business call based on cost/profit analysis and an experienced chip designer looking at the actual risks involved....

    MBAs are good in cutting corners in traditional businesses, but generally have no understanding of technology risks....
  • by Anonymous Coward on Tuesday June 10, 2008 @07:15PM (#23738047)
    next up rumour and hearsay

    that is all
  • by Fearless Freep ( 94727 ) on Tuesday June 10, 2008 @07:17PM (#23738073)
    Please don't equate Steve Ballmer with Bender
  • by dfsmith ( 960400 ) on Tuesday June 10, 2008 @07:29PM (#23738213) Homepage Journal

    Consider: would you rather spend $10M on a platform that may flop and not make a dime

    OR

    Spend $1B on a platform that has made multi-billions.

  • by Dhar ( 19056 ) on Tuesday June 10, 2008 @07:31PM (#23738257) Homepage

    Never, and I say NEVER let a bunch of software engineers try to design a hardware chip.
    I've worked with software written by a hardware company, and I can say the same thing from my side of the fence...never let a bunch of hardware guys write software!

    I suppose if we can all agree to stay out of the other guy's yard, we can get along. You do hardware, I'll do software. :)

    -g.
  • Vote parent up (Score:5, Insightful)

    by imsabbel ( 611519 ) on Tuesday June 10, 2008 @07:39PM (#23738351)
    The article is COMPLETE, UTTER bullshit.

    Years before the xbox360 has been released ATI was already announced as the system parter for the GPU. No "secret unnamed ASIC vendor" anywhere.
    The recall, again, was thermal problems.

    Do you really think a completely different GPU by a completely different company could have been designed in a year _and_ totally compatible with the original one?
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday June 10, 2008 @07:48PM (#23738469)
    Comment removed based on user account deletion
  • by TripHammer ( 668315 ) on Tuesday June 10, 2008 @08:15PM (#23738939)

    well, it's the difference between an MBA making a business call based on cost/profit analysis
    All profit-seeking companies do this. This is not an inherently bad thing - you wouldn't have a job otherwise.

    MBAs are good in cutting corners in traditional businesses, but generally have no understanding of technology risks....
    So if you have business savvy you can't possibly understand technology risks? Oh please.
  • xbox 360 turns a profit - JUST

    still got a while to pay back for the original xbox sink hole
  • by PJ1216 ( 1063738 ) * on Tuesday June 10, 2008 @08:21PM (#23739057)
    I'm not sure of the numbers, but finally turning a profit one quarter does not mean you've finally made up all the money you lost all the past quarters from selling the systems at a loss. It just means they're no longer selling them at a loss. They had already dug a hole, but finally started to climb out instead of going deeper. It doesn't necessarily mean they are out of it though. They *could* be at this point, but that article says nothing the platform making up the billions it had already lost. Eventually they will and maybe they have at this point in time, but that article is a red herring. It just means they stopped losing money.
  • by HiVizDiver ( 640486 ) on Tuesday June 10, 2008 @08:32PM (#23739263)
    Not so sure about that... I would argue that very often when something breaks, it is because they used a cheap vendor, but that the logic doesn't necessarily apply backwards - that using a cheap vendor means it WILL break. I bet there are loads of examples of people doing things on the cheap, where it DIDN'T fail. You just don't hear about those.
  • Indeed. (Score:3, Insightful)

    by Xest ( 935314 ) on Tuesday June 10, 2008 @08:43PM (#23739439)
    Quite why an article could be titled "The truth about..." when it's well, not actually the truth but just mere speculation.

    Speculation that is well known to be false and could've been showed up as such with a quick look at the XBox 360 specs which are available in many places that I'm sure Google would oblige to discover.

    The issue has already been outed as being to do with cheap solder iirc that simply couldn't stay put under the heat of the system over extended periods of time.
  • by Anonymous Coward on Tuesday June 10, 2008 @08:45PM (#23739489)
    Actually, this is a more specific case of "Experience matters." Just because you intellectually can understand the idea of designing circuits does not mean you are aware of the obstacles, workarounds, traditions, and general practices of a field.

    Engineers aren't stupid, and you can certainly cross-train one to do another's job. But you aren't going to do that overnight. If your product is overbudget/behind schedule/etc, you don't want to make it into a very educational failure by regularly having people operating outside their expertise.

  • by Anonymous Coward on Tuesday June 10, 2008 @08:55PM (#23739639)
    I'm going to give you the benefit of the doubt and assume that I'm missing something in your math, but wouldn't one *subtract* the loss from the income to get the spread?
  • by Anonymous Coward on Tuesday June 10, 2008 @08:58PM (#23739683)

    So if you have business savvy you can't possibly understand technology risks? Oh please.
    Strawman. The problem is that MBA degrees are churned out as "one size fits all" managers, suitable (pun intended) for any industry by virtue of having no specific training for any of them.

    You can have business savvy and technological expertise, but it's a roundabout path through today's educational system if you're not teaching yourself at least one. And I think we all know the proportion of people who are capable of serious self-education.

  • by CastrTroy ( 595695 ) on Tuesday June 10, 2008 @09:09PM (#23739871)
    Not really. I wouldn't have a mechanical engineer design a chip either. I also wouldn't have a hardware/mechanical engineer designing a software system. Let people do what they are good at, and stop trying to cut corners by substituting in people where they have no skills.
  • by Anonymous Coward on Tuesday June 10, 2008 @09:23PM (#23740075)

    All profit-seeking companies do this. This is not an inherently bad thing - you wouldn't have a job otherwise.

    I don't think you're getting it. Cutting costs is one thing. Cutting corners is another. Cutting costs is fine, but cutting corners implies the product is worse off because of it. Few engineers would say "It'd be cheaper to roll our own graphics chip," because they realize the immense technical challenges involved. Few MBAs are likely to understand that, however.

    So if you have business savvy you can't possibly understand technology risks? Oh please.

    There's a big difference between what you just said and what the OP said. Nobody said MBAs can't be tech savvy. However, the fact of the matter is, most of them aren't.

    Also, just to be pedantic, having an MBA has little to do with having business savvy.

  • by aliquis ( 678370 ) on Tuesday June 10, 2008 @09:36PM (#23740293)
    It indeed have lost multi-billions, just because it have just recently started to earn them some money don't make all the old losses go away.
  • by Anonymous Coward on Tuesday June 10, 2008 @10:00PM (#23740653)
    I've seen it too. Obviously, you have only been with good managers who make sound decisions.

    Only when problems hit critical mass do they start asking the important questions and gather acurate data.

    I've seen groups crushed and the politicians move out before the axe falls. The problem is, they led the charge straight to hell and only got off the bus at the toll charge.

    The trick is to yell louder then anyone else when problems are brought up. Scape goats are quite handy too.

    I recommend keeping a portfolio of dirt so when it is time to jump ship you point out all of the other rats. This is of course only useful to the guy who is jumping ship. (please take note MBAs as this information is critical to your continued success)
  • by Anonymous Coward on Tuesday June 10, 2008 @10:12PM (#23740843)
    You missed Bunnie's point completely. When the motherboard rolls of the assembly line, they test the motherboards to ensure that the chips are properly soldered. The manufacture has ovens that monitor the temperature of the cpu/motherboard while the bga chip is melting. You have to make sure that the solder melts all the way, yet you don't want to damage the cpu. After that, they inspect the motherboard with x-rays to ensure that the soldered components are properly aligned and the soldered melted.

    Bunnie's point is that the bga joints cracked over time. Different materials expand at different rates when they heat up, the coefficient of thermal expansion [wikipedia.org]. The fiberglass motherboard expanded at different rates than the silicon/epoxy cpu and gpu. Since the Xbox overheated, and was poorly engineered (so Microsoft could beat the PS3 to the market), the motherboard warmed up, and expanded at a different rate than the cpu soldered to it. As a result, the solder joints were under stress, and thus cracked. You can see this in the red die that leaked between the solder pads when Bunnie pried of the cpu.
  • by MBraynard ( 653724 ) on Tuesday June 10, 2008 @10:28PM (#23741065) Journal
    Your fallacy is that you think that this example of MS's bad decision is indicative of MBA's being bad decision makers.

    Are you telling me that Intel, AMD, ATI, NV, etc. have never released a flawed chip?

    Were the people at MS who made the chip really incompetent - or did MS just hire them from another ASIC company? There is no guarantee this wouldn't have happened if they did go to a ASIC.

  • Truth vs Opinion (Score:1, Insightful)

    by Anonymous Coward on Tuesday June 10, 2008 @10:46PM (#23741265)
    No wonder this place is turning less and less serious and more and more left wing so to speak..

    The title is "The Truth About Last Year's Xbox 360 Recall" where as the editor's note says "an article in which Gartner's Brian Lewis offers his perspective on..."

    I know this is slashdot, but do we have to stoop to White House levels of spin?
  • by quanticle ( 843097 ) on Tuesday June 10, 2008 @11:10PM (#23741527) Homepage

    That's true, but, if the did go to an ASIC vendor they could have got a contract indemnifying them from taking losses when the chip turned out to be flawed. By doing the chip design themselves, they saved a little bit of costs, but also took on all the risks of having a bad design.

    That's what the parent poster is alluding to. A manager with experience in technology would have understood that, while designing your own chip might have been cheaper, it would have also introduced significant downside risk, which ought to have been factored into the equation. Farming the chip design out to a third party, while more expensive in the short term, would have entailed less long-term risk.

  • by Hal_Porter ( 817932 ) on Tuesday June 10, 2008 @11:41PM (#23741895)
    I dunno, the problem I have with MBA types as managers is that it's easier to learn the business stuff yourself than the technology.

    And for balance the problem I have with engineers as managers is that it's possible to learn the people skills stuff but you have to understand why it's important and want to do learn it. It's all too easy to stay in the comfort zone where you basically sit in a dark corner somewhere and write code if that's what you enjoy rather than forcing yourself to talk to people.
  • by Tsaot ( 859424 ) on Tuesday June 10, 2008 @11:49PM (#23741967) Homepage
    It's more like people who go for a MBA are wanting to avoid computers. I cannot tell you how many people I've tutored at my college who are going for business degrees on how to create a table in access, let alone how many have asked for help formatting an excel spreadsheet when the teacher has provided step by step instructions. It's like they're afraid of using a computer. I die a little inside each time when they ask which button bolds the text.
  • by vux984 ( 928602 ) on Wednesday June 11, 2008 @12:27AM (#23742403)
    So... hardware design is a "real" engineering (deals with whole range of nastiness the physical reality slaps you with), unlike the hack that software "engineering" is... Is that what you're saying? :-)

    Well... there's "real" software engineering too...stuff involving resource deadlock, race conditions, critical section synchronization, in applications like virtual memory management, network protocols, time sync, file systems, security, fault tolerance, etc that are subject to all sorts of 'physical reality nastiness'.

    Its not all wizards and automatic code completion you know. :-)
  • by Anonymous Coward on Wednesday June 11, 2008 @02:44AM (#23743495)
    >The hardware designer may opt for a much more asynchronous approach, that minimizes the number of clocked registers.

    Personally, based on my experience in teaching VHDL/Verilog to students, I believe that it is much more likely that a software engineer would choose to use a much more asynchronouos approach, and get it totally wrong :)

    Most hardware designers would tend to avoid async logic as much as possible to reduce the verification time. Sometimes you have to do it when doing clock domain crossings, etc, but it is usually not something you are looking forward to.
  • by tftp ( 111690 ) on Wednesday June 11, 2008 @03:50AM (#23743899) Homepage
    Basically he's trying to create business for ASIC design houses by telling people that putting a bunch of licensed IP onto a chip is rocket science and they shouldn't try to do it in house. Is it really? I honestly don't know. I suspect it depends a lot on the quality of the in house people and the quality of the ASIC design house.

    It is true. You should not unnecessarily muck with VHDL/Verilog and 3rd party cores even if you work with an FPGA. This will not kill you, but it will make you poorer. HDLs are notoriously kludgy, and it takes a lot of effort to do it right. Proprietary cores rarely work as documented, and you have no visibility into them. When multiple cores are used, it's one large fingerpointing game between vendors. And you need to have good, experienced HDL coders. And you need to have all the tools, they cost big bucks.

    But that's with mere FPGAs, where you can update your design whenever you wish. However here they are talking about ASICs - where all the wiring is done with masks when the IC is made. You'd have to be certifiably mad to even think about a casual design like this. ASIC designs are done by very competent teams, using "10% coding / 90% verification" time allocation, because you can't afford /any/ mistakes. And even then you make mistakes; but experienced teams with good tools make those mistakes smaller, and they call them "errata" - something that is not right but can be worked around. When you make the F0 0F bug, though, you trash the whole run.

    So Microsoft risked a lot when it went for an in-house design. I am not surprised that they failed. They should have counted all the successful 3D video companies on the market and asked themselves why there are so few, and why top gaming cards cost so much.

    But if you're a cash rich company then the bias will be to try to do as much as possible in house, because that gives you more freedom to value engineer later.

    I am not MS, but I don't really see much business value in rolling your own video controller. More likely the NIH syndrome kicked in, or some people were overly concerned about their job security.

  • by goose-incarnated ( 1145029 ) on Wednesday June 11, 2008 @05:05AM (#23744345) Journal
    He didn't say "profit" above, he said "income'.
  • by Anonymous Coward on Wednesday June 11, 2008 @07:16AM (#23745119)

    As a sysadmin I reckon I could do a pretty good job at brain surgery.
    I'm not sure "turn it off and on again" works in brain surgery >:)
    He said sysadmin, not MSCE.
  • by umghhh ( 965931 ) on Wednesday June 11, 2008 @08:24AM (#23745705)
    It is maybe easier to learn the business stuff as it is presented in MBA curses but not in reality. In reality you must be good at what you do whatever schools you finished and whether you have management position or do things yourself. managing is not such a bad thing as some would like it to be - we need organizers and leaders. Only when these leaders and organizers have no f.ing clue to ask for technical advise and to use it properly things go wrong. OTOH if I as an engineer screw up there are others to correct me and possibly prevent shit hitting the fan (as long as QA is at place), if managers screw up there are a lot of people that will pay for their mistake. This of course is only statistical truth but still holds some validity. I personally do not see organizing and managing as bad thing only I hate constantly arguing with people too much to do it myself properly.

  • by randyest ( 589159 ) on Wednesday June 11, 2008 @01:25PM (#23750713) Homepage

    To fix the problem, Microsoft went back to an unnamed ASIC vendor based in the United States and redesigned the chip, Lewis added. (Based on a previous report, the ASIC vendor is most likely the former ATI Technologies, now part of AMD.)
    The funny thing about this is: ATI is not an ASIC vendor! ATI does chip design and, since they're fabless (or were until AMD bought them,) they get them made at TSMC or sometimes Chartered, or, most often, use NEC Electronics Amertica as their ASIC vendor. ATI partnered with Microsoft to make an ASIC (first at TSMC, then later with NEC) but, in generall, you don't go to ATI to get an ASIC made. You go to NEC, or IBM, or Toshiba, or LSI Logic, etc. Those are ASIC vendors. ATI is a fabless design house specializing in graphics. Big difference.

    I know companies that are a lot smaller than Microsoft who've done ASICs and it has worked.
    Without an ASIC vendor? As in, taped out GDS2 directly to a fab like TSMC? What process node? If you say 90nm or lower (which is the kind of ASIC we're talking about here) I'd have to call bullshit and ask for the names of these companies.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...