Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Microsoft Games Hardware

The Truth About Last Year's Xbox 360 Recall 255

chrplace forwards an article in which Gartner's Brian Lewis offers his perspective on what led to last year's Xbox 360 recall. Lewis says it happened because Microsoft wanted to avoid an ASIC vendor. "Microsoft designed the graphic chip on its own, cut a traditional ASIC vendor out of the process, and went straight to Taiwan Semiconductor Manufacturing Co. Ltd., he explained. But in the end, by going cheap — hoping to save tens of millions of dollars in ASIC design costs, Microsoft ended up paying more than $1 billion for its Xbox 360 recall. To fix the problem, Microsoft went back to an unnamed ASIC vendor based in the United States and redesigned the chip, Lewis added. (Based on a previous report, the ASIC vendor is most likely the former ATI Technologies, now part of AMD.)"
This discussion has been archived. No new comments can be posted.

The Truth About Last Year's Xbox 360 Recall

Comments Filter:
  • Ridiculous (Score:5, Informative)

    by smackenzie ( 912024 ) on Tuesday June 10, 2008 @07:17PM (#23738079)
    ATI and Microsoft developed this chip together over a period of two years. The XBOX 360 GPU has been known since conception as an ATI GPU.

    Furthermore, the recall was for overheating in general which -- though unquestionably affected by the GPU -- is a more comprehensive system design failure, not just a single component. (Look at the stability success they have had simply by reducing the size of the CPU.)

    I'm looking forward to "Jasper", the code name for the next XBOX 360 mother board that will include a 65 nanometer graphics chip, smaller memory chips and HOPEFULLY a price reduction.
  • What's going on..... (Score:5, Informative)

    by ryszards ( 451448 ) * on Tuesday June 10, 2008 @07:25PM (#23738161) Homepage
    Microsoft didn't design the GPU, ATI did, and everyone knows ATI have always been fabless. TSMC are the manufacturer of the larger of the two dice that make up the Xenos/C1 design, and while that die has been revised since for a process node change, it doesn't even appear if that new revision has been used yet (despite it being finished by ATI a long time ago).

    Lewis seems to be just plain wrong, which is kind of upsetting for "chief researcher" at a firm like Gartner, especially when the correct information is freely available.

    While the cooling solution for the GPU is the likely cause of most of the failures, that's not necessarily the GPU's fault, or ATI's, especially for a fault so widespread.
  • by Anonymous Coward on Tuesday June 10, 2008 @07:36PM (#23738317)
    Why not, they're both played by the same actor.
    http://www.imdb.com/name/nm0224007/
  • by Anonymous Coward on Tuesday June 10, 2008 @07:49PM (#23738473)
    What makes you think that it was designed by only software engineers exactly?

    I can tell you first hand that a lot of the people on the Xbox hardware team a extremely talented HARDWARE specialists. The way you talk you would think MS locked a bunch of IE developers in a room and didnt let them out until they had designed the chip.

    And as for the argument of 'well if they are so talented, why is the chip such a POS?', it is not only software engineers that design shitty hardware. Look at AMD, with the TLB defect in the Phenom chips, is that the fault of the software engineers?

    This response may be overkill, but somehow you were modded +5 interesting, but you completely miss the point.
  • by afidel ( 530433 ) on Tuesday June 10, 2008 @08:06PM (#23738757)
    Now scaler chip makes a LOT more sense to me than the GPU. Everyone knows ATI was the partner for the GPU and there would be few people in the industry that would call a GPU an ASIC. A scaler chip is very much an ASIC and I can see where MS might decide to do their own scaler chip, but they had no chance of doing their own modern GPU without a partner.
  • Re:More info please (Score:4, Informative)

    by Vectronic ( 1221470 ) on Tuesday June 10, 2008 @08:11PM (#23738845)
    http://en.wikipedia.org/wiki/Xbox_360_technical_problems [wikipedia.org]

    When a Microsoft Xbox 360 console experiences a "general hardware" failure or "Core Digital" failure, three flashing red lights appear on the power switch's "Ring of Light" in the front of the console. This is commonly referred to as the "Red Ring of Death" ...

    The General Hardware Failure error could be caused by cold soldering. The added mass of the CSP chips (including the GPU and CPU) resists heat flow that allows proper soldering of the lead-free solders underneath the motherboard. ...

    Another General Hardware Failure is shown by the ring of light flashing one red light, and an error code E 74. This too renders the Xbox unusable. ...

    The Nyko Intercooler has also been reported to have caused a general hardware failure in a number of consoles, as well as scorching of the power AC input. ...

    An update patch released on November 1, 2006 was reported to "brick" consoles, rendering them useless. ...

    In June 2008, the EE Times reports the problems may have started in a graphic chip.
    The last one is what this article is (mostly) about...
  • by Ucklak ( 755284 ) on Tuesday June 10, 2008 @08:27PM (#23739175)
    Income of $524 million
    Loss of $423 million
    Equals a spread of $947 million, almost 1 billion.

    Add the $1 billion recall, still looks like Vista and Office are paying for the XBOX 360.
  • by CycoChuck ( 102607 ) on Tuesday June 10, 2008 @08:34PM (#23739303) Journal
    Ballmer isn't Bender? I could of sworn that somewhere Ballmer was quotes saying "Cheese it" and then running out of the room when someone asked him about the xbox failures.
  • by Anonymous Coward on Tuesday June 10, 2008 @08:46PM (#23739499)
    Also, for those who don't know who Bunny is, he is critical to the hacking of the original XBox... he was the one who first discovered that (an old version of) the Secret ROM was written to the boot flash, he was the first to sniff hypertransport to read the Secret ROM, and if I remember correctly, he was the one to discover where the Secret ROM was (Though after the secret rom in boot flash was discovered to be non-functional, which is another thing Bunny was responsible for, it was pretty obvious).
  • Re:Vote parent up (Score:5, Informative)

    by i.of.the.storm ( 907783 ) on Tuesday June 10, 2008 @09:14PM (#23739933) Homepage
    Seriously, I thought it was common knowledge that the GPU was ATI. Don't know how this article even got here.
  • Re:Vote parent up (Score:4, Informative)

    by Pete Brubaker ( 35550 ) <pbman96@@@hotmail...com> on Tuesday June 10, 2008 @10:37PM (#23741171) Homepage Journal
    Both of you are completely right. The alpha dev-stations were Mac G5s with a special ATI reference board in them. Basically a PS_3_0 chip with more registers and instructions. Plus there wasn't even a recall. They only replaced/repaired units that had problems.
  • Some Facts... (Score:3, Informative)

    by TheNetAvenger ( 624455 ) on Tuesday June 10, 2008 @10:50PM (#23741317)
    1) ATI is NOT in the United States. (Yes I know AMD/ATI blah blah) The main point to this is the fab plant and who owns it?

    2) Microsoft did design the GPU in concept, but worked with some bright people from ATI and other GPU gurus for the specifics. People can make fun of MS design a GPU, but this isn't their first time around the block, and also gave them the intimate change of pairing GPU hardware and OS technologies.

    Look at the PS3, in addition the 'cell' processor that 'didn't need' a GPU to the shipping PS3 with the 'cell' and full Geforce 7800 in it, and yet between the two technologies it still can't hold framerates or do anti-aliasing like the Microsoft designed XBox 360. (See recent games like GTAIV where it runs at lower resolutions on the PS3.) (And I won't even go into how slow Blu-Ray makes the device for a game player being significantly slower than DVD and why MS refused to put HD-DVD or Blu-ray in the console as the primary drive. Gamers hate load times and crap framerates.)

    3) The 3 Rings of Death is about the Thermal sensor plate and flexing due to high heat. 99.9% of the time. (Also the 3 Rings does not always mean death, most units continue to work once they cool down, etc.) (Google It)

    4) As for MS Saving Money for using a non US fab plant and then having to move back to one, sure this is possible, but technically there would be little to no difference UNLESS Microsoft also changed the specification of the chip between the move process. I don't care if the fab plat has Donkeys and a Mule pulling carts out front, the silicon is created according to specification, and you don't get much more exact than this level of specificatinos.

    The real story here would more likely be the plastic/plate fab company that was creating the inner X plate/case holder that was warping and causing the 3 Ring problem, a) it was the real problem not the chip and b) would more likely fail specs easier than silicon.

  • by Hal_Porter ( 817932 ) on Wednesday June 11, 2008 @12:05AM (#23742151)

    I don't think you're getting it. Cutting costs is one thing. Cutting corners is another. Cutting costs is fine, but cutting corners implies the product is worse off because of it. Few engineers would say "It'd be cheaper to roll our own graphics chip," because they realize the immense technical challenges involved.
    They didn't "roll their own graphics chip" from what I can tell. They licensed the IP (the VHDL code or a synthesized core) from someone else. The plan from the start with the XBox360 was that they would do this and try to integrate it all eventually onto one chip. That's the reason they moved from x86 to PPC, because neither Intel or AMD would license their IP and let Microsoft make their own chips. Actually this is the difference between Risc and x86 these days - x86 vendors don't license their IP but Risc vendors do. Since consoles are sold at a loss initially and subsidized by games it's really important to reduce the build costs by doing this. Back in the XBox days most people thought that Microsoft lost out because they couldn't integrate the design into once chip in the way that Sony did with their console. And that was because they didn't own the IP for the processor.

    The mistake seemed to be to let Microsoft's in house group do this rather than outsourcing.

    But you've got to remember this is an article in EEtimes from an analyst with an agenda
    http://www.eetimes.com/news/latest/showArticle.jhtml;jsessionid=51TYZYXYRWUZUQSNDLSCKHA?articleID=208403010 [eetimes.com]
    "System OEMs have no business designing ASICs any longer," said Lewis. The reality is that system companies are finding it hard to do enough ASIC designs to keep in-house design teams employed.

    Basically he's trying to create business for ASIC design houses by telling people that putting a bunch of licensed IP onto a chip is rocket science and they shouldn't try to do it in house.

    Is it really? I honestly don't know. I suspect it depends a lot on the quality of the in house people and the quality of the ASIC design house.

    And it depends on what you're trying to do. In the embedded area lots of companies much smaller than Microsoft put an processor and a bunch of their own peripherals onto a chip and it works. I guess that console or PC graphics cores use a lot more power than that. But I don't know if "an ASIC design house" would have done a better job than Microsoft's ASIC group.

    Or more to the point, maybe a $1B recall is the price you pay for learning about this stuff. Microsoft can afford it obviously and it will influence how the successor to the XBox360 is done. Whether they hire more engineers and do it in house or outsource it is a business decision it seems. I guess the in house people and the design house will both try to argue for the best option from their point of view and some manager will decide.

    But if you're a cash rich company then the bias will be to try to do as much as possible in house, because that gives you more freedom to value engineer later.
  • by vigmeister ( 1112659 ) on Wednesday June 11, 2008 @02:08AM (#23743225)
    Apologies for the double post... but this irked me as well:

    I also wouldn't have a hardware/mechanical engineer designing a software system.
    Are you sure? Lockheed Martin has mechanical/aeronautical engineers designing the software systems for their military aircraft.

    Programming and software development is not something that is the exclusive domain of 'software engineers'. You really think software engineers can develop a CAD system on their own? Mechanical engineers have done that BTW. Ever heard of MATLAB or Simulink?

    Sure, software engineers design software. It is juvenile to boost your ego by claiming that no one else can do it. Software engineering is relatively new. New enough that people in other domains have mastered it through necessity.

    Cheers!

  • Re:I'm Shocked.... (Score:3, Informative)

    by default luser ( 529332 ) on Wednesday June 11, 2008 @01:12PM (#23750465) Journal
    The part was designed by ATI, that is not in-dispute.

    What the article insinuates is that the fabricated part (the MASK) was designed by Microsoft.

    Microsoft bought the IP for Xenos from ATI. They did this because of the poor relationship they had buying GPUs directly from Nvidia with the Xbox. Microsoft saw how Sony bought the IP for graphics parts for the PS2 (and now PS3), and created their own ASIC layouts. Microsoft figured they could do the same.

    The only problem with that logic: Microsoft has never done a chip design as complex as Xenos, and they do so few chip designs that it's hard to hang on to personel. The kind of tricks you need to use to make sure a high-performance chip doesn't bleed power - those are exactly the kind of honed skills a newbie chip design house lacks.

interlard - vt., to intersperse; diversify -- Webster's New World Dictionary Of The American Language

Working...