The Truth About Last Year's Xbox 360 Recall 255
chrplace forwards an article in which Gartner's Brian Lewis offers his perspective on what led to last year's Xbox 360 recall. Lewis says it happened because Microsoft wanted to avoid an ASIC vendor. "Microsoft designed the graphic chip on its own, cut a traditional ASIC vendor out of the process, and went straight to Taiwan Semiconductor Manufacturing Co. Ltd., he explained. But in the end, by going cheap — hoping to save tens of millions of dollars in ASIC design costs, Microsoft ended up paying more than $1 billion for its Xbox 360 recall. To fix the problem, Microsoft went back to an unnamed ASIC vendor based in the United States and redesigned the chip, Lewis added. (Based on a previous report, the ASIC vendor is most likely the former ATI Technologies, now part of AMD.)"
yes, go cheap, that's the way (Score:4, Interesting)
Chickens are coming home to roost... (Score:5, Interesting)
Or is it all just a hoax? [fugue.com]
Hope not.
Another Talisman CF (Score:5, Interesting)
Never, and I say NEVER let a bunch of software engineers try to design a hardware chip. This was the biggest CF I'd seen in all my years (30+) as a chip designer. That they did it again, and with such stupidity again is no friggin surprise.
It is not that software engineers should not be involved, of course they should but when they drive the architecture in complete void of any practical chip design constraints..... and continually refuse to listen to any reason from the hardware designers..... well as they say, garbage in, garbage out.
Re:Another Talisman CF (Score:1, Interesting)
Well, as a software engineer, I think I would do a pretty good job in designing a hardware chip. Could you please disillusion me in a more detailed manner?
Re:What's going on..... (Score:1, Interesting)
Given the lead free solder doesn't have a forgiving (nearly idiot proof) eutectic, and the companies producing the 360 were unfamiliar with lead free solder, it's easy to see how massive defects like this might happen. In retrospect I wonder how the enviromental impact from the increased entropy caused by the design choice of lead free solder would have stacked up against just using solder with lead and avoiding all the mess.
this doesn't seem accurate, it was solderability (Score:5, Interesting)
The problem wasn't any chip at all. It wasn't even heat. The problem was the chips were not soldered to the board.
http://www.bunniestudios.com/blog/?p=223 [bunniestudios.com]
Doesn't matter who designed or made the chips. If they aren't soldered down, they won't work. And that's what the problem was. That's why X-clamps (mostly) work.
Heat is semi-tangential. If the chip is soldered down, heat won't pop it off and if it isn't soldered, any kind of movement will break it loose, even when cold. This is how MS could ship you replacement units that were RRoD out of the box. They were fine before they were shipped and were broken loose during shipping.
Most of the problem appears to be solderability problems, not a problem with chip design or manufacturing.
Re:Another Talisman CF (Score:5, Interesting)
I am a person that designs both hardware, and software, but not chips, At the risk of talking outside of my expertise, I will have a go at answering your question.
Firstly, there are things that software people really like, but it is often better to not do them in hardware. This category contains things like Read/Write I/O registers. From a software point of view, they are nice, but they can double your gate count. They can also increase your capacitive bus loading. DAC and ADC designs can also be affected this way. A software person might use a proper ADC and expect proper ADC registered results. A hardware person might select a resistor, capacitor, a voltage comparitor, and a couple of spare I/O pins. The cheesy R/C approach may save the hardware design from a whole slew of problems including cost. A software person may opt for a synchronous logic approach with all registers clocked every clock cycle. The hardware designer may opt for a much more asynchronous approach, that minimizes the number of clocked registers. This reduces power consumption, and potentially the number of registers too. Often the hardware designer will consider thermal, cost, electrical layout issues as part of his design process. The software person will not be as familiar with how to design a good circuit board and chip design in a cost-effective manner. A good software engineer can learn all of this material with time, but the hardware engineers will do them naturally.
The second category of problems is tools. The modern chip designer is working with a fairly advanced set of tools that the software person is likely to be quite unfamiliar with. This starts with the IC design tools, which are quite specialized. It ends with the hardware engineering tools. Have you ever X-Rayed a circuit board to analyze the cracks in the Ball Grid Array where it bonds to the circuit board? Are you familiar with thermal issues, and thermal images? How about EMI test results? Modern IC package design limitations? A good team of engineers will be familiar with these tools, and know how to use them to get good results.
The third category of problems is mistakes from inexperience, or lack of experience in the correct field. I work with industrial electronics. I think from an industrial point of view. What happens when someone attaches 600 (VAC) to the ground wire of the computer? What happens to the remote sensors when the plant gets hit by lightening? In IC design, there are some known gray areas too. Does the chip reset properly on power up? Do metastable, astable, or self-oscillating states exist in the IC design? Can the chip survive with no cooling? Does the chip have an overtemp shutdown function? What happens if someone starts the chip up in sub-zero weather? Do the analog electronics have sufficient electrical separation from the digital electronics, while avoiding nasty things like ESD latchup conditions?
I've completed chip design courses before, but have never had to design a modern production gate array design. As a person that has done both software and hardware, I know that my skills are not good enough for the most modern IC design processes. My limit is FPGA work, and my preference is clever opto-isolation, power semiconductor, TTL and micro-proccessor based circuits. In analog, my expertise in analog is industrial sensing and survivability. You have to know where your field of expertise is, and what your limits are.
Re:Another Talisman CF (Score:5, Interesting)
I testify, Brother, I TESTIFY!
30 Years ago, I ended up in therapy (literally) after dealing with an assembly program written by a hardware guy. The program emulated a CDC communications protocol that was originally done in hardware. This was on a Cincinnati Milacron 2200B, a machine that had both variable instruction length and variable data length. The hardware guy had implemented the protocol state changes by putting a label *on the address portion* of jump statements (he did this in 50 different places in the program) and then in some other area of the code he would change where the jump branched to next time through. It bordered on an implementation of the mythical COME FROM instruction. Of course, there was zero documentation and almost zero comments.
After one marathon debugging session I was so frustrated I was in tears. My manager came in and wanted to know what the problem was. I gave him the listing and left to walk around the building a few times. When I came back, he told me that it was, hands down, the worst piece of crap he had seen in 20 years. He had me rewrite it from scratch, which I did over a long weekend.
The program's name was RIP/TIP (Receive Interrupt Processor/Transmit Interrupt Processor) and I was in therapy for most of a year. (There were a few other issues, but this was the bale of hay that made me snap.)
Re:I'm Shocked.... (Score:2, Interesting)
Re:What's going on..... (Score:5, Interesting)
Everything I've ever heard as a "Gartner opinion" got one of two reactions from me:
1. Well duh.
2. No, that's obviously wrong.
Looks like this is #2.
Re:When I read that I pictured Ballmer: (Score:5, Interesting)
Re:Some Facts... (Score:4, Interesting)
The PS3 COULD run it in 360-resolution, but it might have to sacrifice some of those filters and special effects. I'd rather have a special effect laden game run at slightly lower resolution myself, as long as its hard to notice.
Ooooh, factsies (Score:0, Interesting)
Re:Some Facts... (Score:3, Interesting)
Um, this is what PS3 owners like to tell themselves before they start crying at bed time maybe...
However, the PS3 is using a virtually off the shelf core Geforce 7800 GPU. The XBox 360 is using a variant of an off the shelf ATI 2600 (prior to the 2600 GPU ever existing.)
The XBox 360 GPU is a unified GPU and handles all DX10 features and effects, the 7800 GPU DOES NOT. (See DX10 and the specifications for Vista came from the XBox 360 team, this is why Vista can kick some serious FrameRates for games and still be a general consumer OS.)
Sure the PS3 could run in the XBox 360 resolution. However, it would lose FPS, and also increase load times.
Don't forget your precious blu-ray that is so freaking slow the game has to be copied to the PS3 Hard Drive to keep up with the XBox 360 DVD player. (Mircrosoft even kindly gave Sony a heads up the slow nature of both HD-DV and Blu-Ray would be a serious issue for fast playing games that load large worlds virtually. (Most games have 'load screens' which are just hell longer on PS3, GTAIV doesn't have that luxury)
The 'blur' effect you are referring to is what they used on the PS3 title to help 'reduce' how noticeable it was there was no anti-aliasing. (See the XBox 360 is not only doing HD resolutions, but anti-aliasing the scene as well.)
The same 'blur' effect has been used in many other games for a long time when Video cards couldn't handle anti-aliasing, especially PC games. Take City of Heroes even on the PC, nice game, has two direct blur settings for distant objects, as they artifact REALLY BAD when there is no anti-aliasing. So if your card can't do it the right way, you flip on the distance blur and the non-aliased distance artifacts are smudged on the screen. Almost anti-aliased quality, but only works well on distant scenes or where detail can be smudged away.
Now if you really want to try to argue the 'blur' effects are something the XBox 360 can't do, I suggest you go grab a whitepaper on the GPU differences between the 360 and the PS3, and even pick up the whitepapers on the consumer counterparts, the NVidia 7800 and ATI 2600 - trust me when I say there are more than a 'few' features the XBox 360 GPU will do that the older NVidia chip just can't handle.
PS I'm a fan of NVidia, run them in every laptop and most desktops I own, even my old beat around traveling laptop is from 2005 simple early dual-core P4 w/HT and has a 7950GTX mobile GPU... Oh, the funny thing is, that 2005 laptop can run games at a higher FPS than the PS3, and even do it at full 1920x1200. Since even though it is a Mobile GPU, the 7950GTX w/512mb is FASTER THAN THE GPU in the PS3. Hope this makes you sleep better at night...