Forgot your password?
typodupeerror
IBM PlayStation (Games) Hardware Games

IBM's Plans For the Cell Processor 124

Posted by Soulskill
from the breeding-a-better-hamster dept.
angry tapir writes "Development around the original Cell processor hasn't stalled, and IBM will continue to develop chips and supply hardware for future gaming consoles, a company executive said. IBM is working with gaming machine vendors including Nintendo and Sony, said Jai Menon, CTO of IBM's Systems and Technology Group, during an interview Thursday. 'We want to stay in the business, we intend to stay in the business,' he said. IBM confirmed in a statement that it continues to manufacture the Cell processor for use by Sony in its PlayStation 3. IBM also will continue to invest in Cell as part of its hybrid and multicore chip strategy, Menon said."
This discussion has been archived. No new comments can be posted.

IBM's Plans For the Cell Processor

Comments Filter:
  • by KugelKurt (908765) on Tuesday October 12, 2010 @01:58AM (#33866510)

    The story is not that IBM continues to manufacture chips but that the Cell design is not dead. This contradicts earlier stories to some degree.
    In all fairness, it contradicts only on the surface as IBM only stated in the older story that Cell as separate design will end and its co-processor-heavy design will merge with future POWER iterations.

    There were also rumors that IBM won't manufacture PS3 Cell CPUs any longer, leaving it to contractors.

  • by Anonymous Coward on Tuesday October 12, 2010 @02:05AM (#33866558)

    The Atari Transputer Workstation [wikipedia.org] already did that in the 80s. Coolest real-time raytracing ever!

  • Re:affordable (Score:2, Informative)

    by Nursie (632944) on Tuesday October 12, 2010 @02:12AM (#33866586)

    Get PS3 with 3.41 or earlier firmware -> Jailbreak -> install linux.

    Profit?

    (actually linux for jailbroken ps3s is in the very early stages, but I'm sure it'll get there.

  • by Johnno74 (252399) on Tuesday October 12, 2010 @02:33AM (#33866666)

    Can't buy ARM outside a cellphone? Are you kidding?

    Check this out - this is just one I found with about 5 seconds

    http://www.makershed.com/ProductDetails.asp?ProductCode=MKND01 [makershed.com]

    There are dozens of ARM boards out there suitable for DIY/embedded systems

  • by Anonymous Coward on Tuesday October 12, 2010 @03:41AM (#33866914)
    No; no you don't recall correctly, not even a little bit. Not a jot, not a tittle. Cell was designed specifically for the PS3, and maybe for other kinds of (repetitive streaming type) work that is mostly done by GPUs and/or CUDA in this day and age.
  • by TonyMillion (545370) on Tuesday October 12, 2010 @03:59AM (#33866980) Homepage

    odd, when we were working with cell we went straight to matrix vision and they LOANED us the hardware for about a year.. Nothing sleazy at all. IBM Also loaned us a server, as did Sony (a beautiful rack-mount job which will never see the light of day).

    http://www.matrix-vision.com/products/cell.php?lang=en [matrix-vision.com]

    Bottom Line - the PPC part of the Cell is rubbish, terrible IO and generally 'weak' by todays standards, the SPEs are great, but not enough memory on them (256k) for the algorithms + tables we needed to process the data.

    In the end optimizing for intel & SSE3 and making the algorithms multi-core capable was less pain:performance ratio than working on the Cell which would have required all the additional work of managing DMA to/from the meagre memory on the SPE.

  • by l33t gambler (739436) on Tuesday October 12, 2010 @09:35AM (#33868768) Homepage

    I found this article interesting. They write about Valves approach to multi-core CPU's and game engines.

    The programmers at Valve considered three different models to solve their problem. The first was called "coarse threading" and was the easiest to implement. Many companies are already using coarse threading to improve their games for multiple core systems. The idea is to put whole subsystems on separate cores; for example, graphics rendering on one, AI on another, sound on a third, and so on. The problem with this approach is that some subsystems are less demanding on CPU time than others. Giving sound, for example, a whole core to itself would often leave up to 80 percent of that core sitting unused.

    The second approach was fine-grained threading, which separates tasks into many discrete elements and then distributes them among as many cores as are available. For example, a loop that updates the position of 1,000 objects based on their velocity can be divided among, say, four cores, with each core handling 250 objects apiece. The drawback with this approach is that not all tasks divide neatly into discrete components that can operate independently. Also, if some entries in the list take longer to update than others, it becomes harder to scale the tasks evenly across multiple cores. Finally, the issue of memory bandwidth quickly becomes a limitation with this method. For certain specialized tasks, such as compiling, fine-grained threading works really well. Valve has already implemented a system whereby every computer in their offices automatically acts as a compiler node. When the programmers were getting ready to demonstrate their results on the conference room computer with the big screen, they had to quickly deactivate this feature first!

    The approach that Valve finally chose was a combination of the coarse and fine-grained, with some extra enhancements thrown in. Some systems were split on multiple cores using coarse threading. Other tasks, such as VVIS (the calculations of what objects are visible to the player from their point of view) were split up using fine-grained threading. Lastly, whenever part of a core is idle, work that can be precalculated without lagging or adversely affecting the game experience (such as AI calculations or pathfinding) was queued up to be delivered to the game engine later.

    Valve's approach was the most difficult of all possible methods for utilizing multiple cores, but if they could pull it off, it would deliver the maximum possible benefits on systems like Intel's new quad-core Kentsfield chips.

    To deliver this hybrid threading platform, Valve made use of expert programmers like Tom Leonard, who was writing multithreaded code as early as 1991 when he worked on C++ development tools for companies like Zortech and Symantec. Tom walked us through the thought process behind Valve's new threading model.

    http://arstechnica.com/gaming/news/2006/11/valve-multicore.ars [arstechnica.com]

  • by Jesus_666 (702802) on Tuesday October 12, 2010 @10:52AM (#33869878)
    I don't particularly care about the XBox. The last non-portable console I actually was interested in was the PS2.

    This comes from the point of view of a casual gamer who is not concerned with having the latest and greatest but has a brother who is. I've seen the X360 perform on a large HDTV set and I've seen the PS3 perform on the same set. Both look good. The Cell may outperform the X360 by a large margin if given enough time but that remains to be seen. Right now I'd put them as reasonably close (= to someone who isn't an expert on console graphics the PS3 is not obviously superior, which is exactly what I wrote).

    Don't get me wrong, the Cell is powerful. Nobody would use the X360 for a scientific cluster but the PS3 was popular for that until Sony killed Other OS. However, that power is not easy to work with. They had a very impressive realtime raytracing demo on the GCDC with the SPEs doing the raytracing work and the PPE coordinating and compositing everything. Very nice.

    But at the same time there were a lot of workshops (and at the GC proper, a lot of developers) who pointed out that getting an engine to work on the PS3 is much more work than on more traditional systems because it's a completely different programming model. Treat the SPEs as small CPUs and watch your framerate go to the low single digits. Ignore them and you're wasting most of the system's power. The SPEs have a tiny amount of RAM and you're expected to code in such a way that you deliver data to them in a single DMA operation. If your data set is too big for the SPE or your packet size does not align with what the Cell can do in a single DMA operation you plug up the bus and all SPEs starve.

    It may very well be that the PS3 is a late bloomer and that we will see more and more optimized graphics for the Cell. Then again, Microsoft might be able to afford to just release a new XBox sooner than Sony can relace the PS3 as (if I remember correctly) the PS3 was really expensive to develop.

    The big question is whether the PS3's approach of having a really powerful but hard to use processor is viable in the marketplace. If Microsoft can just toss out consoles at lower development cost and Nintendo outsells both of them by delivering cheap systems to casual gamers, Sony might be facing trouble.

news: gotcha

Working...