Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

IBM PlayStation (Games) Hardware Games

IBM's Plans For the Cell Processor 124

Posted by Soulskill
from the breeding-a-better-hamster dept.
angry tapir writes "Development around the original Cell processor hasn't stalled, and IBM will continue to develop chips and supply hardware for future gaming consoles, a company executive said. IBM is working with gaming machine vendors including Nintendo and Sony, said Jai Menon, CTO of IBM's Systems and Technology Group, during an interview Thursday. 'We want to stay in the business, we intend to stay in the business,' he said. IBM confirmed in a statement that it continues to manufacture the Cell processor for use by Sony in its PlayStation 3. IBM also will continue to invest in Cell as part of its hybrid and multicore chip strategy, Menon said."
This discussion has been archived. No new comments can be posted.

IBM's Plans For the Cell Processor

Comments Filter:
  • by Animats (122034) on Tuesday October 12, 2010 @01:54AM (#33866498) Homepage

    The basic problem with the Cell processor is that it has 256KB (not MB, KB) per processor, plus a bulk transfer mechanism to main memory. Given that model, it has to be programmed like a DSP - very little state, processing works on data streams. For games, this sucks. No CPU has enough memory for a full frame, or for the geometry, or a level map. Trying to hammer programs into that model is painful. (Except for audio. It's great for audio.) In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O. And, of course, Sony had to put an NVidia graphics processor in the thing late in the development cycle, once people finally realized that the Cell CPUs couldn't handle the rendering.

    But if each Cell CPU had, say, 16MB, the Cell machines could be treated more like a cluster. Programming for clusters is well understood, and not too tough.

    It's probably too late, though. Multi-core shared memory cache-consistent machines are now too good. It's not necessary to use an architecture as painful as the Cell. It's probably destined for the graveyard of weird architectures, along with data flow machines, hypercubes, SIMD machines, systolic processors, semi-shared-memory multiprocessors, and similar hardware that's straightforward to build but tough to program.

  • by KingFrog (1888802) on Tuesday October 12, 2010 @01:59AM (#33866516)
    I would not want to be betting against IBM for this marketspace. Their cell chip, which is an asymmetric multi-core CPU architecture, seemed bizarre when announced, but has proven to be quite good for these workloads. If IBM is looking to leverage their regular POWER chipset for the console market, they will probably build some screamers with them. Cell and POWER both have Unix and Linux adaptations running on them, so having the capability seems trivial. Whether vendors will want you using their hardware that way is another matter entirely. After all, the chief reason that console games cost so much is that for every copy sold, the developer pays the console hardware manufacturer a licensing fee. Unlike the PC arena, where the architecture is published and you develop for it for effectively no additional cost.
  • by dbIII (701233) on Tuesday October 12, 2010 @02:00AM (#33866522)
    A while back I was looking for one or two Cell CPU based machines as development boxes for inhouse geophysical software - basicly to see if it's worth going onto that platform. The three week process between contacting what appeared to be the only vendor of Cell based workstations and getting a price for an entry level machine was frustrating. It involved daily calls to a slimy bastard that appeared to just want to waste time trying to become my friend until he had carefully finished weighing my companies wallet.
    In the end the time window had come and gone (the developers got bored or gave up on the idea of using the Cell) before I could get even a hint at the price but I kept going for the sake of future projects. The price for one workstation with one processor was fairly similar to that of six of our cluster nodes. You would need some sort of black-ops budget where any Accountants coming close are shot on sight before paying that sort of price. An entry point machine no much different to a playstation with more memory cost a truly insane and unjustifiable price.
  • by Nursie (632944) on Tuesday October 12, 2010 @02:09AM (#33866576)

    You've really missed hearing about Cell?

    It's a new processor architecture, IBM and Sony (and possibly others) had a hand in it. Effectively two "Power" cores and a bunch of vector processing units. It's supposed to be very very good for vector operations. For a while (a few years back now) the world's most powerful supercomputer was a machine composed of nodes containing two cell processors and an Opteron each.

    It's different to other parallelisation strategies as the vector units (SPU/SPEs) allow you to parallelise stuff at an operation level, unlike just stuffing more cores into the box which is the intel/PC strategy. For games and graphics this it thought to be good, hence its inclusion in the playstation 3. It's also supposed to be good for scientific computing.

    I guess you could think of it as somewhere between a CPU and a GPU, or a hybrid of the two approaches.

  • what would be cool (Score:3, Interesting)

    by AVryhof (142320) <avryhof.gawab@com> on Tuesday October 12, 2010 @06:50AM (#33867610) Homepage

    What would be a pretty cool chip would be an 8-core chip with 4 x86_64 cores, two graphics cores, and two Cell cores. (perhaps IBM + AMD working together)

    After that, build a custom Linux with MeeGo as the front end / launcher.... It would be cool if game console makers embraced Open Source for everything up to launching the games. ...and if they don't want their SDK open source, that's fine, just make the Operating System so it can launch the games, then get out of the way. Run it on two cores (for better functionality with Multimedia capabilities, ebook reading, etc.) and use the rest of the cores (2 x86_64, 2 Graphics and 2 Cells) for gaming.

    As for the other hardware, Composite, Component, HDML, VGA, WiFi, Ethernet, and a headphone jack.(maybe bluetooth for wireless controllers and the ability to use bluetooth headsets)..blu-ray, card reader, and USB.

    This is all off the top of my head, and would be a pretty cool gaming console, which would truly capture the home entertainment medium and make most people looking for gadgets, consoles, or HTPCs drool appropriately.

  • by hairyfeet (841228) <`moc.liamg' `ta' `8691tsaebssab'> on Tuesday October 12, 2010 @07:09AM (#33867688) Journal

    I have been wondering just how long it will take for the "ooohhh shiny!" factor to wear thin. Hell I fire up Far Cry I or Wolfenstein on my $36 HD4650 and the people stand around and go "oooohhh". You really don't need any higher to have decent immersion in a game, and especially with FPS if the game is worth a damn you are too busy dodging fire to just stand around and look at the shiny. Then add in the spiraling costs and delays to market adding lots of "ohhh shiny" add, and it quickly becomes "get a hit, and on time, or we'll all out of business" and that simply isn't sustainable long term.

    That is why I wouldn't be surprised if the next gen gaming consoles don't do something similar to the original Xbox, which I thought was a damned good idea at the time. You could take a cheap ULV Phenom II Quad, add a 5xxx Radeon GPU and some decent controllers and have the average Joe drooling at the "ooohhh shiny" for a long time, and the combination of cheap hardware, the ability for developers to easily code with tools they already have, and the quick time to market would probably make it a hit.

    I just don't see the incredible amounts required to bring a new gen of consoles not seriously hurting any companies bottom line. With a more off the shelf approach all they have to do is cook up the DRM and a close to bare metal OS for it and let the economies of scale keep the price low out the gate and drive prices even lower as time goes on. While MSFT could blow the cash simply because they have twin cash cows in Office and Windows, I doubt Sony will be able to afford the needed capital, and Nintendo has made it pretty clear they aren't gonna play the "ooohhh shiny!" game at all with targeting the Wii to casual gamers. I just don't see a never ending ooohhh shiny arms race being good for anybody. Just look at how ATI is using Eyefinity to push new GPUs and Nvidia looking at HPCs with CUDA, even they know the "ooohhh shiny" can only go so far. Hell I figured when I got the HD4650 it would just be a stopgap until I could get a $150+ GPU, but now? Hell it plays Bioshock II and everything else I throw at it with plenty of ooohh shiny and doesn't turn my apt into a sauna bath, so why bother? I used to be a serious graphics whore, but even I got tired of the ooohh shiny and now prefer games that are actually...what's the word?...oh yeah FUN. I'm starting to wonder if the whole graphics race is starting to hit a dead end.

  • Laughable Drivel (Score:2, Interesting)

    by RingBus (1912660) on Tuesday October 12, 2010 @07:27AM (#33867782)

    "Sony had to put an NVidia graphics processor in the thing late in the development cycle, once people finally realized that the Cell CPUs couldn't handle the rendering."

    My god. You are repeating that Beyond3d forum lie in late 2010???

    "For games, this sucks"
    "Trying to hammer programs into that model is painful. (Except for audio. It's great for audio."
    "In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O."
    "It's not necessary to use an architecture as painful as the Cell."
    "tough to program."

    It's like you tried to parrot every Beyond3d x86 fanboy talking point you could remember.

  • by wowbagger (69688) on Tuesday October 12, 2010 @07:30AM (#33867796) Homepage Journal

    I can go one better: I do signal processing for a living - chewing on multi-hundred megasample/second streams of data in real time. The Cell looked like a perfect fit. We were looking at 1000's per year. Contacted IBM - sorry, not enough zeros on that number for us to sell you the chips. OK, are there any vendors that are targeting the uTCA form factor (that the Telecomms folks are are all over, so they would not have been targeting just us)? Nope, just large blades for mainframes.

    I assert that IBM doesn't want to be in the chip business - at least, not "selling chips to anybody else". They don't mind making chips for their own use, but they really don't have the infrastructure to sell to anybody else.

    Sony and Toshiba don't want to be in the high-end CPU market, they want to be in the mass-market stuff.

    Had IBM licensed the Cell design to somebody like Freescale, they might have gone somewhere.

    Sorry, but I RTFA - and what I came away with was "We will continue to support Sony for as long as Sony wants to make PS3's". I saw nothing that really said "We are going to be going someplace else with this."

  • by MediaStreams (1461187) on Tuesday October 12, 2010 @08:19AM (#33868030)

    Back in the early PS2 we would talk about what a next generation PS2 would look like. Those whiteboard diagrams looked almost identical to what Sony and IBM came up with.

    The parallels between the PS2/EE/GS and PS3/Cell/RSX are almost identical:

    Execution starts on the EE/PPU
    Heavy/parallel computation task is spawned off to the VUs/SPUs
    Light control code runs in parallel on the EE/PPU
    As graphical elements become read to be rasterized they are spawned off to the GS/RSX

    In a well running PS2/PS3 engine all three major areas are running full speed in parallel. Split memory architecture lets each area of the machine run at full speed without interfering with the rest of the system.

    Kutagari and IBM did a masterful job. It was an obvious choice to build off the model of the most sucessful console architecture in history and the one all console developers had intimate knowledge of, the 145 million selling PS2.

  • by CronoCloud (590650) <cronocloudauron@@@gmail...com> on Tuesday October 12, 2010 @08:53AM (#33868234)

    The SPE's aren't full CPU's, they're essentially enhanced versions of the PS2's VU's.

    Given that model, it has to be programmed like a DSP - very little state, processing works on data streams.

    Yep, stream data, just like on the PS2.

    For games, this sucks. No CPU has enough memory for a full frame, or for the geometry, or a level map.

    You're not supposed to keep a full frame or map in there, you're supposed to stream it in and out on the fly, as the Kami intended, just like on the PS2.
    "Fat Pipes (bandwidth), small pans (VU/SPE RAM)" http://arstechnica.com/old/content/2000/04/ps2vspc.ars/1 [arstechnica.com]

    In many PS3 games, the main MIPS machine is doing most of the work, with the Cell CPUs handling audio, networking, and I/O

    The Cell isn't MIPS, it's PPC, the PS2 (and PS1) were the MIPS machines. The SPE's are supposed to handle things like audio and networking, that's their job. Apparently you can also do things like assign a SPE to do things like very fast bzip decompression.

  • I remember reading somewhere that one of the goals in PS2 programming was keeping that DMAC running full tilt streaming data. Ah, found it, Ars Technica:

    http://arstechnica.com/old/content/2000/04/ps2vspc.ars/4 [arstechnica.com]

"Pull the trigger and you're garbage." -- Lady Blue