Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Hardware Entertainment Games

The Mystery of Cell Processors 234

LucidBeast writes "Consumer appliances requiring more computing power Sony, IBM and Toshiba started 2001 developing "Cell"-processor that comprises of multiple processor cores and should give performance ten times of conventional processors. Now the CNN Money reports that details of the processor will be released Feb. 6-10 at the International Solid State Circuits Conference in San Francisco. Also reported by EE Times. Rumors also tell that Sonys PS3 development platform has already been shipped to some developers equipped with the cell processor."
This discussion has been archived. No new comments can be posted.

The Mystery of Cell Processors

Comments Filter:
  • Article text (Score:4, Informative)

    by mrhandstand ( 233183 ) on Monday November 29, 2004 @08:05AM (#10941833) Journal
    Chip power, times 10
    Sony, IBM, Toshiba disclose details of new processor that will run next-generation electronics.
    November 29, 2004: 6:13 AM EST

    TOKYO (Reuters) - IBM, Sony Corp. and Toshiba Corp. on Monday unveiled some key details on the powerful new "Cell" processor the three are jointly producing to run next-generation computers, game consoles and TVs.

    Cloaked in secrecy and the object of much speculation since the three conglomerates announced the project in 2001, Cell will be 10 times more powerful than conventional chips and able to shepherd large chunks of data over broadband networks.

    In a joint release, the three firms gave a glimpse of their respective plans for Cell-powered products, but were mum on technical details, which will be revealed Feb. 6-10 at the International Solid State Circuits Conference in San Francisco.

    IBM (Research), Sony (Research) and Toshiba are investing billions of dollars to develop and prepare for mass production of Cell, which is a multicore semiconductor composed of several processors that work together to handle multiple tasks at the same time.

    "In the future, all forms of digital content will be converged and fused onto the broadband network," Ken Kutaragi, executive deputy president and COO of Sony, said in the release. "Current PC architecture is nearing its limits."

    IBM said it would start pilot production of the microprocessor at its plant in East Fishkill, N.Y., in the first half of 2005. It will use advanced 300 millimeter silicon wafers, which yield more chips per wafer than the 200 mm kind.

    It also announced plans to first use the chip in a workstation it is developing with Sony, targeting the digital content and entertainment industries.

    Sony said it would launch home servers and high-definition televisions powered by Cell in 2006, and reiterated plans to use the microchip to power the next-generation PlayStation game console, a working version of which will be unveiled in May.

    Toshiba said it planned to launch a high-definition TV using Cell in 2006. Top of page
    • When PS2 was launched, incredible specs were also touted; on delivery it ended up cheaper but not more powerful than a high-spec PC with a good video card one year later. I am afraid we might end up with another mediocre product at a reasonable price point. Sony should concentrate on portable systems integration which is where its real expertise lies.

      • by jmcmunn ( 307798 ) on Monday November 29, 2004 @08:20AM (#10941887)
        I think Sony also has some expertise in the console market, afterall they do have two of the best selling consoles ever. And of course they are the current king of the console market, so I would think that should stand for somthing as far as "expertise" goes.

        But yes, we will likely be underimpressed with the PS3 when it comes out. But all of the "non geeks" out there who never heard the five versions of the inflated specs that we were promised will still love the machine for what it is, a good game console.

        So it won't ever have the most teraflops on the worlds' supercomputer list...who cares?
      • Everyone laughed at Sony when they announced the PS1, yet delivered exactly what the promised. The day the PS1 came out, Sony Announced the PS2 specs. At the time, the specs were beyond amazing, but Sony, five years later, delivered. I seriously doubt they would hype up "10x the power of 'normal' processors," especially given the amount of money spent on research.
      • by Octagon Most ( 522688 ) on Monday November 29, 2004 @09:35AM (#10942473)
        "When PS2 was launched, incredible specs were also touted; on delivery it ended up cheaper but not more powerful than a high-spec PC with a good video card one year later. I am afraid we might end up with another mediocre product at a reasonable price point."

        Frankly I like the idea of delivering power comparable to a high-end PC in a less expensive console. Those that want the most possible power will pay the price for the PC anyway so they can keep it updated. The console buyer wants simplicity and low price. As a reformed geek myself I never want to touch the guts of a computer again. My two favorite electronic devices are my iMac and iPod. When I buy another game console I will be much more concerned with the quality of the games and the ease of use than the raw specs. I'd certainly like to see what all this power could deliver, but I'd rather it be US$199 than "incredible."
      • it ended up cheaper but not more powerful than a high-spec PC with a good video card one year later.

        Cheaper is the main thing. The PS2 is a decent machine with a DVD player for under $200, now. I'd be hard pressed to get a general purpose PC with a DVD player, TV/svideo output, and a remote control for that cost. I'm missing out on Doom 3, but that certainly isn't causing me much stress.
      • The cell processor is not ONLY for PS3. It's actually a very flexible system where the cores can assume different functionality depending on what is needed at that milisecond. For the PS3 most of the cells would be working on graphics most of the time. IBM is also planning on using the chips for workstations where presumably most of the cells would be working on MPU functionality most of the time.

        I would not be surprised to see Apple use the chips if they get the OS ported to it.

        So yes PS3 probably won't
    • by choas ( 102419 )
      Wow, Thanks!

      To think we almost slashdotted CNN!

      • Re:Article text (Score:3, Insightful)

        by mrhandstand ( 233183 )
        Hey...you never know. Seriously, I post article text when appropriate because most readers/posters can't be bothered to actually read the damn thing unless it's in front of them.
    • by worst_name_ever ( 633374 ) on Monday November 29, 2004 @09:47AM (#10942561)
      "In the future, all forms of digital content will be converged and fused onto the broadband network"

      And all restaurants will be Taco Bell...

      • In addition to the obvious obsurdity of saying "all", the naive optimism of the broadband convergence prophets puts them into denial about the fact that many people often don't want gaming to be on-line. I've seen some of the sneaky things companies do to glean marketing data off of their paying customers, and it is rather annoying. I don't want my PC phoning home every time it boots (my ISP trys this), nor do I opt to plug a DirecTV unit into the phone jack, for example. Already, people report their lif
  • by Anonymous Coward on Monday November 29, 2004 @08:07AM (#10941836)
    "Consumer appliances requiring more computing power Sony, IBM and Toshiba started 2001 developing "Cell"-processor that comprises of multiple processor cores and should give performance ten times of conventional processors."

    What in the hell does that sentence mean? I can handle a couple of spelling or grammatical problems, but seriously! What the fuck does that mean? Are 3 companies working together to create this Cell processor, or are there three different Cell processors...

    • Someone's created a Cell processor which has become sentient and submitted a rather garbled story about itself to slashdot.
    • It means LSD is alive and well. It means retards have "hacked" their way into the submission que. It Means Cmdr. Taco is submitting stories under psuedonyms. It means you didn't recognize the plot of the Jennifer Lopez blockbuster, "The Cell". It means...I don't know what it means.
    • Re:Please Help! (Score:3, Insightful)

      by BenjyD ( 316700 )
      I always find it odd that so many "Nerds", people who spend their time programming in languages that demand incredibly exact syntax, can't get basic "natural language" syntax right.
      • by Jeff DeMaagd ( 2015 ) on Monday November 29, 2004 @08:53AM (#10942136) Homepage Journal
        I've seen a quote saying that once we get a natural language compiler, we'll find that geeks can't write.
      • Most of our parsers are written in C not C++; we don't throw exceptions.
      • Re:Please Help! (Score:2, Insightful)

        by Anonymous Coward
        Trust me, most of those people don't make much sense in programming languages, either, even if it is syntactically correct...
      • Maybe the poster doesn't speak english as a first language? I doubt that you would pick on a lifetime C developer if he mangled FORTRAN, especially if he'd only had a few hours exposure to it.

        Personally, I'm glad that *most* people aren't as picky as a compiler.

      • Re:Please Help! (Score:5, Insightful)

        by pla ( 258480 ) on Monday November 29, 2004 @09:34AM (#10942455) Journal
        I always find it odd that so many "Nerds", people who spend their time programming in languages that demand incredibly exact syntax, can't get basic "natural language" syntax right.

        We can. The problem arises in that other people cannot (or rather, do not, since most adults can form grammatically correct sentences if you force them to).

        Another, humorous, response to the parent post nicely illustrates the problem... The only way to parse it such that it remains (almost) grammatically correct runs along the lines of "three consumer appliances named Sony, IBM and Toshiba that are inneed of more computing power".

        Now, you can say that any human reader would get the correct meaning. And in this situation, I'll grant that as most likely true. But if people use sloppy grammar in "obvious" sentences, they most likely will carry that into more subtle sentences as well.

        So when a geek chides someone for misuse of a natural language, insisting on an exactness bordering on formal logic - They/We do so because it improves comprehension.

        A non-geek might feel comfortable trying to divine a sloppy author's intended meaning. But we realize the consequences... Do that in a programming language, and at best you'll get buggy code. Do that in real life, and you get ambiguities such as (no political commentary intended) whether or not Bush said/implied a link exists between Saddam and Osama.
      • Perl is for those who like to choose their own syntax. :)

        Why have one or two ways to do things, when you can have eleven?
      • Why is that odd? Most programming languages demand exact syntax or else there is no meaning. Natural languages are much more tolerant, and there is no need for exact syntax. In the absence of an exact syntax requirement, most people, including nerds, are lazy.
        • English is not less tolerant: humans are. The post as written has either no meaning or a very strange one.

          I guess you're right, though: people are incredibly lazy in the absence of compilers.
      • "I always find it odd that so many "Nerds", people who spend their time programming in languages that demand incredibly exact syntax, can't get basic "natural language" syntax right."

        Most computer geeks seem to consider natural language as a form of psuedo code. It doesn't need to work, just as long as it approximates some idea of functionality.

      • If the number of typos and bugs in code I debug is any measure, simply working in a picky environment is no measure of infallibility.

        Most geeks can type 40-80 words a minute. If you remove the backspace key, that rate drops tremendously.

    • by uradu ( 10768 )
      The only meaningful way I could parse it is that three consumer appliances named Sony, IBM and Toshiba that are in need of more computing power got together and started developing this "Cell" processor. If they're sentient enough to do that, what more do they need?!
  • Well.. (Score:5, Funny)

    by oexeo ( 816786 ) on Monday November 29, 2004 @08:08AM (#10941839)
    multiple processor cores and should give performance ten times of conventional processors.

    About 10 processor cores, right?

    They should have enough power to divide by zero by now, right? or is that still to "difficult"

  • by account_deleted ( 4530225 ) on Monday November 29, 2004 @08:09AM (#10941846)
    Comment removed based on user account deletion
  • by nick-less ( 307628 ) on Monday November 29, 2004 @08:11AM (#10941855)
    details of the processor will be released Feb. 6-10

    it gives a 10 times performance gain over a normal processor, from the year 2001 of course, which will be something like a 1.3 GHz P4 or a 800 MHz Celeron, both introduced in january 2001 ;-)
    • So, it'd run at 10.3 GHz is what you're saying, or perform at an equivalent thereof? That's a fairly significant jump over the 4.7GHz Pentiums aavailable now.
      • That's a fairly significant jump over the 4.7GHz Pentiums aavailable now

        Yes, outperforming a current cpu by a factor of two or three isn't that bad at all. But with several dual core cpus at the horizon its not as impressive like it was back in 2001 either...
      • The jump (if real, etc) is significant in these terms only and only if it can compete on pentium-level prices.

        If not, then it should be compared to other high end chips or more fairly to multi-processor implementations. We're not talking Ma's Dell here, but pricey and powerful workstations.

    • Put ten Pentium-like cores on a die or two, clock it at 1GHz, and there you go. Given that current top-of-the-line CPUs have two ultra-modern cores at higher clock speeds, more simpler cores should be feasible. Even Sun has eight cores on a single Niagara chip that'll run faster than 1GHz, so why not IBM? Actually, I'd be suprised that, between Sun and IBM, that Sun would be first to do it.

  • OK...so according to some marketdroid "Current PC architecture is nearing it's limits". I bet he owns stock in the company that it trying to sell you the new stuff! Last I checked AMD and a few other BILLION dollar companies were still in business.

    Now it is true that multiple core chips seem to be where everyone is headed. Even so, I'm not sure how these magical chips will "converge and fuse" digital content. Remeber that this article is A) light on details, and B) put together by a person who is vying for

    • by Kjella ( 173770 ) on Monday November 29, 2004 @08:36AM (#10941996) Homepage
      ...the current computer architecture is nearing its limits yes, but it has no relationship to the content. A modern processor is very well capable of decoding HDTV content, probably encode too if you can accept less than super compression.

      Of course, I see where it is going, I assume these Cell chips will be used to control hardware encoders/decoders with hard real-time limits (i.e. no frame skips and such crap). Taking the best of "dumb" hardware players of today, combined with the multitasking and flexibility of general computers.

      But it is still a computer in drag. If anything, this seems more like a "retro" trend of the past, when you had active NICs/HDD controllers/whatnot with processors of their own. Now it is back with Cells instead. Just like terminals, we're coming full circle.

      Kjella
    • by TheRaven64 ( 641858 ) on Monday November 29, 2004 @09:21AM (#10942365) Journal
      The article on El Reg [theregister.com] has a bit more information content. The chip is POWER-based, and supports multiple cores, each of which can run a separate OS. This is the first POWER chip to be produced in volume (I'm not counting workstation / server chips as volume). This, combined with the PowerPC-based XBox2 may mean that the unit cost of POWER/PowerPC chips drops enough to make beige-box POWER/PowerPC systems cheap enough to be a viable alternative to x86.
      • What is perhaps most interesting is the ability to have an OS written for cooperation so that processors could be introduced into the network. Or, having different OSes cooperate on a task should the OS designers wish to follow open standards. The Cell is designed for networking and cooperation, this is what is important.
      • The Gekko in the Gamecube is not produced in volume? Millions of G4s and G5s have been sold in Macs, but I guess that's not volume either. And I don't see how high volume on Cell will make regular PowerPCs cheaper.
    • OK...so according to some marketdroid "Current PC architecture is nearing it's limits". I bet he owns stock in the company that it trying to sell you the new stuff! Last I checked AMD and a few other BILLION dollar companies were still in business.

      I think there's little doubt that in the performance arena, PC CPUs have leveled off in the last two years. Instead of across the board performance boosts, everyone is talking multi-core and 64-bit. Two years ago, the 3GHz P4 was king. Today it's still more o
    • Ken Kutaragi isn't a marketdroid. He was an engineer on the original PS and now heads the PS business unit.
  • by Realistic_Dragon ( 655151 ) on Monday November 29, 2004 @08:28AM (#10941921) Homepage
    These multi core and multi processor systems can be a bugger to program for because handling concurrrency in a way that doesnt cause deadlocking is a major pain in the ass.

    One of the better ways is to model out the program in CSP (or a variant thereof) and then write in a specially designed language like Occam (developed for the original transputer, but ported now to x86). These give you code that cannot deadlock or livelock or suffer from resource starvation without needing any of the complex and buggy hacks you see in things like the Linux kernel. And the Linux kernel only has to deal with a few processors... scalling to a few thousand processors in C would require a programmer of insane genius or the implimentation of effectivly a new language on top of C to handle the problems caused.

    So, what language do developers use to target this? Is it something elegant designed for the problem at hand?
    • Small problem (Score:5, Informative)

      by Craig Ringer ( 302899 ) on Monday November 29, 2004 @09:41AM (#10942509) Homepage Journal
      There's only one small problem with your contention - Linux /does/ scale out to at least 512 processors - hardly 'a few' - and is heading up to multiple thousands with SGI's current work.

      Of course, one could argue that the Linux folks have more than one insane genius among them...
    • I imagine that with a fast enough processor and more memory at their disposal, some smaller developpers will be able to bring simple games to market faster because of being able to use high-level or even interpreted languages.

      Your hard-core push-the-limits groups will still use machine language to develop their engines of course.

      I'm sure the API will be beautiful no matter what though -- Sony wouldn't risk losing developpers to Direct-X.
      • Bwahaaha!

        Either you've neither seen a Sony API, or you're the most brilliant cynic I've ever seen. I almost fell of my chair laughing. "I'm sure the API will be beautiful" - yeah, right.

    • Buddy of mine made tools to do this using FORTRAN, although it did still have its limits.

      Seriously - I don't know the intricacies of compiler design but I do know he won the obfuscated c contest [ioccc.org] several years ago and now works on multiprocessing tools for some very high end uses (like rocket motor simulations for NASA) - all in c. Last time I asked about the project he wasn't using gcc for it because gcc lacks certain libraries he needs (or something like that) - but it is still c.

      I would say linux or no
    • Linux Insider is running a couple of editorials speculating [linuxinsider.com] about running Linux [linuxinsider.com] on the 'Cell' processor. The bold prediction? 'the Linux developer community will, virtually en masse, abandon the x86 in favor of the new machine.'
      • First a fwe points both regarding this post and the grand parent.
        1. Multicell processors don't require processes/threads to cooperate/communicate via shared memory. What they do is permit a cheap version of SMP by packaging multiple processor cores in a single chip. Given the complexities of dynamic scheduling (I think an exponential number of gates may be required per stalled instruction that is tolerated without stalling the instruction stream). They can allow these cores to stall and cores that are no
  • by grungeman ( 590547 ) on Monday November 29, 2004 @08:33AM (#10941961)
    Sounds like Playstation3 vs. XBox2 will look like a battle between a Terminator T1000 and Clippy.

    • by TheRaven64 ( 641858 ) on Monday November 29, 2004 @09:23AM (#10942385) Journal
      XBox2: PowerPC-based CPU made by IBM.

      PS3: POWER-based CPU made by IBM.

      Looks like a good time to own IBM stock...

      • Question though: can program code written for the POWER CPU's be used on the new Cell CPU? Is there even the remote chance that MacOS X could be ported to run on the Cell CPU architecture in a pretty straightforward fashion?

        The latter could be of great interest to Apple Computer because it means the potential for substantial increases in the performance of future Macintosh models.
        • can program code written for the POWER CPU's be used on the new Cell CPU? Is there even the remote chance that MacOS X could be ported to run on the Cell CPU architecture in a pretty straightforward fashion?

          Given that their ISAs are close but not identical, I'd imagine a re-compile is necessary.
          I doubt that customers are demanding true binary compatibility among a Playstation, a Mac, and a POWER-based server.
      • PS3 will be Cell based, not POWER based. I assume that Cell will in some manner be based around POWER/PowerPC cores but in essance it'll be a new architechture. And.. this is an uncofirmed fact.

        POWER and PowerPC is essentially the same. IBM is marketing them both under the same brand, Power. The difference between POWER and PowerPC is very small, quite comparable to Athlon and Pentium. The majority of code is binary compatible but some instructions must be adapted for the each platform for optimal performa
      • The GameCube is currently PPC based, and I think I read some speculation that the next one would be also.

        So unless another big console maker comes into the pictures, no matter who wins the console wars, IBM wins.
  • A bit more on PS3 (Score:5, Informative)

    by Sai Babu ( 827212 ) on Monday November 29, 2004 @08:33AM (#10941962) Homepage
    But UNC's Zimmons has his doubts. "I believe that while theoretically having a large number of transistors enables teraflops-class performance, the PS3 [Playstation 3] will not be able to deliver this kind of power to the consumer," quoted from /. referenced article.

    Zimmons talks the details [pcvsconsole.com].

  • by TommyBear ( 317561 ) <tommybear2@gmail.com> on Monday November 29, 2004 @08:33AM (#10941966) Homepage
    I currently work at a game studio here in Melbourne Australia and we're looking at next gen stuff (currently we develop xbox, ps2, PC games). Anyway, today at a meeting, one of the senior developers told our group that 4 had been selected to go to a little show and tell by IBM/Sony in Melbourne, where some of the secrets of the "Cell" processor would be demonstrated/explained to the group. Apparently we were only able to get 4 spots at this event.

    So I'm exicited looks like the tech in just around the corner and so are the multi-core platforms (like XBOX2 and PS3).... yay!
  • in 2000 sony had a series of tv spots trumpeting the "play station 9" for the year 2078, with things like "electronic spores that tapped directly into a person's adrenal gland, improved retinal scanning, a mind control system, holographic surround vision, and telepathic personal music" -wikipedia blurb [wikipedia.org]

    here's a link to the video of the ad [methodstudios.com]

    well, with the exorbitant processing demands of the ps3 that this article suggests, it's almost like they are on track to deliver what they promise!
  • Cell in TV ? (Score:3, Interesting)

    by andymar ( 690982 ) on Monday November 29, 2004 @08:42AM (#10942045)
    The article mentions that the Cell CPU will be included in a HDTV from the year 2006. Anyone know what such a powerful CPU is doing in a TV ?
    • Re:Cell in TV ? (Score:2, Interesting)

      by ocelotbob ( 173602 )
      Most likely, it'll be a scaled down version and/or the TV will have built in extras, like a PVR or ability to download web content without a computer.
    • Re:Cell in TV ? (Score:4, Informative)

      by bhima ( 46039 ) <Bhima,Pandava&gmail,com> on Monday November 29, 2004 @09:07AM (#10942267) Journal
      IBM is planning to market many different types of Cell CPU's for handys, TV's, Workstations, and and Super Computers.
    • HDTV w/ built in PS3 and all forms of Sony media to purchase! Cha-Ching!
    • Re:Cell in TV ? (Score:2, Interesting)

      by hattig ( 47930 )
      Well, if Sony is making 10 to 50 million of these processors a year, the cost will be lower. Especially if the PS3 does have four of them, as has been previously rumoured. By putting the processor (or multiples thereof) into other devices, the cost of the processor goes down even more, as more are made (assuming that enough of the things can be made!). At some point it is probably cheaper to use a Cell processor for all decode/encode operations (TV with built-in PVR?) than whatever else is on the market.
    • Re:Cell in TV ? (Score:3, Interesting)

      by spleck ( 312109 )
      Anyone know what such a powerful CPU is doing in a TV ?


      Decoding a 19 Mbps MPEG-2 program stream with multiple SDTV subchannels, surround sound, etc. ???

      Maybe the channels will change faster too.
  • by S3D ( 745318 ) on Monday November 29, 2004 @08:45AM (#10942070)
    Cell Processor-Based Workstation Prototype [physorg.com]
    The companies expect that a one rack Cell processor-based workstation will reach a performance of 16 teraflops or trillions of floating point calculations per second.
    Cell Processor Unveiled [physorg.com]
    IBM, Sony Corporation, and Toshiba Corporation today unveiled for the first time some of the key concepts of the highly-anticipated advanced microprocessor, code-named Cell, they are jointly developing for next-generation computing applications, as well as digital consumer electronics.
    Specifically, the companies confirmed that Cell is a multicore chip comprising a 64-bit Power processor core and multiple synergistic processor cores capable of massive floating point processing. Cell is optimized for compute-intensive workloads and broadband rich media applications, including computer entertainment, movies and other forms of digital content.
    Other highlights of the Cell processor design include: -- Multi-thread, multicore architecture. -- Supports multiple operating systems at the same time. -- Substantial bus bandwidth to/from main memory, as well as companion chips. -- Flexible on-chip I/O (input/output) interface. -- Real-time resource management system for real-time applications. -- On-chip hardware in support of security system for intellectual property protection. -- Implemented in 90 nanometer (nm) silicon-on-insulator (SOI) technology. Additionally, Cell uses custom circuit design to increase overall performance, while supporting precise processor clock control to enable power savings.
    IBM, Sony Group and Toshiba will disclose more details about Cell in four technical papers scheduled for presentation at the International Solid State Circuits Conference. "Less than four years ago, we embarked on an ambitious collaborative effort with Sony Group and Toshiba to create a highly-integrated microprocessor designed to overcome imminent transistor scaling, power and performance limitations in conventional technologies," said Dr. John E. Kelly III, senior vice president, IBM. "Today, we're revealing just a sampling of what we believe makes the innovative Cell processor a premiere open platform for next-generation computing and entertainment products." "Massive and rich content, like multi-channel HD broadcasting programs as well as mega-pixel digital still/movie images captured by high-resolution CCD/CMOS imagers, require huge amount of media processing in real-time. In the future, all forms of digital content will be converged and fused onto the broadband network, and will start to explode," said Ken Kutaragi, executive deputy president and COO, Sony Corporation, and president and Group CEO, Sony Computer Entertainment Inc. "To access and/or browse sea of content freely in real-time, more sophisticated GUI within the 3D world will become the 'key' in the future. Current PC architecture is nearing its limits, in both processing power and bus bandwidth, for handling such rich applications." "The progressive breakdown of barriers between personal computers and digital consumer electronics requires dramatic enhancements in the capabilities and performance of consumer electronics. The Cell processor meets these requirements with a multi-processor architecture/design and a structure able to support high-level media processing. Development of this unsurpassed, high-performance processor is well under way, carried forward by dedicated teamwork and state-of-the-art expertise from Toshiba, Sony Group and IBM," said Mr. Masashi Muromachi, Corporate Vice President of Toshiba Corporation and President & CEO of Toshiba's Semiconductor Company. "Today's announcement shows the substantial progress that has been made in this joint program. Cell will substantially enhance the performance of broadband-empowered consumer applications, raise the user-friendliness of services realized through these applications, and facilitate the use of information-rich media and comm
    • DRM For the Masses (Score:5, Interesting)

      by rsmith-mac ( 639075 ) on Monday November 29, 2004 @10:42AM (#10942974)
      After reading that press release, and correct me if I'm wrong, I'm not sure what's really "new" about the Cell other than On-chip hardware in support of security system for intellectual property protection. There are other Power designs already that do multicore, do high performance, and do vector ops(Altivect), so the only thing that I haven't heard about a design for is their security system.

      Considering the companies involved, and the devices that they want to put the chip in, I'm really tempted to say that the Cell is nothing more than the biggest effort we've ever seen to get a DRM (trusted computing) CPU and associated parts on to the market. Obviously, this scares the bejesus out of me, since it would mean that these Cell devices would effectively be mod-proof; systems like Xbox Live already keep cheaters away, so this seems to be an attempt to stop modding alltogether. So, I have to ask: how is this going to benefit me, the consumer? If Live already gets rid of possible cheaters, how does stopping me from modding my box altogether help me?

      If these assumptions are right, I don't like where this is going.

  • by master_p ( 608214 ) on Monday November 29, 2004 @08:53AM (#10942134)
    I don't think that a game console needs such a so sophisticated and so powerful CPU, for important reasons:

    -Real-time 3d graphics of cinematic quality will always be too slow for general purpose CPUs.

    -developing a game with AI that needs ten times the power of todays CPUs will take many man years and may not be that welcomed by the console audience.

    -It's very difficult to do multithreaded apps, and the difficulty rises exponentially with the number of threads.

    So what exactly would the be role of the CELL processor in PS3?

    It would make much more sense if:

    -Sony developed a platform that can move insanely great amount of graphics around, with the ability to do real-time raytracing, rather than providing so much general-purpose processing power.

    -Sony developed a graphics architecture that could really be parallelised, so instead of bringing out a totally new console, they could just up the graphics spec by adding more chips. They could save millions of dollars from developing and advertising the new console.
    • First thing that a game console should get is a mouse and keyboard standard with the joystick. That would be a better improvement than any graphics, AI or memory expansion.

      After playing games on a PC with a mouse, playing on a console is frustrating, not fun. And, I can not imagine playing a game that puts console based players up against PC players (unless you handicap the PC's hardware). The console players would be out of luck.

      InnerWeb

      • After playing games on a PC with a mouse, playing on a console is frustrating, not fun. And, I can not imagine playing a game that puts console based players up against PC players (unless you handicap the PC's hardware). The console players would be out of luck.

        Playing first-person shooters on a console is frustrating, since the control is designed for a keyboard and mouse. Similarly, playing a typical console 3D platformer like Wind Waker on a PC will be equally frustrating since the control is design

    • -Real-time 3d graphics of cinematic quality will always be too slow for general purpose CPUs.

      But Sony hasn't said that the Cell is going to be used for graphics. It's the CPU, not the GPU.

      -developing a game with AI that needs ten times the power of todays CPUs will take many man years and may not be that welcomed by the console audience.

      Irrelevant. But CPU power iis used by other expensive things, like inverse kinematics and physics. And let's not ignore the benefits of letting developers use langu
    • by n3k5 ( 606163 ) on Monday November 29, 2004 @12:18PM (#10943671) Journal
      Yes indeed, a console doesn't need a powerful CPU any more than a bathtub needs a good amount of warm water. But people who use it will want it anyway.
      Real-time 3d graphics of cinematic quality will always be too slow for general purpose CPUs.
      This statement is so silly, it's not even wrong.
      developing a game with AI that needs ten times the power of todays CPUs will take many man years and may not be that welcomed by the console audience.
      A typical PS3 game takes many person-years to develop, regardless of wheter it uses any AI. For many games, it's a matter of days to develop an AI that needs ten times the power of today's CPUs. Making it so it uses only a fraction of the power of a current CPU is the difficult and time-intensive task. Console-gamers play much more single player games than PC users, so it is particularly them who welcome a sophisticaed AI.
      It's very difficult to do multithreaded apps, and the difficulty rises exponentially with the number of threads.
      It's very easy to make multimedia-processing apps multi-threaded and rendering scales particularly well over multiple CPUs. If the engine uses an API like OpenGL or D3D, it doesn't even have to know how many threads are used to render the visuals, the programmer doesn't have to do anything. Many AI algorithms also scale pretty well over multiple threads and/or closely coupled CPUs.
      [Sony should concentrate on graphics chips instead of general purpose CPUs.]
      They could save millions of dollars from developing and advertising the new console. [Instead of re-using the old one, just with more GFC chips.]
      These CPUs aren't 'general purpose' in the sense like a 486 is general purpose. They are specifically optimised for parallel operations, floating point calculations, vector math ... So they can save even more money the way they're ding it, because they can re-use the same architecture in lots and ltos of media processign devices, not just gaming consoles, and they can not just scale pure graphics performance, but also audio performance, video performance, whatever is suited to the Cell architecture.
  • by Anonymous Coward on Monday November 29, 2004 @09:22AM (#10942374)
    The Cell processor is going to rule!

    After all, look how accurate Sony's hype about the PS2 was:

    The PS2 will be able to render 75 million lit, shaded polygons per second!

    The PS2 will be able to run games at HDTV resolution (1280x960) out of the box with no performance loss!

    We will build professional workstations out of 32 Emotion Engine chips which will be able to render movies in realtime and take over the professional graphics industry!

    Since all the hype turned out to be completely 100% accurate, I'm sure we can expect the same for the PS3 / Cell Processor.

    I suppose it's also possible that it will be another massively over-hyped disappointment with builtin Sony patented lameness that sucks even harder than ATRAC. But you'd have to be a real fucking cynic to believe that!
    • Believe the truth (Score:3, Informative)

      by Ideaphile ( 678292 )
      Sony originally promised the PS2 could render 75M simple polygons per second, but also said the geometry engine's limit was 36M polygons per second. This figure is accurate, but like all such numbers in the graphics industry, it is achievable only in a single-function demo app. Such figures are useful only for comparing the raw performance of different designs.

      Sony never claimed the PS2 could support HDTV resolution. The company was very clear about the limited frame-buffer memory on the Graphics Synthes
  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Monday November 29, 2004 @09:23AM (#10942382)
    Comment removed based on user account deletion
    • Will a software implementation of OpenGL running on a Cell system top the performance of whatever NVidia and ATI are selling by then?

      What about just feeding the model descriptions to a software ray-tracer? No further API needed, unless the raytracer needs it.
  • should give performance ten times of conventional processors.

    It's all about the bandwidth now. The cache(s) and path(s) to memory should be the most fascinating aspects to this processor. Speed is nothing without data to process.

  • by G4from128k ( 686170 ) on Monday November 29, 2004 @10:36AM (#10942924)
    This does not surprise me in the least. A Prescott processor has 125 million transistors, a Motorola 68000 had 68000 transistors. Yet the Prescott is not 1838 times more productive on a per clock-cycle basis. Admittedly, some of those Prescott transistors go to cache, superscalar magic, creating long fast pipes to achieve the GHz and implementing nifty MMX features. Even so, fabbing a 68k in 90 nm would create a tight little processor that is not 1800 times slower than the Prescott.

    Thus, one can imagine creating a tighter core processor design with a budget of a million transistors each (15 times the original 68k budget) with a few million for L1 cache and another million for glue and then place 20 of them on a single die. Add optical interconnects and that new optical-to-silicon technology invented recently (for multiple channels of GHz I/O to feed all those cores) and you have yourself a powerful little processor.

    The point is that with a budget of 125 million transistors, designers can do more than create a bloated single-core CISC processor.
  • i am just wondering if multicore will increase the performance of the games? does it work like the nvidia sli? i am not sure about games today if they can be fully multithreaded (if so, please do point it out). will they run let say geometric calculations on a core and ai engine on another core, etc.? just a thought because there may be a reduction of performance for single threaded apps if they would increase the total performance of the chip but decrease the performance per core.
  • What people seem to be neglecting is the fact that the thrust of the cell processors is to speed up floating point -- but leaves integer performance alone. Thus, "normal" processing tasks won't be any faster at all but those that require floating point (digital media, scientific) will indeed improve. Mostly, I'd think this affects video and audio card makers since it makes much of what they do redundant. So the potential I see is bringing down overall system costs -- not exceeding mainstream performance; th
  • I just wonder how much heat these cell processors will produce. Will they require heatsinks 10 times larger than we currently have?

    "Hey look at my brand new computer."
    "Man! This thing is gigantic! I thought it used very tiny CPU's"
    "They are. 90% of the case is for the heatsink."
    "Oh..."

No spitting on the Bus! Thank you, The Mgt.

Working...