Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
IBM Hardware Entertainment Games

Cell Workstations in 2005 330

yerdaddie writes "The cell processor will be introduced in graphics workstations before release in the Playstation 3, according to press releases by IBM and Sony. As previously discussed, IBM will be releasing more details in February 2005. However, apparently prototype workstations have already been "powered-on" and will be available in 2005. Since Windows on PPC was scrapped back in 1997, this leads to speculation that perhaps Linux, AIX, or BSD will be the operating system for cell workstations."
This discussion has been archived. No new comments can be posted.

Cell Workstations in 2005

Comments Filter:
  • I may be wrong... (Score:3, Interesting)

    by wcitechnologies ( 836709 ) on Monday December 06, 2004 @03:07AM (#11005989)
    I may be wrong, but to me this sounds like hyper threading with a new name. Can anybody enlighten me?
    • Re:I may be wrong... (Score:3, Informative)

      by Anonymous Coward
      Sounds more like some kind of multi-core processor where the number of cores can vary greatly.
      • you are right (Score:3, Informative)

        by Henriok ( 6762 )
        In essence Cell is just that, but it doesn't stay there. Cell technology can distribute it's load to other Cell processosrs nearby. It's built from the ground up to use grid technology transparently. Quite revolutionairy.
    • Actually it sound more like parallel processing to me, where many CPUs are connected together to form one larger CPU. Perhaps you can remover CPUs from the network while active?? Or maybe it is just easier to expand. Their page seems to be full of hype (in my opinion), but no description of concrete benefits from this technology. Also why is this in the games section ... seem more like hardware to me.
    • by Halcyon-X ( 217968 ) on Monday December 06, 2004 @03:14AM (#11006011)
      It has been stated before [psreporter.com] that the PlayStation 3 is expected to be capable of distributed processing [webopedia.com] due to the capabilities of the Cell architecture. Whether or not that will indeed be the case remains to be seen, it is certainly a lofty goal for the current market penetration (not to mention speeds) of broadband in the home. Does Sony expect these PS3s to cooperate with their Cell-based television sets?
      • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday December 06, 2004 @03:53AM (#11006105) Homepage Journal
        The stated goal is for some future playstation (maybe the fourth generation) to use the cell processor and yes, to cooperate with cell devices in televisions, dvd players, et cetera. If we end up with cell PCs they'll be candidates too. They could run linux, of course. To be honest, that's the Xbox, if it were clustering, and it could have been if there were any reason for it to be. Sony will probably use some kind of IEE1394 (i.Link in Sony's parlance) possibly including 800Mbps in order to connect Cell devices. 1394 allows significant cable lengths and near-gigabit speeds today; it is intended to support 1.6Gbps and later even 3.2Gbps (over fiber.)
      • Re: (Score:3, Funny)

        Comment removed based on user account deletion
      • by TommyBear ( 317561 ) <tommybear2@gmail.com> on Monday December 06, 2004 @05:17AM (#11006275) Homepage
        The primary goal of IBM with regards to the Cell processor was that it be scalable first as a rack mounted solution. Therefore the Cell Processor Workstation (CPW) will be available first up as individual form factor boards, containing a CPP, several DPPs and other small components for I/O etc.
      • anything with capability to run complex software and a communications connection of some sort of is 'capable of distributed processing'.

      • by badboy_tw2002 ( 524611 ) on Monday December 06, 2004 @12:56PM (#11008886)
        Do people actually believe this (tr)hype? Were you the same people actually getting giddy about the awfully named "Emotion Engine" allowing realistic hair or somehow providing better human reactions to characters in 1999?

        Console games work and develop well because of one thing: standardization of platform. If you put your game in any console of the same type, it will run the same (besides various regional differences (PAL, NTSC) and maybe some hardware changes later on in a production run, ala XBox's two DVD drives)

        You do not design for "potential extra processing" from someone's TV, toaster, aibo, or whatever. You design for the LCD, which is the unit that everyone buys. You might be able to take advantage of extra hardware like voice headsets or harddrives, but even then your game has to work well without it. (Example: Xbox allows you to precache data from the DVD on the harddrive, but you still need to be able to meet loading time standards without it. i.e. you can do better than 15 seconds with the harddrive, but no worse than without).

        Can you imagine the testing nightmare of "better AI" if someone has a Sony DVD player nearby? Do you test every level with every combination of chip configuration out there?

        This of course has been written with the thought that this is at all possible. Well, sorry, it isn't, and the super IBM cell processor isn't going to make it so. Console games work off extremely hard deadlines, and that's the refresh rate on your TV. Every 16 or 32 ms you need to have a new frame rendered and ready to go. You can't schedule a few frames for processing on the microwave and ask for them back whenever. What your drawing depends on the real state of user input, ai, physics, lighting, scripted events, etc. The state of the game at any point in the future is unknown, and thus in those 16 ms you have to figure out what needs to be updated, how the world should change, and finally render that to the screen. The actual rendering time might not even be half of the time you have for a frame. Do you have the bandwidth to send that data out and expect it back in the same frame? If so let me know so I can get some of that!

        I could see remote AI processing, MAYBE, but that still has to be able to be done on the console anyways for the LCD case. AI is one of the worst things to debug in game development as a lot of times it can be non-deterministic. You do not want to throw another variable into the testing, especially not when its hardware.

        Sony has a very good marketing department for continuing to push this crap. They've said "we will use this cell technology in other products besides the PS2" and "In the future the PS platform will interact with other Sony brand components", thus meaning that maybe your PS2 can start popping popcorn or something, but that has nothing to do with processing, its just networking. But somehow the two get combined on fan sites to mean "OMG, buy 28 PS3s and Jaxter and Dax runs at 6000FPS!!!"

        What you will see with cell processing is a continuation of the mulitprocessor platform the PS2 had, but in a more generic sense. This should allow very interesting stuff to be done, and while games will be initially harder to develop, there's going to be some really cool stuff coming out of this. But don't believe you're going to suddenly see a sentient household that's drawing a few extra pixels in GTA VI: The Quest for More Money.
    • It's a lot more advanced than that, the cores don't need to be on the same board, they can be seperated by a network connection.

      It's system on a chip architecture and it's a lot more elegant than anything Intel or AMD will come up with, simply because it is free of x86 compatibility.
    • Re:I may be wrong... (Score:5, Informative)

      by ponos ( 122721 ) on Monday December 06, 2004 @07:07AM (#11006533)
      I may be wrong, but to me this sounds like hyper threading with a new name. Can anybody enlighten me?

      It's not the same. Hyper threading divides processor units (e.g. a multiplier or an adder) in order to keep most units of the single core busy. This happens because Intel processors have very long processing pipelines (thus the very high frequency compared to AMD), so stalling them can be quite costly. In order to avoid this, Intel simply keeps track of two "virtual" processor states, essentially 2 copies of all registers, and schedules instructions from any of these two execution threads in ways that keep most units busy. By chosing from 2 threads instead of one it has greater chances of finding an instruction that can be computed by an idle (at that time) unit.

      Cell architecture, on the other hand, seems to rely on multiple simple cores, each of which is complete. A central Power processor core keeps them working together. I assume (but I do not know!) that the benefit of this architecture is : (a) adding multiple cores is easy and increases cost linearly (b) software that works for a 16-core chip will also work for a 2-core chip, but slower (therefore the same processor can be adapted to different needs, just like multi-unit videocards, without expensive redesign) (c) an inherent understanding of parallelism (on the chip) allows chaining them together in an easy fashion. Maybe we will start counting cores instead of MHz in a few years, when all cpus will have peaked at some--obscenely high--MHz limit. Details on the cell chip are very vague and ridden with marketing buzz-words, but it appears it will be able to execute many more parallel threads than an Intel HT processor (2 threads maximum in parallel).

      What worries me most is the fact that Sony (which also sells music/movies etc) says it'll have on-chip capability to protect copyrighted works. I don't know what this will mean for the GNU/linux crowd.

      Disclaimer: All the above is wild speculation. I am not an engineer.

      P.

      • all cpus will have peaked at some--obscenely high--MHz limit

        Speaking as someone who started out with a 1.774 MHz processor [discover-net.net], current CPU speeds are already obscenely high. Hell, my disk drive has more memory (2MB vs 16K) than my first computer...
  • by Sanity ( 1431 ) on Monday December 06, 2004 @03:11AM (#11006003) Homepage Journal
    This article [linuxinsider.com] provides some background.
    • The article says that each chip is running its own kernel. That seems like a lot of wasted energy to me. I agree that it could give a serious boost to performance. However what about the memory requirements (RAM specifically)? It sees to me that each micro-kernel is going to need some RAM of its own, and to get the promised performance you would need many of these micro-kernels. This technology may end up more limited by memory requirements than the speed of the chips.
      • by nacturation ( 646836 ) <nacturation&gmail,com> on Monday December 06, 2004 @04:29AM (#11006181) Journal
        It sees to me that each micro-kernel is going to need some RAM of its own, and to get the promised performance you would need many of these micro-kernels.

        Keeping in mind that there are various distros which fit on a 1.44 MB floppy disk *with* userland utilities, I don't think the size of the kernel will prove to be the limiting factor on a modern workstation.
        • by Gopal.V ( 532678 )
          > I don't think the size of the kernel

          The old UNIX SYSV kernel took a whopping 54kb of memory !. I'm now running the same kernel [southern-storm.com.au] in user space and playing around with it.

          Hehe, it's a fun project for CS Majors to play around with.
      • by TheRaven64 ( 641858 ) on Monday December 06, 2004 @05:35AM (#11006319) Journal
        From what I've read about the Cell line, each core can run its own kernel (i.e. it doesn't have to). This provides some interesting possibilities, for example a general purpose kernel running on one, while a real-time kernel runs on another and handles things like sound. Current systems have to make a choice when it comes to scheduling algorithms:
        1. Make one that works for all (or, at least, most) cases but is hideously complicated, or
        2. Make one that focusses on one class of application (e.g. throughput-oriented, realtime, etc).
        Most monolithic kernels choose 1. Several micro-kernels implement the scheduling algorithms in user-space, allowing them to be swapped easily. Having a large number of cores available to the system would allow this to be dynamically tweaked.

        This approach seems more in line with the exokernel project than any microkernel I've looked at. If you've got some spare time, exokernel is well worth a look.

      • But it will be a micro-kernel that will be rather small. You'll have cell processors for doing processing work, then other cell processors acting as I/O controllers with their own kernel.

        Think outside the box, equating the cell design to existing PC architecture is silly.

        Besides, you said it was wasteful? aren't many clusters built of entire computers where you have display hardware, floppy drives, hard disk, RAM etc...?
  • Maybe... (Score:4, Interesting)

    by Spruitje ( 15331 ) <ansonr&spruitje,org> on Monday December 06, 2004 @03:11AM (#11006006) Homepage
    Well, knowing IBM and Sony there is a great change that it will run linux.
    At the moment it seems that linux is the choice for development on the PS2 and I think it will be with the PS3.
    • Re:Maybe... (Score:3, Interesting)

      by Build6 ( 164888 )
      i'm more curious as to whether there'll be two separate chassis/machines (one from IBM, one from Sony... or even more per Cell-partner?), or of it's just going to be one basic machine that may/may not have different corporate logos slapped on it?

      (i'd think it'd almost certainly be linux, no uncertainty there :-)

      hrm. actually, an even bigger question... will there be blinkenlights! *memories of the BeBox*

      • Probably OEM (Score:3, Insightful)

        by Henriok ( 6762 )
        I think IBM, Toshiba and Sony eventually will license and sell Cell technology to those who are interessted. One of the core ideas is that they want to spead this technology as far at they can since every Cell based machine tap on the computational power from all other Cell based appliances in its vicinity. The more the merrier!

        Cell isn't one processor, it's a class of processors. The one that will go into the workstation is more powerful than one that will fit into a PDA, or a HDTV. I think that IBM will
    • Platform showdown? (Score:4, Interesting)

      by Halcyon-X ( 217968 ) on Monday December 06, 2004 @03:25AM (#11006036)
      What's interesting is that how Sony and Microsoft handle their product launches may have an impact on the amount of games we see for these systems. With Linux gaining ground on the desktop (bear with me here), it is concievable that it might be a larger target for games, if not gaming development on a 64-bit workstation. Epic have already committed to an Unreal Tournament development platform on Linux (Windows 64-bit taking its time is probably also a factor).

      The most interesting part, however, is that MS may be putting up .NET as the development environment for the X-Box 2. It makes sense that MS would try to leverage their gaming platform to lure developers onto the .NET platform and commit their engines to that API.

      On another note, could Linux and Mono play much of a role in this if the Cell does indeed provide a Linux environment for development? If Sony is able to provide a less expensive development environment, development costs may ultimately go down and the consumer would benefit.

      This could be either by the increase of choice since the bar of entry would be lowered for smaller software houses, or by cost if the games are indeed cheaper as a result; Existing engines and software could be ported or would be compatible, or due to the the ease of coding on a familiar platform.

      • The same processor powering cell phone, PDA, gaming handhel device, gaming console and general porpose workstation can be a way out of porting-emulators hell which is handheld development is for now. However there will be different OS for handhelds still probably - for examle Nokia unlikely drop Symbian in favor of Linux...
      • Linux is not the OS for PS3, but the OS for workstation. PS3 runs a real-time kernel on it.
  • by 9-bits.tk ( 751823 ) on Monday December 06, 2004 @03:14AM (#11006010)
    then we probably would be seeing Linux for Cell or similar. Reading that reminds me of the XBOX-Linux and the GameCube Linux projects.

    I wonder what the average speed of the processors would be? And if they'd include HyperThreading?

  • my favorite quotes (Score:5, Interesting)

    by mxpengin ( 516866 ) on Monday December 06, 2004 @03:15AM (#11006012) Homepage
    For all practical purposes, the PowerPC has been relegated to a Mac-only solution while high performance NT users have turned to Digital's Alpha....

    This move puts Apple Computer in another awkward position: the company had been planning on using Windows NT in its Web servers.

    And my favorite actual fact is that microsoft is going back to Power PC with the new Xbox . But Im sorry that Alpha has been erased from the map.
    • by hypnotik ( 11190 )
      But Im sorry that Alpha has been erased from the map.

      As am I. I've always thought Alphas were some of the cooler architectures out there. And it's rather amusing to think that Microsoft had NT ported to a 64bit processor a long time prior to the introduction of the Opteron. Granted, there are alot of architectural differences between the Opteron and Alpha, but that's why the HAL existed. Too bad that Microsoft did away with a lot of the HAL to gain video speed. I bet they're regretting that now.

      Anywa
      • "I've always thought Alphas were some of the cooler architectures out there"... We used to have a dual Alpha at work, I remember the heatsinks being about the size of my head and the fans sounding like jet engines!

        Still it is a very intersting architecture and it sounds like you have one for the same reason I have my Cobalt Qube 2.

      • Alpha was plenty better than the Motorola 68x architectures out at the time (and better than Intel's but that should be given) but DEC just didn't know how to do anything else except design the hardware. When they finished and showed off the Alpha at conventions, other companies had equally complex hardware and software to demonstrate with. DEC had a much better hardware design but maybe 2 half-ass programs to show the thing turns on and does a couple things. Then Compaq bought them but didn't take the l
      • And it's rather amusing to think that Microsoft had NT ported to a 64bit processor a long time prior to the introduction of the Opteron.

        They never did port MS-Windows to 64-bit alpha; it only ran in 32-bit mode. Compaq was involved in the 64-bit port, but announced in 1999 that it was foregoing 64-bit development in favor of IA64.

        Dave Cutler *did* get some early versions of 64-bit Win2k to boot on an AlphaServer, but since Compaq lost interest in developing Win2k for the Alpha (both 32-bit and 64-bit ve
  • by amigoro ( 761348 ) on Monday December 06, 2004 @03:16AM (#11006015) Homepage Journal
    The graphic [mithuro.com]
  • running a atandardised obtainable kernel(windows, linux, ect), with hardware level access after deencryption (if it's even encrypteD) and an exploit

    .....

    Am I the only one here thinking "bad fucking idea" or what? And lets not even mention the latency for distributed supercomputing applications. Everyone is now on wireless, unsecured, and sending signals all over the place. Hell, I should support that; free internet with the touch of a button after hijacking someone's toaster. w00t.
  • by hussar ( 87373 ) on Monday December 06, 2004 @03:26AM (#11006038) Homepage
    From one of TFAs: The Cell workstation is designed to deliver tremendous computational power, helping digital entertainment content creators generate higher quality content with richer and more dynamic scenes, much faster than current development systems.

    This points at more than just game consoles. This looks like Sony is looking ahead to a future in which they can dispense with actors entirely and rely on realistic computer generated characters. Should be a good bit of money to be saved if you don't have to pay an actor millions to star in your film. Could be other applications too: Animated news announcers with features finely tuned to inspire trust in the viewer, human-like avatars in intelligent appliances, human-like answering machines and customer service line responders, etc.

    So, how far are we from the footage ala William Gibson's Pattern Recognition and the "live" entertainment ala Neal Stephenson's Diamond Age?
    • Movies and video games are growing closer together all the time anyway. Spider Man 2 the video game made almost as much as Spider Man 2 the movie. More and more video games are turning into movies, and sooner or later that's going to become a regular driving force behind a whole genre of moviemaking. Video games are finally getting the recognition they deserve... anything that sucks up that much time from the world deserves recognition :)
      • Just because they both make money doesn't mean they're comparable mediums. Even simple action movies rely on heroes accomplishing one-time impossible tasks with neat resolution. This simply doesn't work in the context of a game, where mistakes and repeated content is a necessary part of the game. And there's so few comparison points between an art movie and a video game, why even bother?

        If they put a screen capture of "Half Life 2" on a movie screen for two hours, of course the audience would be bored

        • Also Spider Man 2 the film costs £5 in the cinema or £15-£20 on dvd, Spider Man 2 the game costs upwards of £30. They may well be making similar amounts of money bt far fewer people are playing the game than watching the movie.
        • They're comparable mediums because both are moving towards being two things at once: computer generated, and photorealistic. Neither genre has yet achieved completion in both at once, but both are sneaking up on it.

          Games are going to reach the point where in terms of visual quality you will not be able to tell them apart from movies. In general some types of camera angle will not work in games, while others will not really work in movies. However, you do sometimes see movies with scenes in the first per

    • An actor (as opposed to a 'star') can create subtleties of expression that may be beyond CGI. Think about it - intelligence, experience and talent, directly controlling facial muscles. As opposed to a CGI-jockey with a mouse shifting polygons around. Our brains are hard-wired to decrypt those facial signals and quickly notice when they are 'off' in some way. So, yes, this might replace some actors, but only the bad ones! Oh - and porn of course.
      • So test audiences become instead screeners/raters for parametric computer beings. "Is this one seem happier, or sadder? 1 or 2?" blah blah blah, just like for getting a pair of eye glasses. Get 100 people of a certain demographic pigeonhole, and let them rip. Or, maybe it will be even more meta than that? A website, ala "Hot or Not" (whatever it's called), where people will sort of generate character appeal parameters w/o knowing they are doing it.

        The trick, if I remember reading correctly, is to not try t
  • by CaptainPinko ( 753849 ) on Monday December 06, 2004 @03:35AM (#11006058)
    I've always wondered this --I mean it's so obvious that since it's not done it must mean it's flawed-- why doesn't Transmeta release a mobo with it's chip and a blank code for emulating the processor. Hobbyists emerge and write multiple emulator.

    You'd boot into something like Grub and choose your processor. That way you could run a UltraSPARC workstation, MIPS, Itanium, or something as small as a PIC. It'd be great for cross-platform development especially for embedded users.

    I'm sure processor hobbyists would spring up to fill every niche of emulator. Probably be a great proving ground for design theory.

    Considering the low heat output you could have a dual/quad-processor box.

    Maybe someone would figure out how to run multiple translators at the same time so you could run x86 and PPC and 68K at damn-near native speeds

    To me that'd be the ultimate workstation.

    • by Steveftoth ( 78419 ) on Monday December 06, 2004 @04:03AM (#11006126) Homepage
      You do realize that the whole reason that Transmeta's processor works well at all is because it's hopelessly optimized for emulating x86 instructions. And their software took years to write and it still is not 100% correct. (they still have some bugs in the x86 emulation) It's not going to be easy to do such a thing and at the end of the day what would be the advantage of running emulation at that level when you can just run a user level process to emulate a PIC, or ultrasparc or whatever you want?

      I don't see the point of being able to boot into a random chip because you also have to emulate the entire computer, not just the cpu.

      Even if you could emulate an ultrasparc cpu, you can't just throw it into a PC case and boot solaris, you have to use an actual SUN computer that has the right video, network and ide cards in it otherwise you'll have a broken machine. There are lots of little things that will cause the machine to break. The cpu is the heart of a computer, but it's not the only piece. They all have to fit together or it won't work. Just like you can't go and install a copy of OSX on a motherboard for the MorphOS (you can, but it's through an emulation layer, Mac on Linux) It's not at the kernel level.
    • I've been wondering why Transmeta has not released a Mini-ITX board aimed at the Hobbyist crowd.

      i.e. not more than 500 USD not 1000~1500 they are asking for the reference platform.

  • The more of these you have in your house, the faster the game/app you're playing/using will run as it will automatically use spare capacity on the other machines networked together in your house... I for one am most certainly looking forward to getting my hands dirty coding for these beauties... Bring on the Cell Processing Overlords... I'm ready.
  • by DCstewieG ( 824956 ) on Monday December 06, 2004 @03:58AM (#11006111)
    I'm still wondering about the real-time uses of this, i.e. PS3. Latency becomes a huge issue when you're trying to render a frame every 16ms.
    • They managed that with PS1 and 2, I don't see how this can be any worse.
    • by Anonymous Coward
      I may be completely wrong, but I would like to think that IBM and Sony have already thought about this. I very much doubt that they'd design this chip, release it, and then find out if the chip to chip latency will cause timing problems in games.

      I'm going to make a wild guess here: I think that, generally speaking, one local dedicated cell processor will be used for renderinging. Any extra distributed processors (in toasters and whatnot) will be used for the AI's threaded/asynchronous world domination plan
  • Windows (Score:3, Interesting)

    by MustEatYemen ( 810379 ) on Monday December 06, 2004 @04:07AM (#11006134)
    While PPC support was dropped, if I recall correctly back in the Win NT 4.0 days, NT was amazing because it was designed from the ground up because it could basically be compiled for any endian chip/any aritecture.

    Since it is the core of the current and future lines of windows, the windows base should be portable to a cell based system, basically it requires some new drivers and probably tweaking of the HAL abit. The problem is that all the applications (that we all consider part of the windows os but are really just applications running on top) would need to be redone.

    Microsoft would have one of these machines in house by now for they're windows teams to work on supporting. That I have no doubt, what I do doubt if microsoft will consider this important/the future and if they'll support it during the inital release (w/ longhorn maybe?) or if they'll come late and lose a large section of the market as we all jump and have to use a *nix as the desktop.

    If this whole cell thing is more then hype, and is the wave of the future, Microsoft will support it.
    • Re:Windows (Score:3, Informative)

      by TheRaven64 ( 641858 )
      NT was amazing because it was designed from the ground up because it could basically be compiled for any endian chip/any aritecture

      I don't think NT supported any big-endian platforms. Even on PowerPC it ran in little-endian mode. Porting to a new platform was not quite a straight recompile, but it did only require porting the HAL, not the entire system. OS X works in a similar way - the Mach microkernel is used as a HAL (which is how NeXTStep ran on so many architectures with such relative ease).

      Sinc

  • by erice ( 13380 ) on Monday December 06, 2004 @04:10AM (#11006143) Homepage
    What'd I'd like to know is what IBM's solution to the software problem is. Software has always been the achilles heel of multiprocessor systems. Most existing programs and even most existing programmers can't use the resources efficiently. That's why we have gargantuan superscaler, out of order processors. Expensive in terms of hardware but it suits the software better.

    So, why is Cell going to be easy to program, when other parallel systems aren't? The bits of that i've seen about the architecure suggests that programming might be an absolute bear.
    • So, why is Cell going to be easy to program, when other parallel systems aren't? The bits of that i've seen about the architecure suggests that programming might be an absolute bear.

      That's likely *the* key to success of this architecture. As far as I can tell, it isn't really new in a fundamental sense, parallel/distributed architectures have been around for some time. What IS new, is that this would be the first time that a) this new architecture and b) associated computing potential, hits the mass marke

      • Of course we may be looking at a whole new way of programming. Just as objects have really replaced structured as the prefured metaphor maybe IBM is going to create a new programing language maybe c++plusP. You create several objects that all run in parallel with messages flying back and forth. Cell could be the system 360 of the 21st century. A system that can scale from a PDA up to mainframes all running one OS, and all talking to each other. Microsoft should be very afraid.
    • while programming in a multithreaded/multiprocessor environment takes a bit more thought than programming otherwise, it's not nearly as hard as it used to be - or rather, it needn't be so. many modern languages (like my favorite, Limbo [vitanuova.com]) can give you multithreading support (with or without multiprocessors) effectively for free. as long as that goes with light-weight threads (like Inferno [vitanuova.com] and Plan 9 [bell-labs.com] give, or with the stupid "special light-weight process" junk present in many unixes), you've got most of the ba
    • two quick points:
      1) the PS2 is surely worse because it has two different processors which makes it extremely difficult to program. cell will be an improvement here, if only for the fact that you have to deal with only one kind of processor.

      2) you can make parallelization easy by making it simple for tasks that are suited for it. think AltiVec vector instructions - very easy to use. graphics-intensive apps are almost always easy to parallelize. you are going to run the logic in one thread, and spread graphi
  • by dan_sylveste ( 701563 ) on Monday December 06, 2004 @04:55AM (#11006229)
    The development kit for Xbox 2 is Windows NT4 for PPC with Xbox 2 extras.
  • Computer components that talk by wireless..

    That must be the wet dreams of NSA employees ;-)
  • Cell workstations will be 8-way tipically, which many programs (like GCC) is able to use. If claims of Power5/Cell performance are true, it means that it will compile linux kernel under 5 sec. (8-way). All system, including KDE/GNOME and standard set of apps will take less then hour. Sounds too cool to be true.
    • * Sounds too cool to be true.*

      and you know what they say about things that sound too good to be true.

      (btw.. if you wanted.. i'm sure ibm could build you a machine today to do at least almost just that.. the catch would be that it would be friggin expensive!)
  • On-chip DRM worries (Score:5, Interesting)

    by avocade ( 608333 ) on Monday December 06, 2004 @05:29AM (#11006307) Homepage
    I'm still a bit worried that I've not heard much about the seemingly built-in DRM management of this new platform (that seem to be able to spread to all facets of technology, including toasters). According to a clause in the pressrelease by IBM and Sony from Nov. 29 [ibm.com], the Cell processor will have:

    - On-chip hardware in support of security system for intellectual property protection.

    Is this the end of tampering-capable hardware (e.g. machines where you can modify the kernel, bypass DRM-systems etc) that some people have long foreseen? Anyone more in-the-meat of the technical details care to elaborate on this?
    • This has struck me as possibly being more like the VIA Padlock feature: Specialized hardware which requires specialized software to function. When the two are present & functioning it does the job much faster than the general CPU would have. However is you were running an alternate OS without the software support, that part of the chip doesn't really get used.

      Remember... Most IP owners are concentrating on the Windows owners of the world. What really hacks them off is that a windows user can violate the

    • Is this the end of tampering-capable hardware (e.g. machines where you can modify the kernel, bypass DRM-systems etc) that some people have long foreseen?

      Not necessarily. There is no indication of what is meant by "hardware in support of security". It could be instructions to speed-up asymmetric encrytion, a processor serial number, a special unit that must cryptographically activated for certain instructions to function, or something else entirely. It does not imply that only signed bootloaders/kern
    • I'm guessing they're not releasing details because they don't want egg in their face when it is revealed that the restrictive hardware either is flawed because it can be bypassed (from the point of view of many companies holding copyrights), or flawed because it works (from the point of view of the users).

      Personally, I make my picks (PS2 in this case) based on the ease of getting free (as in beer) software for the machine. I bought a PS2, but didn't buy a Gamecube or an Xbox. As a side note, I wouldn't min
  • by zippity8 ( 446412 ) on Monday December 06, 2004 @05:37AM (#11006327)
    Nothing's official just yet, but this is WAY more interesting than studying for finals, so here we go:

    Processor instructions are broken into an 'apulet', which contains data as well as code to perform an operation. This is probably why its claimed that if more processing power is needed, then its a simple task to add a new workstation and the work can be offloaded.

    A cursory read suggests that its like creating a cluster of highly efficient yet simple nodes.

    Corrections are welcome.

    Reference: EETimes [eet.com]
    • by master_p ( 608214 ) on Monday December 06, 2004 @08:14AM (#11006736)
      And how apulets are going to be extracted from serially executed code produced by a C compiler? Will the applications need to be written explicitely for Cell?

      The idea behind the Cell processor is a good one...it is not entirely different than what the Transputer did 15 years ago. Transputer CPUs could be connected into a grid, and the processing power multiplied accordingly, but with one assumption:

      code should have been written in a special programming language that allowed easy parallelization of code.

      The idea of Transputers failed because it is highly difficult to extract parallelism from code. Special development tools were not available.

      The PowerVR architecture also promised 'infinite' 3d graphics speed by just adding new GPUs, since it used tile rendering, but that failed, too.
  • What I've been wondering is why not just make it a PCI-X card that goes into a current Power-Mac?

    If I recall correctly, Sony Playstation 2 workstation (the one with the emotion engine) was over 15,000 USD. That puts it well beyond the "that would be interesting" price range and most likely beyond the aspiring game producer just out of college types.

    Where, I would hope, a PCI-X based card could probably be priced much lower.

    Now that I've said all of that, the old workstation would make an interesting additio

  • by Anonymous Coward

    The POWER train seems to be in full motion. No more wondering why IBM is canning its x86 desktop crap.

    I infer this means a full shift into Power based architecture from IBM, they will only retain x86 server products because customers may want them, but they will not play a large role in their roadmap.

    And that could be a Very Good Thing. The Power architecture is superior to all x86 implementations, including AMD64, in every way. The sooner we can break out into full uncrippled 64 bit computing the better.
  • by ezavada ( 91752 ) on Monday December 06, 2004 @07:27AM (#11006577)
    This seems like an excellent opportunity for Apple to license Mac OS X.

    I'm assuming the intruction set for the cell processor is a superset of the existing PowerPC processors, or that the missing instructions could easily be emulated. If so that would make this is a graphics workstation that could run Photoshop, Final Cut Pro, Shake, and other top notch professional software immediately. The existing user base wouldn't have to buy new versions -- their old versions would run.

    As discussed many times on slashdot and elsewhere, Apple won't license their OS unless they believe they can do it without cannibalizing their existing user base. Doubtless there would be some cannibalization of the high end, but if it makes OS X the clear platform for high-end graphics workstations it could still be an overall boost to Apple. I don't really know how the current high-end graphics market sees OS X. My impression is that a surprising amount of it is on Windows, and that Apple is just holding on to its market share in this area.

    Anyone with more current knowledge of the high-end graphics market care to comment?
  • by ngyahloon ( 655557 ) on Monday December 06, 2004 @08:14AM (#11006737) Homepage
    Wife: Honey, can you turn down the TV volume, you're stealing too much processing from the microwave and my chicken wont bake nicely.

    Husband (sniggers): Yah, as if it'll make it taste better
  • It is being used to program games for the XBOX next, via dual G5 machines from Apple. I doubt the games are programmed on MacosX,plus, more than one [theinquirer.net] source [xbitlabs.com] seems to agree...
  • STI Cell (Score:3, Funny)

    by bitswapper ( 805265 ) on Monday December 06, 2004 @08:24AM (#11006782)

    Too bad 3M didn't get involved.
    Then it would have been the STIM Cell processor.

  • by doctor_no ( 214917 ) on Monday December 06, 2004 @09:02AM (#11006966)
    Here is a powerpoint and article describing more information on the Cell chip that will be shown at the ISSCC (International Solid-State Circuits Conference) conference next Feburary at San Francisco.


    Technological Features for "first-generation" Cell chips:
    4.6Ghz Clock Speed
    1.3V operation
    85 degree C operation with heatsink
    6.4Gb/s off chip communication

    from the article:
    eight cores on a single chip
    90nm SOI process

    Link to Powerpoint [mycom.co.jp]

    Link to Original Article in Japanese [mycom.co.jp]

Keep up the good work! But please don't ask me to help.

Working...