Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Intel Hardware Entertainment Games

Intel To Design PlayStation 4 GPU 288

madhatter256 writes "According to the Inquirer it looks like Intel will be designing Sony's next gen console GPU. It will most likely be an off-shoot of the Larrabee GPU architecture. It is also unknown as of yet if Intel will also take part in the CPU design of the console. Due to current economic times it was a no brainer for Sony to go with Intel. " The article also mentions rumors of ATI getting the Xbox3 GPU and, if history is any judge, the Wii2 as well.
This discussion has been archived. No new comments can be posted.

Intel To Design PlayStation 4 GPU

Comments Filter:
  • Xbox3 and Wii2? (Score:3, Interesting)

    by wjh31 ( 1372867 ) on Friday February 06, 2009 @04:47PM (#26757371) Homepage
    are these confirmed names or assumed names?
  • by Bruce Perens ( 3872 ) * <bruce@perens.com> on Friday February 06, 2009 @04:54PM (#26757457) Homepage Journal
    Remember that Intel has been willing to open its drivers and specifications. Any improvements they make to their graphics core (which, yes, is in need of improvement) will probably make their way to motherboard and laptop chipsets, and will have nice Open Source Linux drivers made for them. So, this is a good thing.
  • Cell? (Score:4, Interesting)

    by Major Blud ( 789630 ) on Friday February 06, 2009 @04:55PM (#26757469) Homepage
    So after all the smack talking that Sony did about the power of Cell being untapped....they've decided to abandon it for the their next console? But if you listen to Sony, PS3 is a "10 year platform" anyways. This means that we wouldn't see the PS4 until at least 5-6 years from now. There is no telling what kind of processors would be available during that time frame. Do we really know if Larrabee would still be available by then? I think it's still way to early for Sony to start talking about specs for the PS4. Some people are bound to stop buying PS3s because it would just mean that the PS4 is right around the corner, and Sony knows this. They really can't afford to halt sales of the PS3 at their current selling rate.
  • Re:Xbox3 and Wii2? (Score:5, Interesting)

    by eln ( 21727 ) on Friday February 06, 2009 @05:01PM (#26757581)

    The obvious follow-up to the Wii would be the Super Wii.

  • by Midnight Thunder ( 17205 ) on Friday February 06, 2009 @05:05PM (#26757643) Homepage Journal

    es, Larrabee [anandtech.com]. It's a massively-multicored x86 processor designed to act as a GPU (it has some fixed-function GPU stuff tacked on).

    So they are taking a general-purpose CPU and using is as GPU? Now why do I get the feeling that this is going to be suboptimal compared to a specialised processor? Also, I would hate to see the power consumption for the graphics capability got out of it.

    Based on what Intel has released thus far, I will not believe it until I see it.

  • Re:Inquirer bullshit (Score:4, Interesting)

    by qoncept ( 599709 ) on Friday February 06, 2009 @05:09PM (#26757699) Homepage
  • by ByOhTek ( 1181381 ) on Friday February 06, 2009 @05:14PM (#26757765) Journal

    wouldn't that be a Wii^2?

    Anyway, I think part of the reason Intels offerings suck, is they are going for the majority of the graphics market - integrated. They aren't really trying for powerhouse GPUs.

    I'm not saying this is a sure fire thing, but as a rough indication, compare their primary competitor that does try for powerhouse GPUs and CPUs. Look at how they perform on the former with their competitors, and the latter with Intel.

    If intel decides to make a performance GPU, it might actually work.

    Add to that the fact that, using a static architecture, consoles don't /need/ the raw power of memory, CPU or GPU that a gaming computer needs, I think Intel has the potential to provide a very reasonable GPU for a console.

  • by __aamnbm3774 ( 989827 ) on Friday February 06, 2009 @05:38PM (#26758073)
    and you're going to let them 'beta test' their high-end GPU (with absolutely zero track-record) in your flagship gaming console?
  • by CopaceticOpus ( 965603 ) on Friday February 06, 2009 @06:00PM (#26758395)

    I'm expecting to see the PS4 come out at least 3 years after the Xbox3. For one thing, the PS3 has technical advantages over the 360 which should give it more legs. Sony designed the PS3 with a target of a ten year lifespan.

    Also, Sony is really stinging from the cost of trying to compete with a same-generation Xbox. They should be able to hit a sweet spot by spacing their machine a half a generation away. When the Xbox3 is released, Sony can drop their PS3 prices to very low levels, and capture the large, budget-minded segment of the market. After 3 years, once the Xbox3's newness has worn off, Sony can release a system which is technically unmatched, and which Microsoft won't be able to respond to for another 2-3 years.

    Anyway, that's what I'd do if I ran Sony. It will be interesting to see how it plays out.

  • by ASBands ( 1087159 ) on Friday February 06, 2009 @06:01PM (#26758409) Homepage

    Larrabee is expected to at least be competitive with nVidia/AMD's stuff, although it might not be until the second generation product before they're on equal footing.

    Competitiveness is not a quality of generation number. Still: What statistics have you seen that compare Larrabee and something people use right now (ATI/nVidia)? There is this presentation [intel.com] (PDF) they made at SIGGRAPH, which shows that performance increases as you add more Larrabee cores. Here's a graph [wikipedia.org] which may mean something. The y-axis is "scaled performance" What might that mean?

    Graphs show how many 1 GHz Larrabee cores are required to maintain 60 FPS at 1600x1200 resolution in several popular games. Roughly 25 cores are required for Gears of War with no antialiasing, 25 cores for F.E.A.R with 4x antialiasing, and 10 cores for Half-Life 2: Episode 2 with 4x antialiasing.

    Sounds neat. I guess that's why they're going to promote the 32-core Larrabee [google.com]. How much will something to run these cost and how much power will it consume? They're still developing this thing, so why do I keep hearing that it will BLOW MY MIND? I have no doubt that Intel has an army of capable engineers that could build something to render graphics great, but if it costs more than the consumer can possibly pay, there's no real point. Intel is gunning for 2 TFLOPs. I'm pretty sure the Radeon HD 4870 passes that mark already (and you can purchase it for less than $500). Sure, it's a cool technology, but I'd like to see some more facts and figures.

    What have I heard? Power usage/heat: 300W TDP [fudzilla.com]. That's pretty horrific. Cost: 12-layer PCB [fudzilla.com]. That's twice the typical graphics card [xbitlabs.com] and four more than the high-end Radeon and nForce cards. That doesn't directly translate into cost, but generally more complicated equals more expensive.

    But back to the PS4 -- Sony's real mistake with the PS3 was expecting the Cell processor to be the most incredible computing device ever. Original plans for the PS3 included 2 Cell processors, but they changed to the RSX when they realized the Cell wasn't capable of rendering graphics like they wanted to (whereas the XBox 360's architecture [hotchips.org] was designed with the GPU and CPU co-existing from the start). You can't build a bunch of fast parts and stick them together, you have to build a fast system. Perhaps Sony has learned their lesson.

  • by bluefoxlucid ( 723572 ) on Friday February 06, 2009 @07:35PM (#26759517) Homepage Journal
    It's designed to be used as a GPU, but the GPU functionality is bolt-on. It's a set of general-purpose CPU cores with a GPU interface on top-- in other words, it's a software-emulated GPU, with lots and lots of CPUs, multi-threaded, and the software is stored in hardware (firmware microcode whatever). Instead of using hardware capable of tackling the problem quickly in minimal die space and power consumption, they used GP hardware and threw lots of it at the problem. This is the concern.
  • by FatherOfONe ( 515801 ) on Friday February 06, 2009 @11:56PM (#26761487)

    This could be a long post, but to sum it up as best I can...

    When Sony started to think about the PS3 they talked to their TOP developers and asked them what it needed to be able to do. We can assume those companies were their internal development staff, Konomi, Square, whomever makes GT. Those companies wanted a TON more performance and as such Sony couldn't just deliver a beefed up PS2. Sony mentioned in one of their articles that these developers demanded close to 100X the performance of the PS2. Sony did their best (obviously not 100x). Also remember that Sony was sitting on the absolute dominate console at the time and they were use to forcing developers to learn new stuff. The PS2 was and is not fun to program for BUT given that there are a ton of them out there companies adapted. They were arrogant in thinking that whatever they made, the developers would be forced to learn. Another point is that the economy wasn't too bad when Sony was designing the PS3. They didn't foresee the economic meltdown that is today.

    Now the whole BlueRay thing. In short Sony wanted to cement BlueRay as the defacto standard and they used the PS3 to do that. This probably cost them this console round, but they sold enough to kill off HD-DVD. This makes it difficult to impossible for the next Xbox or Wii to use anything but BlueRay. Yes Yes Yes downloadable content... It is YEARS off from the mainstream for games and nobody wants to start a nuclear war with Walmart and Gamestop. Add to this the bandwidth caps that the Internet providers are starting to play with and we are decades away from a console with no disk media. However, Sony could have ditched BlueRay and added another 256MB of system RAM and another 256MB of video RAM, all while lowering the price by around $100. In my opinion they could have launched that system a year earlier and have been way farther ahead BUT they wanted BlueRay to win and it did. Was it worth it? Time will tell, but there is currently around 20 million PS3's out the door so almost all 3rd party companies will at least have to support it this generation.

    Sony has the largest 1st and 2nd party support for their console and they have quite a few great games (more this year), so their largest problem now is price. They have to focus like a laser beam to get the price down as fast as possible. My guess (and others) is that a $50 price drop will happen this summer. If Sony can get things under control they may lower it $100 by next Christmas season but I wouldn't hold your breath. Having 20 million consoles sold gives them a lot of room now to just focus on being profitable. Microsoft just fired a bunch of people on their XBOX team (and Zune) and Sony just lost BILLIONS this year. I don't expect anything huge from either one of them this year.

    Having said all this I do agree with your post above but it will be for the PS4. I believe the PS4 will be nothing but a slightly better PS3. The difference is that it will do great 1080P graphics on most games AND it will have a faster BlueRay drive. I could see some weird controller, just because the Wii was so successful with Wii sports. It sure won't cost $600 or $500. If we are lucky it will ditch their weird OS and go with Linux. Knowing Sony it will be their own brand, but Ubuntu would rock on it.

Exceptions prove the rule, and wreck the budget. -- Miller