Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware Entertainment Games

Intel To Design PlayStation 4 GPU 288

madhatter256 writes "According to the Inquirer it looks like Intel will be designing Sony's next gen console GPU. It will most likely be an off-shoot of the Larrabee GPU architecture. It is also unknown as of yet if Intel will also take part in the CPU design of the console. Due to current economic times it was a no brainer for Sony to go with Intel. " The article also mentions rumors of ATI getting the Xbox3 GPU and, if history is any judge, the Wii2 as well.
This discussion has been archived. No new comments can be posted.

Intel To Design PlayStation 4 GPU

Comments Filter:
  • by RyanFenton ( 230700 ) on Friday February 06, 2009 @03:47PM (#26757367)

    >> Wii2

    Sheesh - The proper term is WiiAlso.

  • Xbox3 and Wii2? (Score:3, Interesting)

    by wjh31 ( 1372867 ) on Friday February 06, 2009 @03:47PM (#26757371) Homepage
    are these confirmed names or assumed names?
  • by scubamage ( 727538 ) on Friday February 06, 2009 @03:48PM (#26757393)
    Seriously - about the only thing intel graphics offers is raytracing. Their graphics chipsets are notoriously subpar, even the very best of them. Why would sony send it their way? ATI makes sense for the Wii2 since they've been working with the gamecube platform since its inception... but intel? Can someone clear this up? Do they have some magically awesome chipset that has never graced the consumer market?
    • by Kneo24 ( 688412 ) on Friday February 06, 2009 @03:54PM (#26757455)
      Which is precisely why I think this story is bullshit. No gaming machine, whether it be console or PC, will want an Intel GPU as it's workhorse for graphics. It just isn't possible. Not today. Probably not in the near future either. Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.
      • by LordKaT ( 619540 ) on Friday February 06, 2009 @03:58PM (#26757523) Homepage Journal

        Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.

        Which wouldn't surprise me in the least, since Sony is more than willing to follow the pack leader to grab more marketshare and force their ill-conceived DRM laden formats on the masses.

        • Re: (Score:3, Informative)

          by grantek ( 979387 )

          Well the Cell was a bit out there when it was conceived, and Larrabee's sort of in that position now. I guess Sony is trying to take the bad press that came from the Cell being "too difficult to code for" and going with it, still maintaining that multicore is the way to scale up performance. Good on 'em, I say (despite my overall negative feelings toward the company).

          • Re: (Score:3, Informative)

            by LordKaT ( 619540 )

            The whole "cell is too hard to program for" bullshit was just a symptom of a larger industry-wide problem: education simply doesn't cover multi-threaded resource sharing nearly as well as it needs to.

            I was speaking more about everything Sony has done since the walkman:

            CD burners are too expensive and complicated, you say? Use our MD players, they record like a tape deck, but have the capacity of a CD in our proprietary format!

            That digital camera too complicated? Use our sleek (if poorly engineered) alternat

        • by theaceoffire ( 1053556 ) on Friday February 06, 2009 @06:41PM (#26759587) Homepage
          Out of the three consoles, Sony is the only one who lets you use a browser with flash, use standardized cords and hard drives, use generic keyboards / mice/ tablets/ printers/ cameras/ etc, and play almost any format video off the disk.

          They even allow you to install another OS on their system. Compared to this, it is MS and Nintendo who are "Forcing their ill-conceived DRM laden formats" on the masses.

          Unless you are talking strictly about Blu-Ray instead of Hardware. Don't know why that one would bother anyone, since DVD's and CD's also have DRM but no one seems worried about that.
      • by ByOhTek ( 1181381 ) on Friday February 06, 2009 @04:14PM (#26757765) Journal

        wouldn't that be a Wii^2?

        Anyway, I think part of the reason Intels offerings suck, is they are going for the majority of the graphics market - integrated. They aren't really trying for powerhouse GPUs.

        I'm not saying this is a sure fire thing, but as a rough indication, compare their primary competitor that does try for powerhouse GPUs and CPUs. Look at how they perform on the former with their competitors, and the latter with Intel.

        If intel decides to make a performance GPU, it might actually work.

        Add to that the fact that, using a static architecture, consoles don't /need/ the raw power of memory, CPU or GPU that a gaming computer needs, I think Intel has the potential to provide a very reasonable GPU for a console.

        • Re: (Score:2, Interesting)

          and you're going to let them 'beta test' their high-end GPU (with absolutely zero track-record) in your flagship gaming console?
        • wouldn't that be a Wii^2?

          So, what, the one after that would be a Wii Cube? Talk about re-using names...

      • Re: (Score:3, Insightful)

        by afidel ( 530433 )
        Larabee is for raytraced graphics which requires lots of processors more powerful than those found of todays GPU's and more complex interactions between functional units which are strengths Intel has. Beyond that Intel is all but guaranteed to come through this depression whereas the other two GPU houses are very questionable. Finally Intel has the best fab processes in the world so they can pack more units into a given die budget then anyone else.
      • by nschubach ( 922175 ) on Friday February 06, 2009 @04:32PM (#26757991) Journal

        Which is precisely why I think this story is bullshit.

        This helps too: http://www.techradar.com/news/gaming/sony-shoots-down-intel-gpu-in-ps4-rumours-525563 [techradar.com]

      • Intel can make a graphics chipset. They have the capital to do it, and the equipment. The best nVidia CPU uses a 65nm process. Intel already has a ton of 45nm out there and has successfully tests 35nm. They just don't want to pay for the R&D to do it. Blowing a ton of dough to build a graphics chip just not that big enough of a market for them, at least until Sony came along.

        Now, they will partner with Sony, get extensive experience in graphics, and can leverage their own extensive design experience

      • Maybe they just want to be the cheaper console next time around.

    • by Guspaz ( 556486 ) on Friday February 06, 2009 @03:55PM (#26757475)

      Do they have some magically awesome chipset that has never graced the consumer market?

      Yes, Larrabee [anandtech.com]. It's a massively-multicored x86 processor designed to act as a GPU (it has some fixed-function GPU stuff tacked on).

      In effect, Intel intends to build a GPU powerful enough to get software rendering (and all the flexibility and power that brings) up to the same speed as hardware-accelerated rendering. Intel is also going to be providing OpenGL/Direct3D abstraction layers so that existing games can work.

      Larrabee is expected to at least be competitive with nVidia/AMD's stuff, although it might not be until the second generation product before they're on equal footing.

      • Re: (Score:2, Interesting)

        es, Larrabee [anandtech.com]. It's a massively-multicored x86 processor designed to act as a GPU (it has some fixed-function GPU stuff tacked on).

        So they are taking a general-purpose CPU and using is as GPU? Now why do I get the feeling that this is going to be suboptimal compared to a specialised processor? Also, I would hate to see the power consumption for the graphics capability got out of it.

        Based on what Intel has released thus far, I will not believe it until I see it.

      • by Creepy ( 93888 )

        Sony is putting in a huge gamble here - first of all, without MS as a partner, they will need to develop their own graphics drivers or use whatever proprietary drivers Intel develops for them, which will make porting to other platforms a pain (if possible at all). MS is the only platform that has announced they are writing real time raytracing drivers (in the DX11 API due in March).

        Raytracing requires high memory bandwidth because it needs to be scene aware - that means Sony will likely be using the newest

        • Developers will need to rewrite core libraries or purchase them. Want soft shadows? Buy it or re-develop in house because it isn't a default ray tracing feature and requires casting more (expensive) rays.

          Don't make the assumption that Larrabee is only a ray-tracing engine. It should be able to do traditional polygonal-based rasterization as well. That being said, the entire point is that developers are not locked into using a pre-packaged graphics package. Creating the system out of programmable components is to allow developers to write their own pipelines and graphics engines. This is the direction GPUs are moving toward anyways, with shader languages that run on, in essence, many small, specialized C

      • Re: (Score:2, Interesting)

        by ASBands ( 1087159 )

        Larrabee is expected to at least be competitive with nVidia/AMD's stuff, although it might not be until the second generation product before they're on equal footing.

        Competitiveness is not a quality of generation number. Still: What statistics have you seen that compare Larrabee and something people use right now (ATI/nVidia)? There is this presentation [intel.com] (PDF) they made at SIGGRAPH, which shows that performance increases as you add more Larrabee cores. Here's a graph [wikipedia.org] which may mean something. The y-axis

    • What a lot of people fail to realize is that Intel GPU's are made with power consumption and heat generation in mind, not playing the latest and greatest 3D engine. If they oriented themselves towards pursuing high end 3D gaming, who knows what would happen?
      • by jandrese ( 485 )
        My guess is that if they tried to go for the high end GPU market they would release a couple of powerful but flawed chips before they finally got it right. Unfortunately, flaws that get in a console can't be corrected until the next generation of the console 5-7 years later.

        That said, console developers are accustomed to having to work around hardware flaws, sometimes quite severe, to get their games working. One thing seems certain: Sony is going to skimp on the memory again (for the fourth time) and
    • One of the reasons mentioned in the article is that Sony views Intel as more financially stable than nVidia. Another reason is that there is no bad blood between Intel and Sony whereas there seems to be issues between Sony and nVidia. But I agree with your sentiment that technically, Intel has not shown any real prowess in this area.
      • it seems to me nVidia is in fine shape. besides the PS3 currently uses OpenGL ES for its 3-D graphics and Nvidia would seem to be the perfect choice
    • Problem is that console producers must already have at least broad idea what will next gen look like. We can assume in 5 years from now not only Intel, but also AMD and nVidia will have ray tracing cards. But Sony needs something *right now* that they can build their HW and SW around. Intel is currently only one with not only idea how to make ray tracing hw/sw engine but also working prototypes of Larabee they can provide to Sony soon. They might go with 2-chip solution: Larabee for graphics + normal 4-way

  • Inquirer bullshit (Score:5, Insightful)

    by Trepidity ( 597 ) <[delirium-slashdot] [at] [hackish.org]> on Friday February 06, 2009 @03:49PM (#26757405)

    I know Slashdot isn't known for its high standards of editorial oversight, but do you really have to waste our time parroting Inquirer bullshit?

    Note that the only media source claiming that Intel is designing the PS4 GPU is The Inquirer, which is wrong more often than they're right. And Sony has explicitly denied [techradar.com] the rumors.

    Intel might end up designing the PS4 GPU, who knows. This story doesn't give us any more information than we had before either way, and the headline is certainly wrong in concluding otherwise.

  • by Bruce Perens ( 3872 ) * <bruce@perens.com> on Friday February 06, 2009 @03:54PM (#26757457) Homepage Journal
    Remember that Intel has been willing to open its drivers and specifications. Any improvements they make to their graphics core (which, yes, is in need of improvement) will probably make their way to motherboard and laptop chipsets, and will have nice Open Source Linux drivers made for them. So, this is a good thing.
  • Cell? (Score:4, Interesting)

    by Major Blud ( 789630 ) on Friday February 06, 2009 @03:55PM (#26757469) Homepage
    So after all the smack talking that Sony did about the power of Cell being untapped....they've decided to abandon it for the their next console? But if you listen to Sony, PS3 is a "10 year platform" anyways. This means that we wouldn't see the PS4 until at least 5-6 years from now. There is no telling what kind of processors would be available during that time frame. Do we really know if Larrabee would still be available by then? I think it's still way to early for Sony to start talking about specs for the PS4. Some people are bound to stop buying PS3s because it would just mean that the PS4 is right around the corner, and Sony knows this. They really can't afford to halt sales of the PS3 at their current selling rate.
    • Well, the power of cell being untapped was a problem, wasn't it. Parallel programming is difficult.
    • Cell is being tapped right now. Have you seen Motorstorm 2? Killzone 2? The Cell is finally being properly programmed and it looks great. It finally is doing things Xbox 360 can't do (about time!). It definitely will be tapped out by the time PS4 comes out.

      You don't understand the 10 year platform thing. It doesn't mean 10 years between consoles, it means that there are two active consoles at a time. Like PS1 overlapped with PS2 for 4 years. Like PS2 right now has game releases weekly.

      Sony isn't talking spe

    • I have more of a feeling that this is more FUD spread by someone on the anti-Playstation camp. Just like all the rumors on price drops and such. If you spread news that Sony is abandoning the current gen system for a whole new platform, why buy one?

      Sony wouldn't announce that. It smells like some kind of marketing scam.

  • It'll be interesting to see what the new consoles come up with (I have a hunch it'll entail more hard drive space, better graphics and processors) If the console makers actually use full keyboard and mouse support for their new consoles they could do a lot of damage to the pc gaming market. I mean A LOT. Really, I mean it.
    • by Zakabog ( 603757 )

      If the console makers actually use full keyboard and mouse support for their new consoles they could do a lot of damage to the pc gaming market. I mean A LOT. Really, I mean it.

      You do realize that a lot more software is sold for consoles than PCs right? The console gaming market is huge, mostly because you just buy a console and any game out for that console will work out of the box. I haven't bought a PC game in a long time, but I'm always buying new games for my PS3 or Wii. Plus I'm the only person I know that has a computer capable of playing the latest games (most people I know have a cheap dell with integrated video that couldn't handle something as simple and old as Quake II

  • by NullProg ( 70833 ) on Friday February 06, 2009 @04:10PM (#26757705) Homepage Journal

    From the article:
    How Sony inadvertently helped a competitor and lost position in the videogame market.

    Read here: http://online.wsj.com/article/SB123069467545545011.html [wsj.com]

    Enjoy,

    • Re: (Score:2, Informative)

      by weffew... ( 954080 )

      The point of that WSJ piece is that the Xbox360 CPU and the PS3 CPU are the same because they both come from IBM?

      It's bull. The xbox360 CPU is a totally different architecture of PowerPC. I'm amazed you posted that.

      C

  • "...also take part in the CPU design of the console"

    Wait... Why did Sony spend all that time, money and research on making their assumed super-scaling, awesomely powerful cell processor, if they're thinking of recreating a new CPU for their next console? Am I missing something there?

  • Lucky for nVidia (Score:3, Insightful)

    by argent ( 18001 ) <peter@NOsPAm.slashdot.2006.taronga.com> on Friday February 06, 2009 @04:35PM (#26758035) Homepage Journal

    Gee, over the past few years the news has been all about how doing a GPU for a console not only lost you money, but also pulled resources away from the profitable PC market, and the last few exchanges between ATI and nVidia holding first place in that market have been attributed to this.

    Intel needs any kind of GPU win, badly, and they're big enough and rich enough they can afford to lose money on each chip and make it up on volume.

    It's Sony I worry about, given how utterly appalling Intel GPUs have traditionally been.

    So I gotta wonder, why do you think nVidia is worried about this?

  • Just a guess. But it looks like backward compatibility just got hosed if they go to all Intel. There was such a gap in power that the PS3 was able to emulate the PS2 games on a completely different arch. I'm guessing that emulating the Cell Broadband Processor will be much harder if possible at all on Larrabee. This might be the final nail in Sony's coffin after recently having a 95% drop in profits. I have a PS3, and a 60" Sony SXRD display but damn do I hate Sony. It is interesting that they went wit
  • Intel is going design their GPU... that's nice. I suppose they're aiming to find a way to waste more money and bomb harder than the PS3... if Sony can outdo themselves this time, they'll never have to make another console again!

  • Glad to see Sony continues to make bad decisions.

  • by CopaceticOpus ( 965603 ) on Friday February 06, 2009 @05:00PM (#26758395)

    I'm expecting to see the PS4 come out at least 3 years after the Xbox3. For one thing, the PS3 has technical advantages over the 360 which should give it more legs. Sony designed the PS3 with a target of a ten year lifespan.

    Also, Sony is really stinging from the cost of trying to compete with a same-generation Xbox. They should be able to hit a sweet spot by spacing their machine a half a generation away. When the Xbox3 is released, Sony can drop their PS3 prices to very low levels, and capture the large, budget-minded segment of the market. After 3 years, once the Xbox3's newness has worn off, Sony can release a system which is technically unmatched, and which Microsoft won't be able to respond to for another 2-3 years.

    Anyway, that's what I'd do if I ran Sony. It will be interesting to see how it plays out.

  • by n3tcat ( 664243 )
    I thought PS3 was supposed to be a "10 year" console...

    I really hope they aren't already working on the PS4, wasting money in this economic client on R&D of this nature almost seems brutally wasteful.
  • What IDIOCY! (Score:5, Informative)

    by seebs ( 15766 ) on Friday February 06, 2009 @05:57PM (#26759063) Homepage

    No, not the article.

    The editor.

    First off, Sony denied this already -- yesterday [techradar.com]. So this isn't news, and it's already-rejected news.

    Secondly, what kind of idiot links to the Inquirer as a source? Remember, they're the ones who posted the article claiming the PS3 was "slow and broken" because they didn't understand a memory bandwidth chart.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...