Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware Entertainment Games

Intel To Design PlayStation 4 GPU 288

madhatter256 writes "According to the Inquirer it looks like Intel will be designing Sony's next gen console GPU. It will most likely be an off-shoot of the Larrabee GPU architecture. It is also unknown as of yet if Intel will also take part in the CPU design of the console. Due to current economic times it was a no brainer for Sony to go with Intel. " The article also mentions rumors of ATI getting the Xbox3 GPU and, if history is any judge, the Wii2 as well.
This discussion has been archived. No new comments can be posted.

Intel To Design PlayStation 4 GPU

Comments Filter:
  • by scubamage ( 727538 ) on Friday February 06, 2009 @04:48PM (#26757393)
    Seriously - about the only thing intel graphics offers is raytracing. Their graphics chipsets are notoriously subpar, even the very best of them. Why would sony send it their way? ATI makes sense for the Wii2 since they've been working with the gamecube platform since its inception... but intel? Can someone clear this up? Do they have some magically awesome chipset that has never graced the consumer market?
  • Inquirer bullshit (Score:5, Insightful)

    by Trepidity ( 597 ) <delirium-slashdot@@@hackish...org> on Friday February 06, 2009 @04:49PM (#26757405)

    I know Slashdot isn't known for its high standards of editorial oversight, but do you really have to waste our time parroting Inquirer bullshit?

    Note that the only media source claiming that Intel is designing the PS4 GPU is The Inquirer, which is wrong more often than they're right. And Sony has explicitly denied [techradar.com] the rumors.

    Intel might end up designing the PS4 GPU, who knows. This story doesn't give us any more information than we had before either way, and the headline is certainly wrong in concluding otherwise.

  • by Kneo24 ( 688412 ) on Friday February 06, 2009 @04:54PM (#26757455)
    Which is precisely why I think this story is bullshit. No gaming machine, whether it be console or PC, will want an Intel GPU as it's workhorse for graphics. It just isn't possible. Not today. Probably not in the near future either. Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.
  • by LordKaT ( 619540 ) on Friday February 06, 2009 @04:58PM (#26757523) Homepage Journal

    Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.

    Which wouldn't surprise me in the least, since Sony is more than willing to follow the pack leader to grab more marketshare and force their ill-conceived DRM laden formats on the masses.

  • by Gizzmonic ( 412910 ) on Friday February 06, 2009 @05:02PM (#26757597) Homepage Journal

    History: I put on my robe and judge's wig.

  • by Anonymous Coward on Friday February 06, 2009 @05:08PM (#26757687)

    Gotta post AC, as I'm on too many NDAs :-(

    One distinct advantage Larrabee brings to the console is that it can be GPU and CPU -- they could make a 2 chip console -- Larrabee on one, and IO on the other. That would keep costs under control. They would probably use Larrabee chips where 1 to 4 cores are failing tests -- the console version would advertise 28 cores then. They'd probably also under clock it a bit to keep it cooler.

    It would give them huge economies of scale, and get people to write software for it. And it will likely be competitive (at least in raw number crunching/bandwidth) to NVDA or ATI offerings at the same time. The devil is hiding in the details of their software stack. But they do have some good guys working on that. Abrash, for one.

    -- A.C.

  • by afidel ( 530433 ) on Friday February 06, 2009 @05:14PM (#26757771)
    Larabee is for raytraced graphics which requires lots of processors more powerful than those found of todays GPU's and more complex interactions between functional units which are strengths Intel has. Beyond that Intel is all but guaranteed to come through this depression whereas the other two GPU houses are very questionable. Finally Intel has the best fab processes in the world so they can pack more units into a given die budget then anyone else.
  • by rcuhljr ( 1132713 ) on Friday February 06, 2009 @05:26PM (#26757915)
    Shouldn't they put out games for the PS3 before they make the PS4?
  • Lucky for nVidia (Score:3, Insightful)

    by argent ( 18001 ) <peter@slashdot . ... t a r o nga.com> on Friday February 06, 2009 @05:35PM (#26758035) Homepage Journal

    Gee, over the past few years the news has been all about how doing a GPU for a console not only lost you money, but also pulled resources away from the profitable PC market, and the last few exchanges between ATI and nVidia holding first place in that market have been attributed to this.

    Intel needs any kind of GPU win, badly, and they're big enough and rich enough they can afford to lose money on each chip and make it up on volume.

    It's Sony I worry about, given how utterly appalling Intel GPUs have traditionally been.

    So I gotta wonder, why do you think nVidia is worried about this?

  • by Trepidity ( 597 ) <delirium-slashdot@@@hackish...org> on Friday February 06, 2009 @05:47PM (#26758197)

    It's kind of weird to take Larrabee as evidence of Intel having successfully produced a GPU, since they still haven't produced it, despite years of hype. It might turn out to be as excellent as they claim. It might turn out to be as excellent as revolutionary as their last revolutionary new architecture, Itanium.

  • by Joce640k ( 829181 ) on Friday February 06, 2009 @06:07PM (#26758481) Homepage

    Larrabee has 32 cores, so that's all right then.

    There's no reason Intel can't make a high-end graphics chip, their fabrication processes alone would give them a huge advantage over ATI/NVIDIA.

    If they haven't made one so far it's because they're not really interested. They already sell more graphics chips than the competition so why bother?

    The market for top-of-the-range graphics cards is pretty small. ATI/NVIDIA make way more money from their $50 cards than their $500 cards.

  • by wamerocity ( 1106155 ) on Friday February 06, 2009 @07:47PM (#26759645) Journal
    What I really hope they do is what they should have done with the PS3 in the first place.

    I think that they made a huge mistake by not keeping the same architecture with the PS3 as the PS2. If had kept the same architecture, while only increasing the processor speed and the graphics chip (while still adding new opengl lighting and shading effects, etc.), they could have easily made the PS3 FORWARD compatible, like many of the Xbox games are. Not only that, but every developer on earth knows how to program for the PS2. That would have made it so easy for studios to continue designing games in the PS2 fashion, while simply increase polygon count and textures. I think that the familiarity with the system would really have given Sony equality to designing for the 360, which almost all developers say is easier because of Visual Studios as well as the rest of the developers tools that MS provides. Never underestimate familiarity

    Example. You can render PS1 games on an old PC (hell my old PocketPC Dell axim could render PS1 games near full speed) at higher resolutions (like 1280x960) while getting rid of the boxiness of the textures, and the games look much better. The PS2 emulator for the PC (which only has a handful of games that run at full speed) can make some PS2 games look better than they did on the PS2, because it's rendered at a higher resolution. The PS3, if they had kept the same architecture as the PS2, could have EASILY upscaled EVERY PS2 game made, breathing new life into people library of old games. Who wouldn't want to replay god of war if the resolution was sharpened at 720p? Seriously, the upscaling the the PS3, (simple image anti-aliasing) doesn't really increase the picture quality much, if at all (some games actually look worse).

    Anyways, they didn't do that, and I believe it was a huge mistake. However, I hope they rectify that mistake with the next generation, and make the PS4 FORWARD compatible with my PS3 games. Let me play my PS3 games upscaled at 1080p!. That would be great. I realize that the textures don't get sharper, and that the effects don't increase, but I feel that it would be easier to attract new customer and keep existing customers if they knew all their old games would continue to work forever, and would even look better than they did before!

    My 2 cents. If someone who really knows about programming could explain how this idea isn't feasible/cost effective/a good idea, please explain.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...