Intel To Design PlayStation 4 GPU 288
madhatter256 writes "According to the Inquirer it looks like Intel will be designing Sony's next gen console GPU. It will most likely be an off-shoot of the Larrabee GPU architecture. It is also unknown as of yet if Intel will also take part in the CPU design of the console. Due to current economic times it was a no brainer for Sony to go with Intel. " The article also mentions rumors of ATI getting the Xbox3 GPU and, if history is any judge, the Wii2 as well.
Because when I think graphics, I think intel (Score:5, Insightful)
Inquirer bullshit (Score:5, Insightful)
I know Slashdot isn't known for its high standards of editorial oversight, but do you really have to waste our time parroting Inquirer bullshit?
Note that the only media source claiming that Intel is designing the PS4 GPU is The Inquirer, which is wrong more often than they're right. And Sony has explicitly denied [techradar.com] the rumors.
Intel might end up designing the PS4 GPU, who knows. This story doesn't give us any more information than we had before either way, and the headline is certainly wrong in concluding otherwise.
Re:Because when I think graphics, I think intel (Score:5, Insightful)
Re:Because when I think graphics, I think intel (Score:4, Insightful)
Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.
Which wouldn't surprise me in the least, since Sony is more than willing to follow the pack leader to grab more marketshare and force their ill-conceived DRM laden formats on the masses.
Re:Bizarre metaphor (Score:3, Insightful)
History: I put on my robe and judge's wig.
Larrabee as Console GPU (Score:1, Insightful)
Gotta post AC, as I'm on too many NDAs :-(
One distinct advantage Larrabee brings to the console is that it can be GPU and CPU -- they could make a 2 chip console -- Larrabee on one, and IO on the other. That would keep costs under control. They would probably use Larrabee chips where 1 to 4 cores are failing tests -- the console version would advertise 28 cores then. They'd probably also under clock it a bit to keep it cooler.
It would give them huge economies of scale, and get people to write software for it. And it will likely be competitive (at least in raw number crunching/bandwidth) to NVDA or ATI offerings at the same time. The devil is hiding in the details of their software stack. But they do have some good guys working on that. Abrash, for one.
-- A.C.
Re:Because when I think graphics, I think intel (Score:3, Insightful)
Re:im full of jokes today (Score:5, Insightful)
Lucky for nVidia (Score:3, Insightful)
Gee, over the past few years the news has been all about how doing a GPU for a console not only lost you money, but also pulled resources away from the profitable PC market, and the last few exchanges between ATI and nVidia holding first place in that market have been attributed to this.
Intel needs any kind of GPU win, badly, and they're big enough and rich enough they can afford to lose money on each chip and make it up on volume.
It's Sony I worry about, given how utterly appalling Intel GPUs have traditionally been.
So I gotta wonder, why do you think nVidia is worried about this?
Larrabee doesn't exist, yet (Score:3, Insightful)
It's kind of weird to take Larrabee as evidence of Intel having successfully produced a GPU, since they still haven't produced it, despite years of hype. It might turn out to be as excellent as they claim. It might turn out to be as excellent as revolutionary as their last revolutionary new architecture, Itanium.
Re:Because when I think graphics, I think intel (Score:3, Insightful)
Larrabee has 32 cores, so that's all right then.
There's no reason Intel can't make a high-end graphics chip, their fabrication processes alone would give them a huge advantage over ATI/NVIDIA.
If they haven't made one so far it's because they're not really interested. They already sell more graphics chips than the competition so why bother?
The market for top-of-the-range graphics cards is pretty small. ATI/NVIDIA make way more money from their $50 cards than their $500 cards.
Re:im full of jokes today (Score:3, Insightful)
I think that they made a huge mistake by not keeping the same architecture with the PS3 as the PS2. If had kept the same architecture, while only increasing the processor speed and the graphics chip (while still adding new opengl lighting and shading effects, etc.), they could have easily made the PS3 FORWARD compatible, like many of the Xbox games are. Not only that, but every developer on earth knows how to program for the PS2. That would have made it so easy for studios to continue designing games in the PS2 fashion, while simply increase polygon count and textures. I think that the familiarity with the system would really have given Sony equality to designing for the 360, which almost all developers say is easier because of Visual Studios as well as the rest of the developers tools that MS provides. Never underestimate familiarity
Example. You can render PS1 games on an old PC (hell my old PocketPC Dell axim could render PS1 games near full speed) at higher resolutions (like 1280x960) while getting rid of the boxiness of the textures, and the games look much better. The PS2 emulator for the PC (which only has a handful of games that run at full speed) can make some PS2 games look better than they did on the PS2, because it's rendered at a higher resolution. The PS3, if they had kept the same architecture as the PS2, could have EASILY upscaled EVERY PS2 game made, breathing new life into people library of old games. Who wouldn't want to replay god of war if the resolution was sharpened at 720p? Seriously, the upscaling the the PS3, (simple image anti-aliasing) doesn't really increase the picture quality much, if at all (some games actually look worse).
Anyways, they didn't do that, and I believe it was a huge mistake. However, I hope they rectify that mistake with the next generation, and make the PS4 FORWARD compatible with my PS3 games. Let me play my PS3 games upscaled at 1080p!. That would be great. I realize that the textures don't get sharper, and that the effects don't increase, but I feel that it would be easier to attract new customer and keep existing customers if they knew all their old games would continue to work forever, and would even look better than they did before!
My 2 cents. If someone who really knows about programming could explain how this idea isn't feasible/cost effective/a good idea, please explain.