Intel To Design PlayStation 4 GPU 288
madhatter256 writes "According to the Inquirer it looks like Intel will be designing Sony's next gen console GPU. It will most likely be an off-shoot of the Larrabee GPU architecture. It is also unknown as of yet if Intel will also take part in the CPU design of the console. Due to current economic times it was a no brainer for Sony to go with Intel. " The article also mentions rumors of ATI getting the Xbox3 GPU and, if history is any judge, the Wii2 as well.
Grammar Junta, attack! (Score:5, Funny)
>> Wii2
Sheesh - The proper term is WiiAlso.
Re:Grammar Junta, attack! (Score:5, Funny)
>> Wii2
Sheesh - The proper term is WiiAlso.
Not the WiiWii?
Re:Grammar Junta, attack! (Score:4, Funny)
Not the WiiWii?
That's a limited european marketing name. And it's spelled OuiiOuii.
Re:Grammar Junta, attack! (Score:4, Funny)
That's a limited european marketing name. And it's spelled OuiiOuii.
And the version for playstation fanbois will be the ennui.
Re: (Score:3, Funny)
ennuii.
Re:Grammar Junta, attack! (Score:5, Funny)
Re: (Score:2)
>> Wii2
Sheesh - The proper term is WiiAlso.
And the Xbox3? It's also kind of confusing that they're moving back 357 versions of the Xbox in the naming convention. Makes one wonder if they are still working on those other 357 prototypes. Fail early, fail often I guess.
Re:Grammar Junta, attack! (Score:4, Funny)
I believe it will be:
Xbox 3 Basic
Xbox 3 Home
Xbox 3 Media Center
Xbox 3 Premium
Xbox 3 Business
Xbox 3 Ultimate
Xbox 3 Ultimate - Halo Edition
Xbox 3 Ultimate XTreme Turbo Black
The will all do exactly the same thing.
Re: (Score:3, Funny)
I believe it will be:
The will all do exactly the same thing.
You must be referring to a red ring of sorts.
Re: (Score:3, Funny)
Is that like the MacBook Pro Black edition, now only $200 for the privilege of us changing the pigment color in the injection mould!
No, that's only $200 for the priviledge of you owning a computer that has the same color as The Lord's Own Turtlenecks. In addition, highly scientific studies have shown that having a black Mac, instead of a regular color Mac, will result in you attracting 150% more chicks when you sit sipping $15 soymilkchocolatelattefrappuchinomochaorangemintlattes while writing the great next american novel at your local coffeshop.
Re:Grammar Junta, attack! (Score:5, Funny)
And the Xbox3
They meant the XBOX 129600. The correct formula for Xbox naming is XBOX 360^(N-1) where N is the generation number.
Re: (Score:2)
Re: (Score:2)
Perhaps one could use Roman numerals: WiiII?
Re: (Score:2)
WiiAgain
Re:Grammar Junta, attack! (Score:5, Funny)
After the Wii2 comes the Wii3... which comes in a special "R" edition with a digital video recorder, and also comes packaged with the ever popular game, Kings of Orient, making it the Wii3/Kings of Orient/R.
Re: (Score:2)
After the Wii2 comes the Wii3... which comes in a special "R" edition with a digital video recorder, and also comes packaged with the ever popular game, Kings of Orient, making it the Wii3/Kings of Orient/R.
That would've been funnier about a month and a half ago. =P
Re: (Score:3, Funny)
Re: (Score:3, Funny)
Xbox3 and Wii2? (Score:3, Interesting)
Re:Xbox3 and Wii2? (Score:5, Funny)
Yes.
Re: (Score:2)
Re: (Score:2)
They are not official names - but they indicate to all readers which consoles are being referred to.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Xbox3 and Wii2? (Score:5, Interesting)
The obvious follow-up to the Wii would be the Super Wii.
Re: (Score:2)
The obvious follow-up to the Wii would be the Super Wii.
Is that what happens after you drank a lot of beer?
(I know I'm bringing back the jokes from 3 years ago, but I can't resist).
Re: (Score:2)
I love that game, you insensitive clod!
Played it all the time. I finally acquired the original Zelda game less than 10 years ago.
Because when I think graphics, I think intel (Score:5, Insightful)
Re:Because when I think graphics, I think intel (Score:5, Insightful)
Re:Because when I think graphics, I think intel (Score:4, Insightful)
Unless, however, they plan on making the PS4 some super casual console that doesn't need a lot of oomph for their up and coming stick figure games.
Which wouldn't surprise me in the least, since Sony is more than willing to follow the pack leader to grab more marketshare and force their ill-conceived DRM laden formats on the masses.
Re: (Score:3, Informative)
Well the Cell was a bit out there when it was conceived, and Larrabee's sort of in that position now. I guess Sony is trying to take the bad press that came from the Cell being "too difficult to code for" and going with it, still maintaining that multicore is the way to scale up performance. Good on 'em, I say (despite my overall negative feelings toward the company).
Re: (Score:3, Informative)
The whole "cell is too hard to program for" bullshit was just a symptom of a larger industry-wide problem: education simply doesn't cover multi-threaded resource sharing nearly as well as it needs to.
I was speaking more about everything Sony has done since the walkman:
CD burners are too expensive and complicated, you say? Use our MD players, they record like a tape deck, but have the capacity of a CD in our proprietary format!
That digital camera too complicated? Use our sleek (if poorly engineered) alternat
Re:Because when I think graphics, I think intel (Score:5, Informative)
They even allow you to install another OS on their system. Compared to this, it is MS and Nintendo who are "Forcing their ill-conceived DRM laden formats" on the masses.
Unless you are talking strictly about Blu-Ray instead of Hardware. Don't know why that one would bother anyone, since DVD's and CD's also have DRM but no one seems worried about that.
Re:Because when I think graphics, I think intel (Score:4, Interesting)
wouldn't that be a Wii^2?
Anyway, I think part of the reason Intels offerings suck, is they are going for the majority of the graphics market - integrated. They aren't really trying for powerhouse GPUs.
I'm not saying this is a sure fire thing, but as a rough indication, compare their primary competitor that does try for powerhouse GPUs and CPUs. Look at how they perform on the former with their competitors, and the latter with Intel.
If intel decides to make a performance GPU, it might actually work.
Add to that the fact that, using a static architecture, consoles don't /need/ the raw power of memory, CPU or GPU that a gaming computer needs, I think Intel has the potential to provide a very reasonable GPU for a console.
Re: (Score:2, Interesting)
Re: (Score:2)
wouldn't that be a Wii^2?
So, what, the one after that would be a Wii Cube? Talk about re-using names...
Re: (Score:3, Insightful)
Re:Because when I think graphics, I think intel (Score:5, Informative)
Which is precisely why I think this story is bullshit.
This helps too: http://www.techradar.com/news/gaming/sony-shoots-down-intel-gpu-in-ps4-rumours-525563 [techradar.com]
Re:Because when I think graphics, I think intel (Score:5, Informative)
It's that process size. (Score:2)
Intel can make a graphics chipset. They have the capital to do it, and the equipment. The best nVidia CPU uses a 65nm process. Intel already has a ton of 45nm out there and has successfully tests 35nm. They just don't want to pay for the R&D to do it. Blowing a ton of dough to build a graphics chip just not that big enough of a market for them, at least until Sony came along.
Now, they will partner with Sony, get extensive experience in graphics, and can leverage their own extensive design experience
Re: (Score:2)
Maybe they just want to be the cheaper console next time around.
Re:Because when I think graphics, I think intel (Score:5, Informative)
Do they have some magically awesome chipset that has never graced the consumer market?
Yes, Larrabee [anandtech.com]. It's a massively-multicored x86 processor designed to act as a GPU (it has some fixed-function GPU stuff tacked on).
In effect, Intel intends to build a GPU powerful enough to get software rendering (and all the flexibility and power that brings) up to the same speed as hardware-accelerated rendering. Intel is also going to be providing OpenGL/Direct3D abstraction layers so that existing games can work.
Larrabee is expected to at least be competitive with nVidia/AMD's stuff, although it might not be until the second generation product before they're on equal footing.
Re: (Score:2, Interesting)
es, Larrabee [anandtech.com]. It's a massively-multicored x86 processor designed to act as a GPU (it has some fixed-function GPU stuff tacked on).
So they are taking a general-purpose CPU and using is as GPU? Now why do I get the feeling that this is going to be suboptimal compared to a specialised processor? Also, I would hate to see the power consumption for the graphics capability got out of it.
Based on what Intel has released thus far, I will not believe it until I see it.
Re: (Score:2)
I don't think performance will be suboptimal, but my worries are parallel to yours on power.
One of the major reasons for using a specialized chip is that you don't waste energy on the GP stuff you don't need.
Re:Because when I think graphics, I think intel (Score:4, Informative)
No.
It's designed to be used as a GPU:
http://en.wikipedia.org/wiki/Larrabee_(GPU) [wikipedia.org]
looks really nice. I am looking forward to seeing what the finished product could do. The graphics market could use another competitor at the high consumer end.
Re: (Score:2)
Sony is putting in a huge gamble here - first of all, without MS as a partner, they will need to develop their own graphics drivers or use whatever proprietary drivers Intel develops for them, which will make porting to other platforms a pain (if possible at all). MS is the only platform that has announced they are writing real time raytracing drivers (in the DX11 API due in March).
Raytracing requires high memory bandwidth because it needs to be scene aware - that means Sony will likely be using the newest
Re: (Score:2)
Developers will need to rewrite core libraries or purchase them. Want soft shadows? Buy it or re-develop in house because it isn't a default ray tracing feature and requires casting more (expensive) rays.
Don't make the assumption that Larrabee is only a ray-tracing engine. It should be able to do traditional polygonal-based rasterization as well. That being said, the entire point is that developers are not locked into using a pre-packaged graphics package. Creating the system out of programmable components is to allow developers to write their own pipelines and graphics engines. This is the direction GPUs are moving toward anyways, with shader languages that run on, in essence, many small, specialized C
Re: (Score:2, Interesting)
Competitiveness is not a quality of generation number. Still: What statistics have you seen that compare Larrabee and something people use right now (ATI/nVidia)? There is this presentation [intel.com] (PDF) they made at SIGGRAPH, which shows that performance increases as you add more Larrabee cores. Here's a graph [wikipedia.org] which may mean something. The y-axis
Re: (Score:2)
Re: (Score:2)
That said, console developers are accustomed to having to work around hardware flaws, sometimes quite severe, to get their games working. One thing seems certain: Sony is going to skimp on the memory again (for the fourth time) and
Re: (Score:2)
Re: (Score:2)
Two words: Ray tracing (Score:2)
Problem is that console producers must already have at least broad idea what will next gen look like. We can assume in 5 years from now not only Intel, but also AMD and nVidia will have ray tracing cards. But Sony needs something *right now* that they can build their HW and SW around. Intel is currently only one with not only idea how to make ray tracing hw/sw engine but also working prototypes of Larabee they can provide to Sony soon. They might go with 2-chip solution: Larabee for graphics + normal 4-way
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
Larrabee has 32 cores, so that's all right then.
There's no reason Intel can't make a high-end graphics chip, their fabrication processes alone would give them a huge advantage over ATI/NVIDIA.
If they haven't made one so far it's because they're not really interested. They already sell more graphics chips than the competition so why bother?
The market for top-of-the-range graphics cards is pretty small. ATI/NVIDIA make way more money from their $50 cards than their $500 cards.
Inquirer bullshit (Score:5, Insightful)
I know Slashdot isn't known for its high standards of editorial oversight, but do you really have to waste our time parroting Inquirer bullshit?
Note that the only media source claiming that Intel is designing the PS4 GPU is The Inquirer, which is wrong more often than they're right. And Sony has explicitly denied [techradar.com] the rumors.
Intel might end up designing the PS4 GPU, who knows. This story doesn't give us any more information than we had before either way, and the headline is certainly wrong in concluding otherwise.
Re: (Score:2)
on the merits, also unlikely (Score:3)
For what it's worth, on the merits this is also unlikely to be a genuine leak, even if we ignore it being from the Inquirer. They claim that they got this scoop from a "nice Sony engineering lady at CES". It's unlikely that a random Sony representative at CES would even be privy to such information, or it would've leaked by now. These sorts of decisions are generally kept pretty close to the chest, and don't leak very early unless someone fucks up. (When they do leak, it can lead to SEC investigations.)
Re: (Score:2)
Yeah, I read that and thought... if it were a "nice Sony engineering guy" trying to hit on the "hot lady reporter" offering her a scoop for some... ahem ... services, it might make more sense.
Re:Inquirer bullshit (Score:4, Interesting)
Slow down and consider the implications (Score:4, Interesting)
Re:Slow down and consider the implications (Score:4, Funny)
Cell? (Score:4, Interesting)
Re: (Score:2)
Re: (Score:2)
Parallel programming isn't that hard if you design for it (in fact, I quite like working with threads - I keep a pool for concurrent file loading while the program is executing in my own code), but for the most part, games don't need it since they are usually throttled more by the GPU or memory bandwidth than the CPU. Most of the parallelism needed is in the GPU these days (in the form of programmable shaders).
Re: (Score:2)
Cell is being tapped right now. Have you seen Motorstorm 2? Killzone 2? The Cell is finally being properly programmed and it looks great. It finally is doing things Xbox 360 can't do (about time!). It definitely will be tapped out by the time PS4 comes out.
You don't understand the 10 year platform thing. It doesn't mean 10 years between consoles, it means that there are two active consoles at a time. Like PS1 overlapped with PS2 for 4 years. Like PS2 right now has game releases weekly.
Sony isn't talking spe
Re: (Score:2)
I have more of a feeling that this is more FUD spread by someone on the anti-Playstation camp. Just like all the rumors on price drops and such. If you spread news that Sony is abandoning the current gen system for a whole new platform, why buy one?
Sony wouldn't announce that. It smells like some kind of marketing scam.
Re: (Score:2)
Re: (Score:2)
Eww new console news already. How nice. (Score:2)
Re: (Score:2)
If the console makers actually use full keyboard and mouse support for their new consoles they could do a lot of damage to the pc gaming market. I mean A LOT. Really, I mean it.
You do realize that a lot more software is sold for consoles than PCs right? The console gaming market is huge, mostly because you just buy a console and any game out for that console will work out of the box. I haven't bought a PC game in a long time, but I'm always buying new games for my PS3 or Wii. Plus I'm the only person I know that has a computer capable of playing the latest games (most people I know have a cheap dell with integrated video that couldn't handle something as simple and old as Quake II
Why Intel? Because IBM screwed Sony... (Score:5, Informative)
From the article:
How Sony inadvertently helped a competitor and lost position in the videogame market.
Read here: http://online.wsj.com/article/SB123069467545545011.html [wsj.com]
Enjoy,
Re: (Score:2, Informative)
The point of that WSJ piece is that the Xbox360 CPU and the PS3 CPU are the same because they both come from IBM?
It's bull. The xbox360 CPU is a totally different architecture of PowerPC. I'm amazed you posted that.
C
CPU? (Score:2)
"...also take part in the CPU design of the console"
Wait... Why did Sony spend all that time, money and research on making their assumed super-scaling, awesomely powerful cell processor, if they're thinking of recreating a new CPU for their next console? Am I missing something there?
Lucky for nVidia (Score:3, Insightful)
Gee, over the past few years the news has been all about how doing a GPU for a console not only lost you money, but also pulled resources away from the profitable PC market, and the last few exchanges between ATI and nVidia holding first place in that market have been attributed to this.
Intel needs any kind of GPU win, badly, and they're big enough and rich enough they can afford to lose money on each chip and make it up on volume.
It's Sony I worry about, given how utterly appalling Intel GPUs have traditionally been.
So I gotta wonder, why do you think nVidia is worried about this?
backward compatibility (Score:2)
The PS4 will blow away ANY netbook in graphics! (Score:2)
Intel is going design their GPU... that's nice. I suppose they're aiming to find a way to waste more money and bomb harder than the PS3... if Sony can outdo themselves this time, they'll never have to make another console again!
Nail in the Coffin (Score:2)
Glad to see Sony continues to make bad decisions.
Console timing strategy (Score:4, Interesting)
I'm expecting to see the PS4 come out at least 3 years after the Xbox3. For one thing, the PS3 has technical advantages over the 360 which should give it more legs. Sony designed the PS3 with a target of a ten year lifespan.
Also, Sony is really stinging from the cost of trying to compete with a same-generation Xbox. They should be able to hit a sweet spot by spacing their machine a half a generation away. When the Xbox3 is released, Sony can drop their PS3 prices to very low levels, and capture the large, budget-minded segment of the market. After 3 years, once the Xbox3's newness has worn off, Sony can release a system which is technically unmatched, and which Microsoft won't be able to respond to for another 2-3 years.
Anyway, that's what I'd do if I ran Sony. It will be interesting to see how it plays out.
hmm (Score:2)
I really hope they aren't already working on the PS4, wasting money in this economic client on R&D of this nature almost seems brutally wasteful.
What IDIOCY! (Score:5, Informative)
No, not the article.
The editor.
First off, Sony denied this already -- yesterday [techradar.com]. So this isn't news, and it's already-rejected news.
Secondly, what kind of idiot links to the Inquirer as a source? Remember, they're the ones who posted the article claiming the PS3 was "slow and broken" because they didn't understand a memory bandwidth chart.
im full of jokes today (Score:5, Funny)
that should be Playstation 3.99967873 :P
Re:im full of jokes today (Score:5, Insightful)
Re: (Score:3, Insightful)
I think that they made a huge mistake by not keeping the same architecture with the PS3 as the PS2. If had kept the same architecture, while only increasing the processor speed and the graphics chip (while still adding new opengl lighting and shading effects, etc.), they could have easily made the PS3 FORWARD compatible, like many of the Xbox games are. Not only that, but every developer on earth knows how to prog
Re: (Score:3, Interesting)
This could be a long post, but to sum it up as best I can...
When Sony started to think about the PS3 they talked to their TOP developers and asked them what it needed to be able to do. We can assume those companies were their internal development staff, Konomi, Square, whomever makes GT. Those companies wanted a TON more performance and as such Sony couldn't just deliver a beefed up PS2. Sony mentioned in one of their articles that these developers demanded close to 100X the performance of the PS2. Sony
Re: (Score:3, Insightful)
History: I put on my robe and judge's wig.
Re: (Score:2)
I really don't want to know where you're going to put that gavel.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
No, and not true.
NVidia has tried tech, but it may not be better then:
http://en.wikipedia.org/wiki/Larrabee_(GPU) [wikipedia.org]
Intel understands Chips very well, and that includes GPUs.
There are two places they could fail.
1) They don't fully utilize Larrabee abilities in the drivers..unlikely.
2) They get eaten alive by the Graphics marketing.
another possibility is that it gets pulled due to economic downturn. I consider that highly unlike due to where they are at in development, and there momentum.
The Cell architecture i
Larrabee doesn't exist, yet (Score:3, Insightful)
It's kind of weird to take Larrabee as evidence of Intel having successfully produced a GPU, since they still haven't produced it, despite years of hype. It might turn out to be as excellent as they claim. It might turn out to be as excellent as revolutionary as their last revolutionary new architecture, Itanium.
Re: (Score:2)
Cell is weird and hard to program. Larrabee is x86.
Re: (Score:2)
The singularity will be run on x86.
great news, the more cores the more singularity!