Nvidia Walked Away From PS4 Hardware Negotiations 255
An anonymous reader writes "Tony Tamsai, Nvidia's senior vice president of content and technology, has said that providing hardware for use in the PlayStation 4 was on the table, but they walked away. Having provided chips for use in both the PS3 and the original Xbox, that decision doesn't come without experience. Nvidia didn't want to commit to producing hardware at the cost Sony was willing to pay. They also considered that by accepting a PS4 contract, they wouldn't have the resources to do something else in another sector. In other words, the PS4 is not a lucrative enough platform to consider when high-end graphics cards and the Tegra line of chips hold so much more revenue potential."
Wonder what they told MS (Score:4, Funny)
You can bet MS has approached them on providing chips for Durango too. I wonder if they told *them* to piss off.
Re:Wonder what they told MS (Score:4, Interesting)
Re: (Score:2)
Messages are mixed, the rumour mill is convinced it's going to be AMD ... but there are few credible sources.
Re:Wonder what they told MS (Score:5, Interesting)
Can you? One of the reasons the original XBox was pulled off the market as soon as the 360 came out (and no slim was ever made) was because nVidia reportedly refused to do a die shrink or combine dies, etc. So MS was left with a big, hot, expensive chip while Sony was able to shrink theirs and lower their costs dramatically.
MS might still hold a grudge on that one.
Re:Wonder what they told MS (Score:4, Insightful)
Can you? One of the reasons the original XBox was pulled off the market as soon as the 360 came out (and no slim was ever made) was because nVidia reportedly refused to do a die shrink or combine dies, etc. So MS was left with a big, hot, expensive chip while Sony was able to shrink theirs and lower their costs dramatically.
MS might still hold a grudge on that one.
No sane business holds grudges like that. If MS wants it, it'll be written into the next contract and either nVidia will agree or not get the contract.
Re: (Score:2, Insightful)
No sane business holds grudges like that.
The electronics industry isn't sane, then.
Re: (Score:2)
No sane business holds grudges like that. If MS wants it, it'll be written into the next contract and either nVidia will agree or not get the contract.
Apple anyone? Rumor was Apple was going with Nvidia. Nvidia announced that had a deal with Apple and then Apple (well Jobs) killed the deal. Why? Apple has to announces things on Apple's schedule i.e. at some hyped Apple event.
Many businesses are run by what many would consider to be not sane people. Sometimes that helps the business and others it hurts the business.
Re: (Score:3)
Apple anyone? Rumor was Apple was going with Nvidia. Nvidia announced that had a deal with Apple and then Apple (well Jobs) killed the deal.
Major breach of an NDA is a pretty good reason to go with another supplier, not only for Apple.
Re: (Score:3)
Re: (Score:3)
Or you can completely wrong, it was ATI that did this: http://www.zdnet.com/ati-suffers-wrath-of-jobs-3002080337/ [zdnet.com]
Re: (Score:2)
Can you? One of the reasons the original XBox was pulled off the market as soon as the 360 came out (and no slim was ever made) was because nVidia reportedly refused to do a die shrink or combine dies, etc. So MS was left with a big, hot, expensive chip while Sony was able to shrink theirs and lower their costs dramatically.
MS might still hold a grudge on that one.
The Xbox was pulled because MS wanted everyone to buy a Xbox 360. Why? Because they stand to make more money off new releases, which would be for the Xbox 360 then they would for any Original Xbox games being bought. Granted they still made money off of Xbox Live with the original Xbox's, but not enough I imagine to keep producing them.
I think Sony was the only company still making machines for the previous generation while games were still being made for it. Think it was a year or so after the PS3
Re: (Score:2)
Exactly, I think everyone forgets how Microsoft works. Another fine example is when they end support for an OS, they remove everything from their website that would be of any manual help to you. It's Microsoft's policy to Hook you and force you to upgrade by removing even the most basic self-help. You guys should know that by now, really.
As for the article, it simply makes sense that AMD offers a fully integrated solution with a low price. And it's a good deal for AMD also.
Re: (Score:2)
Think it was a year or so after the PS3 was out that new PS2 games finally stopped coming
Much, much longer than that - new PS2 games were still coming as recently as Q4 2012!
Re: (Score:3)
New PS2 games were made for about 5-6 years after the PS3 came out.
Six years after the PS3s release, there is still an occasional game being released for the PS2, such as the Final Fantasy XI expansion Seekers of Adoulin, which will release March 27 of next year.
http://www.extremetech.com/gaming/144342-13-years-after-the-playstation-2-changed-the-industry-sony-finally-halts-production [extremetech.com]
Re: (Score:2)
Similar to the claim the parent made, my understanding is also that the Xbox was pulled so quickly and replaced by the 360 because of the infeasibility of doing a die shrink to make a higher margin (or more accurately some margin rather than significant loss) design with the original Xbox.
My understanding is that Microsoft did not procure the rights to the implementations of the CPU nor the graphics chip used in the original Xbox. This was presumably because the Xbox was rushed and/or MS was not familiar w
Console margins can't be good (Score:5, Insightful)
You have to provide lots of parts at low cost and they will surely write in a lower price for each continued year of the console. That means you are tying up fab time on something is on an outdated process a few years down the road.
On the other hand AMD had to do this, they need the money so any margin is likely acceptable.
Re:Console margins can't be good (Score:5, Insightful)
Money, yes but possibly also market share. People currently often write and test games only on nvidia hardware and then if it does not beak totally on AMD cards consider it done. With the differences between the cards this will give nvida a performance advantage in all games written this way, although I have no idea how much. AMD just turned the tables for all games written originally for the PS4, quite a win for PC ports of console games too I expect.
Re:Console margins can't be good (Score:5, Interesting)
Re:Console margins can't be good (Score:5, Funny)
... the Wii run[s] AMD (well ATI) GPUs
Being rather generous, aren't we?
If anything, I'd say the Wii 'casually strolls' the GPU. 'Run' is taking it a bit far.
Re: (Score:2)
The AMD GPU is like a corgi. You might take it out for a walk, but it has to run to keep up.
Re: (Score:2, Informative)
Developers develop on NVIDIA because their drivers are better. Flat out better. More compliant, reliable, etc. This has been true for a long time... id Software's Carmack wrote about this years ago, and the situation has not improved since then.
Re: (Score:2, Insightful)
Fixed.
Re: (Score:3)
Re: (Score:2)
That hasn't been my experience. I've simply had better reliability from AMDs offering (On Linux) than Nvidia. But everyone is entitled to their opinion. And by the way, Carmack also said that Linux wasn't viable as a gaming platform, so...Gabe disagrees.
Re: (Score:2)
And still I will not buy an AMD card.
Their video drivers suck. Their linux drivers are a total joke. So here I will play using my AMD CPU and NVIDIA card.
Re: (Score:2)
I've never had problems with the ATI drivers I download direct from the source... perhaps you're thinking of the Windows Update variant, that does inevitably break, well, everything?
My 6500HD seems to work fine on Linux, although I admit I just installed Steam on that box the other day and haven't had a chance to test it.
Re: (Score:2)
Re: (Score:2)
Which is why I usually stay one generation behind on video cards when I am looking for a new one. For what I play I do not need the latest card. I have the specs on what I need. A range of cards meet that spec. Also drivers usually have worked out the bugs by then.
Then again I left the epenis waving contest a long time ago. I have found that stability goes a long way to good gaming. I also no longer play fps games. Maybe today's fps games need the latest card to run well. My newest card is a AMD 6950 2GB mo
Re: (Score:2)
You're dead accurate with the stability sermon there. Let me tack onto it.
Factor in all of the components that cause instability. Cheap RAM (fewer layers on the PCBs than the good stuff) and power supplies (meaningless 80+ labels and poor performance under spiky loads/bad power) and rebranded video cards (same PCB issue) are huge culprits for system instability, and they do in fact market these cheap parts in the gamer segment with fancy packaging and promotional deals like free games.
I have a tip for the D
Re:Console margins can't be good (Score:4, Informative)
A couple of years ago, we had an intern join us at work. Towards the end of the internship (computer animation), she asked me what home build PC I'd recommend for about £700. I wrote a spec that was something along the lines of:
- a decent 24" monitor
- a £35 case + soundproofing
- a mid-range modular PSU (supported SLI if she needed it)
- 120mm heatsink + fan
- a pair of HDD's (for RAID0 - SSD's were too small, and too expensive at the time)
- 8Gb DDR3 1600 Mhz (I'd have gone for 1866, but it was too expensive)
- An asus motherboard
- AMD Althon X4
- A graphics card for about £100.
She posted the spec on facebook, and suddenly a small army of 20 year old students responded with: ZOMG! That CPU is SHIT! You're wasting your moeny! Get an i7! You don't need to buy a heatsink, you get one with the CPU! Why are you spending *that* much on a case and PSU, you can get both of them here for £25!! You can buy cheaper RAM than that! You can get a cheap 24" monitor for £100, what are you thinking!!! etc, etc.
She asked other people at work for their opinions (all people in their 30's), and they all pretty much said the same thing as me. Invest money in the stuff you're going to live with for years (monitor, case, psu, etc), and skimp on the stuff that is easy to replace (CPU/GPU). I think she kinda trusted our opinion a bit more than her class mates, so eventually she went with that system.
A couple of weeks later, I went to help her build it, and that was absolutely hilarious. The same students who'd been suggesting that she was wasting her money, all came out with things like "That computer is so quiet! My computer sounds like an airplane taking off!", or "Jesus! That thing boots so much faster than my i7!". Last time I spoke to her, she'd just upgraded it to an 8 core AMD chip for a little over £80.
Cheap components are a good thing, but PC builds that compromise on quality are not.
Re: (Score:2)
AMD discontinued my motherboard's integrated Radeon chip a year after I bought it.
Meanwhile my GT220 from 5(?) years ago still runs the latest nVidia software and is powering my HTPC.
Re: (Score:3)
Did you buy a cheap motherboard with an ancient GPU on it to start with?
I am not saying that AMD didn't screw you over, it just seems that without the full back story your statement doesn't hold a lot of credence.
Re: (Score:2)
I salute anyone going all out on multi-threading support, it's not easy. CPUs aren't really getting (much) faster so that's all you can do once you've offloaded everything you can to the GPU.
Re: (Score:3)
If nVidia was this small minded, they deserve whatever they get.
Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money. Any profit they actually get is just icing.
Re: (Score:2, Insightful)
yea but that optimization only lasts a few months before its totally outdated, usually before the console hits the stores. The embedded solutions on the other hand have a much higher rate of product rotation meaning you can get the latest n greatest out to customers without holding up fab on a 10 year old design for systems that usually only have high sales within the first couple of years.
Re: (Score:2)
Please tell me more about optimizing games to be properly multithreaded will only last few months and then be outdated.
Re:Console margins can't be good (Score:5, Funny)
Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money. Any profit they actually get is just icing.
Quick, better call Nvidia and tell them this before they make a terrible, terrible mistake! Just say you're calling from Slashdot - they'll put you straight through to the CEO.
Re:Console margins can't be good (Score:5, Insightful)
You're making the assumption that they thought about this. The people involved in the decision probably numbered in the dozens tops, with most of them marketing and finance people. With the way companies seem to be run to realize maximum profits in the short term these days, it's even possible they realized this but turned down the long term gain anyway.
Re:Console margins can't be good (Score:5, Insightful)
You're making the assumption that they thought about this. The people involved in the decision probably numbered in the dozens tops, with most of them marketing and finance people. With the way companies seem to be run to realize maximum profits in the short term these days, it's even possible they realized this but turned down the long term gain anyway.
Given the fact that we're talking about AMD and Nvidia, my guess is that it was a thoughtful decision.
The fact that they have walked away before, that AMD is in previous consoles, and that everyone is continuously crying (from the tech world and wall street alike) that AMD is near it's end (even though it's not), it sounds like they might have made a good decision.
AMD is going to spend a lot of time making a low margin product that is going to be outdated next year but one that they have to keep spending resources and time on for years. Nvidia is going to be spending their time on supercomputer applications, drivers, and pushing their image as a higher end card.
Sometimes you walk away from a business deal because you want your competitor to win it.
Re: (Score:2)
That's most likely in this case, but I still don't think it's guaranteed. Companies make bad, short sighted decisions all the time.
Re: (Score:2)
Re: (Score:3)
You're making the assumption that they thought about this.
More specifically, I'm making the assumption that Nvidia, the multi-billion dollar company, have thought about this deal harder and for longer than the kind of Slashdotter who likes to chip in on these stories a few more minutes after reading about it.
Re: (Score:2)
Or they DID think about this.
Of the last gen consoles, two went ATi/AMD - Xbox360 and Wii. One went nVidia - PS3 (the RSX). nvidia was involved in the gener
Re: (Score:2)
Yet, do you disagree?
Very likely the "mistake" is out of their hands and nothing a CEO can do about it other than build a time machine, go back in time, either A) prevent AMD from buying ATI, or B) Buy ATI, or C) Somehow convince Intel to buy nVidia, then go forward in time, and place a competitive bid on something they couldn't have without the advent of time travel.
Re: (Score:3)
| Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money.
The problem is that AMD changed graphic card architecture on the HD 3xxx series, meaning that free console optmization only exists on the 1xxx and 2xxx series.
Profits are the whole point (Score:2)
Having all games (and thus their ports) on million and millions of xbox and PS consoles designed and optimized for your specific hardware for the next 10 years is worth money. Any profit they actually get is just icing.
Profit is never "just icing". Profit is the entire purpose for a for-profit company. No (sane) business exists to just make revenue. They have to do more than break even on the job.
Re: (Score:2)
Re: (Score:3)
AMD has another advantage in this sort of business. Since they no longer own their own fabrication plants, they can simply contract this out to another fabrication plant if it becomes a constraint on their first choice fabrication vendor.
Uh, they could always have done that. No-one forced them to use thei rown fabs for all their chips.
While flexibility is an advantage, being totally reliant on third-party suppliers is not.
Re: (Score:2)
Except nVidia has never had their own fabs, so that's not an advantage for AMD.
Re: (Score:2)
His name is Tony Tamasi, not Tamsai... (Score:5, Informative)
Just sayin'...
Re: (Score:2)
And just like that, he goes from being Japanese to Western European.
dem Economics (Score:2)
Re:dem Economics (Score:4, Interesting)
The only thing I can take from this is that the potential growth in mobile platforms far outstrips the costs associated with developing hardware for another game console platform. Like a previous comment asked, I wonder if they told Microsoft to go away as well. If they did, what does this mean in the bigger picture? Is the future of gaming on tablets?
My thought is that tablets will allow us to extend games and make them portable. For example, I would have loved to have been able to play Skyrim on the PS3 and the Tablet: The PS3 at home and the Tablet when on the road. Saved games would be synched to the cloud, similar to what Steam does today, and downloaded to the tablet so that you could pick up where you left the game. The capabilities of tablets would have to improve quite a bit before this happens, but it is coming...
Re: (Score:2)
Without some very clever thinking(or a greater acceptance among tablet users of peripherals), that is going to be a brutal UI problem...
Even between PC and console, which are practically cousins in the 'lots of buttons and a pointing device' family of interface devices, you can smell a console port a mile away because of how wrong its interface feels. Some are salveagable(Thank you, thank you SkyUI!), some are basically game-breakers(Sorry GTA IV, I wanted to enjoy you...)
Tablets are a whole different kettl
Re: (Score:2)
Why would you not just hook up a PS3 controller via bluetooth to your tablet?
Re: (Score:2)
Why would you not just hook up a PS3 controller via bluetooth to your tablet?
"(or a greater acceptance among tablet users of peripherals)" Architecturally, there wouldn't be any significant barrier, and it would be the easiest thing to do. I've just never(in a nontrivial amount of observing heavy tablet-use areas) seen any peripheral use aside from keyboard/case quasi-laptop style arrangements, speaker docks(mostly for smaller devices), and video dongles for projector connections. There isn't anything specifically stopping people; but they just don't seem to.
Re:dem Economics (Score:5, Funny)
My thought is that tablets will allow us to extend games and make them portable. For example, I would have loved to have been able to play Skyrim on the PS3 and the Tablet: The PS3 at home and the Tablet when on the road. Saved games would be synched to the cloud, similar to what Steam does today, and downloaded to the tablet so that you could pick up where you left the game. The capabilities of tablets would have to improve quite a bit before this happens, but it is coming...
I was thinking the same things as I was playing sim city the other day....man it would be nice if this game was synched to the cloud...
Re: (Score:2)
Hmm... (Score:5, Interesting)
I wonder how much of the 'opportunity cost/things we could have been working on instead' factor has to do with the fact that AMD is simply in a tighter spot than Nvidia, and how much it has to do with the fact that AMD already makes CPU/GPU combination packages(and seems interested in making more), while Nvidia has nothing of that sort except their 'Tegra', which might be a snappy mobile part; but is fundamentally punching in a different weight class(if nothing else, Sony's plans for 8GBs of RAM get a lot uglier on a 32-bit architecture. Yes, ARM also has something PAE-like; but PAE is mostly a hack that makes running multiple independent programs on a 32 bit system with more than 4GB of RAM palatable, not something you'd want to design a game engine around.)
This isn't to say that Nvidia couldn't have done it(heck, what would buying VIA cost these days?); but Nvidia would need, essentially, an entire new flavor of product line for this job, while AMD, whether they call it this or not, is punching out a modestly customized APU, which almost certainly shares substantially with the ones that they sell for PCs.
Re: (Score:2)
...how much it has to do with the fact that AMD already makes CPU/GPU combination packages(and seems interested in making more), while Nvidia has nothing of that sort except their 'Tegra', which might be a snappy mobile part...
This is my guess. AMD can offer an integrated part with good performance. If the choice of a PC-like architecture had already been made (no "cell 2"), then there were two other options: An integrated Intel solution (not very good graphics), or a combination of CPU from Intel and GPU from Nvidia. This would mean more/larger assembly, and two solutions to pay for rather than one.
Re: (Score:2)
All indications are that this AMD APU has way more graphics hardware onboard than anything you'd find in a consumer part, though. This also seems to be the first proper consumer 8-core chip that AMD has produced; they've never put anything out with that sort of core count before in the consumer market that didn't use the quasi-multicore design where every set of two cores were sharing a lot of the hardware (their answer to SMT). The point is that they're already scaling this thing way the heck up from what
Re: (Score:2)
Agreed, but it also sounds like nVidia has unquestionably reached the "big corporation" stage. A scrappy startup would have found a way to make the business happen - today's nVidia says, "nah, not a big enough margin" like IBM would. Some of the more interesting corps would have thrown a skunkworks or subsidiary at it if it was that thin of an effort.
The trouble is, large slow-moving corporations aren't known for innovation, which is essential in this product space. Ordinarily I'd say nVidia ought to wat
Re: (Score:2)
One thing you're missing is: how many of these consoles are they actually going to sell? Casual gamers have a lot more options with tablets and smartphones than they did when the last generation of consoles came out with really only PCs and older consoles to compete against.
Public Relations.... (Score:5, Insightful)
Why are people running a blatent self-serving PR story?
We lost but... we didn't really want to win it anyway!
Re:Public Relations.... (Score:4, Interesting)
Why are people running a blatent self-serving PR story?
We lost but... we didn't really want to win it anyway!
Yeah, that was what I was thinking too, of course they say that. And if they'd won instead they'd say the exact opposite and we'd hear this drivel from AMD. It's not like Sony and Microsoft had a lot of other options, who should they have gone to? Intel? VIA? PowerVR? No, if both AMD and nVidia had told them to buzz off they'd come back with a better offer. I doubt AMD sold themselves that cheap, since they knew nVidia wouldn't do that either. Just cheap enough to win, keep their volume up and live to fight another day.
Re: (Score:2)
Allegedly (Score:5, Funny)
They, Allegedly, walked away.
Without video proof, we can't be sure they didn't strolled, strutted or even rambled away.
Re:Allegedly (Score:4, Informative)
Not to mention, with phrases like "I'm sure there was a negotiation that went on," the guy just seems to be speculating about what happened, instead of, you know, being there during the negotiations.
Re: (Score:3, Funny)
Perhaps they moseyed [penny-arcade.com].
Re:Allegedly (Score:5, Funny)
Observers from the Ministry of Silly Walks have confirmed (to their disappointment) that their walk was one of the most serious ever recorded, and that they did not amble, dawdle, gambol, hustle, limp, meander, mosey, march, ramble, sashay, saunter, scamper, scurry, sidle, skulk, slink, slog, skip, stroll, stomp, strut, swagger, tiptoe, traipse. They did not even do a forward aerial half turn every alternate step with the left leg, which itself is hardly a silly walk at all.
Re: (Score:2)
And those are just the "legs" options. We have to consider wheelchairs and crutches (or even "limping away"), or rolling down the hall in a conference room chair shouting "weeeeee!" the whole way. If there was alcohol involved, crawling is certainly an option. If it was a crack team of negotiators, there may have been rappelling...
Yes, we need video.
Bullshit (Score:5, Interesting)
Considering AMD are producing the CPU chips for both platforms, and the the GPU as well, it isn't surprising that nVidia "walked" away. This is the eventual benefit of AMD buying ATI, in that they can produce both now. I have no doubt that AMD either have special consideration or simply could offer a better bid than nVidia could.
Regardless of the profit, this would be a big feather in AMD's cap. AKA "We produce both the CPU and GPU of all modern game consoles, don't you want to buy our chips?". Also in the bigger scheme of things, if you get game developers in such numbers making games for YOUR video card on millions and millions of consoles for all games, which are ported to say PC, what do you think those games will be optimized for? AMD. Which will look better? AMD. This is something that is going to change things in a pretty large way over the next 10 years.
nVidia should have paid money to be a part of this if only to prevent their rival in AMD from doing so. Perhaps they didn't have the money. More likely they think they have something that will make a difference. I doubt it.
I'm not fired, I quit is the sentiment I feel about nVidia's statement...
Re:Bullshit (Score:5, Insightful)
This. I'm shocked no one else saw what was obvious here.
AMD is providing a unified CPU/GPU on a single die that shares the same memory and bandwidth. For Nvidia to provide a separate GPU to compete at the same performance and price would be really difficult, if not impossible.
Re: (Score:2)
But the price of the AMD solution goes up because they have to use GDDR5 instead of DDR3 for that memory pool. Estimates I have seen are 2-3x the cost versus DDR3, so it adds an extra 30-50 bucks to the BOM.
You would spend the same amount of money buying 8GB DDR3 plus 2GB GDDR5 for your GPU, and you could choose whichever combination of CPU/GPU you want! It would also mean you could use cheaper 1Gbit GDDR5 chips.
I think that Sony is betting on the unified memory architecture giving them an advantage in GP
Re: (Score:3)
if you get game developers in such numbers making games for YOUR video card on millions and millions of consoles for all games, which are ported to say PC, what do you think those games will be optimized for?
Forget GPU and think the end of Intel IPC ruling the CPU market. AMD just won CPU race in Gaming market sector. Games will be written for AMD 8 core arch from the grounds up, using every possible x86 extension AMD has to offer, and compiled with something other than Intel 'let me check if you run this on Genuine Intel so I can decide if Ill slow down the code" compiler.
Also think end of PhysX.
Re: (Score:2)
Agree.
Though game, most games anyway, tend to be more limited by GPU not CPU, so I can see more optimization (specialization) there. That said, in the long run you are right, I think this will give AMD a bump in the market share outside of consoles eventually.
I know the CPU (and likely the GPU as well) will be a special product, so it will be interesting to see what the details actually become and what they throw in there. If they are strategic and implement some interesting things that is not supported by
Re: (Score:2)
Re: (Score:2)
If I was a betting man, I would go with AMD also, simply for their integration which translates into low cost. Of course it remains to be seen how revolutionary the "Steam Box" is. Could be huge or a big flop, or even a non-starter.
Re: (Score:2)
Nobody knows what a Steam Box is, let alone what hardware it uses.
Re: (Score:2)
"We produce both the CPU and GPU of all modern game consoles, don't you want to buy our chips?"
The Wii U stuck to an IBM PowerPC processor, although it does have an AMD GPU.
IBM made the processors in all three of last generation's consoles (360/PS3/Wii), what impact did that have on IBM's processor sales?
Saving Face (Score:3)
Not only that, the tech they came up with could likely be used for new laptops and set top boxes.
I suspect it was more likely because they didn't have the level of tech needed. ATI had their APU systems lined up already and with tweaking, they're perfect for a console. I'm not sure that NVidia had anything approaching the power of these APUs drawn up (their focus has been on desktop graphics and tablet).
Rumours suggest that the 3DS was going to use NVidia tegra based tech but they couldn't keep the heat down so Nintendo went with the as-seen-in-every-bargain-bucket-chinese-tablet Mali+arm combination.
Re: (Score:2)
Certainly it has revenue potential - but that doesn't mean it has sufficient ROI compared to other uses of their capital to justify committing the resources. For example, I can spend today on a project that will make me $100 bucks or on one that will make me $120 bucks - and absent compelling reasons to choose the former, I'm going with the latter.
Re: (Score:2)
The 3DS GPU isn't Mali but something developed by a Japanese company, I don't believe it's been used in anything else at this point.
Revenue is not the important bit (Score:2)
60million units doesn't have revenue potential?
Wrong question. The correct question is whether it has profit potential. Revenue is just how much you sell. Profits are how much you keep. Profits = Revenue - Expenses. The revenue is not sufficiently larger than the costs then there is no point in being in that line of business.
Re: (Score:2)
Re: (Score:3)
Native multithreading, physics with DirectCompute. (Score:2)
Forget about AMD GPU optimization. Specific shaders optimization accounts for maybe 10%.
Think about future games being natively written with 8 cores in mind. No more buy 2 Nvidia cards to see some PhysX sparks, all games will use Havoc, Bullet or some other physics library computing on AMD GPUs.
The Man Who Said No To Sony (Score:2)
There seems to be a lot of similarities to the Snapper Lawnmower story [slashdot.org].
"Jim Wier believed that Snapper's health -- indeed, its very long-term survival -- required that it not do business with Wal-Mart. "
no brain decision (Score:3)
Low profit plus opportunity cost equals a bad decision. Nvidia made the right business choice. That capacity can now be used for more profitable products.
Re: (Score:2)
I'm guessing there was a non-competitive clause in there which would have stopped nvidia from selling to Valve for the ValveBox.
Then why didn't Sony put a non-competitive clause to AMD for stopping AMD from selling to Microsoft for Durango?
Re:Good Move. (Score:5, Funny)
I recently read a comment in slashdot that had a bizarre structure. The author gave his opinion by telling a story about how he gave that same opinion to a friend of his.
Re:Good Move. (Score:5, Funny)
This one time, at band camp, I stuffed a Slashdot comment up my flute.
Re: (Score:2)
Samsung has a similar contract to make all the processors for every iPhone, iPad, etc.
Man, that contract must suck. It provides 9% of all Samsung revenue.
Re: (Score:2)
It might, if the alternative was to shift that market to GS4 and make more margin on less parts.
Re: (Score:2)
Even though these companies compete with finished products, I believe it is a good deal for both companies for Samsung to produce chips for Apple, which is precisely why the deal persists despite their legal wrangling.
It would hurt Samsung to suddenly drop 9% of their revenue. And because they're competing fiercely, they can't suddenly make a higher margin on less parts in a tight market.
Apple has shopped and can't find a better supplier, which is why they still use a company they hate. They get a part they
Re: (Score:2)
The more margin comes from selling their own devices. Those they get markup on at the retail level, not wholesale.
Re: (Score:2)
I understand, but again they're in a tight market with Apple on the finished phones.
I don't believe they'd sell more phones to see a 9% increase in overall company revenue simply by not making processors for Apple.
Apple has a decent profit margin per product and tons of cash. If Apple had to pay slightly more per proc through another supplier, they'd likely have to eat the difference.
You're suggesting Samsung drop 9% of their total revenue to slightly screw over Apple. But it wouldn't really benefit Samsung
Re: (Score:3)
BTW, see today's news as a further example of why Apple is having difficult moving away from Samsung.
http://arstechnica.com/apple/2013/03/apple-hit-with-class-action-lawsuit-over-defective-retina-displays/ [arstechnica.com]
I expect the two to continue to partner for some time.
Re: (Score:2)
Or maybe it's because of the exact reason stated in the summary - Sony didn't want to pay them enough money.