Apple Reunites With iPhone Graphics Chip Partner To License Technology (theverge.com) 28
Apple will once again license technology from Imagination Technologies, the chip designer that used to provide graphics processors for the iPhone and iPad, the UK-based company announced today. The Verge reports: In a short statement posted on its website, Imagination said that it had entered into a multiyear license agreement with Apple, under which the Cupertino, California-based firm will have access to "a wider range of Imagination's intellectual property in exchange for license fees." Apple announced its split from Imagination back in April 2017 when it said that it would start designing its own graphics chips, and it would stop licensing the company's technology within two years. After the split was announced, Imagination expressed skepticism that Apple could design its own chips "without violating Imagination's patents, intellectual property, and confidential information."
"Intellectual property" (Score:2)
Why work, if you could pay somebody to work once, declare the result of his work *your* "intellectual property", and use the resulting imaginary monopoly to artificially cause scarcity, to force arbitrary prices on the world, or censor that idea for all eternity. Then use the income to pay off the original creator, also *once*, for that work, and keep stealing money from people with your racketeering scheme for all eternity.
And when somebody complains, tell him you are "protecting" creative people (lol), or
Re: (Score:2)
Not sure i follow. The alternative is to not allow people to own intellectual property? How will that work out? Or the workers will own the intellectual property they are paid to produce? Who would hire anyone or risk their money in R&D gambles under those terms? People only by lottery tickets if the rate of return is 1000000 per dollar risked. Why would someone pay somebody to invent something if they didn't own the invention?
It's 2020 already (Score:3)
Technically, Vulkan is the cross platform API that all gfx intensive task (such as games) are standardizing on(*).
(and as low-level as Apple's Metal)
But otherwise, yes, Apple should stick with standards.
Except not, because they like very much the walled iGarden they can lock you in.
---
(*) even more important as Vulkan is lower level and enables 3D engine devs to squeeze a bit more performance out of the hardware, which is critical for embed devices and even more so for GPU with slightly unusual aporoaches s
Re: (Score:2)
This has nothing to do with Vulkan or Metal. Those are software APIs for the hardware. This agreement covers the patents related to the hardware.
Re: (Score:2)
APIs for the software, not hardware. Wish I could edit.
Re: (Score:2)
Yeah I usually do, but this one slipped through. It was early in the morning and I messed up as I was trying to clear up a miscincopetion. BUT THANKS FOR YELLING AT ME OVER A TRIVIAL MISTAKE :)
Answering to the parent poster (Score:2)
I was answering to the parent poster(*) who was complaining that (no matter if Apple- or Imagination Technology-made) the GPUs of Apple iDevice can exclusively be only accessed with Apple-specific proprietary API such as Metal, instead of industry standard like OpenGL.
I was merely pointing which is the current industry standard du jour.
---
(*) Note that the parent poster is now modded at -1. Depending on your settings, it might be invisible and thus it might look like I'm commenting the /. submission, not ni
Vulkan. (Score:2)
Vulkan is not cross-platform,
It *IS* [wikipedia.org]
and is pain{-t-} in the ass to work with.
Yes, Vulkan is lower level. That's the whole purpose of Vulkan, and it's non cross-platform cousins like Apple's Metal, Microsoft's DirectX 12, Sony's GNM, etc.
A lot of GPU devs program GPU on their own budget. Vulkan is just too unwieldy.
Nowadays, most devs use middleware, egines, etc. (Unity, Unreal, etc.)
The engine's devs fight the Vulkan weirdness (and leverage it to gain even better perfs).
Most of the game devs and app devs use such engines or toolkit and only work at a higher level.
Reunite? (Score:2)
Re: (Score:2)
The whole split was probably a ploy to reduce licence fees, or maybe force the share price down so they could buy them cheaply.
Or maybe Apple really did intend to split but failed to develop a competitive GPU.
Re: (Score:3)
Apple tried, but Imagination said no, they won't sell. Then Apple announced they were pulling away (per the licensing agreement, Apple has to give like 3 years notice they were pulling out).
Imagination's stock sank 70% on the news that Apple was terminating the agreement (but this was the notice - they still had 3 years left per contractual agreement). It was then that Imagination d
Good for them (Score:2)
Re: (Score:3)
Glorious move to Intel x86 (Score:1)
For a long time Apple was boasting about being different by using the Power architecture over the more proletarian x86.
That until 2006 when, with great fanfare, they announced their move to Intel processors.
The real reason? IBM got tired of Steve Jobs' annoying negotiation tactics are refused to supply them with processors anymore.
So this glorious moment of being "reunited" looks like a smoke screen for a failed attempt to build their own competitive GPU.
Re:Glorious move to Intel x86 (Score:4, Insightful)
For a long time Apple was boasting about being different by using the Power architecture over the more proletarian x86. That until 2006 when, with great fanfare, they announced their move to Intel processors. The real reason? IBM got tired of Steve Jobs' annoying negotiation tactics are refused to supply them with processors anymore.
That's not my recollection. Two things that Apple wanted was enough (and newer) chips every year for their computers. This was their problem with Motorola as Motorola simply could not keep up production. And laptop versions of the chips. IBM was not in a position to design new consumer chips every year just for Apple. Even if Apple was buying millions of chips, they were a small volume customer to IBM. Internally IBM used the PowerPC 970 for their servers but servers didn't require new chips every year. While PowerPC did have some features that were for consumer use, newer versions of PowerPC would have to include more consumer technology like hardware H264 encoding/decoding if Apple were to keep using them. So IBM would to invest lots of R&D and may have to split their PowerPC chip design into two families.
Also IBM never could deliver a laptop PowerPC chip that Apple could use. Up until the Intel switch Apple had to rely on Motorola's G4 chips for their laptops. While the IBM's G5 chips were beasts they were not power efficient and too hot for laptop use. So if you were Apple would keep using a chip supplier that would not supply you with the chips you need ever year and never supplied you a laptop version. Or you move on to another supplier.
So this glorious moment of being "reunited" looks like a smoke screen for a failed attempt to build their own competitive GPU.
As far as I know, Apple's GPUs seem to perform well enough. I am not aware of a massive recall on them or that they don't work. They may not be able to keep up with future demands (8K, 12-bit HDR) but I would not say they failed.
Re: (Score:2)
That's not my recollection. Two things that Apple wanted was enough (and newer) chips every year for their computers. This was their problem with Motorola as Motorola simply could not keep up production. And laptop versions of the chips. IBM was not in a position to design new consumer chips every year just for Apple. Even if Apple was buying millions of chips, they were a small volume customer to IBM. Internally IBM used the PowerPC 970 for their servers but servers didn't require new chips every year. While PowerPC did have some features that were for consumer use, newer versions of PowerPC would have to include more consumer technology like hardware H264 encoding/decoding if Apple were to keep using them. So IBM would to invest lots of R&D and may have to split their PowerPC chip design into two families.
Also IBM never could deliver a laptop PowerPC chip that Apple could use. Up until the Intel switch Apple had to rely on Motorola's G4 chips for their laptops. While the IBM's G5 chips were beasts they were not power efficient and too hot for laptop use. So if you were Apple would keep using a chip supplier that would not supply you with the chips you need ever year and never supplied you a laptop version. Or you move on to another supplier.
You left out one critical part. Motorola wanted to exit the laptop chip division. Apple was just about the only company buying these chips, and didn't buy them at high enough volume to be worth their trouble. Instead, they wanted to move to building system-on-chip designs with everything integrated, weaker CPUs, and lots of vector processing, because almost all of their future sales were expected to come from gaming consoles and set-top boxes. These chips, however, were useless for Apple, because the ac
Re: (Score:2)
The only thing I'm not clear on is why they decided to go with Intel rather than P.A. Semi, beyond that going with a small, fabless startup would have been a lot riskier than going with the dominant player in the CPU market. Of course, they then turned around a year later and bought P.A. Semi to design iPhone chips for them, so... very bizarre.
The main reason they bought PA Semi was not for their Power expertise. It was for their staff of RISC designers. Also while PA Semi did work on other Power chips, their PWRficient work is what probably drew Apple. If Apple had them design their next CPUs, they may have had the opposite problem of IBM in that they would have had decent laptop chips but mediocre desktop chips.
Re: (Score:2)
I realize that. It's just odd that they waited the extra year, when they could have had those RISC designers *and* a viable laptop chip without putting everyone through that painful transition.
From what I understand, IBM was more than w
Re: (Score:2)
I realize that. It's just odd that they waited the extra year, when they could have had those RISC designers *and* a viable laptop chip without putting everyone through that painful transition.
It still would have been somewhat painful as laptops would use PWREfficient and desktops would use their PowerPC. I don't know how much Apple would have to do to make them perform close.
From what I understand, IBM was more than willing to supply them with desktop chips, even going to far as to incorporate all of the extra vector instructions into subsequent iterations of their POWER instruction set. So I sort of doubt that was a factor, but I could be wrong.
I think at the end of the day, Apple thought they were looking at another repeat of Motorola. So they decided to play it safe by going with Intel.
Re: (Score:2)
They never had similar performance historically, and still don't, so I'm not sure what difference that would have made. :-)
Except that the company was small enough that they could
Re: (Score:2)
I realize that. It's just odd that they waited the extra year, when they could have had those RISC designers *and* a viable laptop chip without putting everyone through that painful transition.
I wonder now if that was deliberate so PA Semi could be bought at a lower price later.
Re: (Score:1)
Great news (Score:1)