Adobe is Considering Whether it Wants To Design Its Own Chips (axios.com) 115
A growing number of technology companies are trying to manufacture their own chips, cutting their reliance on Intel and other chip providers. This week Adobe pondered making a similar move. From a report: At an internal innovation conference on Tuesday, Adobe CTO Abhay Parasnis posed the matter as a question for his colleagues, noting the significant increases in performance from chips designed specifically for specialized tasks, like machine learning. "Do we need to become an ARM licensee?" he said, referring to the company whose underlying chip design is used across a wide range of devices, including computers, servers and phones.
"I don't have the answer, but it is something we are going to have to pay attention to." Later on Tuesday, Parasnis told Axios that there are a range of ways that Adobe could get deeper into silicon. "ARM does afford a model for a software company to package its technology much closer to silicon," he said, adding Adobe could do that without literally making its own chips, including by partnering with an existing chipmaker.
"I don't have the answer, but it is something we are going to have to pay attention to." Later on Tuesday, Parasnis told Axios that there are a range of ways that Adobe could get deeper into silicon. "ARM does afford a model for a software company to package its technology much closer to silicon," he said, adding Adobe could do that without literally making its own chips, including by partnering with an existing chipmaker.
Adobe? (Score:5, Funny)
Re: (Score:3)
that was the old Adobe, the new adobe has the super secure PDF software that has never been exploited... before it is released.
Re: (Score:2)
ONce Adobe does this, they can RENT not only their software to you, but also their hardware!!
Re: (Score:2)
here at IBM we embrace our legacy.. er... ahem well some of it.
Re: (Score:2)
But Intel also brought us flash...
Gordon's Alive!!
Re: (Score:3)
back in the early 1990s, you could buy photoshop accelerators. [google.com]. Then again, you could get the same DSP ( AT&T 3210) by using a Quadra 660AV or 840 AV.
Re: (Score:1)
Re: Adobe? (Score:2)
What's not to love??
Re: (Score:2)
Re: (Score:2)
hummmm, from the people who brought us flash.
So they brought you memories of flash, too, and you have those in your computer right now, right?
Re: (Score:2)
blame the hardware vendor who depends on bloated code to sell increase in performance.
Re: (Score:2)
Re: (Score:2)
Gate's law: Every year software becomes 40% slower.
Sucks that Moore's law has broken, but Gate's law is going strong.
Re: (Score:2)
DRM chips? (Score:3)
As someone who is a tad miffed at Adobe for forcing a subscription model on everyone, even the enterprise, I would be hesitant at best to buy any hardware offerings because I would fear that some additional monthly subscription fee would be tacked on.
If I needed hardware for a custom mass-produced gizmo, and wasn't bound to x86/amd64, I'd probably go ARM. Yes, it does have a license fee, but the technology is widely known and debugged, tools are available, finding multiple ARM fabs wouldn't be hard to do, to ensure second-sourcing is doable, and it would be easy to mass produce widgets with ARM products. If not ARM, then RISC-V or POWER.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Just like old school cartridge games
Dubyah Tee Eff? (Score:5, Insightful)
The entire "lets design the silicon ourselves" push is because YOU'RE ALREADY USING SILICON, just paying someone else for 100% of the work, and the design is generic not customized for your use-case.
If you're a company which has NO HARDWARE PRODUCTS (not even rumors on the horizons) thinking "hey maybe we should license ARM, it worked for Apple" is the WORST KIND OF CORPORATE DRUG INDUCED NIGHTMARE.
Re:Dubyah Tee Eff? (Score:5, Insightful)
For just about anybody else, I'd agree with you... For Adobe, though, it kinda makes sense.
Adobe's cash cow is the media industry, and one of their biggest performance bottlenecks is video rendering. While not a particularly large market, having a premium hardware product that improves rendering speed is worth quite a lot of money to certain companies. I expect that's what Adobe is looking to capture with this push, with a model that would look very similar to how Bitcoin miners operated: Plug in an ASIC as a coprocessor, and it will handle the application workload.
Re: (Score:2)
Re: (Score:1)
Rendering in the Film/Video industry is not the same Rendering you get in the gaming/computer industry. Once an editor is finished editing a film, they have to render it. This process stictches all the edits and effects together into one video file. That file can then be burned on a DVD or uploaded online or what ever. Rendering a 90 minute feature film with just simple edits and cross fades can take 90 minutes on average hardware you or I might own. Production houses can get that render time lower with big
Re: (Score:1)
But will Adobe be able to do this better than Nvidia/AMD?
These are serious competition with lots of accumulated know-how in graphics. Beating them is a tall order.
Re: Dubyah Tee Eff? (Score:1)
Iâ(TM)m no expert on this so this might be wrong, but would it not make more sense to just render ine master at the highest rez ypu would need and then just down scale for wharever versions you need afrwrwards insread ov doing the fades erc separatly for each version, I would think that a down scal operation wpuld be quicker than doing fades rtc , not to. Ention sfx/vfx renders multiple times ?
Re: (Score:2)
No, you're missing out on actual industry jargon. Rendering is the process of assembling all the bits and pieces together, and it can be a very time and resource consuming process when done in high quality. For a movie, you might easily be working with 40 to 50 layers, describing shadows, lighting, motion tracking, depth data, then you're applying all the effects(such as noise, blur, colour changes etc etc etc). In some few cases, you still do it at twice the resolution you intend it to display at, and then
Re:Dubyah Tee Eff? (Score:5, Informative)
We're talking video rendering, which is almost entirely unrelated to the decoding process that's so fast (and already supported by custom silicon like inside that Roku player you mentioned).
As an AC said:
Once an editor is finished editing a film, they have to render it. This process [stitches] all the edits and effects together into one video file.
Now, note that those edits include computing special effects (like chroma key), compositing layers on top of other layers, as well as arranging different clips into one big video, then the whole result must be encoded. Typically, the video codecs are asymmetric, doing a lot more processing during the encoding step so the decoding can be faster and easier (and therefore supporting higher framerates with cheaper decoding hardware).
4K video, in 24-bit color and uncompressed (which is really necessary to do the full compositing operation) is about 25 megabytes per frame. At 60 FPS, that's 1.5 gigabytes per second, or 12 Gbps, to use typical bandwidth units. In comparison, that will just about fully saturate a PCI-e x16 slot and some of the lower DDR4 specs. That's okay, because you won't be storing that data in memory for very long anyway... 64 GB of RAM will only store 42 seconds of uncompressed video. During the encoding process, you'll want to have that old video accessible, because it's useful for making more efficient compression of future frames.
That's a lot of data, all to get a seamless composition, which is really rather important for having modern CGI effects blend invisibly into the recorded footage. Without the full rendering process, the effect layers may get different handling, so they'll appear noticeably different [tvtropes.org] in the final render. In the effort to produce uncompromising results for you, the viewer, studios just take longer for rendering, spending more money on salaries so you get a better result... or they just cut corners and render at a lower resolution.
Having custom devices (and custom silicon) would mean that Adobe (or another vendor) would be able to take advantage of things like dedicated GDDR5X memory for high-bandwidth (256Gbps per chip, and lots of chips to increase capacity) storage, ARM processors for processing (though not necessarily rendering (in the non-video usually-3D sense)) special effects, and ASICs for the compositing and encoding operations, only relying on the host computer for storing the final product. In theory, a shoebox-sized peripheral could replace a data center render farm, enabling near-real-time rendering of edited film. That means directors and production crews can see their results more quickly, allowing them more time to reshoot or otherwise make a better product.
It's certainly a commercial gamble for Adobe... but like I said, they're one of very few companies with a market position that makes custom hardware sensible.
Re: (Score:2)
If it's a trade-off between "faster" and "better", I expect most producers will choose "faster", especially for rendering various preview edits.
Of course, the magic word is really "licensing"... Adobe could license the software encoding algorithms from others and build ASICs around them, or make it possible to load a software encoder to run on Adobe's processors.
I would not expect to see uncompressed frames sent over any current PC I/O technology. Compared to the speeds possible inside a custom device, that
Re: (Score:2)
For just about anybody else, I'd agree with you... For Adobe, though, it kinda makes sense.
Adobe's cash cow is the media industry, and one of their biggest performance bottlenecks is video rendering. While not a particularly large market, having a premium hardware product that improves rendering speed is worth quite a lot of money to certain companies. I expect that's what Adobe is looking to capture with this push, with a model that would look very similar to how Bitcoin miners operated: Plug in an ASIC as a coprocessor, and it will handle the application workload.
I think their goal is rather to provide cloud rendering services, at which point their custom hardware makes more sense.
Re: (Score:2)
AT the minimum he needs to be saying words to the effect of "We have done literally EVERYTHING we can to optimize both algorithms and code, the only way to make this faster is HARDWARE or ALTERNATE UNIVERSE, and Option B is beyond our budget".
ie He is acknowledging
Re: (Score:2)
Re: (Score:2)
This.
Even if we ignore Adobe's historically poor grasp of security (the only company I would trust less to be in my hardware than Adobe is the NSA), there's no sane reason for them to even consider this, because they don't build hardware.
The only plausible reason that they could have for considering this would be to build some sort of special GPU optimized for Photoshop or something, and given that they would almost certainly not let anybody else develop software for such a beast (or else it would stop bein
Re: (Score:2)
Re: (Score:1)
It's a "Why would God need a spaceship?" moment.
I cannot justify creative cloud. (Score:3)
I am not sure why Adobe wants to make its own chips. They are a software company, if these chips are for their own server farm "cloud" what real benefit is it going to give them. Will Creative Cloud software be reasonably priced for amateurs? For the amount of time I need their products, I cannot justify spending more then $5.00 a month for Photoshop. Anything more it is worth my effort banging my head with The GIMP. (mostly due to how little I use the product)
Back in the olden days. I would get the Upgrade for $200 every 4 or 5 years. But the current pricing, is much more expensive for the low volume use of the product. Especially, because I don't need the upgrade all the time.
Re: I cannot justify creative cloud. (Score:3)
Simple for DRM. You need a USB with a custom crypto chip plugged in to use Adobe Cloud. As soon as the cert expires you throw it out and buy another one for an expensive price. More money!
What people fail to realize is Adobe has a monopoly as they bought all their competitors. It's ridiculous. I also notice Adobe took all their menu items in Photoshop and made them separate products so you are forced to buy something now for each function. Now you can't even save a file as a PDF at work without an expensive
will CC cloud cover roaming fees for licene checks (Score:3)
will CC cloud Mobile device cover roaming, in flight wifi, cruse ship wifi, etc fees for license checks?
Adobe? No thanks (Score:2)
With software, at least, you can uninstall their insecure crap. With hardware you're fucked. ...I say as I type on an intel machine. Still, they had a pretty decent track record until the latest debacle.
Careful what you say (Score:2)
But the way this is brought up and posed it's obvious this CTO is likely your average clueless Slashdot user. Let's not hurt his feelings.
Why? (Score:3)
Why on earth would Adobe - exclusively a software company - need to design their own silicon?
Is this a prelude to a repeat of the bad old dongle-days?
Re: (Score:2)
The only "services" they sell are hosted versions of a couple of their software packages.
Do they really need to spend huge amounts on designing, validating, producing and integrating custom silicon, combined with huge amounts on development and QA of a new branch, just to make the hosted version of Spark run a bit more efficiently?
Re: (Score:2)
Re: (Score:1)
Well unless you did what my school did, install a printer switch box but backwards so that you switched it to your computer and then fired up the software.
Re: (Score:2)
Presumably for the same reason Facebook needs to design their own silicon to run a website.
Re: (Score:2)
Why on earth would Adobe - exclusively a software company - need to design their own silicon?
Ummm... because the "tax" imposed by the current silicon overlords is too much to pay? Because the current silicon is not flexible enough?
I can think of a dozen reasons. What should bother you about all of this is that it is necessary.
Or they can learn how to write software instead (Score:3)
Re: (Score:2)
I've found most creative cloud apps (photoshop, premier, after effects) use threads pretty effectively. One of the issues with GPU acceleration - yes it works quite well, but it also reduces the performance of the display significantly.
One thing I'd like with CC is maybe some cloud rendering option or network rendering options like Cinema 4D has.
Comment removed (Score:3)
Re: (Score:1)
Sounds a bit expensive.
Sounds like they're dropping the consumer market
Right (Score:2)
So they want to both charge a monthly fee for 'cloud services' and lock us down with proprietary hardware?
Re: (Score:2)
You won't be locked down with proprietary hardware. In fact, if you stop paying monthly for the hardware, that goes away too!
Re: (Score:2)
Re: (Score:2)
In the 80s Kodak sold a # of awful computers for image manipulation. They ran on some proprietary non-DOS OS and you could never get them to run DOS because they ran off weirdly formatted 5.25 inch disks and they cost like $2000. They were horrible. Who would buy a proprietary box for a single use case in this era? It never even worked in the 80s when the hardware was sufficiently horrendous that you could ALMOST justify such specialization.
I didn’t even know that existed until recently when I saw it on YouTube.
https://youtu.be/ABOJLR7bRIA [youtu.be]
Re: (Score:2)
Who would buy a proprietary box for a single use case in this era?
You won't be buying a proprietary box, you'll be renting it, and the box will be "in the cloud", not on your premises.
Compare to game consoles (Score:2)
Who would buy a proprietary box for a single use case in this era?
I see it as no more unusual than buying an Xbox One game console just for the latest Halo game or a Nintendo Switch game console just for Super Smash Bros. Ultimate.
Re: (Score:2)
Reminds me of the Kodak Picture Maker workstations I used when working in a small photo store. They cost a fortune and were horrible, horrible machines running (incredibly slowly) on obsolete Sun workstations. They later switched to standard PC hardware and the performance improved immensely, but that didn't happen until after I left the store sometime in 2006.
Having worked with a lot of "professional" equipment, I know very well that proprietary hardware is bad news. I once upgraded a $5,000 workstation
Chip for what? (Score:1)
Re: (Score:2)
RISC V (Score:5, Informative)
Why not RISC V, do companies really want to pay ARM forever? It's like, do you want to keep paying for a cloud subscription to software? Hmm.. I guess in Adobe's case they are cool with it.
RISC V, or the RISK in RISC. (Score:1)
Which is the more mature market, with greater experience as well as fabbing ability?
Re: RISC V (Score:3)
I am slightly involved in RISC-V development. At the moment there exist a total of zero RISC-V multicore open designs that work. And by "that work" I mean to be at least capable to boot the OS. It will be competitive in a couple of years, there is a huge community pushing for it, but nowadays it is not ready for prime time.
Re: RISC V (Score:2)
My point is also they should join and contirbute so it happens faster. If they go with ARM they will become dependent on ARM.
Re: (Score:2)
At the moment there exist a total of zero RISC-V multicore open designs that work.
Well, I wouldn't call it zero RISC...
What am I missing? (Score:2)
For the mass market, they have not only multi-core CPUs, but also drivers for graphics cards. On modern PCs, that's a lot of compute power. So three questions:
- Could they really improve performance by a significant amount (better be at least 3x), on custom hardware?
- Are there a lot of power users who would shell out serious bucks for that custom hardware?
- Will that be enough to justify the extra development effort, to create a customized version of their products?
It seems to me that the answer to all thr
Re: (Score:2)
Buy long out of the money puts in Adobe!
The fact they are considering this is a terrible sign for where their exec's heads are at. They think they can become like (Wang word processors of the 1970s and 80s/Bloomberg terminals of the 90s and 00s). Leasing single use machines for a fortune. It will never happen.
Re: (Score:2)
I hate pdfs (Score:2)
Will it have FLASH memory? (Score:2)
that's all I got.
I'm glad they are re-imaging themselves. Other than the PDF plugin I'm not sure what they do these days. All that AI for fake Photoshop and that audio faker they have?
Really? Who will maintain it? (Score:2)
Starting a CPU project today, I would choose RISCV (Score:2)
noooo, not that again! (Score:2)
really, are we going back to the time when you had so many different systems and architectures which barely worked together? ...
an Atari for music, an Amiga for Video, Mac for DTP/Design, PC for office work,
This is NOT a new concept (Score:2)
Back in the 1980's, Apple ][ computers ran on 6502 chips. They were OK for small apps and games, but not for bigger apps. There was a company that had an office suite called Starburst. They saw a large number of Apple ][ PCs that couldn't run the software. So they bought a bunch of Zilog Z80 cards, that could be inserted into a slot, and run CP/M, thus being able to run the Starburst office suite. These cards were often sold together with the Starburst office suite for Apple ][. The suite included...
* a li