Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Businesses Hardware Technology

Adobe is Considering Whether it Wants To Design Its Own Chips (axios.com) 115

A growing number of technology companies are trying to manufacture their own chips, cutting their reliance on Intel and other chip providers. This week Adobe pondered making a similar move. From a report: At an internal innovation conference on Tuesday, Adobe CTO Abhay Parasnis posed the matter as a question for his colleagues, noting the significant increases in performance from chips designed specifically for specialized tasks, like machine learning. "Do we need to become an ARM licensee?" he said, referring to the company whose underlying chip design is used across a wide range of devices, including computers, servers and phones.

"I don't have the answer, but it is something we are going to have to pay attention to." Later on Tuesday, Parasnis told Axios that there are a range of ways that Adobe could get deeper into silicon. "ARM does afford a model for a software company to package its technology much closer to silicon," he said, adding Adobe could do that without literally making its own chips, including by partnering with an existing chipmaker.

This discussion has been archived. No new comments can be posted.

Adobe is Considering Whether it Wants To Design Its Own Chips

Comments Filter:
  • Adobe? (Score:5, Funny)

    by jmccue ( 834797 ) on Wednesday February 06, 2019 @12:29PM (#58079148) Homepage
    hummmm, from the people who brought us flash. I think I will stick with Intel, thank you very much.
  • by ctilsie242 ( 4841247 ) on Wednesday February 06, 2019 @12:39PM (#58079206)

    As someone who is a tad miffed at Adobe for forcing a subscription model on everyone, even the enterprise, I would be hesitant at best to buy any hardware offerings because I would fear that some additional monthly subscription fee would be tacked on.

    If I needed hardware for a custom mass-produced gizmo, and wasn't bound to x86/amd64, I'd probably go ARM. Yes, it does have a license fee, but the technology is widely known and debugged, tools are available, finding multiple ARM fabs wouldn't be hard to do, to ensure second-sourcing is doable, and it would be easy to mass produce widgets with ARM products. If not ARM, then RISC-V or POWER.

  • Dubyah Tee Eff? (Score:5, Insightful)

    by Crypto Gnome ( 651401 ) on Wednesday February 06, 2019 @12:39PM (#58079208) Homepage Journal
    Seriously, WHAT FOR?

    The entire "lets design the silicon ourselves" push is because YOU'RE ALREADY USING SILICON, just paying someone else for 100% of the work, and the design is generic not customized for your use-case.

    If you're a company which has NO HARDWARE PRODUCTS (not even rumors on the horizons) thinking "hey maybe we should license ARM, it worked for Apple" is the WORST KIND OF CORPORATE DRUG INDUCED NIGHTMARE.
    • Re:Dubyah Tee Eff? (Score:5, Insightful)

      by Sarten-X ( 1102295 ) on Wednesday February 06, 2019 @12:51PM (#58079270) Homepage

      For just about anybody else, I'd agree with you... For Adobe, though, it kinda makes sense.

      Adobe's cash cow is the media industry, and one of their biggest performance bottlenecks is video rendering. While not a particularly large market, having a premium hardware product that improves rendering speed is worth quite a lot of money to certain companies. I expect that's what Adobe is looking to capture with this push, with a model that would look very similar to how Bitcoin miners operated: Plug in an ASIC as a coprocessor, and it will handle the application workload.

      • by bob4u2c ( 73467 )
        Rendering Speed? Are we talking video or full on 3D rendering? If were talking video, chips that can pump out 4K video at 30fps or better are a dime a dozen. I bought a cheap $50 roku over christmas that streams content from my media box in 4K to the tv in the kids play room. If your talking 3D rendering then they would be so far behind the curve that they would have to pump probably a billion or so at it just to play catch up. Assuming they do produce something good, then what, its just another video
        • by Anonymous Coward

          Rendering in the Film/Video industry is not the same Rendering you get in the gaming/computer industry. Once an editor is finished editing a film, they have to render it. This process stictches all the edits and effects together into one video file. That file can then be burned on a DVD or uploaded online or what ever. Rendering a 90 minute feature film with just simple edits and cross fades can take 90 minutes on average hardware you or I might own. Production houses can get that render time lower with big

          • But will Adobe be able to do this better than Nvidia/AMD?
            These are serious competition with lots of accumulated know-how in graphics. Beating them is a tall order.

          • Iâ(TM)m no expert on this so this might be wrong, but would it not make more sense to just render ine master at the highest rez ypu would need and then just down scale for wharever versions you need afrwrwards insread ov doing the fades erc separatly for each version, I would think that a down scal operation wpuld be quicker than doing fades rtc , not to. Ention sfx/vfx renders multiple times ?

        • by Shinobi ( 19308 )

          No, you're missing out on actual industry jargon. Rendering is the process of assembling all the bits and pieces together, and it can be a very time and resource consuming process when done in high quality. For a movie, you might easily be working with 40 to 50 layers, describing shadows, lighting, motion tracking, depth data, then you're applying all the effects(such as noise, blur, colour changes etc etc etc). In some few cases, you still do it at twice the resolution you intend it to display at, and then

        • Re:Dubyah Tee Eff? (Score:5, Informative)

          by Sarten-X ( 1102295 ) on Wednesday February 06, 2019 @03:33PM (#58080198) Homepage

          We're talking video rendering, which is almost entirely unrelated to the decoding process that's so fast (and already supported by custom silicon like inside that Roku player you mentioned).

          As an AC said:

          Once an editor is finished editing a film, they have to render it. This process [stitches] all the edits and effects together into one video file.

          Now, note that those edits include computing special effects (like chroma key), compositing layers on top of other layers, as well as arranging different clips into one big video, then the whole result must be encoded. Typically, the video codecs are asymmetric, doing a lot more processing during the encoding step so the decoding can be faster and easier (and therefore supporting higher framerates with cheaper decoding hardware).

          4K video, in 24-bit color and uncompressed (which is really necessary to do the full compositing operation) is about 25 megabytes per frame. At 60 FPS, that's 1.5 gigabytes per second, or 12 Gbps, to use typical bandwidth units. In comparison, that will just about fully saturate a PCI-e x16 slot and some of the lower DDR4 specs. That's okay, because you won't be storing that data in memory for very long anyway... 64 GB of RAM will only store 42 seconds of uncompressed video. During the encoding process, you'll want to have that old video accessible, because it's useful for making more efficient compression of future frames.

          That's a lot of data, all to get a seamless composition, which is really rather important for having modern CGI effects blend invisibly into the recorded footage. Without the full rendering process, the effect layers may get different handling, so they'll appear noticeably different [tvtropes.org] in the final render. In the effort to produce uncompromising results for you, the viewer, studios just take longer for rendering, spending more money on salaries so you get a better result... or they just cut corners and render at a lower resolution.

          Having custom devices (and custom silicon) would mean that Adobe (or another vendor) would be able to take advantage of things like dedicated GDDR5X memory for high-bandwidth (256Gbps per chip, and lots of chips to increase capacity) storage, ARM processors for processing (though not necessarily rendering (in the non-video usually-3D sense)) special effects, and ASICs for the compositing and encoding operations, only relying on the host computer for storing the final product. In theory, a shoebox-sized peripheral could replace a data center render farm, enabling near-real-time rendering of edited film. That means directors and production crews can see their results more quickly, allowing them more time to reshoot or otherwise make a better product.

          It's certainly a commercial gamble for Adobe... but like I said, they're one of very few companies with a market position that makes custom hardware sensible.

      • For just about anybody else, I'd agree with you... For Adobe, though, it kinda makes sense.

        Adobe's cash cow is the media industry, and one of their biggest performance bottlenecks is video rendering. While not a particularly large market, having a premium hardware product that improves rendering speed is worth quite a lot of money to certain companies. I expect that's what Adobe is looking to capture with this push, with a model that would look very similar to how Bitcoin miners operated: Plug in an ASIC as a coprocessor, and it will handle the application workload.

        I think their goal is rather to provide cloud rendering services, at which point their custom hardware makes more sense.

      • When the CEO says "maybe we should use ARM, everyone else is" rather than "our code sucks dead donkeys balls, a well optimized algorithm and really tight code is ALWAYS better than THROW FASTER HARDWARE AT IT", you know he has SHIT FOR BRAINS.

        AT the minimum he needs to be saying words to the effect of "We have done literally EVERYTHING we can to optimize both algorithms and code, the only way to make this faster is HARDWARE or ALTERNATE UNIVERSE, and Option B is beyond our budget".

        ie He is acknowledging
    • What for? Better and tighter DRM and licensing models among other things. I guess you never paid Adobe money for a font.
    • by dgatwood ( 11270 )

      This.

      Even if we ignore Adobe's historically poor grasp of security (the only company I would trust less to be in my hardware than Adobe is the NSA), there's no sane reason for them to even consider this, because they don't build hardware.

      The only plausible reason that they could have for considering this would be to build some sort of special GPU optimized for Photoshop or something, and given that they would almost certainly not let anybody else develop software for such a beast (or else it would stop bein

    • I don't disagree with you, but you're overlooking the bigger picture. It's a simple two words: lock in. I'll bet some C-level exec declared that monthly rents weren't enough and demanded a solution to extract even more revenue from customers. Under the guise of "premium performance," too, I'd add.
    • by Tablizer ( 95088 )

      It's a "Why would God need a spaceship?" moment.

  • by jellomizer ( 103300 ) on Wednesday February 06, 2019 @12:41PM (#58079214)

    I am not sure why Adobe wants to make its own chips. They are a software company, if these chips are for their own server farm "cloud" what real benefit is it going to give them. Will Creative Cloud software be reasonably priced for amateurs? For the amount of time I need their products, I cannot justify spending more then $5.00 a month for Photoshop. Anything more it is worth my effort banging my head with The GIMP. (mostly due to how little I use the product)

    Back in the olden days. I would get the Upgrade for $200 every 4 or 5 years. But the current pricing, is much more expensive for the low volume use of the product. Especially, because I don't need the upgrade all the time.

    • Simple for DRM. You need a USB with a custom crypto chip plugged in to use Adobe Cloud. As soon as the cert expires you throw it out and buy another one for an expensive price. More money!

      What people fail to realize is Adobe has a monopoly as they bought all their competitors. It's ridiculous. I also notice Adobe took all their menu items in Photoshop and made them separate products so you are forced to buy something now for each function. Now you can't even save a file as a PDF at work without an expensive

  • by Joe_Dragon ( 2206452 ) on Wednesday February 06, 2019 @12:44PM (#58079234)

    will CC cloud Mobile device cover roaming, in flight wifi, cruse ship wifi, etc fees for license checks?

  • With software, at least, you can uninstall their insecure crap. With hardware you're fucked. ...I say as I type on an intel machine. Still, they had a pretty decent track record until the latest debacle.

  • But the way this is brought up and posed it's obvious this CTO is likely your average clueless Slashdot user. Let's not hurt his feelings.

  • by YuppieScum ( 1096 ) on Wednesday February 06, 2019 @12:49PM (#58079266) Journal

    Why on earth would Adobe - exclusively a software company - need to design their own silicon?

    Is this a prelude to a repeat of the bad old dongle-days?

    • by bob4u2c ( 73467 )
      But the dongle worked like gang busters! No way anybody could defeat that.

      Well unless you did what my school did, install a printer switch box but backwards so that you switched it to your computer and then fired up the software.
    • Presumably for the same reason Facebook needs to design their own silicon to run a website.

    • Why on earth would Adobe - exclusively a software company - need to design their own silicon?

      Ummm... because the "tax" imposed by the current silicon overlords is too much to pay? Because the current silicon is not flexible enough?

      I can think of a dozen reasons. What should bother you about all of this is that it is necessary.

  • by JoeyRox ( 2711699 ) on Wednesday February 06, 2019 @12:53PM (#58079274)
    Their creative-cloud apps are slow as molasses and only make perfunctory use of the computing resources available to them, including GPUs and multiple CPU cores.
    • I've found most creative cloud apps (photoshop, premier, after effects) use threads pretty effectively. One of the issues with GPU acceleration - yes it works quite well, but it also reduces the performance of the display significantly.

      One thing I'd like with CC is maybe some cloud rendering option or network rendering options like Cinema 4D has.

  • by account_deleted ( 4530225 ) on Wednesday February 06, 2019 @12:58PM (#58079290)
    Comment removed based on user account deletion
  • So they want to both charge a monthly fee for 'cloud services' and lock us down with proprietary hardware?

    • You won't be locked down with proprietary hardware. In fact, if you stop paying monthly for the hardware, that goes away too!

    • by jythie ( 914043 )
      It sounds like they want proprietary hardware running parts of their cloud services. So it would be invisible to the user and probably sitting in some data centre.
  • As an engineer I'm intrigued with what they can do. As a consume, I don't need another gadget, computer, laptop, or tablet. Where is this chip going to go?
    • by jythie ( 914043 )
      It would go in the data centre. It sounds like they are looking into building some custom hardware for running their services, so not something end users would have on-site.
  • RISC V (Score:5, Informative)

    by backslashdot ( 95548 ) on Wednesday February 06, 2019 @01:40PM (#58079446)

    Why not RISC V, do companies really want to pay ARM forever? It's like, do you want to keep paying for a cloud subscription to software? Hmm.. I guess in Adobe's case they are cool with it.

    • by Anonymous Coward

      Which is the more mature market, with greater experience as well as fabbing ability?

    • I am slightly involved in RISC-V development. At the moment there exist a total of zero RISC-V multicore open designs that work. And by "that work" I mean to be at least capable to boot the OS. It will be competitive in a couple of years, there is a huge community pushing for it, but nowadays it is not ready for prime time.

  • For the mass market, they have not only multi-core CPUs, but also drivers for graphics cards. On modern PCs, that's a lot of compute power. So three questions:
    - Could they really improve performance by a significant amount (better be at least 3x), on custom hardware?
    - Are there a lot of power users who would shell out serious bucks for that custom hardware?
    - Will that be enough to justify the extra development effort, to create a customized version of their products?
    It seems to me that the answer to all thr

    • Buy long out of the money puts in Adobe!

      The fact they are considering this is a terrible sign for where their exec's heads are at. They think they can become like (Wang word processors of the 1970s and 80s/Bloomberg terminals of the 90s and 00s). Leasing single use machines for a fortune. It will never happen.

    • by jythie ( 914043 )
      If I understand correctly, they are not looking at producing a consumer device, but instead custom hardware that would be optimized for their hosted services. Probably chips designed around the most processor intensive types of tasks their software does that can then be used by people running their cloud versions. So something like having a checkbox where they can run their render locally or send it out to these special servers for a fee and get them back quicker.
  • I hate pdfs and bloaty McBloatface Reader. There, I said it.
  • that's all I got.

    I'm glad they are re-imaging themselves. Other than the PDF plugin I'm not sure what they do these days. All that AI for fake Photoshop and that audio faker they have?

  • Just think of the number of software bugs and patches for ill written zero days they have since patched. How many people know even a thing about patching hardware microcode? Applying hardware updates would be a bit harder to perform and therefore it would be more likely skipped by the admins due to this. Now you have a piece of vulnerable hardware hanging off the net that no one wants as their responsibility to manage.
  • Why bother with soon obsolete ISAs? It's open and royalty free also, so, go creative! ;-)
  • really, are we going back to the time when you had so many different systems and architectures which barely worked together?
    an Atari for music, an Amiga for Video, Mac for DTP/Design, PC for office work, ...

  • Back in the 1980's, Apple ][ computers ran on 6502 chips. They were OK for small apps and games, but not for bigger apps. There was a company that had an office suite called Starburst. They saw a large number of Apple ][ PCs that couldn't run the software. So they bought a bunch of Zilog Z80 cards, that could be inserted into a slot, and run CP/M, thus being able to run the Starburst office suite. These cards were often sold together with the Starburst office suite for Apple ][. The suite included...

    * a li

Whoever dies with the most toys wins.

Working...