Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Cloud Hardware

Amazon Begins Shifting Alexa's Cloud AI To Its Own Silicon (arstechnica.com) 19

An anonymous reader quotes a report from Ars Technica: On Thursday, an Amazon AWS blogpost announced that the company has moved most of the cloud processing for its Alexa personal assistant off of Nvidia GPUs and onto its own Inferentia Application Specific Integrated Circuit (ASIC). Amazon dev Sebastien Stormacq describes the Inferentia's hardware design as follows: "AWS Inferentia is a custom chip, built by AWS, to accelerate machine learning inference workloads and optimize their cost. Each AWS Inferentia chip contains four NeuronCores. Each NeuronCore implements a high-performance systolic array matrix multiply engine, which massively speeds up typical deep learning operations such as convolution and transformers. NeuronCores are also equipped with a large on-chip cache, which helps cut down on external memory accesses, dramatically reducing latency and increasing throughput."

When an Amazon customer -- usually someone who owns an Echo or Echo dot -- makes use of the Alexa personal assistant, very little of the processing is done on the device itself. [...] According to Stormacq, shifting this inference workload from Nvidia GPU hardware to Amazon's own Inferentia chip resulted in 30-percent lower cost and 25-percent improvement in end-to-end latency on Alexa's text-to-speech workloads. Amazon isn't the only company using the Inferentia processor -- the chip powers Amazon AWS Inf1 instances, which are available to the general public and compete with Amazon's GPU-powered G4 instances. Amazon's AWS Neuron software development kit allows machine-learning developers to use Inferentia as a target for popular frameworks, including TensorFlow, PyTorch, and MXNet.

This discussion has been archived. No new comments can be posted.

Amazon Begins Shifting Alexa's Cloud AI To Its Own Silicon

Comments Filter:
  • Often times, companies announce plan to replace suppliers and threaten them to sign long term contracts at favorable price. If they get the favorable price, they slow down development and delay the development as long as supplier is the best in the market. It is likely that if NVidia falters, we may see Amazon silicon but most likely 3+ years from now.

  • Maybe the upgrade will mean she actually plays the tune I ask for after I have specified the exact track name and artist
    • Alexa speak is a bit of a verbal CLI experience. You need to work out, by trial and error, what phrases it understands. I got my Echo dots on Amazon Prime day for £18 each, and like them for that price, but would never spend £50+ for the experience of hearing Alexa, again and again, saying 'Sorry I don't understand that'. I imagine they could do a 2001 edition that says 'I'm Sorry Dave, but I can't do that'.

      • by cute-boy ( 62961 )
        CLI without a command completer to help you out.... The whole thing Alexa experience is incredibly frustrating and unproductive, and the app to configure the devices is terrible.
  • Does Alexa work without a 'Net? Stand-alone without phoning home?

    I don't really care what it runs on unless I can have possession of the entirety of it. Otherwise, I won't use it.

    A society where everyone rents everything and owns comparatively nothing is a society of serfs. It's feudalism all over again.

    • Does Alexa work without a 'Net? Stand-alone without phoning home?

      No.

      I don't really care what it runs on unless I can have possession of the entirety of it. Otherwise, I won't use it.

      Neither will I. If you're technically capable, and less lazy than I am, you can roll your own out of an nVidia Jetson, an open source neural net library like TensorFlow, and an open source English language training corpus. Plus you can get an omnidirectional ring microphone array cheap from Chinese cloners of the ones used by Amazon and Google. It's not a perfectly smooth path, but it's fairly well-trodden at this point. In the end, you get all local processing, total control of the hardware and soft

      • All I want is LCD posters controlled by AI to avoid uncomfortable situations.

        For example, "Alexa, I have people coming over, please replace all the hentai posters with renaissance art."

        • by EvilSS ( 557649 )
          In theory you could do that with Alexa and IFTTT and some work setting up the backend for the posters to work with IFTTT somehow.
          • One of the places I work at has their production tracking dashboards running on FireTV sticks plugged into cheap TVs.

            Thru load the screens into a photo album to display them.

            I'm sure you could do similar with a bit of clever setup. Put your photos on folders, and have Alexa display them to the FireTV sticks.

    • by EvilSS ( 557649 )
      Depends, do you only want to use it to detect when your internet is out or your Wifi isn't working? Then yes. Otherwise no.
    • by Kaenneth ( 82978 )

      Then you aren't the target; there is very little useful to ask Alexa to do without network access.

      My most frequent questions: "Is it going to rain today", "How long will it take to drive to X", "Is it going to rain in X today", "What is the price of bitcoin", "How late is Y Business open", "Play [Classical/M.C. Chris/music like [artist name]/top hits of the 1920s/etc.]".

      Could be done without network: "What time is it" (you could set the clock manually), "start a Z minute timer" (for cooking), and "Roll 3d6/

You are always doing something marginal when the boss drops by your desk.

Working...