Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Cloud Hardware Linux

New Chip Offers Artificial Intelligence On A USB Stick (pcmag.com) 81

An anonymous reader writes: "Pretty much any device with a USB port will be able to use advanced neural networks," reports PC Magazine, announcing the new Fathom Neural Compute Stick from chip-maker (and Google supplier) Movidius. "Once it's plugged into a Linux-powered device, it will enable that device to perform neural network functions like language comprehension, image recognition, and pattern detection," and without even using an external power supply.

Device manufacturers could now move AI-level processing from the cloud down to end users, PC Magazine reports, with one New York computer science professor saying the technology means that now "every robot, big and small, can now have state-of-the-art vision capabilities."

The article argues that this standalone, ultra-low power neural network could start the creation of a whole new category of next-generation consumer technologies.
This discussion has been archived. No new comments can be posted.

New Chip Offers Artificial Intelligence On A USB Stick

Comments Filter:
  • by Mostly a lurker ( 634878 ) on Sunday May 01, 2016 @12:36PM (#52023349)

    This is all very interesting. However, there is no indication of when the sticks will become generally available. Their website indicates that they intend to create 1000 sticks shortly for use by selected customers. It is difficult to know how real this is, actually.

    • by ShanghaiBill ( 739463 ) on Sunday May 01, 2016 @01:42PM (#52023641)

      This is all very interesting. However, there is no indication of when the sticks will become generally available.

      There also seems to be very little actual information about it. How much memory does it have? How many FLOPS? The product sheet says it uses 16 bit floats, which are generally good enough for NNs. But can it do FP32 and FP64 at all? The power consumption is ~1W, so I doubt if it can do much with that. The USB interface would be a major bottleneck, as you fed information in, and pulled results out. A GPU on a PCIe bus would be way faster at that ... and nearly all computers already have a GPU. I think I will continue to run my NNs on a Tesla K80 [nvidia.com].

      • by caferace ( 442 )
        I suspect (yes, I'm guessing) that this may be somewhat less expensive than a Tesla K80.
        • by ShanghaiBill ( 739463 ) on Sunday May 01, 2016 @03:02PM (#52024099)

          I suspect (yes, I'm guessing) that this may be somewhat less expensive than a Tesla K80.

          Sure, but it is more expensive than the GPU already included in your computer, which has a marginal cost of $0 since you already have it. So why should you buy something that is far slower and less capable than something that is effectively free?

          Also, you don't need to buy a Tesla K80. You can rent them by the minute from AWS.

      • by ceoyoyo ( 59147 )

        It's just a mobile gpu chip with a USB interface.

      • It has FP32, but no FP64. And most of the hardware filters, e.g. for convolutions are FP16 only. And yes, GPUs are faster (I've written code for GPUs and the Myraid2), but the Myraid2 does pretty well from the perspective of processing power to power consumption ratio
    • by samkass ( 174571 )

      This is all very interesting. However, there is no indication of when the sticks will become generally available. Their website indicates that they intend to create 1000 sticks shortly for use by selected customers. It is difficult to know how real this is, actually.

      Wouldn't be surprised if this is a "please buy us out!" advertisement-product. I could see Apple buying them and integrating their chip into the next A-series processor to do client-side Siri among other things.

  • by John Smith ( 4340437 ) on Sunday May 01, 2016 @12:37PM (#52023351)
    Well, first time I've seen that in a long time. Sounds like this Xkcd (https://xkcd.com/644/) might have something to do with it.
  • Bullshit (Score:5, Insightful)

    by geek ( 5680 ) on Sunday May 01, 2016 @12:40PM (#52023359)

    We should first create AI before we start selling it on fucking USB sticks.

    • "We should first create AI before we start selling it on fucking USB sticks."

      It's a neural network on a stick. It's up to you too try to make it usable as an AI.

      • I agree. I'm tired of AI this and AI that. At best we're getting to expert systems that are tied to speech and sight recognition. When one of these "AI" thingies can come up with an original idea and implement new behavior as a result, we might be getting there.
        • by HiThere ( 15173 )

          You should read the Go masters commentary on the recent match. These "AI" thingies can come up with original ideas and implement them. And their original ideas can be better than those of any human expert. (They aren't always, of course.)

          What all current AIs I've heard of are weak on is layered hierarchies of goals.

          • You can train a machine to classify cat photos. I'm not sure if I'd call that intelligence.

            Scientists use the words machine learning. This device enables ML. At this time ML is by far the most useful aspect related to AI, but it's misleading to say this device enables AI. Additionally, mimicking human intelligence doesn't alway involve learned behavior.

            You might disagree with my terminology, but machine leaning enables better AI. Machine learning isn't AI. There are plenty of machines that have been traine

            • by HiThere ( 15173 )

              Well, if you're talking about this particular device...I've no idea what it would be useful for, or why they picked a USB stick form-factor. If you're comparing Google's Go machine to something that classifies cat pictures, I don't think you understand the problem...or what the problem that was being solved is. Or if you're being dismissive of Go (i.e., "I don't like Go, therefore playing it isn't intelligent."), then perhaps you don't understand what was being done or why.

              I'm trying to read your comment

          • Sorry, I would disagree that the Go playing computer is coming up with any original ideas. It is a very complex expert system. It might even remember tactics used in the game against it and "learn". It is all still the result of pre-programmed algorithms. One could examine the inputs, and knowing the programming, predict the computer's behavior. When the Go playing computer decides to take the day off or engage in some other behavior that was never contemplated by its programming we can talk.
    • This is precisely the type of HARDWARE that AI SOFTWARE needs to reach maximum performance. They are making this technology more accessible, which means more developers have the tools they need to start working in the AI field. How is this a bad thing?

    • Don't worry someone will come from the future and destroy them, to prevent a war that hasn't yet taken place.

    • We should first create AI before we start selling it on fucking USB sticks.

      We have created AI. What we haven't done (yet) is create human level general purpose AI. But you don't need GP-AI to do things like object recognition, basic natural language processing, machine learning, etc. All of that is still artificial intelligence.

      • Object recognition is not AI. Neither is language processing. Christ.
        • Perhaps you should go back to school, as I told you already several times: both are AI
          It is not you who decides what AI is. It is the people actually working in AI

          It took decades of research to figure how a computer can recognize objects, that was once a hard AI problem.

          Language and image processing/recognition is done by a subset of aI called "cognitive systems".

          Sorry binary number, you know absolutely nothing about AI.

  • by Anonymous Coward

    I can see a niche market for such a thing(aerial robotics, marketing research, statistics data collection for some other purpose such as at a kiosk) but the question is: is it less effort to integrate than rolling your own with other frameworks? Unless there is a severe power budget or weight limitation of the application(IE aerial drones), the notion of "plug and play" is dependent on how much the device does, and how easy it makes it to add the capabilities it doesn't.

  • by __aaclcg7560 ( 824291 ) on Sunday May 01, 2016 @01:10PM (#52023469)
    This is AI on a USB stick is smart enough to know that it was go into the drawer with all the non-AI USB sticks that I don't use anymore?
  • and without even using an external power supply.

    What are the 5 volts provided by the USB bus if not "external power"?

  • by ArchieBunker ( 132337 ) on Sunday May 01, 2016 @01:38PM (#52023605)

    This USB stick will do a better job of editing Slashdot than the humans,

  • "New Chip Offers Artificial Intelligence..."

    They keep using that phrase, but I do not think that phrase means what they think that phrase means.

  • by Viol8 ( 599362 ) on Sunday May 01, 2016 @02:27PM (#52023875) Homepage

    Fuck off. If you want me to have even a passing interest in this i want to see how easy it will be to use and port applications BEFORE I give you my details.

  • Because that's how you get Skynet. - Archer

  • Comment removed based on user account deletion
  • I want an "AI button" on the front of the case, just like the old Turbo button
    which gave incredible improvements in performance.

  • So who has any spare ports available anyways?

    Most of these "sticks" I've seen are so wide that they block adjacent ports, so that means it will take up a pair (at least, all the USB ports I've seen have been a pair here, a pair there).

    You have devices that need to be powered by the computer, and cannot go into a hub.
    You have your high-speed devices that take up a full port.
    You have printers that refuse to work properly through a hub.

    By the time you're done? I'm glad that USB is hot swappable, because I'm co

Life is a whim of several billion cells to be you for a while.

Working...