Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Hardware Build

Will The Next Raspberry Pi CPU Have Built-in Machine Learning? (tomshardware.com) 49

"At the recent tinyML Summit 2021, Raspberry Pi co-founder Eben Upton teased the future of 'Pi Silicon'," writes Tom's Hardware, adding "It looks like machine learning could see a massive improvement thanks to Raspberry Pi's news in-house chip development team..." Raspberry Pi's in-house application specific integrated circuit team are working on the next iteration, and seems to be focused on lightweight accelerators for ultra low power machine learning applications. During Upton's talk at 40 minutes the slide changes and we see "Future Directions," a slide that shows three current generation 'Pi Silicon' boards, two of which are from board partners, SparkFun's MicroMod RP2040 and Arduino's Nano RP2040 Connect. The third is from ArduCam and they are working on the ArduCam Pico4ML which incorporates machine learning, camera, microphone and screen into a the Pico package.

The last bullet point hints at what the future silicon could be. It may come in the form of lightweight accelerators possibly 4-8 multiply-accumulates (MACs) per clock cycle.

This discussion has been archived. No new comments can be posted.

Will The Next Raspberry Pi CPU Have Built-in Machine Learning?

Comments Filter:
  • Closed Platform (Score:4, Informative)

    by darkain ( 749283 ) on Sunday March 07, 2021 @02:39PM (#61133550) Homepage

    The Raspberry Pi is already a closed platform relying on proprietary driver implementations to get anything to really work. As much as they tout openness, this has been quite annoying to get anything non-Linux working on the damn things. I really wish they'd at least give spec sheets for things like their Wi-Fi interface, so other OSes are not locked into either Ethernet (if they're lucky), or USB dongles (if they're unlucky). Now they want more proprietary shit bolted on!?

    • The Raspberry Pi is already a closed platform relying on proprietary driver implementations to get anything to really work. As much as they tout openness, this has been quite annoying to get anything non-Linux working on the damn things. I really wish they'd at least give spec sheets for things like their Wi-Fi interface, so other OSes are not locked into either Ethernet (if they're lucky), or USB dongles (if they're unlucky). Now they want more proprietary shit bolted on!?

      Yes, because the goal is to provide a $35'ish Single Board Computer that is as capable as any other Linux box (functionality wise, not performance wise) for those wanting to learn about computer hardware and software.

      Its OK if people are not political and just want a *nix that works.

      • "Works", for a computer, is defined as "Hardware, programmable by me.

        You seem to confuse that with an appliance or gadget, that may very well be implemented on a computer, but where you yourself never get access to the computer.
        Maybe because you never actually used a computer, and thought clicking on colorful (well, not anymore) widgets *was* using a computer. (Wel, it's not.)

        What this shit here is, is defective by design.

        Don't contribute to the organized crime. Never play along with their protection racket

      • by darkain ( 749283 )

        If the ultimate goal is to learn about computer hardware... then why have half the hardware closed off? Its requiring significant effort to reverse engineer things like the GPU, not just the Wi-Fi interfaces, to get other things functional on this board. What is learned from that, honestly?

        • by drnb ( 2434720 )

          If the ultimate goal is to learn about computer hardware... then why have half the hardware closed off? Its requiring significant effort to reverse engineer things like the GPU, not just the Wi-Fi interfaces, to get other things functional on this board. What is learned from that, honestly?

          What makes you think the GPU and WiFi are not functional? They function very well with their proprietary drivers.

          As for learning about computer hardware, you seem a bit unfamiliar with RPi. Here are the hardware schematics, data sheets, GPIO pinout docs and various other docs.
          https://www.raspberrypi.org/do... [raspberrypi.org]

          • by darkain ( 749283 )

            Please read the previous posts in the thread about non-Linux support. Then please tell me where in the datasheet there is information on the bus Wi-Fi connects to, as well as the VideoCore system.

            • by drnb ( 2434720 )

              Please read the previous posts in the thread about non-Linux support. Then please tell me where in the datasheet there is information on the bus Wi-Fi connects to, as well as the VideoCore system.

              I do not need to re-read, I read and understand the claim. My point is that the issue is something irrelevant to the mission of Raspberry Pi. Again, not everything needs to conform with a political ideology about software to be useful and beneficial and do an outstanding job at completing its mission. Your ideological mission is not theirs, and that is fine. Different people can have different priorities.

    • Now they want more proprietary shit bolted on!?

      The new stuff described in TFA is hardware. With a few rare exceptions, such as RISC-V, all hardware is proprietary.

      It is unfortunate that the Raspberry Pi relies on a few BLOBs, but the new tech in TFA doesn't add to that problem.

    • fantastic.
      just make an interface card to microsoft and one for apple and you can bill accordingly.
      and the p s 5

    • There are open machine learning accelerators. mainly for doing inference. Even big evil Nvidia has free and open [nvdla.org], depending on your definition of open.

      • I was wondering if this really meant inference. To me that in itself isn't machine "learning" but it would seem odd to train a model on a pi.
    • by stikves ( 127823 )

      They are making some progress, but finding completely open hardware, with good performance, at affordable prices is unfortunately not yet possible.

      For example, they are working on an open source Vulkan driver:
      https://www.raspberrypi.org/bl... [raspberrypi.org]

      Which will one day replace the current binary blob.

    • Yeah, I've found quite a few ways in which Raspberry Pi aren't remotely open. At this point I'll use them if I have to for a project but otherwise I opt for other SBC platforms, even if they have closed aspects because I haven't found a company quite as desperate to own every little thing about their IP as RasPi. You want a silly example? Try asking the foundation for permission to make something physical with their logo. You will hit far less red tape if you were to ask permission from Google or Microsoft.

    • The current main Raspberry Pi series is closed silicon. It exists in part because the foundation received support from Broadcom, which initially sold them the SoC for the series at a price well below what they charged other customers. (Shortly after the first RPi came out, another company introduced a clone using the same SoC. They got to make exactly one limited production run, at which point Broadcom told them that they would not receive any chips in the future. Broadcom currently does not sell any SoCs t

  • Will The Next Raspberry Pi CPU Have Built-in Machine Learning?

    Its pretty damn likely if you let NVIDIA buy ARM. Then there would be funding for such efforts comparable to what Apple is doing with their proprietary ARM implementations.

  • by Nkwe ( 604125 ) on Sunday March 07, 2021 @03:04PM (#61133626)
    Setting aside the question what "machine learning" actually means... Machine learning is just software and if it is a computer running software, then machine learning is already "built in" to some extent. I assume that the claim is that there is some specific hardware support to accelerate an algorithm currently in vogue for machine learning. Can someone comment on technically what kind of hardware acceleration is considered state of the art for machine learning and if this is what on the table for the Raspberry Pi? What kind of practical speed improvement would this hardware in the Pi create? Would it double, triple, or raise by an order of magnitude the speed that a model can be trained or executed? In other words, is this actually significant or not? The article doesn't provide any useful context.
    • by walshy007 ( 906710 ) on Sunday March 07, 2021 @03:18PM (#61133648)

      The quick eli5 type explanation would be, largely parallel floating point math. Which is why GPU's are so heavily used currently.

      The main difference is in a lot of cases high precision isn't required, so 16-bit floating point math is used instead of 32-bit to get far higher performance with similar memory bandwidths.

      • Can you eli8 and also explain the differences between say a classical GPU and what the "Tensor cores" in NVIDIA terminology do? I'm a complete luddite when it comes to this but it seems at the moment for machine learning the only GPUs people are recommending for any machine learning stuff are the RTX ones from NVIDIA, and if it were only floating point math would simply a higher core count be sufficient rather than special purpose silicon?

        • the differences between say a classical GPU and what the "Tensor cores" in NVIDIA terminology do?

          Truly Classical GPUs had fixed function hardware designed to do specific graphics calls. You feed it a 3D model and a texture map for color and tell it where your camera is and it returns an image. You couldn't really program anything to run on a GPU you could only fill in the blanks. You mad libs the 3D scene, it returns an image.

          Modern GPUs have moved to programmable rendering. They are fundamentally now no different from a CPU but have made different performance choices in how they're laid out. A CPU

    • Yes at the core of things, "built in" machine learning is acceleration of floating point vector math...

      But to say you are including built in machine learning, also implies there would basically be libraries around this were you could pass in common neural network models already trained, and be able to pass input through them.

      There's a reason why the camera is also part of one initial package, as support will probably be heavily focused around neural networks that deal with images (for things like face recog

      • by dfghjk ( 711126 )

        "Yes at the core of things, "built in" machine learning is acceleration of floating point vector math..."

        That's not wrong; it's not insightful either. Literally anyone could trivially deduce this, but no one on /. appears able to go beyond it either. What a surprise. Where are all those programming experts?

        ML doesn't merely require vector operations, it requires execution of specific kinds of operations on a massive scale. Specifically, it requires huge numbers of convolution operations, but it's OK, Su

      • Voice recognition is another important application so it's unlikely to be ignored.
    • Can someone comment on technically what kind of hardware acceleration is considered state of the art for machine learning

      The new features described in TFA are designed to speed up neural networks using platforms such as TensorFlow. You want fast operations, especially multiplication, on large matrices of low-precision floating-point values, typically FP16.

      Training a NN is very computationally expensive, so even with the new features, it is not something you will want to do on a RPi. It makes much more sense to upload your data to the cloud, do the training there using TPUs or high-end GPUs, and then download the fully traine

    • Maybe if you weren't so obviously prejudiced and convinced it was nonsense and bothered learning something about machine learning you wouldn't be so cynical or even *gasp* might actually find a use for it.

      You read like the old man dispariging the internet. "My telephone works just fine. I've never needed to connect to no InterNet or whatever fad these kids are wasting their lunch money on these days! I don't see no reason to add dedicated networking hardware... Probably just dials phone numbers faster or so

    • repeat if(rnd(3) !=0)?good++:bad++; until(good===100||bad===100)
      ?
      did it learn something ?
      • i dont have an average of 15 phd's (not even a hi-skool rag) so im still stuck on "how is a.i. different from google collecting your clicks" ? a glorified database with a search algorithm forming clusters on most hits - despite that it seems to be able to compose music by itself . . . (but so do pop-tarts)
        owh - aHA , the /. pitfall ... not falling for it today
      • by Nkwe ( 604125 )

        repeat if(rnd(3) !=0)?good++:bad++; until(good===100||bad===100) ? did it learn something ?

        Two out of three ain't bad?

  • ...have built-in Betteridge controller ?
  • Heck, I would just love it if they would support I2C bus clock stretching right....

  • The Orange Pi line [orangepi.org] already has had machine learning for years, though it's not their own silicon.

  • by gweihir ( 88907 )

    There is no "machine learning". There is just parametrization of statistical classifiers from data. That does not sound nearly as sexy, but happens to actually be accurate and not create false expectations. And the RPi? That is a low-performance platform, putting any "ML" accelerators in there would be completely stupid. But the designers of the RPi have time and again demonstrated they are clueless and cannot even get electronics right, so yes, this stuff will probably be in there instead of an actually go

    • Supposetly they are after a low power platform to run the NN on. But I second that a mass storage device interface wold be great addition perhaps a M.2 connector (SATA is ao last century :-). It could even be used for other PCIe devices.

      • by gweihir ( 88907 )

        Supposetly they are after a low power platform to run the NN on. But I second that a mass storage device interface wold be great addition perhaps a M.2 connector (SATA is ao last century :-). It could even be used for other PCIe devices.

        A PCI-E interface would be nice, yes. But as they still have not even managed to do proper Ethernet (runs over a crappy USB interface on the RPi), I do not expect that to happen anytime soon.

    • And what exactly is stupid about that? You have a simple main CPU and a specialized CPU for the ML.
      What is wrong? I don't get it.

  • Now which one is it going to be: the Classic-PI or the Pico-PI that gets the extention?

  • 2 Gigabit nics ;) and another bump in memory.
  • Huh? (Score:4, Insightful)

    by backslashdot ( 95548 ) on Sunday March 07, 2021 @09:36PM (#61134548)

    We donâ(TM)t need that. Give it 2D GPU features instead so the GUI doesnâ(TM)t have to be excruciating. Make it handle 4K @60fps easily.

    • Give it 2D GPU features instead so the GUI doesnâ(TM)t have to be excruciating.

      Huh? We don't need that. GUI on a Raspberry pi? Sounds like a strange edge use case. Man I have so many of the things in my house, but not one attached to a mouse, keyboard, or a display.

    • line buffers for filtering a 4K output of a 2D scaler would be larger than a simple ML unit. plus the pin count goes up when you want to hook video output to a chip.

The opossum is a very sophisticated animal. It doesn't even get up until 5 or 6 PM.

Working...