Setting aside the question what "machine learning" actually means... Machine learning is just software and if it is a computer running software, then machine learning is already "built in" to some extent. I assume that the claim is that there is some specific hardware support to accelerate an algorithm currently in vogue for machine learning. Can someone comment on technically what kind of hardware acceleration is considered state of the art for machine learning and if this is what on the table for the Rasp
Yes at the core of things, "built in" machine learning is acceleration of floating point vector math...
But to say you are including built in machine learning, also implies there would basically be libraries around this were you could pass in common neural network models already trained, and be able to pass input through them.
There's a reason why the camera is also part of one initial package, as support will probably be heavily focused around neural networks that deal with images (for things like face recog
"Yes at the core of things, "built in" machine learning is acceleration of floating point vector math..."
That's not wrong; it's not insightful either. Literally anyone could trivially deduce this, but no one on/. appears able to go beyond it either. What a surprise. Where are all those programming experts?
ML doesn't merely require vector operations, it requires execution of specific kinds of operations on a massive scale. Specifically, it requires huge numbers of convolution operations, but it's OK, SuperKendall, I wouldn't expect an iOS dev and professional iPhone photographer to know what that is.
ML accelerators can substantially outperform traditional software running on either a CPU or GPU because they can be made to accelerate convolutions. That is to say, they aren't merely vector coprocessors by vector coprocessors with ML-optimized instruction sets. So yes, "at the core of things" they are processors. Thanks for that, SuperKendall.
What exactly is "built in" machine learning? (Score:3)
What it likely means (Score:1)
Yes at the core of things, "built in" machine learning is acceleration of floating point vector math...
But to say you are including built in machine learning, also implies there would basically be libraries around this were you could pass in common neural network models already trained, and be able to pass input through them.
There's a reason why the camera is also part of one initial package, as support will probably be heavily focused around neural networks that deal with images (for things like face recog
Re:What it likely means (Score:2)
"Yes at the core of things, "built in" machine learning is acceleration of floating point vector math..."
That's not wrong; it's not insightful either. Literally anyone could trivially deduce this, but no one on /. appears able to go beyond it either. What a surprise. Where are all those programming experts?
ML doesn't merely require vector operations, it requires execution of specific kinds of operations on a massive scale. Specifically, it requires huge numbers of convolution operations, but it's OK, SuperKendall, I wouldn't expect an iOS dev and professional iPhone photographer to know what that is.
ML accelerators can substantially outperform traditional software running on either a CPU or GPU because they can be made to accelerate convolutions. That is to say, they aren't merely vector coprocessors by vector coprocessors with ML-optimized instruction sets. So yes, "at the core of things" they are processors. Thanks for that, SuperKendall.