AI-Powered Artificial Fingertip Gives Robots a Nearly Humanlike Touch (science.org) 7
Slashdot reader sciencehabit shares this article from Science magazine:
Robots can be programmed to lift a car and even help perform some surgeries, but when it comes to picking up an object they have not touched before, such as an egg, they often fail miserably. Now, engineers have come up with an artificial fingertip that overcomes that limitation. The advance enables machines to sense the textures of these surfaces a lot like a human fingertip does....
[W]hen researchers at the University of Bristol began designing an artificial fingertip in 2009, they used human skin as a guide. Their first fingertip — assembled by hand — was about the size of a soda can. By 2018, they had switched to 3D printing. That made it possible to make the tip and all its components about the size of an adult's big toe and more easily create a series of layers approximating the multilayered structure of human skin. More recently, the scientists have incorporated neural networks into the fingertip, which they call TacTip. The neural networks help a robot quickly process what it's sensing and react accordingly — seemingly just like a real finger.
In our fingertips, a layer of nerve endings deforms when skin contacts an object and tells the brain what's happening. These nerves send "fast" signals to help us avoid dropping something and "slow" signals to convey an object's shape. TacTip's equivalent signals come from an array of pinlike projections underneath a rubbery surface layer that move when the surface is touched. The array's pins are like a hairbrush's bristles: stiff but bendable. Beneath that array is, among other things, a camera that detects when and how the pins move. The amount of bending of the pins provides the slow signal and the speed of bending provides the fast signal. The neural network translates those signals into the fingertip's actions, making it grip more tightly for example, or adjust the angle of the fingertip....
In a second project, Lepora's team added more pins and a microphone to TacTip. The microphone mimics another set of nerve endings deep within our skin that sense vibrations felt as we run our fingers across a surface. These nerve endings enhance our ability to feel how rough a surface is. The microphone did likewise when the researchers tested the enhanced fingertip's ability to differentiate among 13 fabrics.
The article points out that in testing, the artificial fingertip's output "closely matched the neuronal signaling patterns of human fingertips undergoing the same tests."
[W]hen researchers at the University of Bristol began designing an artificial fingertip in 2009, they used human skin as a guide. Their first fingertip — assembled by hand — was about the size of a soda can. By 2018, they had switched to 3D printing. That made it possible to make the tip and all its components about the size of an adult's big toe and more easily create a series of layers approximating the multilayered structure of human skin. More recently, the scientists have incorporated neural networks into the fingertip, which they call TacTip. The neural networks help a robot quickly process what it's sensing and react accordingly — seemingly just like a real finger.
In our fingertips, a layer of nerve endings deforms when skin contacts an object and tells the brain what's happening. These nerves send "fast" signals to help us avoid dropping something and "slow" signals to convey an object's shape. TacTip's equivalent signals come from an array of pinlike projections underneath a rubbery surface layer that move when the surface is touched. The array's pins are like a hairbrush's bristles: stiff but bendable. Beneath that array is, among other things, a camera that detects when and how the pins move. The amount of bending of the pins provides the slow signal and the speed of bending provides the fast signal. The neural network translates those signals into the fingertip's actions, making it grip more tightly for example, or adjust the angle of the fingertip....
In a second project, Lepora's team added more pins and a microphone to TacTip. The microphone mimics another set of nerve endings deep within our skin that sense vibrations felt as we run our fingers across a surface. These nerve endings enhance our ability to feel how rough a surface is. The microphone did likewise when the researchers tested the enhanced fingertip's ability to differentiate among 13 fabrics.
The article points out that in testing, the artificial fingertip's output "closely matched the neuronal signaling patterns of human fingertips undergoing the same tests."
We all know... (Score:2)
Add a hand, the fleshlight will be history (Score:2)
Dexterity (Score:2)
The problem is nobody has made a hand that is as or more dexterous as a human hand. If they had, Foxconn wouldnâ(TM)t need humans to assemble smartphones. Amazon wouldnâ(TM)t need human pickers.
The test (Score:4, Interesting)
If a robot hand can reach into a bag full of various trinkets, feel itâ(TM)s way around and pull out the rubber band .. and only the rubber band, that would be impressive. It is a task any human with hands who does not have a neurological condition can do instantly.
Re: (Score:2)
raspberry pi.
to get this problem solved.
make s t e m kits and let inquisitive minds tinker with it
Would ever mechoelectronic match bio? (Score:2)
It seem that we keep trying to build hands or other robotic devices in an engineered mechanical-electronics way. Perhaps "growing" it as wetware would eventually match the evolution-created quality?
How is this compared to MEMS tactile sensors? (Score:2)
There is research into MEMS device use for tactile sensors. How does this method compare?