Using deep learning to give robotic fingertips a sense of touch
by Ingrid Fadelli, Tech XploreResearchers at the University of Bristol have recently trained a deep-neural-network-based model to gather tactile information about 3-D objects. In their paper, published in IEEE Robotics & Automation Magazine, they applied the deep learning technique to a robotic fingertip with sensing capabilities and found that it allowed it to infer more information about its surrounding environment.
"Our overall idea was to artificially recreate the sense of touch when controlling robots as they physically interact with their surroundings," said Prof. Nathan Lepora, one of the researchers who carried out the study, told TechXplore. "Humans do this without thinking—for example, when brushing their fingers over an object to feel its shape. However, the computations underlying this are surprisingly complex. We implemented this type of physical interaction on a robot, by applying deep learning to an artificial fingertip that senses analogously to human skin."
Prof. Lepora has been trying to recreate a sense of touch in robots for almost a decade, now. In his previous works, he used more conventional machine learning techniques, such as probabilistic classifiers. However, he found that these techniques only allowed robots to perform very basic tasks, such as feeling simple 2-D shapes with a slow tapping motion.
"The breakthrough in this new paper was that the methods we used work in three dimensions on natural complex objects, sliding the fingertip much as humans would do," Prof. Lepora. "We could do this because of the advances in deep learning over the last few years."
Providing robots with a sense of touch can aid the control of their hands and fingertips, allowing them to estimate the shape of and texture of objects or parts of objects that they come into contact with. For instance, when sliding across a surface following an edge, a robot could be able to estimate the edge's angle and move its robotic finger accordingly.
"Deep learning allowed us to construct reliable maps from the sensory data to surface features such as edge angle," Prof. Lepora said. "This is difficult, because sliding a soft human-like fingertip over surfaces distorts the data it gathers. Previously, we were not able to separate this distortion from the shape of the surface, but in this work, we succeeded by training a deep convolutional neural network with examples of distorted tactile data, which allowed us to produce accurate surface angle estimates to within a fraction of a degree."
By collecting accurate estimates of surface angles, the deep learning technique devised by Prof. Lepora and his colleagues enables better control of robotic fingertips. In the future, this method could provide robots with a physical dexterity resembling that of humans, allowing them to efficiently adapt their grasping and manipulation strategies based on the objects they are interacting with.
So far, the researchers have demonstrated their technique's effectiveness by integrating it with a single robotic fingertip. In the future, however, it could be applied to all of a soft robot's fingertips and limbs, allowing it to handle tools and complete manipulation tasks in a similar way to humans. This could ultimately pave the way for the development of more efficient robots to be deployed in a variety of settings, including robots designed to complete house chores, pick produce in farms or attend to patient's needs in healthcare settings.
"My lab also fabricates 3-D-printed fingertips and full robot hands with tactile sensing that replicate the human sense of touch," Prof. Lepora said. " In our next studies, we intend to use artificial intelligence methods such as the one proposed in our paper to investigate dexterous interactions with entire tactile robotic hands, which would allow robots to handle tools or other objects more efficiently."
More information: Nathan Lepora et al. Optimal Deep Learning for Robot Touch: Training Accurate Pose Models of 3D Surfaces and Edges, IEEE Robotics & Automation Magazine (2020). DOI: 10.1109/MRA.2020.2979658 |