A breakthrough in tactile sensor technology could pave the way for advanced prosthetics, that have a better sense of touch than regular human fingers. Researchers at the University of Southern California's Viterbi School of Engineering have developed a robotic fingertip that can distinguish different materials by touch, a development that could not only lead to more sophisticated prostheses, but advancements in personal assistive robots, product testing, and industrial robotics.
Gerald Loeb and his team at USC wanted the fingertip to mimic a human's, including a soft, flexible skin over a liquid filling. It even has fingerprints to improve the device's sensitivity to vibrations (which means that the robot should probably put on a pair of gloves, should it wish to commit a crime). As the robotic finger slides across a textured surface, the skin vibrates in different ways depending on the quality of the object. These vibrations are detected by a hydrophone inside the finger, which in turn sends the incoming information to a processing device. Smartly, the engineers weren't trying to re-invent the wheel, here — the human finger senses textures in pretty much the same way.
Sensors, actuators – and a 200 year-old algorithm
The next part of the challenge was for the robot to identify what it was touching. And for this, the researchers used a 200 year-old theorem developed by Thomas Bayes. Called Bayes' Theorem, it suggests that a prior experience should be used to inform decision making. When humans identify an object by sense of touch alone, we're falling back on previous experience. It sounds like common sense, but there's a fancy algorithm that developers can use to convince computers to do the same. By applying "Bayesian Exploration" to the problem, the researchers had their robot study about 117 common materials gathered from fabric, stationary, and hardware stores.
After this was done, the robot was given the challenge of having to identify these textures at random — and it did so remarkably well, according to a paper that Loeb and the others published in the journal Frontiers in Neurorobotics. They claim the robot successfully identified the materials 95% of the time, after touching and rubbing the various objects an average of five times. And amazingly, it did better than humans — including some textures that humans couldn't distinguish.
Touching the future
What's quite remarkable about this breakthrough is how wide ranging its applications could be. This is not lost on Loeb, who's based a start-up around it called SynTouchLL, a company that will develop and manufacture tactile sensors for mechatronic systems that mimic the human hand. Eventually he'd like to see his tactile sensors used in both prosthetic hands and industrial robots.
Indeed, putting these sensors on artificial limbs would certainly take prostheses to the next level. But as exciting as that potential may be, it would still be fairly limited. Theoretically speaking, while a person would be able to use the device to distinguish materials, they would have no qualitative or subjective sense of what they're touching. In addition, they wouldn't be able to have a preference for what they were feeling. But it's not a stretch to imagine that further research in this area could resulting in the interlinking of human "qualia" with the sensors, offering the user a more personal experience when touching objects – including that pleasurable feeling that comes from a soft caress.
The breakthrough could also lead to industrial robots that would be able to touch and sense things in a way that rivals human workers. In theory, these robotic fingers could be able to better distinguish between objects, getting an enhanced sense of how different objects relate to each other, and what their various tolerances might be. In other words, our sense of touch could be outsourced to robots.
And there's also the potential for personal, or domestic robots who have a sense of touch. Not only would they be able to distinguish between objects, they could also be programmed to respond to being touched — something that could have profound implications for robotic companions of all sorts. And yes, you're thinking what I'm thinking.
Be sure to read the entire paper on Bayesian exploration for intelligent identification of textures.
The above video and all inset image was provided via USC/SynTouch.