RobotRepublic — Cornell University say they’ve made some major progress in their efforts to build a smart, super-sensitive robotic hand.
The resulting prosthetic soft hand technology comes closer than any robotic hand ever has to the real thing, they write in the new journal, Science Robotics.
It even begins to mimic the finer aspects of a human’s sense of touch, they say.
Here’s how the Gentle Bot technology behind it all works, what optical waveguides have to do with it and what commercial, industrial and entertainment applications will likely line up for such an innovation.
But before I get into all that, watch Cornell researchers’ demonstration video of the Gentle Bot-enabled soft hand. It is wild.
Gina Smith penned an earlier version of this story for aNEWDOMAIN. Find it HERE.
Now, scientists have long envisioned and understood the need for prosthetic hands and robotic AI-controlled hands that had the subtle, fine sense of touch of a human, but the cost and difficulty involved in developing them was huge.
WestWorld, in other words, was all but unreachable.
But then Cornell researcher Robert Shepherd and his team at Cornell’s Organic Robotics Lab decided to approach the challenge in another way. Ditching what they already knew about building prosthetic robotic hands, he and his team started from scratch.
This time around, they ignored the bulky, mechanical rigid parts and motors that typically go into high-end prosthetics.
Instead, Shepherd and his team set out to build a smart, soft robotic hand based around optical technology, which they believed could look, work and operate a lot more like a human hand.
Optical waveguide technology — fiber optics are an example of a waveguide — are the game-changing component in the Gentle Bot innovation they describe.
Because waveguides use light instead of electrical sensors to communicate, soft robotic hands based on them can be less bulky.
Cornell’s design embeds sinewy, flexible waveguides throughout the robot hand, all the way to the end of its fingertips. Because those components can move and flex along with the soft hand’s fingers, it mimics human anatomy better than any artificial limb, prosthesis or robot has to date.
“Most robots today have sensors on the outside of the body that detect things from the surface,” says Huichan Zhao, the Cornell doctoral candidate who authored the research paper, which I embedded below. As a result, the soft robot hand the team designed is essentially “innervated,” she added.
“They can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain,” said Zhao, adding:
“Our human hand is not functioning using motors to drive each of the joints; our human hand is soft with a lot of sensors … on the surface and inside the hand … Soft robotic (technology) provides a chance to make a soft hand that is [closer] to a human hand.”
The nerve-like waveguides, long and sinewy as they are, are powerful, too, say the researchers. They can measure elongation, curvature and pressure, which are key capabilities if you are trying to mimic human touch.
And here’s the secret sauce: Manufacturing waveguides, which once required expensive fabrication processes and facilities to make them, now can be custom printed. Thanks to the advent of affordable, soft lithography, even a university researcher laboring under a tight academic budget can do it.
Touch of the future
The Gentle Bot soft hand isn’t just some tomato-touching one trick pony.
It can hold a mug of coffee without spilling it — and shake your hand without feeling too, well, robotic. It also can tell, using just a quick swipe of a single, soft robotic finger, whether it is touching a sponge, acrylic or silicon rubber.
But the technology isn’t perfect, says Zhao. Sure, it can tell which of three tomatoes is ripest, but it has trouble telling the difference between a hard, not yet ripe tomato and, say, acrylic. That’s why the main focus now is to improve such subtleties.
Making the soft hand smarter and more sensitive to such little things will likely require the inclusion of more sophisticated waveguide sensors, machine learning algorithms and improvements to its soft actuators, he said.
Once those actuators can detect a wider range of pressure, detecting the difference between acrylic and vegetable gets a good bit easier.
Improving the design so it is even cheaper and easier to manufacture is a focus, too. One goal, Zhao said, is to ensure components can be fabbed via 3D printing onsite, which would allow doctors to create custom fit prosthetic hands.
How it works
But the essentials of this innovation already seem to be in place, say the researchers. That revolves around the use of waveguides as the primary transport system in a robotic hand, the same role the nerves in your fingertips play in your sense of touch.
Like your nerves, these flexible components literally bend and flex whenever the soft hand’s fingers come into contact with something, even slightly. But, because they change how much light is able to pass through, a simple photodiode detects what the light is and communicates what it finds to the host software. It really is as simple as that.
You will find far more detail about how the Gentle Bot hand works in the Cornell paper the researchers published this week, which I’ve embedded in full at the end of this post.
But if you want to hear more about a few of the edgy applications Gentle Bot technology will find in artificial intelligence, virtual reality, augmented reality and other uses — and the trippy sci-fi vision the Cornell researchers have for Gentle Bot in the long term — read on.
What’s next for Gentle Bot?
You don’t have to read much of Cornell research paper (it’s embedded below) to come up with all kinds of imaginative and even edgy potential applications for ultra-sensitive robotic hands that look, feel and work like a real human hand, or anything close.
As tech writers like me have been fond of saying for years now, long before an innovation makes commercial sense, it’ll debut for military uses — usually in the US or Israel. For what it’s worth, the Cornell research paper, above, notes the project is funded by the Air Force Office of Scientific Research (AFOSR).
And after the military, so the old saw goes, comes sex. This is a sector that won’t have to think too hard to find all manner of profitable uses for super-sensitive robotic hands.It’s just as easy, though, to imagine Gentle Bot tech applications showing up in all manner of exciting AI, VR and AR-related entertainment, apps and games, too.
Medical prosthetics are certainly one of the first commercial applications the creators of Gentle Bot are eyeing. In their research report, the team writes that “our experiments also demonstrate the high resolution of our sensors … as well as their compatibility with soft robotic systems.”
“The application we explored in this work could potentially provide sophisticated grasping and sensory feedback for advanced, custom prosthetics at low cost.”
An early debut in manufacturing and assembly lines seems to be definitely in the cards, too, judging on the theme of their demonstration video and write up.
After all, many big food packing plants already use primitive soft robotic grips to sort and pack fragile food items.
All these commercial and medical uses aside, though, the researchers behind Gentle Bot say they have far bigger, loftier goals in mind here.
And about these Cornell guys, let me tell you: They sure dream big. Ultimately, they envision their tech as inspiring entire, innervated bio-inspired robots, robots that will someday explore the furthest reaches of Earth and even the universe.
I’m Gina Smith and this is RobotRepublic.
Here’s the Cornell research paper on the Gentle Bot soft hand tech.