AI-Enabled Samsung Galaxy Z Series with Innovative Foldable Form Factor & Significantly Improved Screen Delivers New User Experiences Across Productivity, Communication & Creativity The…
Robotic hands are getting more touchy-feely
A group of scientists at Cornell University have created a way for soft robotic hands to feel their surroundings internally, much like we do.
The group was led by the assistant professor of mechanical and aerospace engineering and principal investigator of Organic Robotics Lab, Robert Shepherd. Their paper detailed how stretchable optical waveguides can act as sensors in a soft robotic hand.
The paper, titled Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides, was featured in an edition of Science Robotics.
Lead author of the paper Huichan Zhao commented on his findings in a blog post.
“Most robots today have sensors on the outside of the body that detect things from the surface. Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example.”
The Organic Robotic hand uses light to successfully feel its surroundings
The research group used a four-step lithography process for producing a core which light would spread through, as well as the cladding which would serve as the outer surface of the waveguide. The guide will also house the LED and the photodiode.
Producing the soft robotic hand in this way would mean that the more the hand deforms or changes, the more light the core would lose. This loss of light would then be detected by the diode, which is how the hand can sense its surroundings.
Shepherd explained how the hand recorded various controls.”If no light was lost when we bend the prosthesis, we wouldn’t get any information about the state of the sensor. The amount of loss is dependent on how it’s bent.”
The group used the prosthesis to perform tasks such as probing and grasping to record the various shapes and textures. The prosthesis was even able to scan through tomatoes and figure out which one was the ripest.
Featured image and video via Cornell University