3 Cutting-edge robotic technologies in the works right now

As robots begin to look more and more like humans in order to extend their capabilities – in the same way eyeglasses or wheelchairs do – robotic systems need to evolve a bit more. The goal of many researchers is to achieve highly-developed systems that can help people be better, stronger, and faster – at a reasonable price.

“The number of people with partial impairments is very large and continues to grow,” said Conor Walsh, a roboticist at Harvard University who is developing soft robotics technologies. “For example, these include people who are aging or have suffered a stroke. Overall, about 10 percent of individuals living in the U.S. have difficulty walking. That’s a tremendous problem when you think about it.”

The National Science Foundation (NSF) has funded numerous projects across the country to help make these technologies come to life, as well as ensure they reliable, durable, comfortable and personalized to users.

Check out these three cutting-edge technologies funded by the NSF.

Robots for the blind

One robotic system being developed by roboticists at Carnegie Mellon University includes assisting those with visual impairments in their day-to-day travels. For many people with visual impairments, one of the big challenges in navigating complex buildings and transit stations is that there is not enough funding to provide human assistance to those need it at all times of day and across a whole building or space.

“Assistive robots can extend the reach of employees and service providers so visitors can receive help 24/7 anywhere in the building,” said Aaron Steinfeld, NSF-funded roboticist at Carnegie Mellon University.

Steinfeld and his colleagues are designing cooperative robots, or co-robots, to empower people with disabilities to safely travel and navigate unfamiliar environments. The team focuses on information exchange, assistive localization, and urban navigation in his research -- essentially finding new ways for robots and humans to interact. "In our experience, people who are blind are very willing to interact with a robot, to touch its arms and hands," he says. (Image Credit: Carnegie Mellon University)
Co-robots will empower people with disabilities to safely travel and navigate unfamiliar environments. (Image Credit: Carnegie Mellon University)

Steinfeld and his team are designing cooperative robots, or co-robots, to allow people with disabilities to safely travel and navigate unfamiliar environments. The team focuses on information exchange, assistive localization, and urban navigation, looking for new ways for robots and humans to interact.

“For a person who is blind, navigation needs are slightly different than those who are sighted,” said Steinfeld.

For example, a common way to provide directions to someone who is blind is to trace a map on the person’s hand. In this case, the researchers found that people feel more comfortable doing this with a robot than a stranger because there is no social awkwardness.

“In our experience, people who are blind are very willing to interact with a robot, to touch its arms and hands,” Steinfeld added.

In a transit station scenario, robots could provide intelligent, personalized assistance to travelers with disabilities, freeing up Metro personnel for more complicated tasks better-suited to humans.

Controlling robots with eye motion

A gaze-controlled robotic system that works in three dimensions to enable people with motor impairments to fetch objects using eye movement. (Image Credit: Xiaoli Zhang/ Colorado School of Mines)
A gaze-controlled robotic system that works in three dimensions. (Image Credit: Xiaoli Zhang/ Colorado School of Mines)

An important element in robot-human interaction is that of anticipation. Assistive technologies are learning to “read” humans and respond to their needs in more sophisticated ways.

Xiaoli Zhang, an engineer at Colorado School of Mines, is working on a gaze-controlled robotic system that works in three dimensions to allow people with motor impairments to gather objects just by looking at them.

So, for example, if a person needs to retrieve his or her smartphone, the robot can report instructions on how to do so.

If a person wants to pick up a cup or smartphone, the natural thing to do is to look at it first, so Zhang studies how people use their eyes to express intentions and then uses that data to fine-tune a system to control robotic movement through eye motion.

“We think gaze is unique because it is a naturally intuitive way for how people interact with the world,” said Zhang. “If you’re thirsty, you look for a bottle of water. You need to look at it first before you manipulate it.”

Other systems that are in place focus on the amount of time someone looks at an item.

Zhang is researching a pattern-based system that factors in more than gaze time, though. For example, blink rate and pupil dilation are closely related to people’s intent to manipulate an object.

More natural means of communications between humans and robots will be necessary for them to be implemented into daily life.

Robotic convenience

Roboticists are also working on ways to make assistive technologies more convenient.

Walsh, whose NSF-funded projects include the development of a soft robotic exosuit and soft robotic glove -- both wearable technologies to restore or enhance human movement -- says affordability, comfort and convenience are important considerations in his research. "It comes down to: 'How do we apply as much force as possible in the most comfortable way?'" he says. (Image Credit: Wyss Institute at Harvard University)
 Soft robotic exosuit and soft robotic glove to restore or enhance human movement. (Image Credit: Wyss Institute at Harvard University)

Walsh’s project also includes the development of a soft robotic exosuit and soft robotic glove, both dedicated to restoring and enhancing human movement.

“It comes down to: ‘How do we apply as much force as possible in the most comfortable way?'” said Walsh.

Walsh compares the soft robotic suits he’s working on to that of someone receiving a little push while swinging on a swing.

“As someone is walking, we give them a little boost to walk farther, walk longer. If you want to go to the local store to buy something, put on a robotic suit to walk around. If you want to cook dinner, put on a glove that helps you be more dexterous,” said Walsh.

He focuses on minimalist, user-friendly systems that incorporate relatively new components in robotics: textiles, silicon and hybrid materials. (His lab is home to about seven sewing machines.)

These three projects aim to support a new generation of robots — that don’t look like conventional robots — tailored to people who need assistance the most.

Comments are closed, but trackbacks and pingbacks are open.