By Maya Flores and Polly Ouellette
Robotics Ph.D. students Lakshmi Nair and Joanne Truong use creative problem-solving research to inform robotics that could potentially help in household tasks
In 1970, an accident on the Apollo 13 mission to the moon caused an explosion that destroyed an oxygen tank and threatened the astronauts’ ability to return to Earth. The astronauts were able to use their ingenuity and the resources available on the command module to create a makeshift solution that provided them with enough clean air and power to make it home.
The famous Apollo 13 mission is an example of creatively solving a problem using only the resources available. This concept is often called “MacGyvering” – from a TV series in the 1980s – and is one of the primary inspirations for the work of engineer Lakshmi Nair, who is pursuing her Ph.D. in Robotics in the School of Electrical and Computer Engineering here at Tech.
Nair’s research focuses on creative problem solving, and in particular, she is interested in developing algorithms that will allow robots to use the objects around them to complete a certain task.
That’s why she and her colleagues in computing created the “RoboGyver,” a robot that combines object recognition algorithms with creative thinking psychology to build tools it might need to complete a task, like a hammer, scoop or pinchers.
RoboGyver, which is a shiny, black robotic arm equipped with a pincher at the end and mounted on a table, is able to use a 3D camera to assess the objects on the table, determine their characteristics and analyze which might be able to fit together to create the tool it has been instructed to build. Once RoboGyver has decided which objects to use, it attempts to attach them together to construct the tool.
Nair says she has seen her fair share of comical incidents, in which RoboGyver has tried and failed to combine objects to create a tool. But now, if she instructs RoboGyver to build a certain tool, the robot can do it with some regularity. In the future, Nair hopes to take the project one step further, and simply assign the robot a task – and it would have to take care of the rest.
Projects like RoboGyver are helping engineers figure out how to make robots more adaptable. Many existing robots perform very limited and repetitive tasks, such as manufacturing a product on an assembly line. A robot that is able to creatively solve problems would be able to perform more diverse tasks and could even be implemented in the home.
“Even as humans, we do so many creative things around the house,” said Nair, who imagines a world with household robots to help out with the chores. “Many times, we might use objects in very unconventional ways. We probably don't consciously realize it. I think it makes sense for something like a household robot to have those capabilities.”
There’s still a long way to go, and Nair is doubtful that we will see a household robot in our lifetimes. In order for robotics to make advances, she says that there has to be a coevolution of both hardware and software developments. Even if the algorithm used for RoboGyver rapidly improves, the sensor it uses must also become more sophisticated.
Many households around the world already have some robots – smart devices and autonomous vacuums. It might be decades before we have a handy household robot to help with the cooking and cleaning, but the creative algorithm research Nair is doing will get us one step closer.
Twenty years ago, voice-activated personal assistants seemed like far-off fantasy, only seen in comic books and science fiction. Now, we have Siri on our phones, Cortana on our computers, and Alexa in our homes. One Georgia Tech engineer is trying to take personal assistants even further.
A Ph.D. student in the School of Electrical and Computer Engineering, Joanne Truong is forging her own path in the field of robotics.
“I’m really trying to bring all the advancements that Artificial Intelligence (AI) has seen into the robotics world” Truong said. “Sure, Alexa can answer all sorts of questions, but we don’t have anything that can really act in the environment and do things for us now. I think it would be really cool if we could have Alexa’s with limbs so we could ask her things like: ‘Hey is my computer in my room? If it is, please go and get it for me.’”
In her current research project, Sim2Real, Truong is working with a team to teach robots to learn real-life skills via simulation. The project hopes to address one of the biggest obstacles in robotics today: creating robots that can learn.
“We don’t want to hardcode everything onto the robot, we want to teach the robot to learn on its own,” said Truong. “Robots learn through experience. For example, if it moves a certain way and gets to its goal it remembers that. But in order to teach this way we need years of experience, which is infeasible to do with a real robot. We leverage simulation in order to make things faster and more affordable to do.”
By 3D scanning a real environment and creating a virtualized replica in simulation, the Sim2Real project allows researchers to teach a robot a skill much faster. Instead of physically setting up a robot, having it run the course, and then resetting the experiment up, researchers can simply press a button and run the course virtually. In addition to this, multiple simulations can be run at the same time, or in parallel, speeding up the learning even faster. The knowledge from the simulations is then loaded onto a physical bot.
“We can now teach a robot to navigate from one point to another in about three days,” Truong said. “The robot collects billions of frames of experience, which would take years to do on a physical robot.”
The team also works with a PyRobot, which looks like a Roomba vacuum with a 360-degree arm mounted on top. One of the skills Truong and her team has taught the bot is point goal navigation, where it moves from a starting point in a room to a goal point, completely on its own.
While that may seem like a simple feat, Truong explains just how difficult teaching robots simple tasks can be.
“Figuring out all of the things that are required for a robot to understand the sentence ‘Go to the kitchen and get me some orange juice’ is extremely complex,” Truong said. “That simple command has so much more to it. The robot must understand the words you are saying, the reasoning behind what you want it to do, what object you want it to grab, and where in the environment it needs to go. It also needs to understand that orange juice is usually found in the kitchen, therefore it needs to map and learn how to get to the kitchen, needs to learn how to open the fridge and get the orange juice and all that.”
Looking forward to the future of her field, Truong is hopeful. “I want to make household robotics a reality for everyone. My future is a world where things are more automated and efficient.”
Last revised May 14, 2020