You can bring a virtual robot to a refrigerator, but you cannot force it to take out a drink.
ManipulaTHOR adds a highly articulated robotic arm to the institute AI2-THOR artificial intelligence platform, which should provide much more ability to test robot software even before they are built.
AI2-THOR was programmed to find its way through virtual versions of indoor environments, such as kitchens and bathrooms. You could use computer vision to locate everyday objects, but the model didn’t delve into the mechanics of moving those objects. Instead, he simply made them levitate, like magic from a video game.
Now AI2-THOR is getting real.
“Imagine a robot capable of navigating a kitchen, opening a refrigerator, and pulling out a can of soda.” AI2 CEO Oren Etzioni said in a press release. “This is one of the biggest and yet often overlooked challenges in robotics, and AI2-THOR is the first to design a benchmark for the task of moving objects to various locations in virtual rooms, which allows reproducibility and measures progress. “
Etzioni said it has taken five years to get AI2-THOR to this point.
“Now we can begin to train robots to perceive and navigate the world in a way more similar to how we do, making real-world usage models more achievable than ever,” he said.
Kiana Ehsani, an AI2 research scientist who worked at ManipulaTHOR, said the improved model could help train robots to assemble manufactured goods in factories, sort packaged goods in warehouses or even prepare for space missions.
“This can be generalized to any of that,” Ehsani told GeekWire. “Not at the moment… but we think of this environment as a framework that can allow researchers to develop models for any type of object manipulation. It doesn’t have to be just in kitchens, or just indoor scenes, or just in homes. “
ManipulaTHOR’s virtual robotic arm is designed to simulate the capabilities of the Kinova Gen3 Modular Robotic Arm, a commercially available product that has six degrees of freedom.
Researchers can program the virtual arm into AI2-THOR 3.0 To remove obstructions from the path, grab the objects that need to be manipulated and move them as they are supposed to move in the real world.
If the articulated arm is in the wrong setting and the robot crashes into a virtual faucet while opening the kitchen faucet, artificial intelligence researchers can adjust their software to prevent that from happening in the real world. More importantly, computer models running on AI2-THOR 3.0 should be better prepared to deal with novel situations.
“We showed that if you train on a subset of the scenes we have, and then take this robot and put it in a totally new environment that has never been seen, it can still avoid obstacles and bring objects to the target location,” Ehsani said.
Now that ManipulaTHOR has been launched, Ehsani and his AI2-THOR teammates are inviting researchers to participate in RoboTHOR Challenge 2021, which was held in conjunction with the Embedded AI Workshop in June Conference on Computer Vision and Pattern Recognition.
“The challenge will be in the simulation, and the task will be to move to an object, lift it, and then move it to the target location … without interfering with the rest of the room,” Ehsani said.
Of course, the litmus test will come when real-world robots are programmed using the computer models developed for AI2-THOR’s virtual robots. Even before the launch of ManipulaTHOR, AI2 planned to do that as part of last year’s RoboTHOR Challenge, but due to the coronavirus pandemic, real-world testing had to be postponed.
“That’s on the list for sure, hopefully in the very near future,” Ehsani said.