What if we could throw our sight, hearing, touch, and even sense of smell to distant locales and experience these places in a more visceral way?
So we wondered what would happen if we were to tap into the worldwide community of gamers and use their skills in new ways. With a robot working inside the deep freezer room, or in a standard manufacturing or warehouse facility, remote operators could remain on call, waiting for it to ask for assistance if it made an error, got stuck, or otherwise found itself incapable of completing a task. A remote operator would enter a virtual control room that re-created the robot’s surroundings and predicament. This person would see the world through the robot’s eyes, effectively slipping into its body in that distant cold storage facility without being personally exposed to the frigid temperatures. Then the operator would intuitively guide the robot and help it complete the assigned task.
To validate our concept, we developed a system that allows people to remotely see the world through the eyes of a robot and perform a relatively simple task; then we tested it on people who weren’t exactly skilled gamers. In the lab, we set up a robot with manipulators, a stapler, wire, and a frame. The goal was to get the robot to staple wire to the frame. We used a humanoid, ambidextrous robot called Baxter, plus the Oculus VR system. Then we created an intermediate virtual room to put the human and the robot in the same system of coordinates—a shared simulated space. This let the human see the world from the point of view of the robot and control it naturally, using body motions. We demoed this system during a meeting in Washington, DC, where many participants—including some who’d never played a video game—were able to don the headset, see the virtual space, and control our Boston-based robot intuitively from 500 miles away to complete the task.
The best-known and perhaps most compelling examples of remote teleoperation and extended reach are the robots NASA has sent to Mars in the last few decades. My PhD student Marsette “Marty” Vona helped develop much of the software that made it easy for people on Earth to interact with these robots tens of millions of miles away. These intelligent machines are a perfect example of how robots and humans can work together to achieve the extraordinary. Machines are better at operating in inhospitable environments like Mars. Humans are better at higher-level decision-making. So we send increasingly advanced robots to Mars, and people like Marty build increasingly advanced software to help other scientists see and even feel the faraway planet through the eyes, tools, and sensors of the robots. Then human scientists ingest and analyze the gathered data and make critical creative decisions about what the rovers should explore next. The robots all but situate the scientists on Martian soil. They are not taking the place of actual human explorers; they’re doing reconnaissance work to clear a path for a human mission to Mars. Once our astronauts venture to the Red Planet, they will have a level of familiarity and expertise that would not be possible without the rover missions.
Robots can allow us to extend our perceptual reach into alien environments here on Earth, too. In 2007, European researchers led by J.L. Deneubourg described a novel experiment in which they developed autonomous robots that infiltrated and influenced a community of cockroaches. The relatively simple robots were able to sense the difference between light and dark environments and move to one or the other as the researchers wanted. The miniature machines didn’t look like cockroaches, but they did smell like them, because the scientists covered them with pheromones that were attractive to other cockroaches from the same clan.
The goal of the experiment was to better understand the insects’ social behavior. Generally, cockroaches prefer to cluster in dark environments with others of their kind. The preference for darkness makes sense—they’re less vulnerable to predators or disgusted humans when they’re hiding in the shadows. When the researchers instructed their pheromone-soaked machines to group together in the light, however, the other cockroaches followed. They chose the comfort of a group despite the danger of the light.
These robotic roaches bring me back to my first conversation with Roger Payne all those years ago, and his dreams of swimming alongside his majestic friends. What if we could build a robot that accomplished something similar to his imagined capsule? What if we could create a robotic fish that moved alongside marine creatures and mammals like a regular member of the aquatic neighborhood? That would give us a phenomenal window into undersea life.
Sneaking into and following aquatic communities to observe behaviors, swimming patterns, and creatures’ interactions with their habitats is difficult. Stationary observatories cannot follow fish. Humans can only stay underwater for so long. Remotely operated and autonomous underwater vehicles typically rely on propellers or jet-based propulsion systems, and it’s hard to go unnoticed when your robot is kicking up so much turbulence. We wanted to create something different—a robot that actually swam like a fish. This project took us many years, as we had to develop new artificial muscles, soft skin, novel ways of controlling the robot, and an entirely new method of propulsion. I’ve been diving for decades, and I have yet to see a fish with a propeller. Our robot, SoFi (pronounced like Sophie), moves by swinging its tail back and forth like a shark. A dorsal fin and twin fins on either side of its body allow it to dive, ascend, and move through the water smoothly, and we’ve already shown that SoFi can navigate around other aquatic life forms without disrupting their behavior.
SoFi is about the size of an average snapper and has taken some lovely tours in and around coral reef communities in the Pacific Ocean at depths of up to 18 meters. Human divers can venture deeper, of course, but the presence of a scuba-diving human changes the behavior of the marine creatures. A few scientists remotely monitoring and occasionally steering SoFi cause no such disruption. By deploying one or several realistic robotic fish, scientists will be able to follow, record, monitor, and potentially interact with fish and marine mammals as if they were just members of the community.
#robots #coming #good