Your virtual butler is coming. You know, eventually.
Today's robots are nothing like in the movies. Here's what it would take to get there.
Dan Robitzski • May 10, 2017
Even Star Wars' R2-D2, which is basically a garbage can-shaped supercomputer, can understand and express emotions better than any existing robot. [Image Credit: flickr user Gordon Tarpley | CC BY 2.0]
If you’re anything like me, you can’t wait until the sci-fi dream of having an android do all of your busywork becomes reality. In the movies, robots walk around just like people, but the robots we have in real life pale in comparison. I set out to learn how far away we are from getting advanced, sentient machines. I spoke to Jizhong Xiao, who heads the robotics program at the City College of New York, about the robots he’s developed, what makes them different from the androids of Star Wars, and what it would take to reach that next level.
Our phone connection wasn’t the best, so feel free to follow along with the transcript of our conversation below.
Dan: A new Star Wars movie comes out every year nowadays. And like many other science fiction films, they feature robots way beyond what we can build. They think, they feel — the highly-coordinated machines are basically just people that happen to be made of metal. And in some ways, this technology is a distant pipedream. But in other ways, robots can already one-up humans. Is it possible that we’re actually not that far off from getting our own personal android butlers? Reporter Dan Robitzski set out to learn what it would take to close the sci-fi gap and launch our robotics into the future.
C-3PO: Master Luke, sir. Pardon me for asking, but what should R2 and I do if we’re discovered here?
Luke Skywalker: Lock the door.
Han Solo: And hope they don’t have blasters.
C-3PO: That isn’t very reassuring.
Dan: That’s C-3PO, an android from Star Wars. In the movies, he bumbles about with the other characters, getting into trouble. And he provides some comic relief with his blunt and unfiltered hysterics. He’s fully autonomous and conveys emotions. You can really only tell that he’s a robot because he can talk to computers, he’s great with numbers and his joints are a little stiff. But outside of the movies, the robots we have today look completely different. They have specific, narrow jobs, and for the most part require someone with a remote controller to guide them.
Xiao: In my introductions to robotics class, I always have the first slide say “what is a robot?”
Dan: I’m joined on the phone by Jizhong Xiao. He’s the head of the robotics program within the engineering school at the City College of New York.
Dan: And what do your students say?
Xiao: Actually, my slide says “Hollywood imagination.” Star Wars, R2D2, even California’s governor the Terminator, I always show these pictures.
Dan: So people’s assumptions about robots are based on Hollywood’s imagination. But Xiao is well-versed in the current state of robotics. After all, he and the students in his lab have developed quite a few of their own. In particular, he’s developed several robots that can navigate through their surroundings.
Their main focus is on what they call the city climber. It’s the first climbing rover that can traverse rough and bumpy surfaces. It looks sort of like the Roomba that you might have vacuuming your floor, if your Roomba could climb brick walls and slide along your ceiling.
Xiao mentions that these climbers could be used for all sorts of purposes like search and rescue missions or construction. The city climber can do jobs that would normally put people at risk by requiring them to climb up precarious expanses of scaffolding. Now, a remote-controlled robot can do it for them.
Xiao is working towards improving his designs so that the robots can be fully autonomous. He’s already developed this technology in another project of his, the CityFlyer, which looks like the conventional quad-rotor drones that seem to be everywhere nowadays. His autonomous quadrotors film their surroundings to create a dynamic, three-dimensional map of their environment in real time, which they can use to avoid obstacles.
That’s great for a little machine beeping around and bumping into my ankles, but I want to know when I can get that robot butler.
Xiao: Can we ever reach that sophisticated level? The answer is yes, partially yes. We have robots that look a lot like humans. For example, they have eyes, elbows … And some of the functions are better than humans.
Dan: Xiao gives the example of Boston Dynamics — A company well known for producing videos of their sophisticated robots that can walk, jump, balance, you name it.
Xiao: You kick it on the icy ground, and it can still balance . … Human beings, we cannot (chuckling)
Dan: While these robots can replace or improve upon human abilities, they’re not quite the same as the hyperintelligent-yet-sentient androids we’ve come to expect. While the physical capabilities are there, something is still lacking. Even current research into robots that are shielded by synthetic or 3D printed skins — not unlike the Terminator — aren’t as human as we would expect. So what are we missing?
Xiao: We have a term, artificial intelligence. … The intelligence, I don’t think it’s as good as a human being. Right now we can reach the intelligence level of a baby.
Dan: Xiao says modern robots can be as intelligent as a baby, but they’re far worse at something else, which is detecting emotions. Xiao describes how easy it is for us to observe that someone is smiling. Programming a robot to recognize a smile and then recognize its emotional salience is a near-herculean task. It’s that emotion and that intelligence that we see in the robots of science fiction — that’s what makes them seem so different from the technology we have today.
The problem comes from the way computer code is written. But a new technique called deep learning may help propel robotic technologies forward. Instead of having to write extensive programs, deep learning would allow robots to teach themselves.
Dan: So it sounds like if we wanted to get robots as advanced as we see in the movies, we would need to better understand how the brain works and better integrate that into the technology we have.
Xiao: Yes, that’s it.
Dan: And while Xiao insists that the software behind it isn’t anything horribly complex, he says it requires a great deal of effort to jumpstart the process. If we wanted a robot to detect a smile, it would need to be presented with a vast amount of what he calls smile data in order to be able to recognize what they look like.
Xiao: You need to collect lots of images from websites, and then you need to label it. You need to get the data so it can read it. So you need huge data, a lot of data.
Dan: It’s like traveling to a new place. You need to look around to get your bearings. But once you’ve been there a few times, it becomes familiar. This is the challenge with deep learning. Robots need to be presented with smile after smile before they can detect one on their own. This is what Xiao is currently doing with his city climbers, except instead of smiles he’s teaching them to find signs of structural damage in the buildings they inspect.
Dan: So basically, the hardware can do some really impressive things but it’s emotion and intelligence that are behind what we see in movies.
Xiao: Yes.
Dan: And do you think that level of robotic intelligence is possible?
Xiao: Uh…
Dan: Down the road?
Xiao: (laughing) Down the road?
Dan: Unfortunately, Xiao couldn’t really say when I could finally have a robotic butler all of my own. But after I pressed him a bit, he suggested that we were maybe fifty years away from having truly emotionally intelligent robots like C-3PO.
Reporting in New York, I’m Dan Robitzski, hotshot ace reporter for scienceline.org.