Artificial (emotional) intelligence
Design is the key to health care robots finding success in our sympathetic nature
The nurses met the robot during their coffee break. It was lime-green, squat with soft edges. Its designers called it RobCab, and showed them the blood sample trays inside its body cavity before sending it on its first retrieval. The hospital staff signed study consent forms while they waited, and joked about RobCab’s resemblance to a French cartoon character. The robot returned a few minutes later and delivered its first specimen: a box of chocolate.
Sara Ljungblad was there as part of a research team studying human-robot interactions in the small Swedish hospital in 2011. She remembers the nurses’ reactions well – the chocolate didn’t buy RobCab any fast friends. “In the beginning, they didn’t really appreciate having it there,” says Ljungblad. The robot was an alien, an object, a piece of equipment, she says, and the nurses saw it as a potential obstacle.
But the study continued, and RobCab quietly went about ferrying blood samples. Relieved of that simple chore, the nurses had more time for individual care. RobCab was awkward, but not clumsy. It was also professional and competent, and maybe even a little cute. One nurse nicknamed the robot “Kermit.” Mostly, the robot gave them space. RobCab won them over, and it wasn’t with chocolate.
Experiments like these are just one way robots are being brought into health care, and doctor and nurse shortages in the U.S. have made the field ripe for innovation. The first robotic health care emerged in the mid-1980s to make prostate surgery less invasive. Using only a single, small incision to insert remote-controlled tools, surgeons could better navigate the removal of the sequestered prostate gland.
Other innovations followed, especially in primary care, where there are a plethora of readily-automated tasks. In fact, many non-surgical robots are creating new opportunities for empathic engagement. They are becoming assistants, caregivers, and confidants. Some advocates even believe that robots have the potential to restore a degree of empathy that disappeared decades ago with the house call.
Robot designers have found many niches to capitalize on this intangible asset. Dementia patients now find relief with cooing, cuddling robotic companions. In Japan, a smiling bear face and soft touch take the edge off a brutish automaton that lifts patients in and out of bed. And elite specialists now provide care and knowledge from afar with an expanding array of video conferencing robots.
In the hands of a physician who knows how to use it properly, a well-designed robot creates the possibility of a renewed connection between patient and doctor, according to Dr. Venktesh Ramnath, a California physician and robotic health care entrepreneur. Dr. Ramnath connects with patients from a distance, using telemedicine, one of the earliest uses of robotic health care. Personal care, he believes, is one of the most basic parts of medicine.
“After all the drugs and pharmaceuticals, all you are left with is the human touch and the expression of empathy and care from a provider to a patient,” and he sees robots as a way to foster this care. Dr. Ramnath uses the RP-7, manufactured by InTouch Health, to give bedside care to dozens of rural clinics from his office in the San Francisco Bay area. He uses features like camera zooming and tilting to make diagnostic decisions without relying on a nurse to move the camera or reposition the patient. This potential isn’t without its drawbacks: The RP-7 can cost upwards of $150,000, a sum still prohibitive for many hospitals.
Other companies are making affordability a primary goal. Aethon has introduced a transport robot similar to RobCab that costs around $35,000. For some hospitals, this is favorable to hiring a human. VGo Communications, a competitor of InTouch, sells their light weight droid for only $6,000.
Improved surgical techniques, often using robots, are less invasive and create less trauma and blood loss. As a result, patients can often be discharged the same day. Boston Children’s Hospital started using the VGos 18 months ago, and has already sent them home with more than 80 kids. Dr. Hiep Nguyen, a pediatric urologist at Boston Children’s Hospital, is in charge of the robots. In addition to considerable savings, he says, patients heal faster at home.
Each home stay starts with a five-minute post-operative tutorial, and lasts around three days. In addition to remote check-ups, the children also create a customized avatar that reminds them when to go to the bathroom, when to eat, and what to be eating. “If you forget to take your medicine, the avatar gets sick, if you eat the avatar gets better,” says Dr. Nguyen. And when Dr. Nguyen takes the robot back, he says they leave the avatar behind by transferring it to a computer or mobile device. “It’s the return of the house call,” he says.
The VGos’ bodies are spare, ribbon-shaped, and child-sized. Dr. Nguyen powered up one that had a SpongeBob SquarePants sticker on it, and introduced it as Gary. As he drove Gary away from its docking station, Dr. Nguyen demonstrated that the frail-looking robot is remarkably stable and maneuverable. It can spin on a dime, doesn’t get easily tangled in debris on the floor, and its heavy base makes it hard to tip. The controls are simple: clicking top, bottom, and sides of the screen to go forward, reverse or turn. On his office computer, Dr. Nguyen has streaming, high definition video directly from Gary’s camera.
Design features, like RobCab’s penguin-inspired shape, RP-7’s gaze fixing, and VGo’s diminutive form do a lot to help create empathetic attachments. Dr. Nguyen didn’t anticipate how effective this connection would be. His patients ended up telling him – through the robots – things they wouldn’t normally tell a doctor. They dress the VGos up and give them names – like Gary. One girl even asked Nguyen if she could have surgery again. “When I asked her why, she replied, ‘So I can keep Gary’.
These beneficial effects are at least partly rooted in the natural tendency of humans to project personalities and emotions onto robots, says Jodi Forlizzi, who studies robot design at the Carnegie-Mellon Institute in Pittsburgh, Pa. “It doesn’t take a lot of social cues for people to start filling in a personality,” she says. The place people encounter the robots, she sayss, also has a lot to do with what personality gets projected. In a study involving a transport robot similar to RobCab, Forlizzi observed that caregivers in a cancer ward would kick and curse at it, projecting the stress of their surroundings. However, the same robot received decorations and serenades when it went through the maternity ward.
Ljundblad had similar experiences in Sweden. There, RobCab was tested in an orthopedic ward. “When you get out of surgery, you don’t want to get run over by a robot,” she says. It didn’t help that the robot’s reaction time lagged, even though it had sensors to tell when someone was in its path. “You got a bit unsure if the robot would move away or not,” she says. This led to some traffic jams, and eventually people started giving it a wide berth. In addition to a robot’s primary function, Ljundblad says designers need to consider interactions with patients, staff, cleaning personnel and visitors.
Despite these growing pains, robotic applications in health care have few limits. Researchers are already integrating diagnostic tools, improving navigation, and developing ways for robots to access digitized records. And bigger innovations, like scanners that can analyze your breath for disease, wait in the wings. While other professionals may fear for their job when the bell tolls for automation, physicians like Dr. Nguyen see these droids as a resource. “I think it only makes doctors more accurate. I was working with one brain. Now I’m working with four!” he says, with a sweeping gesture at his robots.