(Image: Sven Hoppe/dpa/PA)
Not Like Us is our monthly column exploring the minds of intelligent machines – and how we live with them
Even if you don’t care about robots, the story of‘s demise might seem a sad one.
Early on the morning of 1 August, hitchBOT was two weeks into an ambitious journey from Massachusetts to San Francisco. Over the past year, it had successfully hitchhiked its way around Canada, Germany and the Netherlands. Drivers would spot a child-sized gizmo standing on the side of the road: plastic body, brightly coloured galoshes, rubber gloves with a thumb in the air. The bot would ask for a ride. Many said yes.
That Saturday morning, hitchBOT was left somewhere on a bench in Philadelphia to wait for its next ride. A few hours later, it was found lying on the ground, headless, with its arms ripped out. It was too far gone to continue the journey. The remains were shipped back to its creators at Ryerson University in Canada.
Going by the media coverage – on public radio, evening news, newspapers, blogs, magazines as diverse as Sports Illustrated and People – hitchBOT’s troubled end was no mere technology story. The bot had been “murdered”, “abandoned”, “decapitated”, “brutally slain”. Footage of its destruction – what looked like surveillance video of a man stomping hitchBOT on a deserted city street – picked up thousands of views before being exposed as fake.
Wave of sentiment
A Philadelphia maker group offered to help put the bot back together.. “I am ashamed of my city,” said one local writer. “We’re better than this,” said another. Sentiments that would have been more familiar in a human obituary, or at least that of a beloved animal.
Whatever empathy a person might have felt for the bot, it certainly wasn’t capable of feeling anything back. So why the outpouring of affection for a machine?
It’s a question worth pondering. A life lived in close quarters with robots is no longer a subject for science fiction. They are stationed in, hospitals, shipyards and . But we don’t yet have any established etiquette or moral code for dealing with them.
“The technology is being developed and integrated before we fully understand the long-term psychological ramifications,” says. “How we treat robots and how they affect us emotionally – we’re not really sure what’s going to happen.”
Carpenter specialises in the relationships between people and robots, particularly US military personnel who disarm bombs with the help of bots.
The technicians seem to develop a bond with their robots, lending them human characteristics and naming them, much like a proud owner might do to a new car. Some name them after movie stars or girlfriends, or refer to them as “he” or “she”. One technician called his Fido “cuz it was like a dog”.
But the bond extended beyond names. Robots were said to be “like a team member” or to have individual quirks that set them apart from other bots. When a robot blows up, said one technician, it wasn’t anywhere close to being on the same level as a buddy of yours getting wounded or seeing a member getting taken out. But another described a complicated mix of emotions: anger, melancholy, and the relief that it was a robot rather than a human that got hurt – and “a sense of loss from something happening to one of your robots. Poor little fella.”
What inspires this bond between humans and their robots? Carpenter says maintenance – the act of taking care of the robot – is a factor. Then there’s time: the more of it spent together, the more it creates a sense of shared experience and nostalgia.
Perhaps to create that bond, many robots are explicitly designed to resemble pets. This can bring unexpected complications. Last year, Sony closed its last repair centre for Aibo,– leaving owners in Japan with malfunctioning dogs in the lurch. Some meet regularly to let their ailing dogs play together and discuss repair tips. Others have held funerals for broken Aibos. “They are important members of my family,” one man .
The Aibo owners’ mourning suggests that they found real happiness in their bond with their robots. That happiness can be a boon for people in need of a companion. Research has suggested, for example, that Paro the therapeutic seal, another interactive robot, reduces anxiety and loneliness in its elderly owners.
That bond can also be put to practical use, says Carpenter. If you like the robot responsible for making you take medicine, you’ll be less likely you turn it off. In the military, a bond with your bomb-disposing robot could make you work more productively. “It’s going to make the interactions more natural, and probably more effective,” she says.
And then there’s the window it opens into our minds. The way someone acts around injured robots can tell us a little bit about who they are themselves, like a reverse of the Voight-Kampff test in the filmBlade Runner that helped distinguish humans from lifelike machines, saysat the Massachusetts Institute of Technology Media Lab.
That was the implication ofat the Symposium on Robot and Human Interactive Communication in Kobe, Japan. Darling and her colleagues asked 101 people to strike a toy robot bug with a mallet. Those who’d received higher empathy scores on a personality test tended to hesitate several seconds longer before doing it.
Abuse is abuse
Darling thinks that robot abuse could one day provide early warnings for something more worrying. “We know that these behaviours tend to translate in other cases. A case of animal abuse in a household where there are children, that automatically triggers a child services investigation,” she says. “I think that logically extends to robots that appear lifelike as well. It might actually be a good test.”
Feeling concern for injured robots doesn’t necessarily mean that our feelings rival or even surpass those for humans. In a study published last year, researchers at the University of Duisburg-Essen in Germany. During the former videos, brain reaction appeared the same – but in the latter, their brains lit up differently, indicating that human suffering upset them more than robot suffering.
That hasn’t stopped a few brave souls from standing up for robots like hitchBOT. An organisation in Seattle called the American Society for the Prevention of Cruelty to Robots insists that genuine artificial intelligence deserves the right to “existence, independence, and the pursuit of greater cognition”.
Last month, a drunk man in Japan was arrested for angrily kicking a robot in a Softbank store. Rather than simply charging such a person with damaging property, maybe in the future, authorities will launch a wider-ranging investigation of his behaviour.
This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.