Dr. Caroline West, a senior lecturer in philosophy at Sydney University, says we should already be thinking about what will happen when humanoids develop the ability to reason and integrate into society. If humanoids become as intelligent and capable of feeling as humans, should they be given the same rights? The question cuts to the heart of what a “person” is.
“It could happen tomorrow, it could happen in 50 years, it could happen in 100 years,” says Professor Mary-Anne Williams, head of the innovation and research lab at Australia’s University of Technology. “People and animals are just chemical bags, chemical systems, so there’s no technical reason why we couldn’t have robots that truly have AI.”
Professor Williams believes a unique form of robotic emotion could even evolve one day. “You could argue some robots can mimic (emotions) already,” she says. “But because a robot will experience the world differently to us it will be quite an effort for the robot to imagine how we feel about something.”
“One of the things we’ll want robots to do is communicate. But in order to have a conversation you need the capability to build a mental model of the person you’re communicating with. And if you can model other people or other systems’ cognitive abilities then you can deceive.”
Humans generally anticipate how another person might feel about something by thinking about how it would affect them. People who don’t have the ability to empathize can become psychopaths.
“I think there is a danger of producing robots that are psychopathic,” Prof Williams says.
Of course, Isaac Asimov formulated the three laws to try and prevent robots from harming humans, but Professor Williams says this is easier said than done. Especially when there are robots already trained to kill on the battlefield in Iraq.
“You need a lot of cognitive capability to determine harm if you’re in a different kind of body. What will we do when we have to deal with entities … who have perceptions beyond our own and can reason as well as we can, or potentially better?”
Dr. Caroline West says, “If something is a person then it has serious rights, and what it takes to be a person is to be self-conscious and able to reason. If silicon-based creatures get to have those abilities then they would have the same moral standing as persons. Just as we think it’s not okay to enslave persons, so it would be wrong to enslave these robots if they really were self-conscious.”
Via TechNewsWorld.