Conference explores ethical concerns as technology advances
The recent human-computer romance movie Her and the 1940s-era I, Robot series of short stories may have seemed far-fetched to audiences, but, according to philosophers who have considered the issue, similar situations may not be far in our future.
“This is real work that’s occurring right now,” said John P. Sullins, introducing a discussion titled “Sex, Virtue and Robots” at a conference this week at the University of Delaware. “It’s not science fiction.”
Sullins, an ethics professor at Sonoma State University, was referring to a team that is working to add artificial intelligence to the RealDoll product, a customizable, life-size sex doll that The New York Times says has sold more than 5,000 units since 1996. Sullins showed the audience at the UD conference a brief video in which RealDoll creator Matt McMullen demonstrates the work of his team of robotics engineers as they seek to make the doll increasingly lifelike, with a stated goal of creating a “genuine bond between man and machine” through emotional and not just physical connections.
Sullins’ talk and a related panel discussion were part of a four-day conference, “Computer Ethics: Philosophical Enquiry,” that took place at Clayton Hall Conference Center on UD’s Newark campus. It was the first international conference jointly sponsored by the International Society for Ethics and Information Technology and the International Association for Computing and Philosophy.
In the “Sex, Virtue and Robots” session, panelists and members of the audience raised concerns about a possible future in which highly realistic “sexbots” are widely available. For example, asked Charles Ess of the University of Oslo, if humans have sexual relations with robots do they become more like robots themselves?
“Robots can imitate love,” Ess said. “We can build robots that can trick us into thinking they care for us. But it is a trick.”
Panelist Shannon Vallor of Santa Clara University questioned whether the use of sexbots would make it more difficult for people to have emotionally mature human relationships and possibly undermine the ability to develop such virtues as empathy, patience and caring for others.
In response to a comment from the audience, Ess said the use of such robots would not necessarily be bad in all situations and could, for example, possibly replace the exploitation that now occurs with sex workers.
“It’s not science fiction anymore,” he said. “We need to be aware of the possible problems as well as the possible benefits.”
Other topics at the conference included ethical issues involved in big data, personal health technologies, battlefield robots, tracking devices and self-driving cars.
In one of several keynote addresses, “Getting a Handle on Big Data Ethics,” Deborah Johnson, professor of applied ethics at the University of Virginia, discussed recent research in which Facebook collaborated with scientists at Cornell University to manipulate and analyze the emotions of the site’s users.
That controversial research leads to other concerns about “the futuristic, ultimate consequences of unbridled big-data analytics,” Johnson said. Some marketing experts, she said, use the term “neuromarketing” as a process of analyzing and predicting consumers’ behavior based solely on what they’ve done in the past, without asking them their opinions or preferences.
She called for greater public accountability for social media and other data-collecting companies.
Powers, who also has an appointment in the School of Public Policy and Administration and at the Delaware Biotechnology Institute, will be working during fall semester and Winter Session with Jean-Gabriel Ganascia at a Sorbonne university in Paris.
Ganascia specializes in informatics, and Powers will collaborate with researchers in computer science, philosophy and machine learning on a project titled “Autonomous Agents and Ethics,” sponsored by the French National Research Agency.
– Article by Ann Manser
*Source: University of Delaware