Creators of artificial intelligence measure how well machines can imitate human qualities like empathy, listening, affirmation, and love. Don't reciprocate, says Sherry Turkle. Turkle's latest book is "Reclaiming Conversation: The Power of Talk in a Digital Age" (http://goo.gl/CFFWGq).
Read more at BigThink.com: http://bigthink.com/videos/sherry-turkle-on-the-dangers-of-emotional-interactions-with-robots
Follow Big Think here:
YouTube: http://goo.gl/CPTsV5
Facebook: https://www.facebook.com/BigThinkdotcom
Twitter: https://twitter.com/bigthink
Transcript - I have very strong feelings about a future in which robots become the kind of conversational agent that pretend to have emotional lives. Shortly after I finished we can make reclaiming conversation I was interviewed for an article in the New York Times about Hello Barbie. So Hello Barbie comes out of the box and says, now I'm just paraphrasing, the jest hi I'm Hello Barbie. I have a sister. You have a sister. I kind of hate my sister. I'm jealous of your sister. Do you hate your sister? Let's talk about how we feel about our sisters. In other words it just kind of knows stuff about you and is ready to talk about the arc of a human life and sibling rivalry as though it had a life, a mother, the feelings of jealousy about a sister and was ready to relate to you on that basis. And it doesn't. It's teaching pretend empathy. It's asking you to relate to an object that has pretend empathy.
And this is really not a, in my view, this is really a not good direction for AI to go. There are so many wonderful things for robots to do, so many wonderful things for robots to do. Having pretend empathy, having pretend conversations about caring and love and things that a robot can feel about their body's and about their lives and about their mothers and about their sisters gets children and gets elders, which are the other target group for these robots into a kind of fantasy miasma that is not good for anybody. Children don't need to learn pretend empathy, they need to learn real empathy, which they get from having real conversations with real people who do have sisters, who do have mothers. And I think this is a very dangerous and indeed very toxic direction.
We worry so much about whether we can get people to talk to robots, you know, can you get a child to talk to this Hello Barbie? Can you get an elderly person to talk to a sociable robot? What about who's listening? There's nobody listening. These robots don't know how to listen and understand what you're saying. They know how to respond. They're programmed to make something of what you say and respond but they don't know what it means if you say my sister makes me feel depressed because she's more beautiful than I am and I feel that my mother loves her more. That robot really does not do anything useful to you with that information. That's not empathy. And children need to know that they're being heard by a human being that can do this empathy game with them, this empathy dance with them so in the middle of a time when we're having this crisis in empathy to imagine that now we're going to throw in some robots that will do some pretend empathy, I have to say that in all the optimism of my book this is the pessimistic part and I really end the book with a kind of call to arms. I call it what do we forget when we talk to machines? And I mean it to be literally a call to arms that this is not a good direction. We don't need to take this direction, we just need to not buy these products. This doesn't take a social revolution, this just takes consumers saying that they're not going to buy these products. They're not bringing them to their homes.
Read more at BigThink.com: http://bigthink.com/videos/sherry-turkle-on-the-dangers-of-emotional-interactions-with-robots
Follow Big Think here:
YouTube: http://goo.gl/CPTsV5
Facebook: https://www.facebook.com/BigThinkdotcom
Twitter: https://twitter.com/bigthink
Transcript - I have very strong feelings about a future in which robots become the kind of conversational agent that pretend to have emotional lives. Shortly after I finished we can make reclaiming conversation I was interviewed for an article in the New York Times about Hello Barbie. So Hello Barbie comes out of the box and says, now I'm just paraphrasing, the jest hi I'm Hello Barbie. I have a sister. You have a sister. I kind of hate my sister. I'm jealous of your sister. Do you hate your sister? Let's talk about how we feel about our sisters. In other words it just kind of knows stuff about you and is ready to talk about the arc of a human life and sibling rivalry as though it had a life, a mother, the feelings of jealousy about a sister and was ready to relate to you on that basis. And it doesn't. It's teaching pretend empathy. It's asking you to relate to an object that has pretend empathy.
And this is really not a, in my view, this is really a not good direction for AI to go. There are so many wonderful things for robots to do, so many wonderful things for robots to do. Having pretend empathy, having pretend conversations about caring and love and things that a robot can feel about their body's and about their lives and about their mothers and about their sisters gets children and gets elders, which are the other target group for these robots into a kind of fantasy miasma that is not good for anybody. Children don't need to learn pretend empathy, they need to learn real empathy, which they get from having real conversations with real people who do have sisters, who do have mothers. And I think this is a very dangerous and indeed very toxic direction.
We worry so much about whether we can get people to talk to robots, you know, can you get a child to talk to this Hello Barbie? Can you get an elderly person to talk to a sociable robot? What about who's listening? There's nobody listening. These robots don't know how to listen and understand what you're saying. They know how to respond. They're programmed to make something of what you say and respond but they don't know what it means if you say my sister makes me feel depressed because she's more beautiful than I am and I feel that my mother loves her more. That robot really does not do anything useful to you with that information. That's not empathy. And children need to know that they're being heard by a human being that can do this empathy game with them, this empathy dance with them so in the middle of a time when we're having this crisis in empathy to imagine that now we're going to throw in some robots that will do some pretend empathy, I have to say that in all the optimism of my book this is the pessimistic part and I really end the book with a kind of call to arms. I call it what do we forget when we talk to machines? And I mean it to be literally a call to arms that this is not a good direction. We don't need to take this direction, we just need to not buy these products. This doesn't take a social revolution, this just takes consumers saying that they're not going to buy these products. They're not bringing them to their homes.
Category
🤖
Tech