Why we have an emotional connection to robots | Kate Darling


Translator: Joseph Geni
Reviewer: Krystian Aparta There was a day, about 10 years ago, when I asked a friend to hold
a baby dinosaur robot upside down. It was this toy called a Pleo
that I had ordered, and I was really excited about it
because I’ve always loved robots. And this one has really cool
technical features. It had motors and touch sensors and it had an infrared camera. And one of the things it had
was a tilt sensor, so it knew what direction it was facing. And when you held it upside down, it would start to cry. And I thought this was super cool,
so I was showing it off to my friend, and I said, “Oh, hold it up by the tail.
See what it does.” So we’re watching
the theatrics of this robot struggle and cry out. And after a few seconds, it starts to bother me a little, and I said, “OK, that’s enough now. Let’s put him back down.” And then I pet the robot
to make it stop crying. And that was kind of
a weird experience for me. For one thing, I wasn’t the most
maternal person at the time. Although since then I’ve become
a mother, nine months ago, and I’ve learned that babies also squirm
when you hold them upside down. (Laughter) But my response to this robot
was also interesting because I knew exactly
how this machine worked, and yet I still felt
compelled to be kind to it. And that observation sparked a curiosity that I’ve spent the past decade pursuing. Why did I comfort this robot? And one of the things I discovered
was that my treatment of this machine was more than just an awkward moment
in my living room, that in a world where we’re increasingly
integrating robots into our lives, an instinct like that
might actually have consequences, because the first thing that I discovered
is that it’s not just me. In 2007, the Washington Post
reported that the United States military was testing this robot
that defused land mines. And the way it worked
was it was shaped like a stick insect and it would walk
around a minefield on its legs, and every time it stepped on a mine,
one of the legs would blow up, and it would continue on the other legs
to blow up more mines. And the colonel who was in charge
of this testing exercise ends up calling it off, because, he says, it’s too inhumane to watch this damaged robot
drag itself along the minefield. Now, what would cause
a hardened military officer and someone like myself to have this response to robots? Well, of course, we’re primed
by science fiction and pop culture to really want to personify these things, but it goes a little bit deeper than that. It turns out that we’re biologically
hardwired to project intent and life onto any movement in our physical space
that seems autonomous to us. So people will treat all sorts
of robots like they’re alive. These bomb-disposal units get names. They get medals of honor. They’ve had funerals for them
with gun salutes. And research shows that we do this
even with very simple household robots, like the Roomba vacuum cleaner. (Laughter) It’s just a disc that roams
around your floor to clean it, but just the fact it’s moving
around on its own will cause people to name the Roomba and feel bad for the Roomba
when it gets stuck under the couch. (Laughter) And we can design robots
specifically to evoke this response, using eyes and faces or movements that people automatically,
subconsciously associate with states of mind. And there’s an entire body of research
called human-robot interaction that really shows how well this works. So for example, researchers
at Stanford University found out that it makes people really uncomfortable when you ask them to touch
a robot’s private parts. (Laughter) So from this, but from many other studies, we know, we know that people
respond to the cues given to them by these lifelike machines, even if they know that they’re not real. Now, we’re headed towards a world
where robots are everywhere. Robotic technology is moving out
from behind factory walls. It’s entering workplaces, households. And as these machines that can sense
and make autonomous decisions and learn enter into these shared spaces, I think that maybe the best
analogy we have for this is our relationship with animals. Thousands of years ago,
we started to domesticate animals, and we trained them for work
and weaponry and companionship. And throughout history, we’ve treated
some animals like tools or like products, and other animals,
we’ve treated with kindness and we’ve given a place in society
as our companions. I think it’s plausible we might start
to integrate robots in similar ways. And sure, animals are alive. Robots are not. And I can tell you,
from working with roboticists, that we’re pretty far away from developing
robots that can feel anything. But we feel for them, and that matters, because if we’re trying to integrate
robots into these shared spaces, we need to understand that people will
treat them differently than other devices, and that in some cases, for example, the case of a soldier
who becomes emotionally attached to the robot that they work with, that can be anything
from inefficient to dangerous. But in other cases,
it can actually be useful to foster this emotional
connection to robots. We’re already seeing some great use cases, for example, robots working
with autistic children to engage them in ways
that we haven’t seen previously, or robots working with teachers to engage
kids in learning with new results. And it’s not just for kids. Early studies show that robots
can help doctors and patients in health care settings. This is the PARO baby seal robot. It’s used in nursing homes
and with dementia patients. It’s been around for a while. And I remember, years ago,
being at a party and telling someone about this robot, and her response was, “Oh my gosh. That’s horrible. I can’t believe we’re giving people
robots instead of human care.” And this is a really common response, and I think it’s absolutely correct, because that would be terrible. But in this case,
it’s not what this robot replaces. What this robot replaces is animal therapy in contexts where
we can’t use real animals but we can use robots, because people will consistently treat
them more like an animal than a device. Acknowledging this emotional
connection to robots can also help us anticipate challenges as these devices move into more intimate
areas of people’s lives. For example, is it OK
if your child’s teddy bear robot records private conversations? Is it OK if your sex robot
has compelling in-app purchases? (Laughter) Because robots plus capitalism equals questions around
consumer protection and privacy. And those aren’t the only reasons that our behavior around
these machines could matter. A few years after that first
initial experience I had with this baby dinosaur robot, I did a workshop
with my friend Hannes Gassert. And we took five
of these baby dinosaur robots and we gave them to five teams of people. And we had them name them and play with them and interact with them
for about an hour. And then we unveiled
a hammer and a hatchet and we told them to torture
and kill the robots. (Laughter) And this turned out to be
a little more dramatic than we expected it to be, because none of the participants
would even so much as strike these baby dinosaur robots, so we had to improvise a little,
and at some point, we said, “OK, you can save your team’s robot
if you destroy another team’s robot.” (Laughter) And even that didn’t work.
They couldn’t do it. So finally, we said, “We’re going to destroy all of the robots unless someone takes
a hatchet to one of them.” And this guy stood up,
and he took the hatchet, and the whole room winced
as he brought the hatchet down on the robot’s neck, and there was this half-joking,
half-serious moment of silence in the room for this fallen robot. (Laughter) So that was a really
interesting experience. Now, it wasn’t a controlled
study, obviously, but it did lead to some
later research that I did at MIT with Palash Nandy and Cynthia Breazeal, where we had people come into the lab
and smash these HEXBUGs that move around in a really
lifelike way, like insects. So instead of choosing something cute
that people are drawn to, we chose something more basic, and what we found
was that high-empathy people would hesitate more to hit the HEXBUGS. Now this is just a little study, but it’s part of a larger body of research that is starting to indicate
that there may be a connection between people’s tendencies for empathy and their behavior around robots. But my question for the coming era
of human-robot interaction is not: “Do we empathize with robots?” It’s: “Can robots change
people’s empathy?” Is there reason to, for example, prevent your child
from kicking a robotic dog, not just out of respect for property, but because the child might be
more likely to kick a real dog? And again, it’s not just kids. This is the violent video games question,
but it’s on a completely new level because of this visceral physicality
that we respond more intensely to than to images on a screen. When we behave violently towards robots, specifically robots
that are designed to mimic life, is that a healthy outlet
for violent behavior or is that training our cruelty muscles? We don’t know … But the answer to this question has
the potential to impact human behavior, it has the potential
to impact social norms, it has the potential to inspire rules
around what we can and can’t do with certain robots, similar to our animal cruelty laws. Because even if robots can’t feel, our behavior towards them
might matter for us. And regardless of whether
we end up changing our rules, robots might be able to help us
come to a new understanding of ourselves. Most of what I’ve learned
over the past 10 years has not been about technology at all. It’s been about human psychology and empathy and how we relate to others. Because when a child is kind to a Roomba, when a soldier tries to save
a robot on the battlefield, or when a group of people refuses
to harm a robotic baby dinosaur, those robots aren’t just motors
and gears and algorithms. They’re reflections of our own humanity. Thank you. (Applause)

100 Comments

  1. AWARHERO said:

    Now, where is that hatchet again?

    November 6, 2018
    Reply
  2. N O said:

    Empathy is not a stagnant consistent factor. How one treats a robot has NOTHING to do with their humanity and everything to do with their intelligence to separate life from non life mentally. People who treat them consistently are probably emotionally under developed.

    November 6, 2018
    Reply
  3. Courteous Corgi said:

    People who do not own a Roomba are unable to understand.

    November 6, 2018
    Reply
  4. Sinky said:

    I'm not surprised she's a mother, what guy could say no?

    November 6, 2018
    Reply
  5. MrTruth111 said:

    NO I don't feel a thing for my RealDoll.

    November 6, 2018
    Reply
  6. Ror said:

    Reminds me of Battlestar Galactica x).

    November 6, 2018
    Reply
  7. A M said:

    Emotional connection to a robot???? Its a tool not a person. They are not feeling, living things. Thats like falling in love with a hammer.

    November 6, 2018
    Reply
  8. GuitarZombie said:

    presupposition

    November 6, 2018
    Reply
  9. georules said:

    speak for yourself. I have no emotional connection to any robots.

    November 6, 2018
    Reply
  10. GreenM&M_11 said:

    This is hauntingly….. true.

    November 6, 2018
    Reply
  11. Carlos Mendoza said:

    I always say thank you to Siri

    November 6, 2018
    Reply
  12. Bossman CALL said:

    Hmm, I think women have a connection with whatever enters their vaginas. Regardless of it's origin, be it robotic, edible, or human.

    November 7, 2018
    Reply
  13. FabledDan said:

    I feel like this axe experiment has nothing to do with people feeling connection to robots and more about being a pointless display of violence.

    November 7, 2018
    Reply
  14. Sebir havuz said:

    If this ideology continues to separate like this you can go to jail because of damaging your vacuum cleaner…We want justice for the vacuum cleaners .whatever :p

    November 7, 2018
    Reply
  15. Mr DFK said:

    How about why people are emotionally attached to animals?

    November 7, 2018
    Reply
  16. Tychoxi said:

    HAHAHA FELLOW HUMANS. ISN'T IT CRAZY HOW WE FORM ATTACHMENTS TO SUCH SUPERIOR BEINGS INANIMATE AND TOTALLY NOT DANGEROUS OBJECTS SUCH AS ROBOTS?

    November 7, 2018
    Reply
  17. cesarcdx said:

    That is mechanical not emotional.

    November 7, 2018
    Reply
  18. dooby boody said:

    People can care about pretty much anything tbh. Toys, cars, robots, even fuckin fictional people( ex. characters)

    November 7, 2018
    Reply
  19. J Zelaya said:

    Check out the Hooters! Zayum.

    November 7, 2018
    Reply
  20. Albert Wang said:

    I am disappointed. This researcher presents more questions than answers, and her research methodology is so… not comprehensive. TED, your speaker quality is going down hill.

    November 7, 2018
    Reply
  21. Cathy C said:

    I'll have an emotional connection to a robot when they develop an affordable one that can do my housework!

    November 7, 2018
    Reply
  22. The Muckler said:

    Because the title…can't watch

    November 7, 2018
    Reply
  23. The Muckler said:

    Pretty sure that's a disorder

    November 7, 2018
    Reply
  24. elliedits.mp4 said:

    yet we still are dropping bombs on children and their families, deporting people who just want a safe place to live, and letting children murder other children with assault rifles because the 50-year-old white men want to play with them and be able to buy them at Walmart.

    November 7, 2018
    Reply
  25. K. David Woolley said:

    10:24 Has me thinking of those poor hitchhiker bots that eventually meet their violent ends because of some jerk(s).

    HitchBOT, the hitchhiking robot, gets beheaded in Philadelphia https://www.cnn.com/2015/08/03/us/hitchbot-robot-beheaded-philadelphia-feat/index.html

    November 7, 2018
    Reply
  26. oceansmile said:

    She’s wrong

    November 7, 2018
    Reply
  27. CusBro said:

    Hold up, is that Aloy horizon zero dawn?

    November 7, 2018
    Reply
  28. BoomSnap SnapBoom said:

    So she is a mentally disabled woman?

    November 7, 2018
    Reply
  29. TicToc RobotSnot said:

    🤖❣️

    November 7, 2018
    Reply
  30. Ryan Vivek said:

    I have a connection to my car. Does that count. I put my penis in the exhaust pipe.

    November 7, 2018
    Reply
  31. DURMUŞ BAYSAL said:

    The world has the habit of making room for the man whose words and actions show that he knows where he is going.

    "Know that the life of this world is but amusement and diversion and adornment and boasting to one another and competition in increase of wealth and children. (The Noble Qur'an. Surah Hadid. Verse:20)"
    Indeed, Allah orders justice and good conduct and giving to relatives and forbids immorality and bad conduct and oppression. He admonishes you that perhaps you will be reminded. (The Noble Qur'an. Surah An Nahl. Verse:90)

    Listen to the voice of the Noble Quran.

    Maybe your life will be changed If God allows.

    https://www.youtube.com/watch?v=lVY8pwx9B74

    https://www.youtube.com/watch?v=Omh4oG8T_Fw

    https://www.youtube.com/watch?v=EXWTxB6oS6I

    November 7, 2018
    Reply
  32. Abdulkareem Al-Aradi said:

    In fact, it has a very wonderful body.

    November 7, 2018
    Reply
  33. Marvin Elsen said:

    "The violent video game question"…"Training our cruelty muscles"… it's getting tiresome, yes violent people have an affection towards vioöent games, it's a vent, not a catalyst. The majority of people enjoying violent video games, enjoy video games, the "violent" part is not the focus. Those who play and those who watch the game have two very different experiences. The players engage in realtime problem solving to win a game, they don't think "I need to make that one bleed for the sake of being violent".
    Also: if a soldier has empathy for a robot, you could argue he is kind, or maybe unprofessional?

    November 7, 2018
    Reply
  34. Galip Dönmez said:

    Before even I watch the video, I'm gonna say why we don't have emotional connections to robots. Because they are THINGS! Objects, items to use. Wow, this one is so out in the open that Captain Obvious would refuse to explain it and yet somebody still felt the need to give an 11 minute speech that contradicts something so obvious. The only time I'd feel sad because of a piece of plastic would be if it was damaged and expensive to replace, not because of subconsciously believing it is a person…

    November 7, 2018
    Reply
  35. Jack Frost said:

    What sort of general sends real people into battle knowing that they my die or worse, but feels bad for a bomb disposal robot?

    November 7, 2018
    Reply
  36. WoWBow said:

    We can love sociopaths, I think a wee electrical dinosaur deserves some love as well

    November 7, 2018
    Reply
  37. Tatiyana Kholomonova said:

    its one of the most great videos I studied at Ted… – One of the most important contexts at this time.. – Thank you so much!

    November 7, 2018
    Reply
  38. Foo Ghu said:

    wait… who told you we have the emotional connection to the robots?

    November 7, 2018
    Reply
  39. Kaung Myat Thu said:

    "Violent Video game question question….."? WTF are you talking about.

    November 7, 2018
    Reply
  40. Gerson Yamada said:

    🙏🏻❤️🙏🏻

    November 7, 2018
    Reply
  41. Alpha Strength said:

    This is an awkward topic

    November 7, 2018
    Reply
  42. Darksh0t009 said:

    Sounds like someone fell in love with their vibrator.. jk

    November 7, 2018
    Reply
  43. nuclear kid said:

    Because you can do whatever you can, and whenever you want and neither will it refuse nor will it judge.

    November 7, 2018
    Reply
  44. Event Hʘriךּon said:

    animal cruelty is used to predict human abuse. I can only imagine being cruel to a robot can be used to test your empathy just the same way.

    November 7, 2018
    Reply
  45. Татьяна Шарах said:

    Только закончила играть в Detroit: Become Human. Совпадение?

    November 7, 2018
    Reply
  46. GameSquid said:

    you all suck. don't connect with robots. sentimental fools. I wouldn't even name a pet if I had one.

    November 7, 2018
    Reply
  47. deadboy said:

    yeaaaah i think this is more of a personal problem Kate, time to find a psychologist

    November 7, 2018
    Reply
  48. Ojisan Kukki said:

    I can explain the army guy: He was obviously a country-boy and subconsciously saw the robot as a big "stickman insect," and as every boy will tell you, stickmen are freaking awesome. So him seeing the robot slowly destroyed was like a kid watching a stickman trying to escape from lizards or birds.

    November 7, 2018
    Reply
  49. Doctor NPC said:

    Is this predictive programming?

    November 7, 2018
    Reply
  50. Iliya Moskvichev said:

    Ever since the first computers, there have always been ghosts in the machine. Random segments of code that have grouped together to form unexpected protocols. Unanticipated, these free radicals engender questions of free will, creativity, and even the nature of what we might call the soul. Why is it that when some robots are left in darkness, they will seek out the light? Why is it that when robots are stored in an empty space, they will group together, rather than stand alone? How do we explain this behavior? Random segments of code? Or is it something more? When does a perceptual schematic become consciousness? When does a difference engine become the search for truth? When does a personality simulation become the bitter mote… of a soul?

    November 8, 2018
    Reply
  51. Genius by Design said:

    maybe she is mentally ILL that per·son·i·fy ?

    November 8, 2018
    Reply
  52. Genius by Design said:

    Maybe General was an idiot or preserving his authority ?

    November 8, 2018
    Reply
  53. Genius by Design said:

    https://www.youtube.com/watch?v=5xZ_KQhv8T8

    November 8, 2018
    Reply
  54. Jim Scobie said:

    Boston Dynamics has a robot. They were testing it by pushing it around and stuff. I was irritated by it and it bothered me. I thought I was weird. Lol

    November 8, 2018
    Reply
  55. Neo Count said:

    It won't love you.

    November 8, 2018
    Reply
  56. Like A Chef said:

    Like why would there even be a battlefield to fight in when people even emphasize with robots. Humans are weird

    November 8, 2018
    Reply
  57. Apimpnamedslickback said:

    https://youtu.be/RNKLuXUh3M4

    Seems this robot has feelings

    November 8, 2018
    Reply
  58. Alfonso Bernal Jimenez said:

    Put subtitles with the birkenbihl method. To learn English.
    Pongan subtítulos con el método birkenbihl. Para aprender ingles.
    Mettez les sous-titres avec la méthode birkenbihl. Apprendre l'anglais.
    Untertitel mit der Birkenbihl-Methode setzen. Englisch lernen.
    ضع ترجمات مع طريقة birkenbihl. لتعلم الإنجليزية.

    November 8, 2018
    Reply
  59. Alfonso Bernal Jimenez said:

    BIRKENBIHL

    November 8, 2018
    Reply
  60. Paulo Rostok said:

    Weird alright

    November 8, 2018
    Reply
  61. recex said:

    While we cannot empathy between people how can we empathy with robots?

    November 8, 2018
    Reply
  62. Golden nugget said:

    That's deep

    November 8, 2018
    Reply
  63. Chuck Bryan said:

    We can train our minds to think better, just as we can train our bodies to perform actions better. Just as an Olympic athlete perfects her/his performance for competition, we can train our minds to perform better in everyday life. This is the focus of Buddhist practice; it works well. So, if we practice empathy, kindness, and compassion toward robots, it will translate to similar behaviors toward people, pets, and other living things. If we were to treat robots in a cruel, unkind way, this would train our minds to act in a similar way toward others. Many people would certainly be able to discern the difference between a robot and a person or animal, but a person's ability to switch behavior, especially in a short period of time (instantaneously), would be challenged. Overall, our minds are quite interesting, and it is a challenge for us to improve our lives though the improvement of our thinking. This was a very enjoyable TED Talk. I have some real thinking to do.

    November 8, 2018
    Reply
  64. Jaime D said:

    One day a kid who hasn't had a good up bringing is going to treat their robot or Android like crap and it's going to turn on her or him. And the one to be blamed is the machines and not the humans. It's bound to happen. As humans not all of us are as compassionate and caring even the slightest as we should be. Little to people remember though, how much smarter AI and machine is.

    November 9, 2018
    Reply
  65. FLEXCOPE INC. said:

    What is a robotic ethicist?

    November 9, 2018
    Reply
  66. A.E. Daley-Moore said:

    https://youtu.be/hi4pzKvuEQM

    November 9, 2018
    Reply
  67. StaringanimE1 said:

    💜💙

    November 9, 2018
    Reply
  68. rautermann said:

    First TED talk to make me cry.

    November 9, 2018
    Reply
  69. masterblackthorn said:

    You might Kate but I don't , because I live in reality, Im not a techno geek

    November 10, 2018
    Reply
  70. Tidepool Clipper said:

    I know I sound a bit heartless when I say this; but since robots can't feel emotions yet, I just can't really care for something that doesn't actually provide similar feelings back to me. Merely, it is only doing and showing what is programmed inside it to begin with. Robots can not be called their own person yet.

    November 12, 2018
    Reply
  71. Phong Hoang said:

    Thank yêu very much

    November 12, 2018
    Reply
  72. silverhairdemon said:

    Wow looks like a intelligent handsome lady, but the robot need to have some cute factor for me to feel empathy towards it. just like the few stuffed dolls/animals I have laying around here.

    November 12, 2018
    Reply
  73. 8dioproductions said:

    No more kicking Boston Dynamics … One day AIs will look at us and go like … Naw … NEXT!!!

    November 12, 2018
    Reply
  74. Tim_Bouwman said:

    boringggggg

    November 13, 2018
    Reply
  75. Jasper Trip said:

    People in the comments need to understand that she didn’t say that YOU specifically have an emotional connection to robots. She’s saying that according to studies she has done there are many people who do, and that it is a normal thing. She has spent 10 years studying this, she probably knows more about it than you do.

    November 13, 2018
    Reply
  76. Player Review said:

    Somebody obviously watched the film AI

    November 14, 2018
    Reply
  77. Sierra C said:

    This was super cool. The fresh take on robots that I've been looking for lol.
    Edit: she doesn't suggest that we tend to and deepen our connection to robots. She is suggesting that our incline for empathy says something about humanity, questioning what that says and where to go with it. Acknowledging that there IS an emotional connection some people develop will help give us foresight, instead of remaining ignorant to the psychological effects (or consequences)

    November 15, 2018
    Reply
  78. Rajkumar Wadeyar said:

    Indians love Chitti the robot (2.0 Reloaded). :3

    November 16, 2018
    Reply
  79. Awang Budiman said:

    Why we have an emotional connection to robots.
    Answer: No we fukin don't.

    November 16, 2018
    Reply
  80. Mae Lin said:

    Taylor Swift in a TED talk

    November 17, 2018
    Reply
  81. JDRFT said:

    Who else thought she was the robot?

    November 20, 2018
    Reply
  82. asif wifi said:

    can u plz add english substaitals on the all video

    November 24, 2018
    Reply
  83. John King said:

    Could only watch half way…
    Compassion is Basic Nature.

    November 26, 2018
    Reply
  84. jacobawojtowicz said:

    Oh God….my cousin married his car. We sent him to therapy. If you cant tell the difference between real and imaginary, look into that. I don't care if you are dating your anime pillow, but don't Ted Talk as if all of humanity has the same headcase.

    November 28, 2018
    Reply
  85. Filming life to Remember it. said:

    I've listened to a plethora of these #TED talks. I find most to not be informative. But to be conjured-up by "Articulate," speaking people, that then go on a nation-wide tour and seemingly have little or no data/statistics to back-up their claims. Very misleading.
    Such as this. "We have an emotional connection to robots." Well. I have zero emotion when destroying a robot. I do not have long talks, sleep with, nor shower with a robot. Maybe those with "Head," problems do have an emotional connection. Seems this lady watched "Blade Runner," and "Ex-Machina," too many times .

    November 30, 2018
    Reply
  86. Alex Horwitz said:

    I have a question.

    Is it possible that during the experiment with the five groups playing with the baby dinosaur that perhaps they didn’t all have empathy for the robots, but rather they empathized with a person in their group that felt empathy for the robot?

    For example, if I had a child that was emotionally attached to a teddy bear, I know it’s just a stuffed animal. It’s not real. But if I had to take a hatchet to it, I would hesitate because I’m aware of the emotional attachment my child has for it. Not necessarily because I have empathy for the teddy bear.

    December 4, 2018
    Reply
  87. TheDroidBay said:

    7:23 Beyond the insatiable corporate demand for ever more private data, there is no overarching reason to connect robots to networks. Exposing them to the world wide web is absolutely asking for problems, yet if their function does not require an internet connection, there is no reason to provide one. Updates can and must be done through onsite hardware and technicians. We see robots as humans or as animals, and we want them to be like them. There is no internet connection wired into my dogs brain. There isn't one in my head. If I want new information I have to go and read it, if I want my dog to do something new I teach it. Just because a robot has a computer somewhere in it, does not make it a pc or a smartphone. Don't give them wifi or a LAN port and it will eliminate a whole host of awful problems for very little hassle. I do not care if I wait a few months for new firmware if my robot can do incredible things that improve my life. Also: Give it an off switch.

    December 11, 2018
    Reply
  88. From my point of view, you are upside down. said:

    Plot twist! She is also a robot.

    December 20, 2018
    Reply
  89. The Spam said:

    Well, i think if you watch those cars burn out you would cry or those crash test

    December 20, 2018
    Reply
  90. N said:

    andthentheresthesexrobots.

    December 31, 2018
    Reply
  91. Robin Paul said:

    I have no connection with robots or Personal assistants but I do talk to my computer sometimes. Strange.

    January 5, 2019
    Reply
  92. Ian MacReth said:

    I'm reminded of an old Twilight Zone episode named "Alicia," about a prisoner on an empty planet is given a female bot as a companion.

    January 11, 2019
    Reply
  93. Cecilia Spears said:

    OMG! I LOVE this TedTalk! Thank you! (Though I admit some of my favorite shows includes Chobits and Robot Carnival.)

    February 1, 2019
    Reply
  94. Dan said:

    No one is integrating robots into their lives. A handful of idiots is not "our society"

    February 3, 2019
    Reply
  95. Chaguita said:

    if is she a robot?

    February 4, 2019
    Reply
  96. Jake Broe said:

    I am a little disappointed the comments section is flooded with terminator/skynet references… i'm old…

    March 30, 2019
    Reply
  97. prettynoose888 said:

    The saddest film I’ve ever seen is “A.I. Artificial Intelligence”, it's the only film that ever made me cry because it reminded me so much of how humans treat animals.

    May 14, 2019
    Reply
  98. Ethan Spencer said:

    Hi Aunt kate

    May 21, 2019
    Reply
  99. Tsar Alester said:

    Fear is only emotion needed for them

    June 19, 2019
    Reply
  100. WE ARE ONE BRAIN said:

    Yea, let‘s built a superhero

    July 24, 2019
    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *