Enchantment - The Robotic Moment - Alone Together - Sherry Turkle

Alone Together: Why We Expect More from Technology and Less from Each Other - Sherry Turkle (2011)

Part I. The Robotic Moment

Chapter 4. Enchantment

A little over a year after AIBO’s release, My Real Baby became available in stores. In November 2000, I attended a party at MIT to celebrate its launch. The air was festive: My Real Babies were being handed around liberally to journalists, designers, toy-industry executives, and members of the MIT faculty and their guests.

An editor from Wired magazine made a speech at the party, admiring how much advanced technology was now available off the shelf. The robot was impressive, certainly. But it was also surprisingly clunky; its motors whirred as its limited range of facial expressions changed. Engineering students around me expressed disappointment, having hoped for more. As I chatted with one of them, my eyes wandered to a smiling faculty wife who had picked up a My Real Baby and was holding it to her as she would a real child. She had the robot resting over her shoulder, and I noticed her moment of shocked pleasure when the robot burped and then settled down. The woman instinctively kissed the top of My Real Baby’s head and gently massaged its back as she talked with a friend—all of these the timeless gestures of maternal multitasking. Later, as she was leaving, I asked her about the experience. “I loved it,” she said. “I can’t wait to get one.” I asked why. “No reason. It just gives me a good feeling.”

My Real Baby tells you when it is happy and when it wants to play. But it adds a lot more to the mix: it blinks and sucks its thumb; with facial musculature under its skin, it can smile, laugh, frown, and cry. As with all sociable robots, getting along with the robot baby requires learning to read its states of mind. It gets tired and wants to sleep; it gets overexcited and wants to be left alone. It wants to be touched, fed, and have its diaper changed. Over time, My Real Baby develops from infant into two-year-old; baby cries and moans give way to coherent sentences. As it matures, the robot becomes more independent, more likely to assert its needs and preferences. The Tamagotchi primer is followed in the essential: My Real Baby demands care, and its personality is shaped by the care it receives.

Both AIBO and My Real Baby encourage people to imagine robots in everyday life. That is not surprising. After all, these are not extraterrestrials: one is a dog and one is a baby. What is surprising is that time spent with these robots provokes not just fantasies about mutual affection, as we’ve already seen, but the notion that robots will be there to care for us in the sense of taking care of us. To put it too simply, conversations about My Real Baby easily lead to musing about a future in which My Real Baby becomes My Real Babysitter. In this, My Real Baby and AIBO are evocative objects—they give people a way to talk about their disappointments with the people around them—parents and babysitters and nursing home attendants—and imagine being served more effectively by robots. When one fifth-grade boy objects that the AIBO before him wouldn’t be useful to an elderly person, he is corrected. His classmates make it clear that they are not talking about AIBO specifically. “AIBO is one, but there will be more.”

The first time I heard this fantasy—children suggesting that the descendants of such primitive robots might someday care for them—I was stunned. But in fact, the idea of robot caretaking is now widespread in the culture. Traditional science fiction, from Frankenstein to the Chucky movies, portrays the inanimate coming to life as terrifying. Recently, however, it has also been portrayed as gratifying, nearly redemptive. In Star Wars, R2D2 is every child’s dream of a helpmate. In Steven Spielberg’s A.I.: Artificial Intelligence, a robot’s love brings hope to a grieving mother. In Disney’s WALL-E, a robot saves the planet, but more than this, it saves the people: it reminds them how to love. In 9, the humans are gone, but the robots that endure are committed to salvaging human values. An emerging mythology depicts benevolent robots.

I study My Real Baby among children five through fourteen. Some play with the robot in my office. Some meet it in classrooms and after-school settings. Others take it home for two or three weeks. Because this is a robot that represents a baby, it gets children talking about family things, care and attention, how much they have and how much more they want. Children talk about working mothers, absent fathers, and isolated grandparents. There is much talk of divorce. Some children wonder whether one of this robot’s future cousins might be a reasonable babysitter; something mechanical might be more reliable than the caretaking they have.1

Many of the children I study return to empty homes after school and wait for a parent or older family member to come home from work. Often their only babysitter is the television or a computer game, so in comparison a robot looks like pretty good company. Nicole is eleven. Both of her parents are nurses. Sometimes their shifts overlap, and when this happens, neither is home until late. Nicole thinks a robot might be comforting: “If you cut yourself and you want some sympathy. Or you had a bad day at school—even your best friend was mad at you. It would be better to not be alone when you came home.” Twelve-year-old Kevin is not so sure: “If robots don’t feel pain, how could they comfort you?” But the philosophical conversations of the late 1970s and 1980s are cut short: these children are trying to figure out if a robot might be good for them in the most practical terms.

The twenty children in Miss Grant’s fifth-grade class, in a public school on Boston’s North Shore, are nine and ten. They have all spent time with the AIBOs and My Real Babies that I brought to their school. Now we are about to begin a home study where one group of children after another will take a My Real Baby home for two weeks. Most take the position Wilson staked out with his Furby and Lester settled into with his AIBO. They are content to be with a machine that they treat as a living creature. Noah remarks that My Real Baby is very noisy when it changes position, but he is quick to point out that this is insignificant: “The whirring doesn’t bother me,” he says. “I forget it right away.”

In the robotic moment, what you are made of—silicon, metal, flesh—pales in comparison with how you behave. In any given circumstance, some people and some robots are competent and some not. Like people, any particular robot needs to be judged on its own merits. Tia says, “Some robots would be good companions because they are more efficient and reliable,” and then she pauses. I ask her to say more, and she tells me a story. She was at home alone with her pregnant mother, who quite suddenly went into labor. On short notice, they needed to find a babysitter for Tia. Luckily, her grandmother was close by and able to take over, but nevertheless, Tia found the incident frightening. “Having a robot babysitter would mean never having to panic about finding someone at the last minute. It is always ready to take care of you.” In only a few years, children have moved from taking care of Tamagotchis and Furbies to fantasies of being watched over by benign and competent digital proctors. The Tamagotchis and Furbies were always on. Here, a robot is thought of as “always ready.”

These fifth graders know that AIBO and My Real Baby are not up to the job of babysitter, but these robots inspire optimism that scientists are within striking distance. The fifth graders think that a robot could be a babysitter if it could manage babysitter behavior. In their comments about how a robot might pass that test, one hears about the limitations of the humans who currently have the job: “They [robots] would be more efficient than a human if they had to call for an emergency and had a phone right inside them.... They are more practical because if someone gets hurt they are not going to stress or freak out.” “They would be very good if you were sick and your mother worked.” “Robots would always be sure that you would have fun. People have their own problems.” Rather than a mere understudy, a robot could be better qualified to serve. Hesitations are equally pragmatic. One fifth grader points out how much air conditioners and garbage disposals break. “The robot might shut down” too.

In the 1980s, most children drew a line—marking a kind of sacred space—between the competencies of computers and what was special about being a person. In Miss Grant’s class, the sacred space of the romantic reaction is less important than getting the job done. Most of the children are willing to place robots and humans on an almost-level playing field and debate which can perform better in a given situation. To paraphrase, these pragmatic children say that if people are better at fun, let’s put them in charge of fun. If a robot will pay more attention to them than a distracted babysitter, let the robot babysit. If the future holds robots that behave lovingly, these children will be pleased to feel loved. And they are not dissuaded if they see significant differences between their way of thinking and how they imagine robots think. They are most likely to say that if these differences don’t interfere with how a robot performs its job, the differences are not worth dwelling on.

Children are not afraid to admit that when robots become caretakers, some things will be lost, things they will miss. But they also make it clear that when they say they will “miss” something (like having a mother at home to watch them when they are sick), it is not necessarily something they have or ever hope to. Children talk about parents who work all day and take night shifts. Conversations about families are as much about their elusiveness as about their resources.

On this almost-level playing field, attitudes about robotic companionship are something of a litmus test for how happy children are with those who care for them. So, children who have incompetent or boring babysitters are interested in robots. Those who have good babysitters would rather stick with what they have.


Jude is happy with his babysitter. “She is creative. She finds ways for us to have fun together.” He worries that a robot in her place might be too literal minded: “If parents say [to a person], ‘Take care of the kid,’ they [the person] won’t just go, ‘Okay, I’m just going to make sure you don’t get hurt.’ They’ll play with you; they’ll make sure you have fun too.” Jean-Baptiste agrees. Robot babysitters are “only in some ways alive.... It responds to you, but all it really thinks about is the job. If their job is making sure you don’t get hurt, they’re not going to be thinking about ice cream.” Or it might know that children like ice cream, but wouldn’t understand what ice cream was all about. How bad would this be? Despite his concerns, Jean-Baptiste says he “could love a robot if it was very, very nice to me.” It wouldn’t understand it was being nice, but for Jean-Baptiste, kindness is as kindness does.

Some children are open to a robot companion because people are so often disappointing. Colleen says, “I once had a babysitter just leave and go over to a friend’s house. A robot babysitter wouldn’t do that.” Even when they stayed around, her babysitters were preoccupied. “I would prefer to have a robot babysitter… . A robot would give me all its attention.” Octavio says that human babysitters are better than robots “if you are bored”—humans are able to make up better games. But they often get meals wrong: “What’s with the cereal for dinner? That’s boring. I should have pasta or chicken for dinner, not cereal.” Because of their “programming,” robots would know that cereal at night is not appropriate. Or, at least, says Octavio, robots would be programmed to take interest in his objections. In this way, the machines would know that cereal does not make a good dinner. Programming means that robots can be trusted. Octavio’s classmate Owen agrees. It is easier to trust a robot than a person: “You can only trust a person if you know who they are. You would have to know a person more [than a robot].... You wouldn’t have to know the robot, or you would get to know it much faster.”

Owen is not devaluing the “human kind” of trust, the trust built as people come through for each other. But he is saying that human trust can take a long time to develop, while robot trust is as simple as choosing and testing a program. The meaning of intelligence changed when the field of artificial intelligence declared it was something computers could have. The meaning of memory changed when it was something computers used. Here the word “trust” is under siege, now that it is something of which robots are worthy. But some of the children are concerned that a trustworthy, because consistent, robot might still fall short as babysitter for lack of heart. So Bridget says she could love a robot babysitter if it did a good job, but she is skeptical about the possibility. She describes what might occur if a robot babysitter were taking care of her and she scraped her knee: “It’s just going to be like, [in a robot voice] ‘Okay, what do I do, get a Band-Aid and put it on, that’s it. That’s my job, just get a Band-Aid and put it on.’ … [stops using robot’s voice] But to love somebody, you need a body and a heart. These computers don’t really have a heart. It’s just a brain.... A robot can get hurt, but it doesn’t really hurt. The robot just shuts down. When hurt, the robot says, ‘Right. Okay, I’m hurt, now I’ll shut down.’”

As Bridget speaks, I feel a chill. This “shutdown” is, of course, the behavior of My Real Baby, which shuts down when treated roughly. Bridget seizes upon that detail as a reason why a robot cannot have empathy. How easy it would be, how small a technical thing, to give robots “pretend empathy.” With some trepidation, I ask Bridget, “So, if the robot showed that it felt pain, would that make a difference?” Without hesitation she answers, “Oh yes, but these robots shut down if they are hurt.” From my perspective, the lack of robotic “empathy” depends on their not being part of the human life cycle, of not experiencing what humans experience. But these are not Bridget’s concerns. She imagines a robot that could be comforting if it performed pain. This is the behaviorism of the robotic moment.

There is little sentimentality in this classroom. Indeed, one of Miss Grant’s students sees people as potential obstacles to relationships with robots: “If you are already attached to your babysitter, you won’t be able to bond with a robot.” And this might be a shame. For the babysitter is not necessarily better, she just got there first. The children’s lack of sentimentality does not mean that the robots always come out ahead. After a long conversation about robot babysitters, Octavio, still dreaming of pasta instead of cereal, imagines how a robot might be programmed both to play with him and feed him “chicken and pasta because that is what you are supposed to have at night.” But Bridget dismisses Octavio’s plan as “just a waste. You could have just had a person.” Jude concurs: “What’s the point of buying a robot for thousands and thousands of dollars when you could have just kept the babysitter for twenty dollars an hour?”


Children speak fondly of their grandparents, whose care is often a source of family tension. Children feel a responsibility, and they want their parents to take responsibility. And yet, children see that their parents struggle with this. Might robots be there to fill in the gaps?

Some children are taken with the idea that machines could help with purely practical matters. They talk about a robot “getting my grandmother water in the middle of the night,” “watching over my grandmother when she sleeps,” and being outfitted with “emergency supplies.” The robots might be more reliable than people—they would not need sleep, for example—and they might make it easier for grandparents to continue living in their own homes.

But other children’s thinking goes beyond emergencies to offering grandparents the pleasures of robotic companionship. Oliver, the nine-year-old owner of Peanut the hamster, says that his grandparents are frail and don’t get out much. He considers in detail how their days might be made more interesting by an AIBO. But the robots might come with their own problems. Oliver points out that his grandparents are often confused, and it would be easy for them to confuse the robots. “Like, the old people might tell them [the AIBOs] the wrong people to obey or to do the opposite or not listen to the right person.” His sister Emma, eleven, sees only the bright side of a robotic companion. “My grandmother had a dog and the dog died before she did. My grandmother said she would die when her dog died.... I’m not sure that it is good for old people to have dogs. I think the AIBO would have been better for her.” Back in Miss Grant’s class, Bonnie thinks a robot might be the ultimate consolation. “If you had two grandparents and one died,” she says, “a robot would help the one that was alone.”

Jude, also in Miss Grant’s class, knows that his grandmother enjoys talking about the past, when she was a young mother, during what she calls “her happiest time.” He thinks that My Real Baby can bring her back to that experience. “She can play at that.” But it is Jude who first raises a question that will come to preoccupy these children. He thinks that his grandparents might prefer a robot to visits from a real baby.

Jude thinks aloud: “Real babies require work and then, well, they stop being babies and are harder for an older person to care for.” Jude says that while he and other kids can easily tell the difference between robots and a real baby, his grandparents might be fooled. “It will cry if it’s bored; when it gets its bottle, it will be happy.”

This association to the idea that robots might “double” for family members brings to mind a story I heard when I first visited Japan in the early 1990s. The problems of the elderly loomed large. Unlike in previous generations, children were mobile, and women were in the workforce. Aging and infirm parents were unlikely to live at home. Visiting them was harder; they were often in different cities from their children. In response, some Japanese children were hiring actors to substitute for them and visit aging parents.2 The actors would visit and play their parts. Some of the elderly parents had dementia and might not have known the difference. Most fascinating were reports about the parents who knew that they were being visited by actors. They took the actors’ visits as a sign of respect, enjoyed the company, and played the game. When I expressed surprise at how satisfying this seemed for all concerned, I was told that in Japan being elderly is a role, just as being a child is a role. Parental visits are, in large part, the acting out of scripts. The Japanese valued the predictable visits and the well-trained and courteous actors. But when I heard of it, I thought, “If you are willing to send in an actor, why not send in a robot?”

Eighteen years later, a room of American fifth graders are actively considering that proposition. The children know that their grandparents value predictability. When the children visit, they try their best to accommodate their elders’ desire for order. This is not always easy: “My grandmother,” says Dennis, “she really likes it if my glass, like with water, is only placed in a certain place. She doesn’t like it if I don’t wheel her only in a certain way through the hospital. It’s hard.” In this arena, children think that robots might have an edge over them. They begin to envision robots as so much a part of the family circle that they provoke a new kind of sibling rivalry.

One girl describes a feeling close to dread: “If my grandmother started loving the robot, she might start thinking it is her family and that her real family might not be as important to her anymore.” Children worry that the robots could spark warm—too warm—feelings. They imagine their grandparents as grateful to, dependent on, and fond of their new caretakers. The robot that begins as a “solution” ends up a usurper. Owen worries that “grandparents might love the robot more than you… . They would be around the robot so much more.” I ask if the robot would love the grandparents back. “Yes,” says Owen, “a little bit. I might feel a little jealous at the robot.”

Hunter’s grandmother lives alone. She has a button to press if she needs help—for example, if she falls or feels ill. Although Hunter knows that My Real Baby and AIBO couldn’t help his grandmother, he thinks future robots might. Hunter has mixed feelings: “I worry that if a robot came in that could help her with falls, then she might really want it.... She might like it more than me. It would be more helpful than I am.” Hunter wants to be the one to help his grandmother, but he doesn’t live with her. He realizes the practicality of the robot but is “really upset that the robot might be the hero for her.”

This is the sentiment of fourteen-year-old Chelsea, an eighth grader in Hart-ford. Her grandmother, eighty-four, lives in a nursing home. Chelsea and her mother visit once a week. Her grandmother’s forgetfulness frightens her. “I don’t want her forgetting about me.” When I introduce her to My Real Baby, Chelsea talks about her grandmother: “She would like this. She really would. I kind of hate that. But this does a lot of what she wants.... Actually, I think she would like that it would remember her and it wouldn’t ask her too many questions. I worry that when I go with my mom, we ask her so many questions. I wonder if she is relieved when we leave sometimes. My Real Baby would just love her, and there wouldn’t be any stress.”

I ask Chelsea if she would like to bring a My Real Baby to her grandmother. Her response is emphatic: “No! I know this sounds freaky, but I’m a little jealous. I don’t like it that I could be replaced by a robot, but I see how I could be.” I ask Chelsea about the things that only she can offer her grandmother, such as memories of their time together. Chelsea nods but says little. For the time being she can only think of the calm presence of the robot stand-in. The next time I see Chelsea, she is with her mother. They have discussed the idea of the robot companion. From Chelsea’s point of view, the conversation did not go well; she is upset that her mother seems taken by the idea.3 Chelsea is sharp with her mother: “It is better that grandma be lonely than forget us because she is playing with her robot. This whole thing makes me jealous of a robot.”

In Miss Grant’s class, the conversation about robots and grandparents ends up on a skeptical note. Some children become jealous, while others come to see the substitution as wrong. One says, “I wouldn’t let that thing [a robot] touch my grandmother.” For another, “That would be too weird.” A third worries that a robot might “blow up … stop working... put the house on fire.” A conversation that began as matter-of-fact becomes more animated. An anxious consensus emerges: “Don’t we have people for these jobs?”


My Real Baby was primitive, the first of its kind, and not a commercial success. Nevertheless, it was able to reach the “real baby” in us, the part that needs care and worries it will not come. It made it possible for children to project their hopes of getting what they are missing onto the idea of a robot.

Callie, ten, is serious and soft-spoken. When I first bring My Real Baby to her school, she says that “they were probably confused about who their mommies and daddies were because they were being handled by so many different people.” She thinks this must have been stressful and is convinced that things will be easier on the robots when they are placed in homes. Like any adoptive mother, she is concerned about bonding with her baby and wants to be the first in her class to take My Real Baby home. She imagines that future study participants will have a harder time with the robot, which is sure to “cry a lot” because “she doesn’t know, doesn’t think that this person is its mama.” As soon as Callie brought My Real Baby home, she stepped into the role of its mother. Now, after three weeks of the home study, our conversation takes place in her suburban home outside of Providence, Rhode Island.

Callie begins with a diversionary tactic: she notes small differences between My Real Baby and a biological child (the size of their pupils, for example) in a seeming effort to minimize the much larger differences between them. She works hard to sustain her feeling that My Real Baby is alive and has emotions. She wants this to be the case. Taking care of My Real Baby makes her feel more cared for. She explains that her parents are very busy and don’t have a lot of time to spend with her. She and her four-year-old brother compete for their attention.

For the most part, Callie is taken care of by nannies and babysitters. She sees her mother only “if she [is] not going out.” Callie describes her as “very busy … with very important work.” But what Callie says she misses most is spending time with her father, of whom she speaks throughout her interviews and play sessions. Sometimes he comes to our sessions, but he is visibly distracted. He usually has his BlackBerry with him and checks his e-mail every few minutes. He seems to have little time to concentrate exclusively on his daughter. Nevertheless Callie is intensely loyal to him. She explains that he works all day and often has to go out to important meetings at night. He needs time to travel. Tellingly, Callie thinks that grown-ups would like My Real Baby as much as children do because, in its presence, adults would be “reminded of being parents.”

Callie loves to babysit. Caring for others makes her feel wanted in a way that life at home sometimes does not. Her relationship with My Real Baby during the three-week home study comes to play something of the same role: loving the robot makes her feel more loved. She knows the robot is mechanical but has little concern for its (lack of) biology. It is alive enough to be loved because it has feelings, among them an appreciation of her motherly love. She sees the robot as capable of complex and mixed emotions. “It’s got similar-to-human feelings, because she can really tell the differences between things, and she’s happy a lot. She gets happy, and she gets sad, and mad, and excited. I think right now she’s excited and happy at the same time.” When My Real Baby says, “I love you,” Callie sees the robot’s expressed feelings as genuine. “I think she really does,” says Callie, almost tearfully. “I feel really good when it says that. Her expressions change. Sort of like Robbie [her four-year-old brother].” Playing with My Real Baby, she says, “makes me incredibly happy.” She worries about leaving the robot at home when she goes to school. She knows what it’s like to feel abandoned and worries that My Real Baby is sad during the day because no one is paying attention to it. Callie hopes that during these times, My Real Baby will play with one of Callie’s pets, a strategy that Callie uses when she feels lonely.

My Real Baby sleeps near Callie’s bed on a silk pillow. She names the robot after her three-year-old cousin Bella. “I named her like my cousin … because she [My Real Baby] was sort of demanding and said most of the things that Bella does.” But Callie often compares My Real Baby to her brother Robbie. Robbie is four, and Callie thinks My Real Baby is “growing up” to be his age. After feeding the robot, Callie tries several times to burp it, saying, “This is what babies need to do.” She holds the robot closer with increasing tenderness. She believes that it is getting to know her better as they spend more time together. With time, she says, “Our relationship, it grows bigger.... Maybe when I first started playing with her, she didn’t really know me … but now that she’s … played with me a lot more she really knows me and is a lot more outgoing.”

When Callie plays with other dolls, she says she is “pretending.” Time with My Real Baby is different: “I feel like I’m her real mom. I bet if I really tried, she could learn another word. Maybe ‘Da-da.’ Hopefully if I said it a lot, she would pick it up. It’s sort of like a real baby, where you wouldn’t want to set a bad example.” In Callie’s favorite game with My Real Baby, she imagines that she and the robot live in their own condo. She takes herself out of her own family and creates a new one in which she takes care of the robot and the robot is her constant companion. It is a fantasy in which this child, hungry for attention, finally gets as much attention as she wants.

In my study, Callie takes home both an AIBO and a My Real Baby. But very soon, the AIBO begins to malfunction: it develops a loud mechanical wheeze and its walking becomes wobbly. When this happens, Callie treats the AIBO as ill rather than broken—as a sick animal in need of “veterinary care.” Callie thinks it has “a virus, maybe the flu. Poor AIBO. I felt sad for it. It was a good AIBO.” Most important to Callie is maintaining her sense of herself as a successful mother. Once AIBO is her baby, she cannot not fail “him.” She ministers to AIBO—keeps it warm, shows it love—but when it does not recover, her attitude changes. She cannot not tolerate that the AIBO is sick and she cannot help. So she reinterprets AIBO’s problem. It is not ill; it is playing. When AIBO can walk no more, Callie says, “Oh, that’s what my dog does when he wants attention. I think it might be sleeping. Or just stretching in a different way than a normal dog would.” When she hears the troubling mechanical sounds, Callie considers that AIBO might be “just going to sleep.” Once she interprets the inert AIBO as sleeping, she is able to relax. She takes AIBO in her arms, holds it close, and pets it gently. She says, “Aww, man! How playful. AIBO! … He is sort of tired and wants to rest.” Callie focuses on what is most important to her: that AIBO should feel loved. She says, “He knows that I’m holding him.”

As Callie plays out scenarios in the imaginary condo, her parents and some of the researchers are charmed by the ease of her relationship with the robots, the way she accepts them as good company. But Callie’s earnestness of connection is compelled; she needs to connect with these robots.

Callie is very sad when her three weeks with My Real Baby and AIBO come to an end. She has used the time to demonstrate her ability to be a loving mother, a good caretaker to her pets, her brother, and her robots. Before leaving My Real Baby, Callie opens its box and gives the robot a final, emotional good-bye. She reassures My Real Baby that it will be missed and that “the researchers will take good care of you.” Callie has tried to work through a desire to feel loved by becoming indispensable to her robots. She fears that her parents forget her during their time away; now, Callie’s concern is that My Real Baby and AIBO will forget her.

With the best of intentions, roboticists hope we can use their inventions to practice our relationship skills. But for someone like Callie, practice may be too perfect. Disappointed by people, she feels safest in the sanctuary of an as-if world. Of course, Callie’s story is not over. Her parents love her and may become more present. She may find a caring teacher. But at ten, ministering to her robots, Callie reminds us of our vulnerability to them. More than harmless amusements, they are powerful because they invite our attachment. And such attachments change our way of being in the world.

Seven-year-old Tucker, severely ill, is afraid of his body, afraid of dying, and afraid to talk about it. A relationship with AIBO gives voice to these feelings. Home-administered treatments help Tucker to breathe, but even so, he spends several months a year in hospitals. Enthusiastic play with AIBO sometimes leaves him too tired to speak. His parents are reassuring that when this happens, he just needs to rest, and, indeed, after some time sitting quietly, Tucker is always able to continue.

Tucker’s mother explains that safety is always his first concern, something that, she admits, can become trying when he second-guesses her driving. When Tucker plays his favorite computer game, Roller Coaster Tycoon, rather than build the wildest roller coaster possible, he builds the safest one. The game allows you choices for how to spend your money in developing your amusement park. Tucker likes to put his cash into maintenance and staffing. He says that very often the game declares him the winner of the award for the “safest park.” So, when he first meets AIBO in my office, Tucker’s priority is that it be kept safe. His anxiety about this is so great that he denies any reality in which it is, in fact, endangered. So, when AIBO smashes into a fence of red siding that defines its space, Tucker interprets this as AIBO “scratching a door, wanting to go in … because it hasn’t been there yet.” Defense mechanisms are the responses we use to deal with realities too threatening to face. Like Callie ignoring the reality of her broken AIBO, Tucker sees only what he can handle.

Like Callie, Tucker sees AIBO’s feelings as real; he says that the robot recognizes and loves him. Tucker explains that when he goes to school, his dog Reb misses him and sometimes wants to jump into the car with him. He thinks that when he takes AIBO home, it will have the same loving desires. Indeed, Tucker finds few differences between AIBO and Reb, most of them unflattering to the biological pet. When Tucker learns to interpret AIBO’s blinking lights, he concludes that the robot and Reb have “the same feelings,” although he decides that AIBO seems the angrier of the two.

Tucker wishes he himself were stronger and projects this wish onto AIBO: he likes to talk about the robot as a superhero dog that shows up the limitations of his biological dog. Tucker says, “AIBO is probably as smart as Reb and at least he isn’t as scared as my dog.” While freely celebrating AIBO’s virtues, Tucker avoids answering any questions about what Reb can do that AIBO cannot. I am reminded of Chelsea, who, once having decided that a calm robot might be more comforting to her grandmother than her own anxious and talkative self, could not be engaged on what only she had to offer.

So, it is not uncommon for AIBO to do a trick and for Tucker to comment, “My dog couldn’t do that.” AIBO is the better dog, and we hear why. AIBO is alive even if his heart is made of batteries and wires. AIBO will never get sick or die. In fact, AIBO is everything that Tucker wishes to be. Tucker identifies with AIBO as a being that can resist death through technology. AIBO gives Tucker the idea that people, like this robot, may someday be recharged and rewired. Just as no blood is needed for AIBO’s heart to feel emotion, batteries and wires might someday keep a person alive. Tucker uses care for AIBO to dream himself into a cyborg future.

At one point Tucker says that he “would miss AIBO as much as Reb if either of them died.” Tucker seems startled when he realizes that in fantasy he has allowed that AIBO could die. He immediately explains that AIBO coulddie but does not have to die. And AIBO will not die if Tucker protects him. In this moment of poignant identification, Tucker sees AIBO as both potentially immortal and a creature like him, someone who needs to be kept out of harm’s way. In Tucker’s case, precautions have often been futile. Despite the best of care, he has often landed in the hospital. In AIBO’s case, Tucker believes that precautions will work. They will require vigilance. Tucker tells us his elaborate plans to care for the robot when he takes it home. As he speaks, Tucker’s anxiety about AIBO’s possible death comes through: “He’ll probably be in my room most of the time. And I’m probably going to keep him downstairs so he doesn’t fall down the stairs. Because he probably, in a sense he would die if he fell down the stairs. Because he could break.”

After the robot goes home with him, Tucker reports on their progress. On AIBO’s first day, Tucker says, “AIBO was charging and probably didn’t miss me.” By the second day, Tucker is sure that AIBO cares. But of course, AIBO is not always at his best, something that helps Tucker identify with the robot, for Tucker, too, has good and bad days. Tucker says that after he returns his AIBO, he will miss the robot and that the robot “will probably miss me.”

With AIBO at home, Tucker dreams up duels between the robot and his Bio Bugs. Bio Bugs are robot creatures that can walk and engage in combat with each other, gaining “survival skills” along the way. They can end up very aggressive. With great excitement, Tucker describes their confrontations with AIBO. The battles between AIBO and the Bio Bugs seem to reassure him that, no matter what, AIBO will survive. It reinforces the image of the robot as a life form able to defy death, something Tucker would like to become. The “bugs” are the perfect representation of a bacterium or virus, such as those that Tucker continually fights off. AIBO easily defeats them.

When it is time to return the robot, Tucker seems concerned that his healthy older brother, Connor, twelve, barely played with AIBO during the weeks they had the robot at home. Tucker brings this up with a shaky voice. He explains that his brother didn’t play with the robot because “he didn’t want to get addicted to him so he would be sad when we had to give him back.” Tucker wishes that he had more of his brother’s attention; the two are not close. Tucker fears that his brother does not spend time with him because he is so frail. In general, he worries that his illness keeps people away because they don’t want to invest in him. AIBO, too, is only passing through their home. Tucker is upset by Connor’s hesitancy to bond with something “only passing in his life.” Tucker tells us that he is making the most of his time with AIBO.

Callie and Tucker nurture robots that offer a lot more room for relationship than Furbies and Tamagotchis. Yet, both My Real Baby and AIBO are commercially available pastimes. I’ve studied other children who come to MIT laboratories to visit more advanced robots. These robots are not toys; they have their own toys. Grown-ups don’t just play with them; these robots have their own grown-up attendants. Is this a game for grown-ups or a more grown-up game? Is it a game at all? To treat these robots as toys is to miss the point—and even the children know it.