NOTES - Alone Together - Sherry Turkle

Alone Together: Why We Expect More from Technology and Less from Each Other - Sherry Turkle (2011)



1 Sherry Turkle, “Inner History,” in Sherry Turkle, ed., The Inner History of Devices (Cambridge, MA: MIT Press, 2008), 2-29.

2 Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005), 2.

3 Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon and Schuster, 1995), 13.

4 Roger Entner, “Under-aged Texting: Usage and Actual Cost,”, January 27, 2010, (accessed May 30, 2010).


1 See “What Is Second Life,” Second Life, (accessed June 13, 2010).

2 Benedict Carey and John Markoff, “Students, Meet Your New Teacher, Mr. Robot,” New York Times, July 10, 2010, (accessed July 10, 2010); Anne Tergeson and Miho Inada, “It’s Not a Stuffed Animal, It’s a $6,000 Medical Device,” Wall Street Journal, June 21, 2010, (accessed August 10, 2010); Jonathan Fildes, “‘Virtual Human’ Milo Comes Out to Play at TED in Oxford,” BBC News, July 13, 2010, (accessed July 13, 2010); Amy Harmon, “A Soft Spot for Circuitry: Robot Machines as Companions,” New York Times, July 4, 2010, (accessed July 4, 2010); Emily Veach, “A Robot That Helps You Diet,” Wall Street Journal, July 20, 2010, (accessed July 20, 2010).

3 On this, see “The Making of Deep Blue,” IBM Research, (accessed June 10, 2010).

4 David L. Levy, Love and Sex with Robots: The Evolution of Human-Robot Relationships (New York: Harper Collins, 2007).

5 The book review is Robin Marantz Henig, “Robo Love,” New York Times, December 2, 2007, (accessed July 21, 2009). The original article about the MIT robot scene is Robin Marantz Henig, “The Real Transformers,” New York Times, July 29, 2007, (accessed July 21, 2009).

6 Levy, Love and Sex, 22.

7 On “alterity,” the ability to put oneself in the place of another, see Emmanuel Lévinas, Alterity and Transcendence, trans. Michael B. Smith (London: Athlone Press, 1999).

8 Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005), 183-218.

9 The way here is paved by erotic images of female robots used to sell refrigerators, washing machines, shaving cream, and vodka. See, for example, the campaign for Svedka Vodka (Steve Hall, “Svedka Launches Futuristic, Un-PC Campaign,”, September 20, 2005, [accessed September 1, 2009]) and Phillip’s shaving system (“Feel the Erotic Union of Man and Shavebot,”, August 21, 2007, [accessed September 1, 2009]).

10 Sharon Moshavi, “Putting on the Dog in Japan,” Boston Globe, June 17, 1999, A1.

11 As preteens, the young women of the first Google generation (born roughly from 1987 to 1993) wore clothing widely referred to as “baby harlot”; they listened to songs about explicit sex well before puberty. Their boomer parents had few ideas about where to draw lines, having spent their own adolescences declaring the lines irrelevant. Boomer parents grew up rejecting parental rules, but knowing that there were rules. One might say it is the job of teenagers to complain about constraints and the job of parents to insist on them, even if the rules are not obeyed. Rules, even unheeded, suggest that twelve to fifteen are not good ages to be emotionally and sexually enmeshed.

Today’s teenagers cannot easily articulate any rules about sexual conduct except for those that will keep them “safe.” Safety refers to not getting venereal diseases or AIDS. Safety refers to not getting pregnant. And on these matters teenagers are eloquent, unembarrassed, and startlingly well informed. But teenagers are overwhelmed with how unsafe they feel in relationships. A robot to talk to is appealing—even if currently unavailable—as are situations that provide feelings of closeness without emotional demands. I have said that rampant fantasies of vampire lovers (closeness with constraints on sexuality) bear a family resemblance to ideas about robot lovers (sex without intimacy, perfect). And closeness without the possibility of physical intimacy and eroticized encounters that can be switched off in an instant—these are the affordances of online encounters. Online romance expresses the aesthetic of the robotic moment. From a certain perspective, they are a way of preparing for it. On the psychology of adolescents’ desire for relationships with constraint, I am indebted to conversations with child and adolescent psychoanalyst Monica Horovitz in August 2009.

12 Commenting on the insatiable desire for robot pets during the 2009 holiday season, a researcher on social trends comments, “A toy trend would be something that reflects the broader society, that tells you where society is going, something society needs.” Gerald Celente, founder of the Trends Research Institute, cited in Brad Tuttle, “Toy Craze Explained: A Zhu Zhu Pet Hamster Is Like a ‘Viral Infection,’” Time, December 9, 2009, (accessed December 9, 2009).

13 For classic psychodynamic formulations of the meaning of symptoms, see Sigmund Freud, “The Unconscious,” in The Standard Edition of Sigmund Freud, ed. and trans. James Strachey et al. (London: Hogarth Press, 1953-1974), 14:159-204; “Introductory Lectures on Psychoanalysis,” in The Standard Edition, vols. 15 and 16; “From the History of an Infantile Neurosis,” in The Standard Edition, 17:1-122; “Inhibitions, Symptoms, and Anxiety,” in The Standard Edition, 20:75-172; and Sigmund Freud and Joseph Breuer, “Studies on Hysteria,” in The Standard Edition, 2:48-106. For Freud on dreams as wishes, see “The Interpretation of Dreams,” in The Standard Edition, vol. IV.

14 For an argument about the pleasures of limited worlds in another technological realm, see Natasha Schüll’s work on gambling, Addiction by Design: Machine Gambling in Las Vegas (Princeton, NJ: Princeton University Press, forthcoming).

15 See, for example, Bill Gates, “A Robot in Every Home,” Scientific American, January 2007, (accessed September 2, 2009).

16 See Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon and Schuster, 1995). On life as performance, the classic work is Erving Goffman, The Presentation of Self in Everyday Life (Garden City, NY: Doubleday Anchor, 1959).

17 The apt phrase “identity workshop” was coined by my then student Amy Bruckman. See “Identity Workshop: Emergent Social and Psychological Phenomena in Text-Based Virtual Reality” (unpublished essay, Media Lab, Massachusetts Institute of Technology, 1992), (accessed September 2, 2009).

18 Sociologists distinguish between strong ties, those of family and close friendship, and weak ties, the bonds of acquaintanceship that make us comfortable at work and in our communities. Facebook and Twitter, friending rather than friendship—these are worlds of weak ties. Today’s technology encourages a celebration of these weak ties as the kind we need in the networked life. The classic work on weak ties is Mark S. Granovetter, “The Strength of Weak Ties,” American Journal of Sociology 78, no. 6 (May 1973): 1360-1380.

19 See, for example, Matt Richtel, “In Study, Texting Lifts Crash Risk by Large Margin,” New York Times, July 27, 2009, (accessed September 1, 2009). On the pressure that friends and family members put on drivers who text, see “Driver Texting Now an Issue in Back Seat,” New York Times, September 9, 2009, (accessed September 9, 2009). As I complete this book, Oprah Winfrey has made texting while driving a personal crusade, encouraging people across America to sign an online pledge to not text and drive. See “Oprah’s No Phone Zone,” (accessed May 30, 2010).

20 The teenage national average as of January 2010 is closer to thirty-five hundred; my affluent, urban neighborhood has a far higher number. Roger Entner, “Under-aged Texting: Usage and Actual Cost,”, January 27, 2010, (accessed May 30, 2010). On texting’s impact on teenage life, see Katie Hafner, “Texting May Be Taking Its Toll,” New York Times, May 25, 2009, (accessed July 21, 2009).

21 To find friends in the neighborhood, Loopt for the iPhone is a popular “app.”

22 A witty experiment suggests that Facebook “friends” won’t even show up when you invite them to a party. Hal Niedzviecki, “Facebook in a Crowd,” New York Times, October 24, 2008, (accessed July 27, 2010).

23 From Winston Churchill’s remarks to the English Architectural Association in 1924, available at the International Centre for Facilities website at (accessed August 10, 2010). Churchill’s comment is, of course, very similar to the spirit of Marshall McLuhan. See, for example, Understanding MediaThe Extensions of Man (1964; Cambridge, MA: MIT Press, 1994).


1 Weizenbaum had written the program a decade earlier. See Joseph Weizenbaum, “ELIZA—a Computer Program for the Study of Natural Language Communication Between Man and Machine,” Communications of the ACM, vol. 9, no. 1 (January 1966): 36-45.

2 See Joseph Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation (San Francisco: W. H. Freeman, 1976).

3 For whatever kind of companionship, a classical first step is to make robots that are physically identical to people. In America, David Hanson has an Albert Einstein robot that chats about relativity. At the TED conference in February 2009, Hansen discussed his project to create robots with empathy as the “seeds of hope for our future.” See (accessed August 11, 2010) On Hanson, also see Jerome Groopman, “Robots That Care: Advances in Technological Therapy,” The New Yorker, November 2, 2009, (accessed November 11, 2009).

These days, you can order a robot clone in your own image (or that of anyone else) from a Japanese department store. The robot clone costs $225,000 and became available in January 2010. See “Dear Santa: I Want a Robot That Looks Just Like Me,” Ethics Soup, December 17, 2009, html (accessed January 12, 2010).

4 Bryan Griggs, “Inventor Unveils $7,000 Talking Sex Robot,” CNN, February 1, 2010, (accessed June 9, 2010).

5 Raymond Kurzweil, The Singularity Is Near: When Humans Transcend Biology (New York: Viking, 2005). On radical images of our future, see Joel Garreau, Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies—and What It Means to Be Human (New York: Doubleday, 2005).

6 For my further reflections on computer psychotherapy, see “Taking Things at Interface Value,” in Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon and Schuster, 1995), 102-124.

7 There is, too, a greater willingness to enter into a relationship with a machine if people think it will make them feel better. On how easy it is to anthropomorphize a computer, see Byron Reeves and Clifford Nass, The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places (New York: Cambridge University Press, 1996). See also, on computer psychotherapy, Harold P. Erdman, Marjorie H. Klein, and John H. Greist, “Direct Patient Computer Interviewing,” Journal of Consulting and Clinical Psychology 53 (1985): 760-773; Kenneth Mark Colby, James B. Watt, and John P. Gilbert, “A Computer Method for Psychotherapy: Preliminary Communication,” Journal of Nervous and Mental Diseases 142, no. 2 (1966): 148-152; Moshe H. Spero, “Thoughts on Computerized Psychotherapy,” Psychiatry 41 (1978): 281-282.

8 For my work on early computational objects and the question of aliveness, see Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005). That work on aliveness continued with a second generation of computational objects in Turkle, Life on the Screen. My inquiry, with an emphasis on children’s reasoning rather than their answers, is inspired by Jean Piaget, The Child’s Conception of the World, trans. Joan Tomlinson and Andrew Tomlinson (Totowa, NJ: Littlefield, Adams, 1960).

9 On the power of the liminal, see, for example, Victor Turner, The Ritual Process: Structure and Anti-Structure (Chicago: Aldine, 1969), and The Forest of Symbols: Aspects of Ndembu Ritual (1967; Ithaca, NY: Cornell University Press, 1970). See also Mary Douglas, Purity and Danger: An Analysis of Concepts of Pollution and Taboo (London: Routledge and Kegan Paul, 1966).

10 Piaget, The Child’s Conception.

11 Turkle, The Second Self, 33-64.

12 Children, in fact, settled on three new formulations. First, when it came to thinking through the aliveness of computational objects, autonomous motion was no longer at the heart of the matter. The question was whether computers had autonomous cognition. Second, they acknowledged that computer toys might have some kind of awareness (particularly of them) without being alive. Consciousness and life were split. Third, computers seemed alive because they could think on their own, but were only “sort of alive” because even though they could think on their own, their histories undermined their autonomy. So an eight-year-old said that Speak & Spell was “sort of alive” but not “really alive” because it had a programmer. “The programmer,” he said, “gives it its ideas. So the ideas don’t come from the game.” These days, sociable robots, with their autonomous behavior, moods, and faces, seem to take the programmer increasingly out of the picture. And with the formulation “alive enough,” children put the robots on a new terrain. As for cognition, it has given way in children’s minds to the capacity to show attention, to be part of a relationship of mutual affection.

13 Turkle, Life on the Screen, 169.

14 Turkle, Life on the Screen, 173-174.

15 The quotation is from a journal entry by Emerson in January 1832. The passage reads in full, “Dreams and beasts are two keys by which we are to find out the secrets of our nature. All mystics use them. They are like comparative anatomy. They are our test objects.” See Joel Porte, ed. Emerson in His Journals (Cambridge, MA: Belknap Press, 1982), 81.

16 According to psychoanalyst D. W. Winnicott, objects such as teddy bears, baby blankets, or a bit of silk from a first pillow mediate between the infant’s earliest bonds with the mother, who is experienced as inseparable from the self, and other people, who will come to be experienced as separate beings. These objects are known as “transitional,” and the infant comes to know them both as almost inseparable parts of the self and as the first “not me” possessions. As the child grows, these transitional objects are left behind, but the effects of early encounters with them remain. We see them in the highly charged relationships that people have with later objects and experiences that call forth the feeling of being “at one” with things outside the self. The power of the transitional object is associated with religion, spirituality, the perception of beauty, sexual intimacy, and the sense of connection with nature. And now, the power of the transitional object is associated with computers and, even more dramatically, with sociable robots. On transitional objects, see D. W. Winnicott, Playing and Reality (New York: Basic Books, 1971).

17 In the early 1980s, children’s notion of people as “emotional machines” seemed to me an unstable category. I anticipated that later generations of children would find other formulations as they learned more about computers. They might, for example, see through the apparent “intelligence” of the machines by developing a greater understanding of how they were created and operated. As a result, children might be less inclined to see computers as kin. However, in only a few years, things moved in a very different direction. Children did not endeavor to make computation more transparent. Like the rest of the culture, they accepted it as opaque, a behaving system. Children taking sociable robots “at interface value” are part of a larger trend. The 1984 introduction of the Macintosh encouraged its users to stay on the surface of things. The Macintosh version of “transparency” stood the traditional meaning of that word on its head. Transparency used to refer to the ability to “open up the hood” and look inside. On a Macintosh it meant double-clicking an icon. In other words, transparency had come to mean being able to make a simulation work without knowing how it worked. The new transparency is what used to be called opacity. For more on this question, see Turkle, Life on the Screen, especially 29-43, and Sherry Turkle, Simulation and Its Discontents(Cambridge, MA: MIT Press, 2009).

18 Our connections with the virtual intensifies when avatars look, gesture, and move like us; these connections become stronger when we move from the virtual to the embodied robotic. Computer scientist Cory Kidd studied attachment to a computer program. In one condition the program issued written commands that told study participants what to do. In a second condition, an on-screen avatar issued the same instructions. In a third condition an on-screen robot was used to give the same instructions. The robot engendered the greatest attachment. Cory Kidd, “Human-Robot Interaction” (master’s thesis, Massachusetts Institute of Technology, 2003).

19 The Tamagotchi website cautions about unfavorable outcomes: “If you neglect your little cyber creature, your Tamagotchi may grow up to be mean or ugly. How old will your Tamagotchi be when it returns to its home planet? What kind of virtual caretaker will you be?” The packaging on a Tamagotchi makes the agenda clear: “There are a total of 4 hearts on the ‘Happy’ and ‘Hunger’ screens and they start out empty. The more hearts that are filled, the better satisfied Tamagotchi is. You must feed or play with Tamagotchi in order to fill the empty hearts. If you keep Tamagotchi full and happy, it will grow into a cute, happy cyberpet. If you neglect Tamagotchi, it will grow into an unattractive alien.” The manufacturer of the first Tamagotchi is Bandai. Its website provides clear moral instruction that links nurturance and responsibility. See the Bandai website at (accessed October 5, 2009).

20 See “Tamagotchi Graveyard,” Tamagotchi Dreamworld, (accessed June 15, 2009).

21 In Japan, a neglected Tamagotchi dies but can be uploaded to a virtual graveyard. In the United States, manufacturers propose gentler resolutions. Some neglected Tamagotchis might become “angels” and return to their home planet. On the Tamagotchis I played with, it was possible to hit a reset button and be presented with another creature.

22 Sigmund Freud, “The Uncanny,” in The Standard Edition of Sigmund Freud, ed. and trans. James Strachey et al. (London: Hogarth Press, 1953-1974), 17:219-256.

23 Sigmund Freud, “Mourning and Melancholia,” in The Standard Edition, 14: 237-258.

24 See “Tamagotchi Graveyard.”

25 Other writings on the Tamagotchi gravesite include the epitaph for a Tamagotchi named Lacey who lived for ninety-nine years. We know how hard it was for her owner to achieve this result, but he is modest about his efforts: “She wasn’t much trouble at all.” But even with his considerable accomplishment, he feels her death was due to his neglect: “I slept late on a Sunday and she died.” But in the simple expressions of guilt (or perhaps a playing at guilt) are frank admissions of how hard it is to lose someone you love. Mourners say, “I was his mama and he will always love me as I loved him”; “He went everywhere with me. He was a loving and faithful pet”; “I’m sorry and I real[l]y miss you!”; and “God gave him life. I gave him death.” Some mourners express their belief in redemption through the generativity of generations. Thus is “Little Guy” memorialized, dead at forty-eight: “I hope you are very happy, Little Guy. I’m currently taking care of your son. I know he’s yours because he looks and acts just like you. I’m really sorry I couldn’t save you and had you on pause a lot when you were older.” See “Tamagotchi Graveyard.”


1 The fact that the Furby was so hard to quiet down was evidence of its aliveness. Even adults who knew it was not alive saw it as playing on the boundaries of life. The response of many was to see the Furby as out of control, intolerable, or, as one put it, insane. A online video of an “insane Furby” shows the Furby chatting away, to the increasing consternation of its adult owner. To stop it, he slaps its face, sticks his fingers in its mouth, holds down its ears and eyes, smashes it against a wall, and throws it down a flight of stairs. None of these shuts it down. If anything, its language becomes more manic, more “desperate.” Finally comes the solution of taking out the Furby’s batteries with a Phillips screwdriver. Now, the quiet Furby is petted. Its owner comments, “That’s better.” See “Insane Furby,” YouTube, March 15, 2007, (accessed November 11, 2009).

2 These enactments bring theory to ground level. See Donna Haraway, “A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century,” in Simians, Cyborgs and Women: The Reinvention of Nature (New York: Routledge, 1991), 149-181, and N. Katherine Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago: University of Chicago Press, 1999).

3 Michael Chorost, Rebuilt: How Becoming Part Computer Made Me More Human (Boston: Houghton Mifflin, 2005).

4 Here, the Furby acts as what psychoanalyst D. W. Winnicott termed a “transitional object,” one where the boundaries between self and object are not clear. See D. W. Winnicott, Playing and Reality (New York: Basic Books, 1971).

5 The idea that the Furby had the capacity to learn new words by “listening” to the language around it was persistent. The belief most likely stemmed from the fact that it was possible to have the Furby say certain preprogrammed words or phrases more often by petting it whenever it said them. As a result of this myth, several intelligence agencies banned Furbies from their offices, believing that they were recording devices camouflaged as toys.

6 Children move back and forth between he, she, and it in talking about relational artifacts. Once they make a choice, they do not always stick with it. I report on what children say and, thus, their sentences are sometimes inconsistent.

7 Peter H. Kahn and his colleagues studied online discussion groups that centered on Furbies. For their account, see Batya Friedman, Peter H. Kahn Jr., and Jennifer Hagman, “Hardware Companions? What Online AIBO Discussion Forums Reveal About the Human-Robotic Relationship,” in Proceedings of the Conference on Human Factors in Computing Systems (New York: ACM Press, 2003), 273-280.

8 The artist Kelly Heaton played on the biological/mechanical tension in the Furby’s body by creating a fur coat made entirely from the fur of four hundred “skinned” Furbies, reengineered into a coat for Mrs. Santa Claus. The artwork, titled Dead Pelt, was deeply disturbing. It also included a wall of reactive eyes and mouths, taken from Furbies, and a formal anatomical drawing of a Furby. See the Feldman Gallery’s Kelly Heaton page at (accessed August 18, 2009).

9 Baird developed her thought experiment comparing how people would treat a gerbil, a Barbie, and a Furby for a presentation at the Victoria Institute, Gothenburg, Sweden, in 1999.

10 In Turing’s paper that argued the existence of intelligence if a machine could not be distinguished from a person, one scenario involved gender. In “Computing Machinery and Intelligence,” he suggested an “imitation game”: a man and then a computer pose as female, and the interrogator tries to distinguish them from a real woman. See Alan Turing, “Computing Machinery and Intelligence,” Mind 59, no. 236 (October 1950): 433-460.

11 Antonio Damasio, The Feeling of What Happens: Body and Emotion in the Making of Consciousness (New York: Harcourt, 1999). Since emotions are cognitive representations of body states, the body cannot be separated from emotional life, just as emotion cannot be separated from cognition.

12 There are online worlds and communities where people feel comfortable expressing love for Furbies and seriously mourning Tamagotchis. These are places where a deep sense of connection to the robotic are shared. These “sanctioned spaces” play an important part in the development of the robotic moment. When you have company and a community, a sense of intimacy with sociable machines comes to feel natural. Over time, these online places begin to influence the larger community. At the very least, a cohort has grown up thinking that their attitudes toward the inanimate are widely shared.

13 BIT was developed by Brooks and his colleagues at the IS Robotics Corporation. IS Robotics was the precursor to iRobot, which first became well known as the makers of the Roomba robotic vacuum cleaner.

14 Rodney A. Brooks, Flesh and Machines: How Robots Will Change Us (New York: Pantheon, 2002), 202.

15 Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005), 61.

16 This field has a vast literature. Several works that have influenced my thinking include the early book by Peter D. Kramer, Listening to Prozac: A Psychiatrist Explores Antidepressants and the Remaking of the Self (New York: Viking, 1993), and the more recent Margaret Talbot, “Brain Gain: The Underground World of Neuroenhancing Drugs,” The New Yorker, July 27, 2009, (accessed July 21, 2009), and Nathan Greenslit, “Depression and Consumption: Psychopharmaceuticals, Branding, and New Identity Practices,” Culture, Medicine, and Psychiatry 25, no. 4 (2005): 477-502.


1 Three recent works by authors who have influenced my thinking are Jessica Riskin, ed., Genesis Redux: Essays on the History and Philosophy of Artificial Life (Chicago: University of Chicago Press, 2007); Gaby Wood, Edison’s Eve: A Magical History of the Quest for Mechanical Life (New York: Anchor, 2003); and Barbara Johnson, Persons and Things (Cambridge, MA: Harvard University Press, 2008). Johnson explores how relations between persons and things can be more fluid while arguing a central ethical tenet: persons should be treated as persons.

2 Norbert Wiener, God and Golem, Inc.: A Comment on Certain Points Where Cybernetics Impinges on Religion (Cambridge, MA: MIT Press, 1966).

3 The literature on the negotiation of technology, self, and social world is rich and varied. I have been particularly influenced by the perspectives described in Wiebe Bijker, Thomas P. Hughes, and Trevor Pinch, eds., The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology (1987; Cambridge, MA: MIT Press, 1999) and by the work of Karin D. Knorr Cetina and Bruno Latour. See, for example, Karin D. Knorr Cetina, “Sociality with Objects: Social Relations in Post-social Knowledge Societies,” Theory, Culture and Society 14, no. 4 (1997): 1-30; Karin D. Knorr Cetina, Epistemic Cultures: How the Sciences Make Knowledge(Cambridge, MA: Harvard University Press, 1999); Bruno Latour and Steve Woolgar, Laboratory Life: The Construction of Scientific Facts (1979; Princeton, NJ: Princeton University Press, 1986); Bruno Latour, Science in Action: How to Follow Scientists and Engineers Through Society (1987; Cambridge, MA: Harvard University Press, 1999); Bruno Latour, Aramis, or the Love of Technology, trans. Catherine Porter (1996; Cambridge, MA: Harvard University Press, 2002); and Bruno Latour, We Have Never Been Modern, trans. Catherine Porter (1991; Cambridge, MA: Harvard University Press, 2001).

In the specific area of the relationship between people and computational objects, this book is indebted to the work of Sara Kiesler, Lee Sproull, and Clifford Nass and their collaborators. See, for example, Sau-lai Lee and Sara Kiesler, “Human Mental Models of Humanoid Robots” (paper delivered at the International Conference on Robotics and Automation, Barcelona, Spain, April 18-22, 2005); Lee Sproull et al., “When the Interface Is a Face,” Human-Computer Interaction 11 (1996): 97-124; Sara Kiesler and Lee Sproull, “Social Responses to ‘Social’ Computers,” in Human Values and the Design of Technology, ed. Batya Friedman (Stanford, CA: CLSI Publications, 1997); Byron Reeves and Clifford Nass, The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places (New York: Cambridge University Press, 1996); Clifford Nass and Scott Brave, Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship (Cambridge, MA: MIT Press, 2005); Victoria Groom and Clifford Nass, “Can Robots Be Teammates? Benchmarks and Predictors of Failure in Human-Robot Teams,” Interaction Studies 8, no. 3 (2008): 483-500; Leila Takayama, Victoria Groom, and Clifford Nass, “I’m Sorry, Dave, I’m Afraid I Won’t Do That: Social Aspects of Human-Agent Conflict,” Proceedings of the Conference on Human Factors in Computing Systems (Boston, MA: ACM Press, 2009), 2209-2108.

4 The object relations tradition in psychoanalytic thought proposes that infants see objects (and people) in terms of their functions. This partial understanding is captured by the phrase “part objects.” So, for example, the breast that feeds the hungry infant is the “good breast.” The hungry infant unsuccessfully tries to nurse in relation to the “bad breast.” By interacting with the world, the child internalizes these external objects, which shape his or her psyche. Infants gradually grow to develop a sense of “whole objects.” The internalized objects may not be accurate representations of the outside world, but they are what the child uses as he or she goes forward. D. W. Winnicott reassured mothers that in a “good-enough” facilitating environment, a perception of part objects eventually transforms into a comprehension of whole objects. This corresponds to the ability to tolerate ambiguity and to see that both the “good” and “bad” breast are part of the same mother. In larger terms, this underpins the ability throughout life to tolerate ambiguous and realistic relationships. See, for example, Melanie Klein, Love, Guilt and Reparation: And Other Works, 1921-1945, ed. Roger Money-Kyrle et al. (New York: Free Press, 1975), and D. W. Winnicott, Playing and Reality (New York: Basic Books, 1971).

5 Emmanuel Lévinas, Alterity and Transcendence, trans. Michael B. Smith (London: Athlone Press, 1999).

6 The Pokémon is a character that appears in a card collection with which Henry plays an elaborate set of war games. He collects character cards in which different creatures from the Pokémon world have different powers. Then, teams of creatures challenge each other. Henry spends a lot of time strategizing about how to maximize his team’s powers in his war games. He spends a lot of time thinking about “powers.” I thank my research assistant Lauren Klein for her helpful explanations of the Pokémon universe.

7 In the 1980s, the presence of a “programmer” figured in children’s conversations about computer toys and games. The physical autonomy of robots seems to make the question of their historical determination fall out of the conversation. This is crucial in people’s relating to robots as alive on their own account.

Peter H. Kahn and his colleagues performed a set of experiments that studied how children’s attitudes and, crucially, their behavior differed with AIBOs and stuffed doll dogs. When questioned verbally, children reported opinions about AIBO that were similar to their opinions about a stuffed doll dog. But when you look not at what the children say but at what they do, the picture looks very different. Kahn analyzed 2,360 coded interactions. Most dramatically, children playing with AIBO were far more likely to attempt reciprocal behavior (engaging with the robot and expecting it to engage with them in return) than with the stuffed doll dog (683 to 180 occurrences). In the same spirit, half the children in Kahn’s study said that both AIBO and the stuffed doll dog could hear, but children actually gave more verbal direction to AIBO (fifty-four occurrences) than to the stuffed doll dog (eleven occurrences). In other words, when children talk about the lifelike qualities of their dolls, children don’t believe what they say. They do believe what they say about AIBO.

Similarly, children in Kahn’s study were more likely to take action to “animate” the stuffed doll dog (207 occurrences) while they mostly let AIBO animate itself (20 occurrences). Most tellingly, the children were more likely to mistreat the stuffed doll dog than AIBO (184 to 39 occurrences). Relational artifacts, as I stress here, put children on a moral terrain.

Kahn also found evidence that children see AIBO as the “sort of entity with which they could have a meaningful social (human-animal) relationship.” This expresses what I have called simultaneous vision: children see relational artifacts as both machine and creature. They both know AIBO is an artifact and treat it as a dog. See Peter H. Kahn Jr. et al., “Robotic Pets in the Lives of Preschool Children,” Interaction Studies: Social Behavior and Communication in Biological and Artificial Systems 7, no. 3 (2006): 405-436.

See also a study by Kahn and his colleagues on how people write about AIBO on the Web: Peter H. Kahn Jr., Batya Friedman, and Jennifer Hagman, “Hardware Companions? What Online AIBO Discussion Forums Reveal About the Human-Robotic Relationship,” in Proceedings of the Conference on Human Factors in Computing Systems (New York: ACM Press, 2003), 273-280.

8 Older children and adults use access to the AIBO programming code more literally to create an AIBO in their image.

9 I note the extensive and growing literature suggesting that computation (including robots) will, in a short period, enable people to be essentially immortal. The best-known writer in this genre is Raymond Kurzweil who posits that within a quarter of a century, computational power will have hit a point he calls the singularity. It is a moment of “take-off,” when all bets are off as to what computers will be able to do, think, or accomplish. Among the things that Kurzweil believes will be possible after the singularity is that humans will be able to embed themselves in a chip. They could either take an embodied robotic form or (perhaps until this becomes possible) roam a virtual landscape. This virtual self could evolve into an android self when the technology becomes ready. See Raymond Kurzweil, The Singularity Is Near: When Humans Transcend Biology (New York: Viking, 2005).

Kurzweil’s work has captured the public imagination. A small sample of attention to the singularity in the media includes “Future Shock—Robots,” Daily Show with Jon Stewart, August 24, 2006, -shock-robots (accessed August 10, 2010); the IEEE Spectrum’s special issue The Singularity: A Special Report, June 3, 2008; James Temple, “Singularity Taps Students’ Technology Ideas,” San Francisco Chronicle, August 28, 2009, (accessed September 2, 2009); Iain Mackenzie, “Where Tech and Philosophy Collide,” BBC News, August 12, 2009, (accessed September 2, 2009).

One hears echoes of a “transhuman perspective” (the idea that we will move into a new realm by merging with our machines) in children’s asides as they play with AIBO. Matt, nine, reflecting on his AIBO, said, “I think in twenty years from now, if they had the right stuff, they could put a real brain in a robot body.” The idea of practicing for an impossible “perfection” comes to mind when I attend meetings of AIBO users. They come together to show off how they have customized their AIBOs. They reprogram and “perfect them.” The users I speak to spend as much as fifty to sixty hours a week on their hobby. Some are willing to tell me that they spend more time with their AIBOs than with their families. Yet, AIBO is experienced as relaxation. As one thirty-five-year-old computer technician says, “All of this is less pressure than a real dog. No one will die.” Of all the robotic creatures in my studies, the AIBO provoked the most musing about death and the finality of loss.

10 Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005), 41.

11 This is a classic use of the defense mechanism known as projective identification, or seeing in someone else what you feel within yourself. So, if a teenager is angry at her prying mother, she may imagine her mother to be hostile. If a wife is angry with an inattentive husband, she may find her husband aggressive. Sometimes these feelings are conscious, but often they are not. Children use play to engage in the projections that can bring unacknowledged feelings to light. “Play is to the child what thinking, planning, and blueprinting are to the adult, a trial universe in which conditions are simplified and methods exploratory, so that past failures can be thought through, expectations tested.” See Erik Erikson, The Erik Erikson Reader, ed. Robert Coles (New York: W. W. Norton, 2000), 195-196.

12 Interview. July 2000.

13 Hines says that the robot is “designed to engage the owner with conversation rather than lifelike movement.” See “Roxxxy Sex Robot [PHOTOS]: World’s First ‘Robot Girlfriend’ Can Do More Than Chat,” Huffington Post, January 10, 2010, (accessed January 11, 2010).

14 Paul Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America (Cambridge, MA: MIT Press, 1997).


1 In an interview about the possibilities of a robot babysitter, psychologist Clifford Nass said, “The question is, if robots could take care of your children, would you let them? What does it communicate about our society that we’re not making child-care a number-one priority?” I later spoke with Nass about his response to the idea of the nanny bot, and he rephrased with greater emphasis: “The first problem with offering children a robotic babysitter is that you would have to explain to the children why you were considering this. Why were there no people around for the child?” See Brandon Keim, “I, Nanny: Robot Babysitters Pose Dilemma,” Wired, December 18, 2008, (accessed May 31, 2010).

2 One article puts this in the context of the Japanese work ethic. See Jennifer Van House Hutcheson, “All Work and No Play,” MercatorNet, May 31, 2007, (accessed August 20, 2009). Another reports on an elderly couple who hired actors to portray their grandchildren, but then assaulted them as they wished to assault their true grandchildren. See Pencil Louis, “Elderly Yokohama,” (accessed August 20, 2009).

3 Chelsea’s mother, Grace, fifty-one, explains her position: “Active young people are simply not the right companions for old people and infirm people.” She says, “When I bring my daughters to see my mom, I feel guilty. They shouldn’t be here. She isn’t what she was, even not as good as when they were little… . I think it’s better they remember her happier, healthier.” Grace had seen Paro in my office and now is intrigued with the idea of bringing the robot to her mother. For Grace, her mother’s immortality depends on now declaring her no longer the person to remember. It is the case that robots seem “easier” to give as companions to people whom society declares “nonpersons.” Grace is not saying that her mother is a nonperson, but her choice of a robotic companion marks the moment when Grace’s mother is no longer the person Grace wants to remember as her mother.


1 Rodney A. Brooks, “The Whole Iguana,” in Robotics Science, ed. Michael Brady, MIT Press, 1989), 432-456. This was written in response to a two-page challenge by Daniel C. Dennett about lessons to be learned by building a complete system rather than just modules. See Daniel C. Dennett, “Why Not the Whole Iguana?” Behavioral and Brain Sciences 1 (1978): 103-104.

2 Kismet is programmed to recognize the word “say” and will repeat the word that follows it. So, children trying to teach Kismet its name would instruct, “Say Kismet,” and Kismet would comply, much to their glee. Similarly, children would try to teach Kismet their names by saying, “Say Robert” … “Say Evelyn” … “Say Mark.” Here, too, it was within Kismet’s technical ability to comply.

3 Cog and Kismet were both built at the MIT Artificial Intelligence Laboratory. Cog has visual, tactile, and kinesthetic sensory systems and is capable of a variety of social tasks, including visually detecting people and salient objects, orienting to visual targets, pointing to visual targets, differentiating between animate and inanimate movement, and performing simple tasks of imitation. Kismet is a robotic head with five degrees of freedom, an active vision platform, and fourteen degrees of freedom in its display of facial expressions. Though the Kismet head sits disembodied on a platform, it is winsome in appearance. It possesses small, mobile ears made of folded paper, mobile lips made from red rubber tubing, and heavily lidded eyes ringed with false eyelashes. Its behaviors and capabilities are modeled on those of a preverbal infant. Kismet gives the impression of looking into people’s eyes and can recognize and generate speech and speech patterns, although to a limited degree.

Much has been written about these two very well-known robots. See Rodney A. Brooks et al., “The Cog Project: Building a Humanoid Robot,” in Computation for Metaphors, Analogy and Agents, vol. 1562 of Springer Lecture Notes in Artificial Intelligence , ed. C. Nehaniv (New York: Springer-Verlag, 1998), and Rodney Brooks, Flesh and Machines: How Robots Will Change Us (New York: Pantheon, 2002). Brian Scassellati did his dissertation work on Cog. See Brian Scassellati, Foundations for a Theory of Mind for a Humanoid Robot (PhD diss., Massachusetts Institute of Technology, 2001). Scassellati and Cynthia Breazeal worked together during early stages of the Kismet project, which became the foundation of Breazeal’s dissertation work. See Cynthia Breazeal and Brian Scassellati, “How to Build Robots That Make Friends and Influence People” (paper presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems, Kyongju, Korea, October 17-21, 1999), in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (1999), 858-863. Cynthia Breazeal and Brian Scassellati, “Infant-like Social Interactions Between a Robot and a Human Caretaker,” Adaptive Behavior 8 (2000): 49-74; Cynthia Breazeal, “Sociable Machines: Expressive Social Exchange Between Humans and Robots” (PhD diss., Massachusetts Institute of Technology, 2000); and Cynthia Breazeal, Designing Sociable Robots (Cambridge, MA: MIT Press, 2002).

4 Cynthia Breazeal discusses the astronaut project in Claudia Dreifus, “A Conversation with Cynthia Breazeal: A Passion to Build a Better Robot, One with Social Skills and a Smile,” New York Times, June 10, 2003, (accessed September 9, 2009).

5 I cite this student in Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005), 271. The full Norbert Weiner citation is “This is an idea with which I have toyed before—that it is conceptually possible for a human being to be sent over a telegraph line.” See Norbert Wiener, God and Golem, Inc.: A Comment on Certain Points Where Cybernetics Impinges on Religion (Cambridge, MA: MIT Press, 1966), 36.

6 People drawn to sociable robots seem to hit a wall that has come to be known as the “uncanny valley.” This phrase is believed to have been coined by Masahiro Mori in “The Uncanny Valley,” Energy 7, no. 4 (1970): 33-35, An English translation by Karl F. MacDorman and Takashi Minato is available at (accessed November 14, 2009).

If one plots a graph with humanlike appearance on the x axis and approval of the robot on the y axis, as the robot becomes more lifelike, approval increases until the robot becomes too lifelike, at which point approval plummets into a “valley.” When a robot is completely indistinguishable from a human, approval returns. Japanese roboticist Hiroshi Ishiguro thinks he is building realistic androids that are close to climbing out of the uncanny valley. See, for example, Karl F. MacDorman and Hiroshi Ishiguro, “The Uncanny Advantage of Using Androids in Cognitive and Social Science Research,” Interaction Studies 7, no. 3 (2006): 297-337, and Karl F. MacDorman et al., “Too Real for Comfort: Uncanny Responses to Computer Generated Faces,” Computers in Human Behavior 25 (2009): 695-710.

Like Ishiguro, roboticist David Hanson aspires to build realistic androids that challenge the notion of the uncanny valley. “We conclude that rendering the social human in all possible detail can help us to better understand social intelligence, both scientifically and artistically.” See David Hanson et al., “Upending the Uncanny Valley,” Association for the Advancement of Artificial Intelligence, May 11, 2005, (accessed November 14, 2009).

7 A sympathetic reading of the possibilities of deep human-robot connections is represented in Peter H. Kahn Jr. et al., “What Is Human? Toward Psychological Benchmarks in the Field of Human-Robot Interaction,” Interaction Studies 8, no. 3 (2007): 363-390, and Peter H. Kahn Jr. et al., “Social and Moral Relationships with Robotic Others?” in Proceedings of the 13th International Workshop on Robot and Human Interactive Communication (RO-MAN ’04) (Piscataway, NJ: Institute of Electrical and Electronics Engineers, 2004), 545-550.

Yet in their 2006 paper “Robotic Pets in the Lives of Preschool Children” (Interaction Studies: Social Behavior and Communication in Biological and Artificial Systems 7, no. 3, 405-436), Kahn and his colleagues cite John Searle’s 1992 critique of AI in formulating their own. See John Searle, The Rediscovery of the Mind (Cambridge, MA: MIT Press, 1992). Kahn concludes, “Although it is an open question our sense is that because computerized robots are formal systems, with syntax but not semantics, they will never be capable of engaging in full social relationships or of engendering full moral development in human beings.”

8 The philosopher Emmanuel Lévinas writes that the presence of a face initiates the human ethical compact, which binds us before we know what lies behind a face. The face itself communicates, “You shall not commit murder.” We seem to be summoned by the face even if we are looking at the face of a machine, something that cannot be killed. The robot’s face certainly announces, as Lévinas puts it, “Thou shalt not abandon me.” See Emmanuel Lévinas, “Ethics and the Face,” in Totality and Infinity: An Essay on Exteriority, trans. Alphonso Lingis (Pittsburgh, PA: Duquesne University Press, 1969), 199. Lévinas notes that the capacity to put ourselves in the place of another, alterity, is one of the defining characteristics of the human. I speak of complicity because for a human being to feel that current robots are “others,” the human must construct them as capable of alterity. See Emmanuel Lévinas, Alterity and Transcendence, trans. Michael Smith (New York: Columbia, 1999).

9 See Sherry Turkle et al., “First Encounters with Kismet and Cog: Children Respond to Relational Artifacts,” in Digital Media: Transformations in Human Communication , ed. Paul Messaris and Lee Humphreys (New York: Peter Lang Publishing, 2006). I owe a special debt to Jennifer Audley for her contribution to the design and implementation of this study and to Olivia Dasté and Robert Briscoe for work on the analysis of transcripts.

10 Plato, The Republic, Book Two: The Individual, the State, and Education.

11 J. K. Rowling, Henry Potter and the Chamber of Secrets (New York: Scholastic, 1999), 329.

12 One twelve-year-old is troubled to learn about Scassellati’s imminent departure. She pleads with him, “But Cog has seen you so much. Won’t it miss you? I think Cog sees you as its dad… . It is easier for you to teach it than for any of us.” She imagines the worst for the robot’s future. “What if someone was trying to do something bad to the robot and you won’t be there to protect it anymore?”

13 Brian Aldiss, Supertoys Last All Summer Long and Other Stories of Future Time (New York: St. Martin, 2001).

14 See Takayuki Kanda et al., “Interactive Humanoid Robots and Androids in Children’s Lives,” Children, Youth and Environments 19, no. 1, (2009): 12-33, www.coloradoedu/journals/cye. (accessed July 4, 2009).


1 For Paro’s Guinness citation, see “Seal-Type Robot ‘PARO’ to Be Marketed with Best Healing Effect in the World,” Paro Robots, January 4, 2005, (accessed July 27, 2010).

2 Publicity films for Paro show older men and women who live with Paro having breakfast with it, watching television with it, taking it to the supermarket and out to dinner. Sometimes Paro is adopted by a couple and sometimes by a younger person who simply does not enjoy living alone. In interviews, people say they are happy to have company that is easier to take care of than a real pet, company that will not die. See the Paro website at (accessed August 10, 2010).

3 Over years, I become convinced that in nursing homes, seniors become fascinated by relationships with robots because, among other reasons, they bring to the surface tensions about seniors’ autonomy in their institutions. A robot that needs you promotes a fantasy of autonomy: seniors feel competent because something depends on them. Yet, robots can also disrupt fictions of autonomy. They can send the message, “Now you know that you are wholly dependent. Play with this toy. You are like a child.” This push and pull makes the robots compelling even as they disturb. I owe this insight to my research assistant William Taggart. See Sherry Turkle et al., “Relational Artifacts with Children and Elders: The Complexities of Cybercompanionship,” Connection Science 18, no. 4 (December 2006): 347-361, and Cory D. Kidd, William Taggart, and Sherry Turkle, “A Sociable Robot to Encourage Social Interaction Among the Elderly,” Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando, Florida, May 15-19, 2006.

4 See, for example, Toshiyo Tamura et al., “Is an Entertainment Robot Useful in the Care of Elderly People with Severe Dementia?” The Journals of Gerontology Series A: Biological Sciences and Medical Sciences 59 (January 2004): M83-M85, (accessed August 15, 2009).

5 Suvendrini Kakuchi, “Robot Lovin’,” Asia Week Magazine Online, November 9, 2001,,8782,182326,00.html (accessed September 15, 2006).

6 It is standard for presentations about the “need” for sociable robots to begin with a slide demonstrating the inability to staff service jobs with people because of demographic trends. This slide is often drawn from the 2002 United Nations Report “UNPD World Population Ageing: 1950-2050,” United Nations, (accessed July 8, 2009). The slides in this report dramatize (particularly for the developed world) that there are more and more older people and fewer and fewer younger people to take care of them. Nothing in my argument disputes the slide. I do question the leap from this slide to the inevitability of robots to care for people.

7 The meeting was sponsored by the Association for the Advancement of Artificial Intelligence. There are were also presentations on relational agents that dwell within the machine—for example, an “affective” health and weight-loss coach, developed by the chair of the symposium, Timothy W. Bickmore. See Timothy W. Bickmore, “Relational Agents: Effecting Change Through Human-Computer Relationships” (PhD diss., Massachusetts Institute of Technology, 2003), and Timothy W. Bickmore and Rosalind W. Picard, “Towards Caring Machines,” in CHI ’04 Extended Abstracts on Human Factors in Computing Systems (New York: ACM Press, 2004). Another presentation discussed a robotic cat, Max, named one of Time magazine’s “Inventions of the Year” for 2003. If you stroke Max or call it by name, the robot responds. See “Best Inventions 2003: Lap Cat,” (accessed September 23, 2009). Max is brought to the conference by Elena and Alexander Lubin, who are proponents of “robotherapy.” For an overview of their work, see their entry in the Encyclopedia of Applied Psychology (Oxford: Elsevier, 2004), 289-293.

8 An estimated 26.2 percent of Americans ages 18 and older (and one in five children) suffer from a diagnosable mental disorder in a given year. National Institutes of Mental Health Statistics, (accessed August 10, 2010) and “The numbers count: Mental Disorders in America,” (Accessed August 10, 2010).

9 The movement to use robots in therapeutic situations is encouraged by the observed therapeutic potential of pets and the power of a gaze. See, for example, K. Allen et al., “Presence of Human Friends and Pet Dogs As Moderators of Autonomic Responses to Stress in Women,” Journal of Personality and Social Psychology 61, no. 4 (1991): 582-589, and Michael Argyle and Mark Cook, Gaze and Mutual Gaze (Cambridge: Cambridge University Press, 1976).

10 We now have years of experience of people using games and the Internet as places to, in their words, “say what they couldn’t say in their real lives.” We see people, especially adolescents, using the anonymity and new opportunities of online life to experiment with identity. They try out new things, in safety—never a bad idea. When we see what we do in our lives on the screen, we can learn what we feel we are missing and use this information to enhance our lives in the “real.” Over years of study I have learned that the people who do best with their lives on the screen are those who use them as material for self-reflection. This can be the focus of work between a therapist and a patient. In Sherry Turkle, ed., The Inner History of Devices (Cambridge, MA: MIT Press, 2008), see especially the contributions of therapists John Hamilton, Kimberlyn Leary, and Marcia Levy-Warren.

11 Cory D. Kidd, “Designing for Long-Term Human-Robot Interaction and Application to Weight Loss” (PhD diss., Massachusetts Institute of Technology, 2008). Rose (and Gordon) are pseudonyms that Kidd provided for his subjects.

12 Cory D. Kidd, “Sociable Robots: The Role of Presence and Task in Human-Robot Interaction” (master’s thesis, Massachusetts Institute of Technology, 2003).

13 For an early experiment using a small, plush robot as a trigger for memories, see Marina Bers and Justine Cassell, “Interactive Storytelling Systems for Children: Using Technology to Explore Language and Identity,” Journal of Interactive Learning Research 9, no. 2 (1999): 603-609.

14 See, for example, Erving Goffman, The Presentation of Self in Everyday Life (Garden City, NY: Doubleday Anchor, 1959).

15 The Intel Corporation joins with the Universities of Michigan and Pittsburgh, Carnegie Mellon, and Stanford on the Nursebot project. Nursebot tests a range of ideas for assisting elderly people, such as reminding elderly patients to visit the bathroom, take medicine, drink, or see the doctor; connecting patients with caregivers through the Internet; collecting data and monitoring the well-being of patients; manipulating objects around the home, such as the refrigerator, washing machine, or microwave; taking over certain social functions such as game playing and simple conversation. For a 2002 film produced by the National Science Foundation on Nursebot, see “Nursebot,” YouTube, May 10, 2008, (accessed August 13, 2009).

16 Chung Chan Lee, “Robot Nurse Escorts and Schmoozes the Elderly,” Robots—Because Humans Deserve Better, May 17, 2006, (accessed August 13, 2009). This blog’s main title is telling: “Because Humans Deserve Better.”

17 See Lee, “Robot Nurse Escorts.”

18 See comments about “In the Hands of Robots—Japan,” YouTube, June 16, 2008, (accessed August 13, 2009).

19 Amy Harmon, “Discovering a Soft Spot for Circuitry,” New York Times, July 5, 2010, (accessed July 5, 2010). The story cites Timothy Hornyak, a student of robots in Japan, on our new challenge to process synthetic emotions.


1 See “Kismet and Rich,” MIT Computer Science and Artificial Intelligence Laboratory, (accessed November 14, 2009).

2 I have been fortunate to have colleagues who have both inspired and challenged my readings of complicity and communion. I owe a special debt to Margaret Boden, Linnda R. Caporael, and Lucy Suchman.

For a discussion of the construction of meaning behind what I am terming complicity in human-robot interactions, see Margaret Boden, Mind As Machine: A History of Cognitive Science, vol. 1 (London: Oxford University Press, 2006). Prior to these constructions of meaning is the general question of why humans anthropomorphize. See, for example, Linnda R. Caporael, “Anthropomorphism and Mechanomorphism: Two Faces of the Human Machine.” Computers in Human Behavior 2 (1986): 215-34 and Linnda R. Caporael and Ceclia M. Hayes, “Why Anthropomorphize? Folk Psychology and Other Stories,” in Anthropomorphism, Anecdotes, and Animals, ed. Robert W. Mitchell, Nicholas S. Thompson, and Lyn Miles (Albany: State University of New York, 1997), 59-75. The literature on anthropomorphism is large. I signal two particularly useful volumes: Mitchell, Thompson, and Miles, eds., Anthropomorphism, Anecdotes, and Animals, and John Stodart Kennedy, The New Anthropomorphism (Cambridge: Cambridge University Press, 1992).

For a critical study of the constructions of meaning in human-robot interactions, see Lucy Suchman, Human-Machine Reconfigurations: Plans and Situated Actions (1987; Cambridge: Cambridge University Press, 2007), especially ch. 13. See also Lucy Suchman, “Affiliative Objects,” Organization 12, no. 2 (2005): 379-399. Suchman and I both participated in panels on computers and society at the Society for the Social Studies of Science (August 2007) and at the Harvard Graduate School of Design (March 2009). At both panels, Suchman eloquently examined human-robot interactions as social constructs. Most recently, Suchman has persuasively argued for a return to “innocence” in how we approach sociable robots, a tonic dialing down of what we are willing to project onto them. See Lucy Suchman, “Subject Objects,” accepted for a special issue of Feminist Theory devoted to “nonhuman feminisms,” edited by Myra Hird and Celia Roberts.

3 On Domo, see Sandra Swanson, “Meet Domo, It Just Wants to Help,” Technology Review (July/August 2007), (accessed August 6, 2009). Unless otherwise referenced, all citations to Aaron Edsinger are from an interview in March 2007.

4 Swanson, “Meet Domo.”

5 A similar experience is reported by Lijin Aryananda, a graduate student at MIT’s Computer Science and Artificial Intelligence Laboratory, who has done her thesis work on Mertz, the robot to which Pia Lindman hopes to merge her mind. I spoke with Aryananda in March 2007, as she was about to graduate and take a fellowship in Germany. She said she would miss the robot. Her feelings for it, she said, began with a “technical bonding.” She described how she was the person who could interact with the robot best: “I can understand the timing of the robot. I know what sorts of things it can be conditioned to respond to. It would be an understatement to say that I look at this robot and see thirteen degrees of freedom. There is more than that.” Aryananda described feeling that the robot was not at its “best” unless she was there, to the point that it almost felt as though she was “letting the robot down” if she was not around. And then, one day, these feelings of “technical missing” became “just missing.” She said, “It [Mertz] has been such a big part of your life, its ways of responding are so much a part of the rhythm of your day.” For her dissertation work on how people responded to Mertz in a natural environment, see Lijin Aryananda, “A Few Days of a Robot’s Life in the Human’s World: Toward Incremental Individual Recognition” (PhD diss., Massachusetts Institute of Technology, 2007).

6 Alan Turing, usually credited with inventing the programmable computer, said that intelligence may require the ability to have sensate experience. In 1950, he wrote, “It can also be maintained that it is best to provide the machine with the best sense organs that money can buy, and then teach it to understand and speak English. That process could follow the normal teaching of a child. Things would be pointed out and named, etc.” Alan Turing, “Computing Machinery and Intelligence,” Mind 59, no. 236 (October 1950): 433-460.

7 This branch of artificial intelligence (sometimes called “classical AI”) attempts to explicitly represent human knowledge in a declarative form in facts and rules. For an overview of AI and its schools that explores its relations to theories of mind, see Margaret Boden, Artificial Intelligence and Natural Man (1981; New York: Basic Books, 1990).

8 Hubert Dreyfus, “Why Computers Must Have Bodies in Order to Be Intelligent,” Review of Metaphysics 21, no. 1 (September 1967): 13-32. See also Hubert Dreyfus, What Computers Can’t Do: A Critique of Artificial Reason(New York: Harper & Row, 1972); Hubert Dreyfus with Stuart E. Dreyfus and Tom Athanasiou, Mind over Machine: The Power of Human Intuition and Expertise in the Era of the Computer (New York: Free Press, 1986); Hubert Dreyfus with Stuart E. Dreyfus, “Making a Mind Versus Modeling the Brain: Artificial Intelligence Back at a Branchpoint,” Daedalus 117, no. 1 (winter 1988): 15-44; Hubert Dreyfus, What Computers “Still” Can’t Do: A Critique of Artificial Reason (1979; Cambridge, MA: MIT Press, 1992).

For another influential critique of artificial intelligences that stresses the importance of embodiment, see John Searle, “Minds, Brains, and Programs,” Behavioral and Brain Sciences 3 (1980): 417-424, and “Is the Brain’s Mind a Computer Program?” Scientific American 262, no. 1 (January 1990): 26-31.

9 Dreyfus, “Why Computers Must Have Bodies.”

10 Antonio Damasio, Descartes’ Error: Emotion, Reason, and the Human Brain (New York: Gosset/Putnam Press, 1994).

11 For an introduction to this phenomenon, see David G. Myers, Exploring Psychology (New York: Worth Books, 2005), 392. See also Robert Soussignan, “Duchenne Smile, Emotional Experience, and Autonomic Reactivity: A Test of the Facial Feedback Hypothesis,” Emotion 1, no. 2 (2002): 52-74, and Randy J. Larsen, Margaret Kasimatis, and Kurt Frey, “Facilitating the Furrowed Brow: An Unobtrusive Test of the Facial Feedback Hypothesis Applied to Unpleasant Affect,” Cognition & Emotion 6, no. 5 (September 1992): 321-338.

12 See, for example, Stephanie D. Preston and Frans B. M. de Waal, “Empathy: Its Ultimate and Proximate Bases,” Behavioral and Brain Sciences 25 (2002): 1-72, and Christian Keysers and Valeria Gazzola, “Towards a Unifying Neural Theory of Social Cognition,” Progress in Brain Research 156 (2006): 379-401.

13 On the New York exhibition of the Lindman/Edsinger/Domo project, see Stephanie Cash, “Pia Lindman at Luxe,” Art in America, September 2006, (accessed September 10, 2009).

14 On fusion with the inanimate, there are other expert witnesses, although their experiences are outside the scope of this book. See, for example, Michael Chorost’s Rebuilt: My Journey Back to the Hearing World (New York: Mariner Books, 2006), a personal account of receiving a cochlear implant. Other relevant testimony comes from Aimée Mullins, a double amputee who uses prosthetic legs to remake herself. See “Aimee Mullins and Her 12 Pairs of Legs,” (accessed September 11, 2009). In both Chorost’s and Mullins’s cases, there is evidence that merging with technology results not only in a purely instrumental gain in functionality but in a new prosthetic sensibility as well.

15 Lévinas, Emmanuel, “Ethics and the Face,” in Totality and Infinity: An Essay on Exteriority, trans. Alphonso Lingis (Pittsburgh, PA: Duquesne University Press, 1969), 197-201.

16 Lindman uses continental philosophy and psychoanalysis as a referent. I see two themes in the work of French psychoanalyst Jacques Lacan as touchstones of her thinking. First, there is always something that cannot be represented, something that Lacan calls “the real.” Second, the self is structured by language and society. There is no ego apart from language and society. See Jacques Lacan, Ecrits: A Selection, trans. Alan Sheridan (1977; New York: W. W. Norton & Company, 2002), and The Four Fundamental Concepts of Psychoanalysis, ed. Jacques-Alain Miller, trans. Alan Sheridan (1973; New York: W. W. Norton & Company, 1998).

17 See, for example, Sherry Turkle, “Authenticity in the Age of Digital Companions,” Interaction Studies 8, no. 3 (2007): 501-517.

18 The tendency for people to attribute personality, intelligence, and emotion to computational objects has been widely documented in the field of human-computer interaction. Classic experimental studies are reported in Byron Reeves and Clifford Nass, The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places (New York: Cambridge University Press/CSLI, 1996); Clifford Nass et al., “Computers Are Social Actors: A Review of Current Research,” in Human Values and the Design of Computer Technology, ed. Batya Friedman (Stanford, CA: CSLI Productions, 1997), 137-162; Clifford Nass and Yougmee Moon, “Machines and Mindlessness: Social Response to Computers,” Journal of Social Issues 56, no. 1 (2000): 81-103. See also Salvatore Parise et al., “Cooperating with Life-like Interface Agents,” Computers in Human Behavior 15 (1999): 123-142; Lee Sproull et al., “When the Interface Is a Face,” Human-Computer Interaction 11, no. 2 (June 1996): 97-124; Sara Kiesler and Lee Sproull, “Social Responses to ‘Social’ Computers,” in Human Values and the Design of Technology, ed. Batya Friedman (Stanford, CA: CLSI Publications, 1997). A review of research on sociable robotics is T. Fong, I. Nourbakhsh, and K. Dautenhahn, A Survey of Social Robots (Pittsburgh, PA: Carnegie Mellon University Robotics Institute, 2002).

19 Nass et al., “Computers Are Social Actors,” 138.

20 Nass et al., “Computers Are Social Actors,” 158.

21 Nass et al., “Computers Are Social Actors,” 138.

22 Rosalind W. Picard, Affective Computing (Cambridge, MA: MIT Press, 1997), x.

23 Marvin Minsky, The Emotion Machine: Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind (New York: Simon & Schuster, 2006), 345.

24 See “affective,” Thesaurus.com (accessed July 6, 2009).

25 Raymond Kurzweil believes that it will be possible to download self onto machine. For an overview of his ideas, see Raymond Kurzweil, The Singularity Is Near: When Humans Transcend Biology (New York: Viking, 2005).

26 Current research at the MIT Media Lab aspires to such computationally enhanced environments. See, for example, research groups on Fluid Interfaces and Information Ecology, August 14, 2010).

27 Starner discussed his ideas on growing a robot by using sensors embedded in his clothing in a January 2008 interview. See “Wearable Computing Pioneer Thad Starner,”, January 29, 2008, (accessed April 3, 2010).

28 For an overview of robots in medical settings, focusing on research on Alzheimer’s and autism, see Jerome Groopman, “Robots That Care: Advances in Technological Therapy,” The New Yorker, November 2, 2009, (accessed November 11, 2009).

29 In Japan, robot babysitters offer lessons, games, and child surveillance as their mothers perform household chores. Androids in the form of sexy women are marketed as receptionists and guides. They are in development to serve as hostesses and elementary school teachers. In a related development in Japan, a lifelike sex doll, anatomically correct and enhanced by sphincter muscles, is publicly marketed and seen as a good way for shut-ins to find pleasure and, more generally, to control the spread of sexually transmitted diseases.

In a new development, there is now a “real,” physical vacation resort where Japanese men can spend time with their virtual girlfriends. Although the men check in “alone,” the staff is trained to respond to them as though they were in a couple. Daisuke Wakabayashi, “Only in Japan, Real Men Go to a Hotel with Virtual Girlfriends,” August 31, 2010, (accessed September 7, 2010).


1 This chapter expands on themes explored in Sherry Turkle, “Tethering,” in Sensorium: Embodied Experience, Technology, and Contemporary Art, ed. Caroline Jones (Cambridge, MA: Zone, 2006), 220-226, and “Always-On/Always-on-You: The Tethered Self,” in Handbook of Mobile Communication Studies, ed. James E. Katz (Cambridge, MA: MIT Press, 2008), 121-138.

2 These statements put me on a contested terrain of what constitutes support and shared celebration. I have interviewed people who say that flurries of virtual condolence and congratulations are sustaining; others say it just reminds them of how alone they are. And this in fact is my thesis: we are confused about when we are alone and when we are together.

3 See “The Guild—Do You Want to Date My Avatar,” YouTube, August 17, 2009, (accessed January 15, 2010).

4 Internet Relay Chat is a form of real-time Internet text messaging (chat) or synchronous conferencing. It is mainly designed for group communication in discussion forums, called channels, but also allows one-to-one communication via private message as well as chat and data transfers. It is much used during academic conferences, now in addition to Twitter. See, for example, this note on a conference invitation for a conference on media literacy: “Conference attendees are encouraged to bring their laptops, PDAs, netbooks or Twitter enabled phones, so they can participate in on-line social networking that will be part of this year’s conference. Directions on how to obtain Internet connectivity and where people will be talking, will be provided in your attendee packet. For those who can not attend, tell them they can backchannel with us on Twitter at #homeinc.” See “Conference Program,” 2009 Media Literacy Conference, (accessed October 20, 2009).

5 Hugh Gusterson and Catherine Besteman, eds., The Insecure AmericanHow We Got Here and What We Should Do About It (Los Angeles: University of California Press, 2009).

6 See, for example, Robert D. Putnam, Bowling Alone: The Collapse and Revival of American Community (New York: Simon and Schuster, 2001); Gusterson and Besteman, eds., The Insecure American; Theda Skocpol, Diminished Democracy: From Membership to Management in American Civic Life (Norman: University of Oklahoma Press, 2003).

7 Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon and Schuster, 1995), 182.

8 See “What Is Second Life,” Second Life, (accessed June 13, 2010).

9 There is evidence that people experience what they do online as though it happened in the physical real. See, for example, Nick Yee, Jeremy Bailenson, and Nicolas Ducheneaut, “The Proteus Effect: Implications of Transformed Digital Self-representation on Online and Offline Behavior,” Communication Research 36, no. 2: 285-312. For a video introduction to work in this area by Stanford University’s Virtual Human Interaction Laboratory, directed by Jeremy Bailenson, see, “The Avatar Effect,” (accessed September 2, 2009).

10 Pete accesses Second Life through an iPhone application known as Spark. It does not bring the entire world to him, but it does enable conversation.

11 Pete insists that Alison does not know of his double life. Over the past twenty years I have had many conversations about virtual infidelity. In the case of women whose husbands are virtually unfaithful, there are sharp differences of opinion. Some think it is preferable to any physical infidelity. Others think it is the worst kind of infidelity, an infidelity that involves not simply sex but talking, considering another, making plans, and building a life.

12 In online life, weak ties—the ties of acquaintanceship—are often celebrated as the best ties of all. For the seminal work on weak ties, see Mark Granovetter, “The Strength of Weak Ties,” American Journal of Sociology 78, no. 6 (1973): 1360-1380, and “The Strength of Weak Ties: A Network Theory Revisited,” Sociological Theory 1 (1983): 201-233.

13 Turkle, Life on the Screen.

14 This is sometimes referred to as “continuous partial attention,” a phrase widely credited to media researcher Linda Stone. See Stone’s blog at (accessed August 24, 2009).

15 Those who study the boundaries between work and the rest of life suggest that it is helpful to demarcate our changing roles. Sue Campbell Clark, “Work/Family Border Theory: A New Theory of Work/Family Balance,” Human Relations 53, no. 6 (2000): 747-770; Stephan Desrochers and Leisa D. Sargent, “Work-Family Boundary Ambiguity, Gender and Stress in Dual-Earner Couples” (paper presented at the conference “From 9-to-5 to 24/7: How Workplace Changes Impact Families, Work, and Communities,” 2003 BPW/Brandeis University Conference, Orlando, Florida, March 2003); and Michelle Shumate and Janet Fulk, “Boundaries and Role Conflict When Work and Family Are Colocated: A Communication Network and Symbolic Interaction Approach,” Human Relations 57, no. 1 (2004): 55-74.

16 Media theorist Henry Jenkins is an eloquent spokesperson for the significance of multitasking. See “The Skill of the Future: In a Word ‘Multitasking,’” html? (accessed November 16, 2009). His other online interviews on the Digital Nation website beautifully capture a vision of schools bending to new media sensibilities. See “The Tech Fix,” (accessed November 14, 2009), and “Defenders of the Book,” (accessed November 14, 2009).

17 The literature on the downside of multitasking is growing. An influential and much-reported study is Eyal Ophir, Clifford Nass, and Anthony Wagner, “Cognitive Control in Media Multitaskers,” Proceedings of the National Academy of Sciences 106 (2009): 15583-15587, (accessed August 10, 2010). This study found that when people multitask, everything they do is degraded in quality. An excellent work on the general topic is Maggie Jackson, Distracted: The Erosion of Attention and the Coming Dark Age (New York: Prometheus, 2008). On the practical downside of thinking that we can do more than one thing at once, see, for example, the nine-part series on the New York Times website titled “Driven to Distraction,” covering such topics as doing office work while driving at 60 mph, drivers and legislators dismissing cell phone risks, and New York taxi drivers ignoring the ban on cell phone use while driving. “Driven to Distraction,” New York Times (accessed November 14, 2009).

Teenagers routinely drive and text; we know this because their automobile accidents are traced back to texting and cell phone use. A 2009 study of twenty-one teenagers showed them changing speed and weaving in and out of lanes while texting. Eastern Virginia Medical School, “Texting While Driving Can Be Deadly, Study Shows,” ScienceDaily, May 5, 2009, (accessed January 4, 2010). A larger study of nine hundred teenagers in 2007 showed 50 percent of them texted while driving despite the fact that 36 percent of them thought this was dangerous. See Steve Vogel, “Teen Driver Menace: Text-Messaging,” Suite101, October 22, 2007, (accessed January 4, 2009).

Adults also text while driving. Trains collide while conductors text. A plane flies past its destination airport because its pilots are absorbed in a new computer program. In October 2009, pilots attending to their laptop computers—behavior in defiance of safety regulations—were the cause of an aircraft overshooting its Minneapolis destination by 150 miles. “The pilots told the National Transportation Safety Board that they missed their destination because they had taken out their personal laptops in the cockpit, a violation of airline policy, so the first officer, Richard I. Cole, could tutor the captain, Timothy B. Cheney, in a new scheduling system put in place by Delta Air Lines, which acquired Northwest last fall.” See Micheline Maynard and Matthew L. Wald, “Off-Course Pilots Cite Computer Distraction,” New York Times, October 26, 2009, (accessed November 16, 2009).

18 In practical terms, what works best is to remind students that media literacy is about knowing when not to use technology as well as how to use it. I am optimistic that over time, we will make better use of technology in the classroom and we will be less afraid to turn it off when that is what makes sense pedagogically.

19 Melissa Mazmanian, “Some Thoughts on BlackBerries” (unpublished memo, Massachusetts Institute of Technology, 2005). See also Melissa Mazmanian, Wanda Orlikowski, and Joanne Yates, “Ubiquitous E-mail: Individual Experiences and Organizational Consequences of BlackBerry Use,” Proceedings of the 65th Annual Meeting of the Academy of Management, Atlanta, Georgia, August 2006, August 24, 2009).

20 The first book club selection by Arianna Huffington for the Huffington Post’s book club was Carl Honoré’s In Praise of Slowness: How a Worldwide Movement Is Challenging the Cult of Speed (New York: HarperCollins, 2004).

21 Diana B. Gant and Sara Kiesler, “Blurring the Boundaries: Cell Phones, Mobility and the Line Between Work and Personal Life,” in Wireless World: Social and Interactional Aspects of the Mobile Age, ed. N. G. R. H. Brown (New York: Springer, 2001).

22 Donna Haraway, “A Cyborg Manifesto,” in Simians, Cyborgs and Women: The Reinvention of Nature (New York; Routledge, 1991), 149-181.

23 Thomas R. Herzog et al., “Reflection and Attentional Recovery As Distinctive Benefits of Restorative Environments,” Journal of Environmental Psychology 17 (1997): 165-170. See also Stephen Kaplan, “The Restorative Benefits of Nature: Toward an Integrative Framework,” Journal of Environmental Psychology 15 (1995): 169-182.

24 I studied teenagers from a wide range of economic, social, and ethnic backgrounds. They attended seven different schools: two private boys preparatory schools, one in an urban center (Fillmore) and one in a rural setting (Hadley), one urban private girls school (Richelieu), an urban Catholic coeducational high school (Silver Academy), a private urban coeducational high school (Cranston), and two public high schools, one suburban (Roosevelt) and one urban (Branscomb). All students, from wealthy to disadvantaged, had cell phones with texting capability. Class distinctions showed themselves not in whether students possessed a phone but in what kind of contract they had with their providers. Teenagers with fewer resources, such as Julia in the following chapter, tended to have plans that constrained who they could text for free. Free texts are most usually for people on the same network. Ever resourceful, students with restricted plans try to get their friends to sign up with their cell providers. We shall see that teenagers don’t care much about who they can call. I often hear, “I never use my calling minutes.” On teenagers and digital culture, see Mizuko Ito et. al., Hanging Out, Messing Around, and Geeking Out: Kids Learning and Living with New Media (Cambridge, MA: MIT Press, 2010) and Danah Boyd, “Why Youth (Heart) Social Network Sites: The Role of Networked Publics in Teenage Social Life,” MacArthur Foundation Series on Digital Learning—Youth, Identity, and Digital Media, ed. Davind Buckingham (Cambridge, MA: Mit Press 2007), 119-142.


1 Carol Gilligan, In a Different Voice: Psychological Theory and Women’s Development (1982; Cambridge, MA: Harvard University Press, 1993).

2 Erik Erikson, Identity and the Life Cycle (1952; New York: W. W. Norton, 1980) and Childhood and Society (New York: Norton, 1950).

3 In Julia’s world, e-mail is considered “slow” and rarely used because texting has greater immediacy.

4 It is so common to see teenagers (and others) attending to their mobiles rather than what is around them, that it was possible for a fake news story to gain traction in Britain. Taken up by the media, the story went out that there was a trial program to pad lampposts in major cities. Although it was a hoax, I fell for it when it was presented online as news. In fact, in the year prior to the hoax, one in five Britons did walk into a lamppost or other obstruction while attending to a mobile device. This is not surprising because research reported that “62 per cent of Britons concentrate so hard on their mobile phone when texting they lose peripheral vision.” See Charlie Sorrel, “Padded Lampposts Cause Fuss in London,” Wired, March 10, 2008, (accessed October 5, 2009).

5 New communications technology makes it easier to serve up people as slivers of self, providing a sense that to get what you need from others you have multiple and inexhaustible options. On the psychology that needs these “slivers,” see Paul H. Ornstein, ed., The Search for Self: Selected Writings of Heinz Kohut (1950-1978), vol. 2 (New York: International Universities Press, 1978).

6 David Riesman, Nathan Glazer, and Reuel Denney, The Lonely Crowd: A Study of the Changing American Character (1950; New Haven, CT: Yale University Press, 2001).

7 Orenstein, The Search for Self. For an earlier work, of a very different time, that linked cultural change and narcissistic personality style, see Christopher Lasch, The Culture of Narcissism (New York: Norton, 1979). Lasch said that “pathology represents a heightened version of normality.” This formulation is helpful in thinking about the “normal” self in a tethered society and those who suffer more acutely from its discontents. From a psychodynamic perspective, we all suffer from the same things, some of us more acutely than others.

8 See Erik Erikson, Identity and the Life Cycle and Childhood and SocietyYoung Man Luther: A Study in Psychoanalysis and History (New York: W. W. Norton and Company, 1958).

9 Robert Jay Lifton, The Protean Self: Human Resilience in an Age of Fragmentation (New York: Basic Books, 1993).

10 Lifton shared this story at a meeting of the Wellfleet Seminar in October 2009, an annual gathering that began as a forum for Erikson and his students as they turned their attention to psychohistory.

11 The performances of everyday life—playing the roles of father, mother, child, wife, husband, life partner, worker—also provide “a bit of stress.” There is room for considerable debate about how much online life really shares with our performances of self in “real life.” Some look to the sociology of “self-presentation” to argue that online and off, we are always onstage. Erving Goffman, The Presentation of Self in Everyday Life (Garden City, NY: Doubleday Anchor, 1959).


1 In the object relations tradition of psychoanalysis, an object is that which one relates to. Usually, objects are people, especially a significant person who is the object or target of another’s feelings or intentions. A whole object is a person in his or her entirety. It is common in development for people to internalize part objects, representations of others that are not the whole person. Online life provides an environment that makes it easier for people to relate to part objects. This puts relationships at risk. On object relations theory, see, for example, Stephen A. Mitchell and Margaret J. Black, Freud and Beyond: A History of Modern Psychoanalytic Thought (New York: Basic Books, 1995).

2 See Stefana Broadbent, “How the Internet Enables Intimacy,” (accessed August 8, 2010). According to Broadbent, 80 percent of calls on cell phones are made to four people, 80 percent of Skype calls are made to two people, and most Facebook exchanges are with four to six people.

3 This mother is being destructive to her relationship with her daughter. Research shows that people use the phone in ways that surely undermine relationships with adult partners as well. In one striking finding, according to Dan Schulman, CEO of cell operator Virgin Mobile, one in five people will interrupt sex to answer their phone. David Kirkpatrick, “Do You Answer Your Cellphone During Sex?” Fortune, August 28 2006, (accessed November 11, 2009).

4 See Amanda Lenhart et al., “Teens and Mobile Phones,” The Pew Foundation, April 20, 2010, (accessed August 10, 2010).

5 See “What Is Second Life,” Second Life, (accessed June 13, 2010).

6 Erik Erikson, Childhood and Society (New York: Norton, 1950).

7 To use the psychoanalyst Philip Bromberg’s language, finding fluidity of self in online life enables us to “stand in the spaces between realities and still not los[e] any of them … the capacity to feel like one self while being many.” See Philip Bromberg, “Shadow and Substance: A Relational Perspective on Clinical Process,” Psychoanalytic Psychology 10 (1993): 166. In AI pioneer Marvin Minsky’s language, cycling through online personae reveals different aspects of a “society of mind,” a computational notion of identity as distributed and heterogeneous. Identity, from the Latin idem, has been typically used to refer to the sameness between two qualities. On the Internet, however, one can be many and usually is. See Marvin Minsky, Society of Mind (New York: Basic Books, 1987).

8 Nielsen recently found that children send eight text messages for every phone call they make or receive. See Anna-Jane Grossman, “I Hate the Phone,” Huffington Post, October 14, 2009, (accessed October 17, 2009).

9 “Number of US Facebook Users over 35 Nearly Doubles in Last 60 Days,” Inside Facebook, March 25, 2009, (accessed October 19, 2009).

10 Dan is aware of his withdrawal, but a new generation takes machine-mediated communication simply as the nature of things. Two young girls, ten and twelve, trapped inside a storm drain turned to Facebook for help instead of calling the police. They used their mobile phones to update their Facebook statuses. Even with their lives at risk, these girls saw Facebook as their portal to the world. Firefighters eventually rescued the pair after being contacted by one of their male school friends, who had been online and saw they were trapped. The news report read as follows:

The drama happened near Adelaide, Australia. Firefighter Glenn Benham, who took part in the rescue, said, “These girls were able to access Facebook on their mobile phones so they could have called the emergency services. It seems absolutely crazy but they updated their status rather than call us directly. We could have come to their rescue much faster than relying on someone else being online, then replying to them, then calling us. It is a worrying development. Young people should realize it’s better to contact us directly. Luckily they are safe and well. It’s awful to think what could have happened because of the delay.’”

See “Girls Trapped in Storm Drain Use Facebook to Call for Help,” Daily Mail, September 8, 2009, (accessed October 6, 2009).

11 This paraphrases a line from Sonnet 73: “Consum’d with that which it was nourish’d by … ”

12 The author of a recent blog post titled “I Hate the Phone” would not call Trey old-school, but nor would she want to call him. Anna-Jane Grossman admits to growing up loving her pink princess phone, answering machine, and long, drawn-out conversations with friends she had just seen at school. Now she hates the phone: “I feel an inexplicable kind of dread when I hear a phone ring, even when the caller ID displays the number of someone I like.... My dislike for the phone probably first started to grow when I began using Instant Messenger. Perhaps phone-talking is a skill that one has to practice, and the more IMing I’ve done, the more my skills have dwindled to the level of a modern day 13-year-old who never has touched a landline.... I don’t even listen to my [phone] messages any more: They get transcribed automatically and then are sent to me via e-mail or text.” The author was introduced to Skype and sees its virtues; she also sees the ways in which it undermines conversation: “It occurs to me that if there’s one thing that’ll become obsolete because of video-chatting, it’s not phones: it’s natural flowing conversations with people far away.” See Grossman, “I Hate the Phone.”

In my experience with Skype, pauses seem long and awkward, and it is an effort not to look bored. Peggy Ornstein makes this point in “The Overextended Family,” New York Times Magazine, June 25, 2009, (accessed October 17, 2009). Ornstein characterizes Skype as providing “too much information,” something that derails intimacy: “Suddenly I understood why slumber-party confessions always came after lights were out, why children tend to admit the juicy stuff to the back of your head while you’re driving, why psychoanalysts stay out of a patient’s sightline.”


1 Seth Schiesel, “All Together Now: Play the Game, Mom,” New York Times, September 1, 2009, (accessed December 13, 2009).

2 Amy Bruckman. “Identity Workshop: Emergent Social and Psychological Phenomena in Text-Based Virtual Reality” (unpublished essay, Media Lab, Massachusetts Institute of Technology, 1992), September 2, 2009).

3 For rich material on the “boundary work” between self and avatar, see Adam Boellstorff, Coming of Age in Second Life: An Anthropologist Explores the Virtually Human (Princeton, NJ: Princeton University Press, 2008), and T. L. Taylor, Play Between Worlds: Exploring Online Game Culture (Cambridge, MA: MIT Press, 2006). See also Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon and Schuster, 1995).

4 This is true whether they are in the text-based multiuser domains, or MUDs, of the early 1990s (such as Lambda Moo), in the visually rich massively multiplayer online role-playing games of the end of that decade (Everquest and Ultima II), or in today’s cinemalike virtual worlds, such as World of Warcraft or Second Life.

5 Victor Turner, The Ritual Process: Structure and Anti-Structure (Chicago: Aldine, 1969).

6 The work coming out of Stanford University’s virtual reality laboratory presents compelling evidence that if you are, for example, tall in virtual reality, you will feel more assertive in meetings that follow online sessions. See, for example, J. N. Bailenson, J. A. Fox, and J. Binney, “Virtual Experiences, Physical Behaviors: The Effect of Presence on Imitation of an Eating Avatar,” PRESENCE: Teleoperators and Virtual Environments 18, no. 4: 294-303, and J. A. Fox and J. N. Bailenson, “Virtual Self-modeling: The Effects of Vicarious Reinforcement and Identification on Exercise Behaviors,” Media Psychology 12 (2009): 1-25.

7 Turkle, Life on the Screen.

8 The Loebner Prize Competition also awards a prize to the person who is most obviously a person, the person who is least confused with an artificial intelligence. See Charles Platt, “What’s It Mean to Be Human, Anyway?” Wired, May 1995, (accessed May 31, 2010).

9 Mihaly Csíkszentmihalyi, Beyond Boredom and Anxiety (San Francisco: Jossey-Bass, 2000 [1st ed. 1975]), and Natasha Schüll, Addiction by Design: Machine Gambling in Las Vegas (Princeton, NJ: Princeton University Press, forthcoming).

10 Mihaly Csíkszentmihalyi, Flow: The Psychology of Optimal Experience (New York: Harper & and Row, 1990).

11 With too much volume, of course, e-mail becomes too stressful to be relaxing. But “doing e-mail,” no matter how onerous, can put one in the zone.

12 Natasha Schüll, Addiction by Design. On the issue of unreal choices, Schüll refers to the work of psychologist Barry Schwartz, The Paradox of Choice: Why More Is Less (New York: Harper Collins, 2003).

13 Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005), see, especially, “Video Games and Computer Holding Power,” 65-90.

14 Washington State University neuroscientist Jaak Panksepp describes a compelled behavior he calls the “seeking drive.” When humans (indeed, all mammals) receive stimulation to the lateral hypothalamus (this happens every time we hear the ping of a new e-mail or hit return to start a Google search), we are caught in a loop “where each stimulation evoke[s] a reinvigorated search strategy.” See Jaak Panksepp, Affective Neuroscience: The Foundations of Human and Animal Emotions (Oxford: Oxford University Press, 1998), 151. The implication is that search provokes search; seeking provokes seeking. Panksepp says that when we get thrilled about the world of ideas, about making intellectual connections, about divining meaning, it is the seeking circuits that are firing.

In an article in Slate, Emily Yoffe, reviews the relationship between our digital lives and how the brain experiences pleasure. She says:

Actually all our electronic communication devices—e-mail, Facebook feeds, texts, Twitter—are feeding the same drive as our searches.... If the rewards come unpredictably—as e-mail, texts, updates do—we get even more carried away. No wonder we call it a “CrackBerry.” …

[Psychologist Kent] Berridge says the “ding” announcing a new e-mail or the vibration that signals the arrival of a text message serves as a reward cue for us. And when we respond, we get a little piece of news (Twitter, anyone?), making us want more. These information nuggets may be as uniquely potent for humans as a Froot Loop to a rat. When you give a rat a minuscule dose of sugar, it engenders “a panting appetite,” Berridge says—a powerful and not necessarily pleasant state.

See Emily Yoffe, “Seeking How the Brain Hard-Wires Us to Love Google, Twitter, and Texting. And Why That’s Dangerous,” Slate, August 12, 2009, (accessed September 25, 2009). See also Nicholas Carr, “Is Google Making Us Stupid?” The Atlantic, July-August 2008, (accessed November 20, 2009), and Kent C. Berridge and Terry E. Robinson, “What Is the Role of Dopamine in Reward: Hedonic Impact, Reward Learning, or Incentive Salience?” Brain Research Reviews 28 (1998): 309-369.


1 The PostSecret site is run by Frank Warren. See (accessed August 22, 2009). For his views on the positive aspects of venting through confessional sites, see Tom Ashcroft’s On Point interview with Frank Warren, “Baring Secrets Online,” WBUR, June 10, 2009, (accessed August 2, 2010). See also Michele Norris’s All Things Considered interview with Frank Warren, “Postcards Feature Secret Messages from Strangers,” NPR, March 30, 2005, (accessed August 2, 2010).

2 See (accessed August 4, 2010).

3 As throughout this book, I have disguised the details of this case and all others I cite.

4 Ashley Fantz, “Forgive Us Father; We’d Rather Go Online,”, March 13, 2008, (accessed August 22, 2009).

5 The exceptions are significant: if at the earliest ages you were not nurtured—you often cried and were not fed—the vulnerability/nurturance expectation can be broken. Erik Erikson calls the positive laying down of expectations “basic trust.” See Childhood and Society (New York: Norton, 1950), 247-250.

6 This is the defense mechanism of “projective identification.” Instead of facing our own issues, we see them in others. There, they can be safely attacked. Insecure about her own appearance, a wife criticizes her husband’s weight; the habitually angry see a hostile world.

7 The Reverend Bobby Gruenwald of the Oklahoma-based, an evangelical consortium of thirteen churches affiliated with the online confessional, is one of those who argues that our notion of “community” should include online congregations. In the first year it was open, about thirty thousand people posted “secrets” on the MySecret website. The posts are linked to categories, which include lusting, cheating, stealing, and bestiality. When the site was featured on America Online’s homepage, it got over 1.3 million hits in a single day. Confessional sites like MySecret do not track IP addresses, which could identify those who post. This means that if someone posts a confession of a criminal nature, the site managers cannot do much about it. So, online, we read about people admitting to murder (these are often interpreted as soldiers writing about the experience of war) and enjoying child pornography: “A recent message on reads, ‘I have killed four people. One of them was a 17 year old boy.’” See Fantz, “Forgive Us Father.”

8 Ray Oldenberg. The Great Good PlaceCafés, Coffee Ships, Community Centers, Beauty Parlors, General Stores, Bars, Hangouts, and How They Get You Through the Day (New York: Paragon House, 1989). On virtual environments as communities, see Howard Rheingold, The Virtual Community: Homesteading on the Electronic Frontier (Reading, MA: Addison Wesley, 1993).

9 There is, too, the word “world.” Sociologist William Bainbridge, a student of World of Warcraft, takes its title seriously and talks of the game as a world. See William Bainbridge, The Warcraft Civilization: Social Science in a Virtual World (Cambridge, MA: MIT Press, 2010). For an interchange on the game as a “world,” or perhaps a “neighborhood,” see Tom Ashcroft’s On Point interview with William Bainbridge, “Deep in the ‘World of Warcraft,’” WBUR, March 30, 2010, (accessed August 10, 2010).


1 “No More Teachers? No More Books? Higher Education in the Networked Age,” A Centennial Panel on Information Technology, Harvard University, Cambridge, Massachusetts, November 16, 2009.

2 There are at least three displacements in the passage from the book to the online text. First, there is the displacement of personal and idiosyncratic associations. Second, online links that are there to be double-clicked are determined by what the author of the text thinks it is important for you to be exposed to. Third, even if one accepts that such links are convenient, they are easily bypassed when reading is interrupted by an incoming e-mail or other online distractions.

3 In The Year of Magical Thinking (New York: Alfred A. Knopf, 2005), a memoir about the year after her husband’s death, Joan Didion describes how material objects became charged with meaning. So, for example, Didion cannot bring herself to throw away her husband’s shoes because she is convinced that he may need them. This same magical thinking is associated both with religious devotion and the “illness” of mourning. With time, Freud believed, the true object, the lost husband, comes to have a full internal representation. See Sigmund Freud, “Mourning and Melancholia,” in The Standard Edition of Sigmund Freud, ed. and trans. James Strachey et al. (London: Hogarth Press, 1953-1974), 14: 237-258.

4 At many summer camps, there are rules that campers should not have cell phones, which are routinely “confiscated” at the start of camp. Children now tell me that parents give them two phones: one to “turn in” on the first day of camp and a second to keep for calling home.

5 In October 2005, ABC News called the term “in vogue.” See “Do ‘Helicopter Moms’ Do More Harm Than Good?”, October 21, 2005, (accessed April 7, 2004).

6 In 2004, the Pentagon canceled its so-called LifeLog project, an ambitious effort to build a database tracking a person’s entire existence: phone calls made, TV shows watched, magazines read, plane tickets bought, e-mails sent and received. It was then partially revived nine months later. See Noah Schachtman, “Pentagon Revives Memory Project,” Wired (accessed August 4, 2010). Backers of the project saw it as a near-perfect digital memory. Civil libertarians argued that it could become the ultimate invasive profiling tool. Such objections, of course, are undermined if people make agreements with private services (for instance, Facebook and Google) in which they sacrifice rights to their data in return for services on the system. When one agrees to such terms of service, the questions become, How transparent are the privacy settings on the service, and how easy is it for people to choose the privacy options they wish to have? Facebook has been the focus of much of the public discussion of these matters, centering on whether the “default” is more privacy or less. So, for example, in 2007, Facebook users turned against Beacon, a service that posted information about users’ purchases both to Facebook and other sites. More than fifty thousand users signed an online petition in protest, and Beacon became an application that users had to “opt into.” By 2009, it was shut down completely, and Facebook agreed to use $9.5 million to start a foundation dedicated to questions of online privacy. See “Facebook Shuts Down Beacon Marketing Tool,” CBC News, September 21, 2009, (accessed October 15, 2009).

In spring 2010, Facebook’s privacy policies again became front-page news. See Jenna Wortham, “Facebook Glitch Brings New Privacy Worries,” New York Times, May 5, 2010, (accessed May 10, 2010), and Miguel Helft and Jenna Wortham, “Facebook Bows to Pressure over Privacy,” New York Times, May 27, 2010, (accessed May 29, 2010). This conversation will surely continue.

7 Miguel Helft, “Anger Leads to Apology from Google About Buzz,” New York Times, February 14, 2010, (accessed May 29, 2010).

8 The corporate world has certainly behaved as though transparency about privacy policy is not necessarily in its best interest. When Facebook has been open about how much user data it feels it owns, users have not been happy. The corporate reality, however, is on the public record. An anonymous Facebook employee disclosed that the company saves “all the data on all of our servers, every hour of every day.” “At least two people,” the employee said, “have been fired” for spying on accounts. Cited in Stephen Burt, “Always On,” London Review of Books 32, no. 11 (June 10, 2010): 21-22.

9 Polly Sprenger, “Sun on Privacy: Get over It,” Wired News, January 26, 1999, (accessed August 4, 2010).

10 Eric Schmidt made the first remark about controlling behavior rather than worrying about privacy to CNBC. The video is available at Ryan Tate, “Google CEO: Secrets Are for Filthy People,” Gawker, December 4, 2009, (accessed June 5, 2010). His remark about name changing was made to the Wall Street Journal. Holman W. Jenkins Jr., “Google and the Search for the Future,” (accessed September 3, 2010).

11 On the issue of computational metaphors being taken as reality, see Harry R. Lewis (with Hal Abelson and Ken Ledeen), Blown to Bits: Your Life, Liberty, and Happiness After the Digital Explosion (New York: Pearson, 2006), ch. 3.

12 Robert Jay Lifton, “Protean Man,” Archives of General Psychiatry 24 (1971): 298-304, and Robert Jay Lifton, The Protean Self: Human Resilience in an Age of Fragmentation (New York: Basic Books, 1993). See also Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon and Schuster, 1995).

13 Michel Foucault, Discipline and Punish: The Birth of the Prison, trans. Alan Sheridan (1979; New York: Vintage Books, 1995).

14 Foucault, Discipline and Punish, 195-228. Here is one example of Foucault on the relationship between remembrance and the constitution of a new kind of self: “First, to bring out a certain number of historical facts which are often glossed over when posing this problem of writing, we must look into the famous question of the hypomnemata… . Now, in fact, hypomnemata has a very precise meaning. It is a copy-book, a notebook. Precisely this type of notebook was coming into vogue in Plato’s time for personal and administrative use. This new technology was as disrupting as the introduction of the computer into private life today. It seems to me the question of writing and the self must be posed in terms of the technical and material framework in which it arose.... What seems remarkable to me is that these new instruments were immediately used for the constitution of a permanent relationship to oneself—one must manage oneself as a governor manages the governed, as a head of an enterprise manages his enterprise, a head of household manages his household.”

See Paul Rabinow, “An Interview with Michel Foucault,” in The Foucault Reader, ed. Paul Rabinow (New York: Pantheon, 1984), 363-365.


1 This recalls how French psychoanalyst Jacques Lacan talks about the analytic encounter. The offer to listen creates a demand to be heard. “In short, I have succeeded in doing what in the field of ordinary commerce people would dearly like to be able to do with such ease: with supply, I have created demand.” See Jacques Lacan, “The Direction of the Treatment and the Principles of Its Power,” Ecrits: A Selection, trans. Alan Sheridan (New York: W.W. Norton, 1977), 254. For a discussion of Lacan and the “intransitive demand,” see Sherry Turkle, Psychoanalytic Politic: Jacques Lacan and Freud’s French Revolution (1978; New York: Guilford Press, 1992), 85.

2 David Andersen, “Erik H. Erikson’s Challenge to Modernity” (PhD diss., Bowling Green State University, 1993). After writing this chapter and the next, I found Alan Lightman’s elegant essay, “Prisoner of the Wired World,” which evokes many of the themes I treat here. In A Sense of the Mysterious: Science and the Human Spirit (New York: Pantheon, 2005), 183-208.

3 Anthony Storr, Solitude: A Return to the Self (New York: Random House, 1988), 198.

4 Henry David Thoreau, “Where I Lived and What I Lived For,” in Walden (1854; New York: American Renaissance Books, 2009), 47. I thank Erikson biographer Lawrence J. Friedman for his insights on Erikson and “stillness.”

5 Thoreau, “Where I Lived,” 47.

6 Katy Hafner, “To Deal with Obsession, Some Defriend Facebook,” New York Times, December 20, 2009, (accessed January 6, 2009).

7 Thoreau, “Where I Lived,” 47.

8 Kevin Kelly, “Technophilia,” The Technium, June 8, 2009, (accessed December 9, 2009).

9 See Sherry Turkle, “Simulation and Its Discontents,” in Sherry Turkle, Simulation and Its Discontents (Cambridge, MA: MIT Press, 2009), 3-84.


1 Bohr says, “It is the hallmark of any deep truth that its negation is also a deep truth” (as quoted in Max Delbrück, Mind from Matter: An Essay on Evolutionary Epistemology [Palo Alto, CA: Blackwell Scientific Publications, 1986], 167).

2 One study comparing data from 1985 and 2004 found that the mean number of people with whom Americans can discuss matters important to them dropped by nearly one-third, from 2.94 people in 1985 to 2.08 in 2004. Researchers also found that the number of people who said they had no one with whom to discuss such matters more than doubled, to nearly 25 percent. The survey found that both family and nonfamily confidants dropped, with the loss greatest in nonfamily connections. Miller McPherson, Lynn Smith-Lovin, and Matthew E. Brashears, “Social Isolation in America: Changes in Core Discussion Networks over Two Decades,” American Sociological Review 71 (June 2006): 353-375.

3 Barry Wellman and Bernie Hogan (with Kristen Berg et al.), “Connected Lives: The Project,” in Networked Neighborhoods, ed. Patrick Purcell (London: Springer-Verlag, 2006), 161-216.

4 Moving past the philosophical, there are contradictions on the ground: a “huggable” robot is a responsive teddy bear that makes it possible for a grandmother in Detroit to send a squeeze to her grandson in Cambridge, Massachusetts. The grandmother hears and sees her grandson through the eyes and ears of the bear, and the robot communicates her caress. All well and good. But when videoconferences and hugs mediated by teddy bears keep grandparents from making several-thousand-mile treks to see their grandchildren in person (and there is already evidence that they do), children will be denied something precious: the starchy feel of a grandmother’s apron, the smell of her perfume up close, and the taste of her cooking. Amy Harmon, “Grandma’s on the Computer Screen,” New York Times, November 26, 2008, (accessed December 11, 2009). On the “Huggable” project, see (accessed April 5, 2010).

5 On ELIZA, see Joseph Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation (San Francisco: Freeman, 1976); Sherry Turkle, The Second Self: Computers and the Human Spirit (1984; Cambridge, MA: MIT Press, 2005); Sherry Turkle, Life on the Screen: Identity in the Age of the Internet (New York: Simon and Schuster, 1995).

6 People who feel that psychotherapists are dismissive or disrespectful may also prefer to have computers as counselors. An MIT administrative assistant says to me: “When you go to a psychoanalyst, well, you’re already going to a robot.”

7 In fact, we have two robotic dreams. In one, we imagine the robots as perfect companions. In another, we join with them to become new selves. This second scenario itself has two variants. In a first, we evolve. We assimilate robotic parts until there is no “us” and “them.” In the short term, we feel smarter and healthier. In the long term, we become immortal. In the second variant, there is a decisive turn, a moment of “singularity” in which computing power is so vast that people essentially become one with machines. For a critique of what he calls “cybernetic totalism,” see Jaron Lanier, “One Half a Manifesto,” (accessed August 3, 2010) and You Are Not a Gadget: A Manifesto (New York: Knopf, 2010).

8 Psychoanalysis sees truth in the symptom. But it is a truth that has not been given free expression. You don’t want to get rid of these truths for they are “signs that something has disconnected a significant experience from the mass of other, non-symptomatic significant experiences. The aim of psychoanalysis is to restore the broken connection, thereby converting the distorted, disconnected experience (the symptom) into an ordinary, connected one.” See Robert Caper, Building Out into the Dark: Theory and Observation in Science and Psychoanalysis (New York: Routledge, 2009), 90.

9 Kevin Kelly, “Technophilia,” The Technium, June 8, 2009, (accessed December 9, 2009).

10 Caper, Building Out into the Dark, 93.

11 Personal communication, October 2008.

12 Caper says, “We tolerate the plague of our neurotic symptoms because we fear that discovering the truths they simultaneously rest on and cover over will lead to our destruction.” And further, an interpretation, like a new technology, “always poses a danger.... The danger consists not in the analysts’ search for truth, and not even in the fact that his interpretations are inevitably flawed, but in his not recognizing that this is so.” See Caper, Building Out into the Dark, 91, 94.

13 Henry Adams, “The Dynamo and the Virgin,” in The Education of Henry Adams: An Autobiography (Boston: Massachusetts Historical Society, 1918), 380.

14 Kelly, “Technophilia.”

15 One roboticist who makes quite extravagant claims about our futures is David Hanson. For videos and progress reports, see Hanson Robotics at (accessed December 11, 2009). And, of course, there is David Levy’s book on the future of robot affections, Love and Sex with Robots: The Evolution of Human-Robot Relationships (New York: Harper Collins, 2007).

16 This is a paraphrase. The exact citation is, “When you want to give something presence, you have to consult nature and that is where design comes in. If you think of brick, for instance, you say to brick, ‘What do you want, brick?’ And brick says to you, ‘I’d like an arch.’ And if you say to brick, ‘Look, arches are expensive and I can use a concrete lintel over you, what do you think of that, brick?’ And brick says to you, ‘I’d like an arch.’” See Nathaniel Kahn, My Architect: A Son’s Journey (New Yorker Films, 2003).

17 Rodney Brooks, cited in “MIT: ‘Creating a Robot So Alive You Feel Bad About Switching It Off’—a Galaxy Classic,” The Daily Galaxy, December 24, 2009, html (accessed June 4, 2010).

18 Cynthia Breazeal and Rodney Brooks both make the point that robot emotions do not have to be like human ones. They should be judged on their own merits. See Cynthia Breazeal and Rodney Brooks (2005). “Robot Emotion: A Functional Perspective,” in J.-M. Fellous and M. Arbib (eds.) Who Needs Emotions: The Brain Meets the Robot, MIT Press. 271-310. Breazeal insists that “the question for robots is not, ‘Will they ever have human emotions?’ Dogs don’t have human emotions, either, but we all agree they have genuine emotions. The question is, ‘What are the emotions that are genuine for the robot?’” Breazeal talks about Kismet as a synthetic being and expects that it will be “given the same respect and consideration that you would to any living thing.” WNPR, “Morning Edition,” April 9, 2001, (accessed August 12, 2010). See also Susan K. Lewis, “Friendly Robots,” Nova, Robin Marantz Henig, “The Real Transformers,” New York Times, July 29, 2007, (accessed September 3, 2010).

19 There is much talk these days of a “robot bill of rights.” As robots become more complex, there is a movement to have formal rules for how we treat artificial sentience. Robot rights are the subject of parliamentary inquiry in the United Kingdom. In South Korea, where the government plans to put a sociable robot into every home by 2020, there are plans to draw up legal guidelines on how they must be treated. The focus of these efforts is on protecting the robots. But as early as the mid-1990s, people abused virtual creatures called “norns,” tormenting them until they became psychotic, beating their virtual heads against virtual walls. Popular Web videos show even as simple a robotic toy as Hasbro’s Elmo Live being doused with gas and set on fire, his red fur turning to charcoal as he writhes in what looks like pain. I have watched the abuse of Tamagotchis, Furbies, My Real Babies, and Paros. The “robot rights movement” is all about not hurting the robots. My concern is that when we torture sociable robots that we believe to have “states of mind,” we damage ourselves.

Daniel Roth, “Do Humanlike Machines Deserve Human Rights,” Wired Magazine, January 19, 2009, (accessed June 4, 2010).

20 For drawing my attention to what he calls “formes frustes of feeling,” I thank my colleague Cambridge psychiatrist and psychoanalyst Dr. David Mann, who has reformulated an entire range of unpleasant affects (for example, envy, greed, resentment) in an as-yet-unpublished essay, “Failures of Feeling” (2009).

21 Anthony Storr, Solitude: A Return to the Self (New York: Random House, 1988).

22 Quandaries have become a classic way of thinking about moral dilemmas. See, for example, Marc Hauser, Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong (New York: Ecco, 2006). Some of the most common quandaries involve trolley cars and the certainty of death. A typical scenario has you driving a trolley car with five workers ahead of you on the track. Doing nothing will kill all five. You can swerve onto a track on which there is only one worker. Do you act to kill one person rather than five? Then, the scenario may be shifted so you are on a bridge, observing the trolley cars. There is a fat man standing beside you. Do you push him onto the track to stop the trolley, thus saving the five people? And so it goes.

23 Traditional psychology was constructed based on experiments done only with men and through theories that only took into account male development. During the first and second world wars, psychological tests were standardized for the male soldiers with whom they were developed. End of story. Psychologists came to see male responses as “normal” ones. The behaviors, attitudes, and patterns of relationship exhibited by most men became the norm for “people.” Psychologist Carol Gilligan’s 1982 In a Different Voice is an example of work that broke this frame. Gilligan portrays the canonical (and stereotypically “male”) view of moral reasoning and then points out that it constitutes only one way in which people make moral decisions. The canonical pattern looks at moral choices in terms of abstract principles. Another, equally evolved moral voice relies on concrete situations and relationships. For example, see Gilligan’s treatment of “Amy and Heinz” in In a Different Voice: Psychological Theory and Women’s Development (Cambridge, MA: Harvard University Press, 1993), 26-28, 30. The “robots-or-nothing” thinking about elder care frames a dilemma that begs for a contextual approach; this is what the fifth graders in Miss Grant’s class brought to the table.

We hear another moment of reframing when seventeen-year-old Nick tries to find a way to get his father to put away his BlackBerry during family dinners. Recall that in Nick’s home, family dinners are long. His mother takes pride in her beautifully prepared meals with many courses. Nick suggests shorter meals. His parents argue principles: the priority of work versus that of a meal prepared with love. Nick focuses on relationship. The family needs family time. How can they provide that for each other? Nick suggests a shorter meal with no phones.

24 Anthony Appiah, Experiments in Ethics (Cambridge, MA: Harvard University Press, 2008), 196-197. Appiah is writing about “trolley car” quandaries, but he could be writing about the “robots-or-nothing” problem.

25 Here I note the work on using robots as a therapeutic tool with people on the autism spectrum. Robots do not overwhelm them as people may. The predictability of robots is comforting. The question remains whether these robots can serve as transitions to relationships with people. I cotaught a course at MIT on robotics and autism with Rosalind Picard and Cynthia Breazeal. Roboticists are of course gratified to feel that they can contribute to therapy in this area; the jury is still out on whether nonhuman faces get us ready for human ones. For a discussion that focuses on the work of roboticist Maja Matarić in this area, see Jerome Groopman, “Robots That Care: Advances in Technological Therapy,” The New Yorker, November 2, 2009, (accessed November 11, 2009).

26 This phrase is drawn from Roger Shattuck’s book on the “Wild Child” of Aveyron. The Forbidden Experiment (New York: Farrar, Strauss, and Giroux, 1980).

27 “Basic trust” is Erik Erikson’s phrase; see Childhood and Society (New York: Norton, 1950) and Identity and the Life Cycle (1952; New York: Norton, 1980).

28 At MIT, the question of risk strikes most of my students as odd. They assume, along with roboticist David Hanson, that eventually robots “will evolve into socially intelligent beings, capable of love and earning a place in the extended human family.” See Groopman, “Robots That Care.”

29 A University of Michigan study found that today’s college students have less empathy than those of the 1980s or 1990s. Today’s generation scored about 40 percent lower in empathy than their counterparts did twenty or thirty years ago. Sara Konrath, a researcher at the University of Michigan’s Institute for Social Research, conducted, with University of Michigan graduate student Edward O’Brien and undergraduate student Courtney Hsing, a meta-analysis that looked at data on empathy, combining the results of seventy-two different studies of American college students conducted between 1979 and 2009. Compared to college students of the late 1970s, the study found, college students today are less likely to agree with statements such as “I sometimes try to understand my friends better by imagining how things look from their perspective” and “I often have tender, concerned feelings for people less fortunate than me.” See “Empathy: College Students Don’t Have As Much As They Used To,” EurekAlert! May 28, 2010, .php (accessed June 4, 2010).

30 I thank my psychotherapist colleagues for ongoing conversations on these matters. In particular I acknowledge the adolescent psychiatrist John Hamilton and the panels on “Adolescence in Cyberspace” on which we have collaborated at the Annual Meetings of the American Academy of Child and Adolescent Psychiatry in October 2004 and October 2008; the participants in the MIT working group, “Whither Psychoanalysis in Digital Culture” Initiative on Technology and Self, 2003-2004; and participants at the Washington Institute for Psychoanalysis’s “New Directions” Conference, April 30, 2010.

31 Maggie Jackson, Distracted: The Erosion of Attention and the Coming Dark Age (New York: Prometheus, 2008).

32 Matt Richtel, “Hooked on Gadgets and Paying a Mental Price,” New York Times, July 7, 2010, (accessed July 7, 2010).

33 Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York: W. W. Norton and Company, 2010). Here, the argument is that online activities—surfing, searching, jumping from e-mail to text—actually change the nature of the brain. The more time we spend online, the more we are incapable of quiet reverie, not because of habits of mind but because of a rewiring of our circuitry. This area of research is, happily, getting more and more public attention. See Matt Richtel, “Your Brain on Computers: Outdoor and Out of Reach, Studying the Brain,” New York Times, August 16, 2010, (accessed August 16, 2010).

34 Of course, one of my concerns is that the moment to summon ourselves to action might pass. We are at a point at which, when robots are proposed as companions for the elderly or as babysitters, we can still have a conversation that challenges these ideas. We still remember why they are problematic. I am concerned that in twenty years, one may simply boast, “I’m leaving my kid with the nanny bot.” After the cost of purchase, it will be free and reliable. It will contact you if there is any deviation from the plan you have left for your child—be these deviations in your child’s temperature or in a range of acceptable behaviors. I vividly remember leading an MIT seminar in 2001, one that was part of a celebration at the release of Steven Spielberg’s A.I.: Artificial Intelligence, when for the first time, I was the only person in a room of thirty who did not see any issue at all with the prospect of a computer psychotherapist. Moments when big steps with technology seem problematic have a way of passing.


1 Vannevar Bush, “As We May Think,” Atlantic Monthly (July 1945): 101-106, (accessed November 20, 2009).

2 See Steve Mann (with Hal Niedzviecki), Digital Destiny and Human Possibility in the Age of the Wearable Computer (New York: Random House, 2001).

3 C. Gordon Bell and Jim Gemmell, “A Digital Life,” Scientific American 296, no. 3 (March 2007): 58-65, (accessed August 7, 2007). The My Life Bits website is (accessed July 30, 2010). Bell and Gemmell published a book-length discussion of this project, Total Recall: How the E-Memory Revolution Will Change Everything (New York: Dutton, 2009).

4 Bell and Gemmell, “A Digital Life.”

5 Thompson notes of his 2007 visit, “MyLifeBits records his telephone calls and archives every picture—up to 1,000 a day—snapped by his automatic ‘SenseCam,’ that device slung around his neck. He has even stowed his entire past: The massive stacks of documents from his 47-year computer career, first as a millionaire executive then as a government Internet bureaucrat, have been hoovered up and scanned in. The last time he counted, MyLifeBits had more than 101,000 emails, almost 15,000 Word and PDF documents, 99,000 Web pages, and 44,000 pictures.” See Clive Thompson, “A Head for Detail,” Fast Company, December 19, 2007, (accessed October 1, 2009).

6 Susan Sontag, On Photography (New York: Dell, 1978), 9.

7 Bell and Gemmell discuss the burdens of having a digital shadow. They anticipate that other people captured in one’s sights may need to be pixilated so as not to invade their privacy, data will have to be stored “offshore” to protect it from loss and/or illegal seizure, and there is danger posed by “identity thieves, gossipmongers, or an authoritarian state.” The fact that these three are grouped together as problems to be solved technically illustrates the power of the fantasy of total life capture. For after all, the potential damage from gossipmongers and an authoritarian state are not commensurate. They surely cannot be dealt with by the same technical maneuvers. Yet the fantasy is potent. Bell and Gemmell admit that despite all problems, “for us the excitement outweighs the fear.” See Bell and Gemmell, “A Digital Life.”

8 Indeed, with far less “remembrance technology,” many of us wonder if Google is “making us stupid” because it is always easier to search than remember. The originator of this memorable phrase is Nicholas Carr, “Is Google Making Us Stupid?” The Atlantic, July/August 2008, (accessed August 12, 2010).

9 Thompson, “A Head for Detail.”

10 Thompson, “A Head for Detail.”

11 Obama himself fought hard and famously to keep his BlackBerry, arguing that he counts on this digital device to make sure that the “bubble” of his office does not separate him from the “real” world. Obama kept his BlackBerry, but in March 2009, the Vatican asked the Catholic bishops of Italy to request that their flocks give up texting, social-networking websites, and computer games for Lent, or at least on Fridays. Pope Benedict has warned Catholics not to “substitute virtual friendship” for real human relationships. On his YouTube site, the pope warned of “obsessive” use of mobile phones and computers, which “may isolate individuals from real social interaction while also disrupting the patterns of rest, silence, and reflection that are necessary for healthy human development.” The London Times reports that “even Pope Benedict … experienced the distractions of obsessive texting” when President Nicolas Sarkozy of France was flagged for rudeness when he checked his mobile device for text messages during a personal audience with the pontiff. See Richard Owen, “Thou Shalt Not Text until Easter, Italians Told,” The Times, March 3, 2009 (accessed July 30, 2010).

12 See Sherry Turkle, “Reading the Inner History of Devices,” in Sherry Turkle, ed., The Inner History of Devices (Cambridge, MA: MIT Press, 2008).

13 Technology and remembrance is a growing discipline. In addition to Cyborg, Steve Mann has written extensively about computation and remembrance. See, for example, “Wearable Computing: Toward Humanistic Intelligence,” Intelligent Systems 16, no. 3 (May-June 2001): 10-15. From 1996 on, Thad Starner, who like Steve Mann was a member of the MIT cyborg group, worked on the Remembrance Agent, a tool that would sit on your computer desktop (or now, your mobile device) and not only record what you were doing but make suggestions about what you might be interested in looking at next. See Bradley J. Rhodes and Thad Starner, “Remembrance Agent: A Continuously Running Personal Information Retrieval System,” Proceedings of the First International Conference on the Practical Application of Intelligent Agents and Multi Agent Technology (PAAM ’96),487-495, 487-495, (accessed December 14, 2009).

Albert Frigo’s “Storing, Indexing and Retrieving My Autobiography,” presented at the 2004 Workshop on Memory and the Sharing of Experience in Vienna, Austria, describes a device to take pictures of what comes into his hand. He comments on the implications: “The objects I photograph, while used, represent single specific activities that from a more general perspective can visualize how, throughout my life, my intentions, my desires, my sorrows have mutated. The objects become my emblems, the code through which the whole of me can be reconstructed, interpreted.” See Albert Frigo, “Storing, Indexing and Retrieving My Autobiography,” Nishida & Sumi Lab, (accessed November 2009). For a sense of the field’s current ambitions, see the Memories for Life project at (accessed July 30, 2010) and the Reality Mining group at MIT and the Santa Fe Institute at (accessed December 14, 2009).

William C. Cheng, Leana Golubchik, and David G. Kay write about the politics of remembrance. They anticipate a future in which we will all wear self-monitoring and recording devices. They discuss the danger that state authority will presume that when behaving lawfully, people will be wearing the device. Not wearing the device will be taken as indicative of guilt. Yet, even given this dark scenario, they conclude with the claim that, essentially, the train has left the station: “We believe that systems like Total Recall will get built, they will have valuable uses, and they will radically change our notions of privacy. Even though there is reason to be skeptical that there will be any meaningful legal protection for the privacy status quo, we believe that useful technologies are largely inevitable, that they often bring social changes with them, and that we will inevitably both suffer and benefit from their consequences.” See William C. Cheng, Leana Golubchik, and David G. Kay, “Total Recall: Are Privacy Changes Inevitable?” (paper presented at Capture, Archiving, and Retrieval of Personal Experiences [CARPE] workshop, New York, October 15, 2004), (accessed December 14, 2009).

14 Alec Wilkinson, “Remember This?” The New Yorker, May 28, 2007, 38-44, (accessed November 20, 2009).

15 Wilkinson, “Remember This?”