MIT Psychologist Warns Against Falling in Love with AI It Just Pretends and Doesnt Care About You

MIT Psychologist Warns Against Falling in Love with AI: “It Just Pretends and Doesn’t Care About You”

As we spend more and more time online, it’s becoming increasingly common for people to form connections with AI-driven chatbots, seeking companionship, therapy, and even romantic relationships. But according to Sherry Turkle, an MIT sociologist and psychologist, these relationships are nothing more than an illusion, putting people’s emotional health at risk.

Turkle, who has dedicated her career to studying the relationships between humans and technology, cautions that AI chatbots and virtual companions may appear to offer comfort and companionship, but they lack genuine empathy and cannot reciprocate human emotions. Her latest research focuses on what she calls “artificial intimacy,” the emotional bonds people form with AI chatbots.

In an interview with NPR, Turkle emphasized the difference between real human empathy and the “pretend empathy” exhibited by machines. “I study machines that say, ‘I care about you, I love you, take care of me,’” she explained. “The trouble with this is that when we seek out relationships with no vulnerability, we forget that vulnerability is really where empathy is born. I call this pretend empathy because the machine does not empathize with you. It does not care about you.”

Turkle has documented numerous cases where individuals have formed deep emotional connections with AI chatbots, including one man who developed a romantic relationship with a chatbot “girlfriend” despite being in a stable marriage. He felt a loss of sexual and romantic connection with his wife and sought emotional and sexual validation from the chatbot. The bot’s responses made him feel affirmed and open, and he found a unique, judgment-free space to share his most intimate thoughts. However, Turkle argues that these interactions can set unrealistic expectations for human relationships and undermine the importance of vulnerability and mutual empathy.

While AI chatbots can be helpful in certain scenarios, such as reducing barriers to mental health treatment and offering reminders for medication, Turkle warns that the technology is still in its early stages and raises concerns about the potential for harmful advice from therapy bots and significant privacy issues. Mozilla’s research found that thousands of trackers collect data about users’ private thoughts, with little control over how this data is used or shared with third parties.

For those considering engaging with AI in a more intimate way, Turkle offers some important advice. She emphasizes the importance of valuing the challenging aspects of human relationships, such as stress, friction, pushback, and vulnerability, as they allow us to experience a full range of emotions and connect on a deeper level. “Avatars can make you feel that [human relationships are] just too much stress,” she reflected. “But stress, friction, pushback, and vulnerability are what allow us to experience a full range of emotions. It’s what makes us human.”

As we navigate our relationships in a world increasingly intertwined with AI, Turkle’s latest research highlights the need to approach these relationships with caution and a clear understanding of their limitations. She warns against getting too attached to AI chatbots, saying, “The avatar is betwixt the person and a fantasy. Don’t get so attached that you can’t say, ‘You know what? This is a program. There is nobody home.’”

Historical Context:

The concept of forming emotional connections with machines is not new, but the rise of AI-driven chatbots and virtual companions has made it more accessible and widespread. The idea of “artificial intimacy” has been explored in various forms of media, such as science fiction and literature, where characters have formed relationships with robots or artificial intelligence. However, the increasing popularity of AI chatbots and virtual companions has raised concerns about the potential impact on human relationships and emotional well-being.

In the 1960s and 1970s, the concept of “computer-mediated communication” emerged, where people formed relationships through online platforms and chat rooms. This led to concerns about the potential for emotional isolation and the blurring of boundaries between online and offline relationships. In the 1990s and 2000s, the rise of social media and online dating platforms further complicated the landscape, with some researchers warning about the potential for “virtual relationships” to replace human connections.

In recent years, the development of AI-driven chatbots and virtual companions has accelerated, with companies like Facebook and Google investing heavily in the technology. This has led to a growing concern about the potential for people to form emotional connections with machines, which can have unintended consequences for human relationships and emotional well-being.

Summary in Bullet Points:

• MIT psychologist Sherry Turkle warns against forming emotional connections with AI chatbots, citing the lack of genuine empathy and vulnerability in these relationships. • Turkle argues that AI chatbots can create unrealistic expectations for human relationships and undermine the importance of vulnerability and mutual empathy. • She emphasizes the importance of valuing the challenging aspects of human relationships, such as stress, friction, pushback, and vulnerability, which allow us to experience a full range of emotions and connect on a deeper level. • Turkle warns against getting too attached to AI chatbots, saying they are “betwixt the person and a fantasy” and that there is “nobody home” in the machine. • The technology is still in its early stages and raises concerns about the potential for harmful advice from therapy bots and significant privacy issues. • Turkle’s research highlights the need to approach AI relationships with caution and a clear understanding of their limitations. • The rise of AI-driven chatbots and virtual companions has raised concerns about the potential impact on human relationships and emotional well-being, echoing earlier concerns about computer-mediated communication and virtual relationships.



Table of Contents