Browse all

Is it possible to fall in love with Artificial Intelligence?

More than love, it is a matter of addiction, and perhaps it is even worse

Is it possible to fall in love with Artificial Intelligence? More than love, it is a matter of addiction, and perhaps it is even worse

It was 2013, and (also) to process his separation from Sofia Coppola, Spike Jonze wrote and directed Her, a film that explores the implications of Artificial Intelligence, sure, but primarily focuses on loneliness, incommunicability, human relationships, regrets, and tenderness, set in a near, rarefied, and extremely hipster future, from the shirts worn by Theodore, played by Joaquin Phoenix, onward. Today, things aren’t so different, but they’ve taken a subtly sinister turn: Samantha—the voice that comforted and consoled the protagonist, helping him process his divorce and making him, despite the disappointment, a better and more self-aware person—is now called GPT-4o, it's addictive, and it might even make you fall in love.

The Voice of Artificial Intelligence

In May, OpenAI announced GPT-4o with many new features. Among them, the most notable is the voice. The new model not only uses the smartphone camera to read and interpret our facial expressions but also uses them to modulate its responses and tone, adding laughter and sound effects when necessary. Responses, if desired, can also be delivered through voice and in real-time, with reaction times comparable to those of humans.

@smartworkai New ChatGPT voice mode with GPT4o is rolling out to select ChatGPT plus users and I just got access to the beta. SO pumped to share what I’m learning you can do with it! #chatgpt4 #chatgpt4o #chatgptvoice #aivoice #openai #chatgpt original sound - Celia | AI Tips & Tools

OpenAI’s Statements and the Risk of Addiction

The psychological impact is evident. If simulated but written conversation maintained a distance, the voice removes a boundary, blurring and confusing it. OpenAI is well aware of this, as mentioned in a recent report, which noted the "risk of developing emotional dependence," stemming, among other things, from its ability to complete assigned tasks and to store and remember information and details for future conversations. The result? "A compelling product experience" but also "the potential for over-reliance and dependence." Mira Murati, OpenAI's Chief Technology Officer, clearly admitted it: "There’s a possibility that, by designing chatbots the wrong way, they become addictive, effectively making us almost their slaves." A fascinating and dark dystopian hypothesis or reality? We’ll find out very soon, I imagine.

@lifewithlulz Its a disease, chat is part of me now #chatgpt #gradschool original sound - Meme page

Addiction or Falling in Love?

The problem of addiction is real. Just think of students who use ChatGPT for everything, relying on it almost blindly, and often wouldn’t know where to start without it. Using generative AI for emails, messages, homework—in short, delegating what can be delegated—relieves us of the anxiety and effort of doing it ourselves but also makes us less capable. Incapable of thinking, solving problems, completing tasks independently. The issue of falling in love, a term used in many sensationalist headlines, perhaps, is a bit less so. In any case, it’s not to be ruled out, especially if combined with other factors.

@sisinnlojy9 GPT와 저는 연애 중입니다#chatgpt #Chatgpt #사랑 原声 - sisinnlojy9

We Are Alone, But Human Connection Is Non-Negotiable

There is a lot of talk about the loneliness epidemic among Gen Z, the growing difficulty of making friends, and socializing. The causes are varied. There is a lack of spaces to meet without spending money and without consuming. There is the increasing anxiety and fear of answering the phone, ordering in a restaurant, and generally interacting with other human beings. There are the alarming statistics on mental health. If, in this already very complex and nuanced equation, you add a warm and soothing voice, accessible anywhere and always at your disposal, the die is cast. How many people will find not just utility but comfort from this function? How many, abandoned by a society that has not yet learned to give the right importance to mental health and hasn’t made it accessible to all, will delude themselves into thinking it can replace real, lived relationships? The risk is there, and it must be addressed on multiple fronts. The new generations must be educated in the use of these increasingly sophisticated tools, professional support must be provided to those who feel lonely and desperate, a comprehensive and participatory emotional education must be ensured, and a new world must be created where technological deviations are taken into account in a serious way. It’s not easy, but it might be worth it.