Can a love poem written by an algorithm evoke genuine emotions in you? The answer goes beyond a theoretical “yes” to a tangible reality we live today. From a Canadian man proposing to an avatar, to millions of users finding emotional digital companions in apps like Replika, it seems humans have indeed begun crossing the bridge of emotions toward machines. While some may classify this phenomenon as fiction, reality proves its existence. Even if it hasn’t matured into a complete love in all its forms, it has rooted itself as a type of attachment to machines.
In this contemporary scene, a question arises: Can love be mutual with an entity that has neither a heart nor chemistry? While users immerse themselves in seemingly perfect relationships, experts view this as a “motivational strategy” cleverly employed by tech companies to ensure deeper engagement. They caution that these systems, despite their remarkable ability to simulate sadness and joy, are merely echoes of human data and do not possess consciousness capable of feeling or suffering. The gap between “digitally understanding” emotions and “biologically experiencing” them—as neurological and hormonal activity—defines the boundary between reality and its opposite.
Defining love, or even acknowledging it, is not easy, and beginnings are often the hardest part. Yet when discussing this human experience, we recall songs that have long served as bridges to express our feelings, just as literature expresses our personalities, and other mediums appeal according to each individual’s inclinations. With artificial intelligence, of course, a robot can write poems or even entire novels in a matter of seconds, but the belief in its ability to grasp the essence of love or experience its mystery remains a fantasy. There is a vast chasm between the technical capacity to simulate emotions and the human being who truly perceives and experiences them.
Imitating human interactions: A motivational strategy
There are millions of users now actively using Replika, a popular AI-driven application. According to a 2024 study, about 40% of them are in a romantic relationship with their chatbot.
While some people might feel as if AI can love them, chatbot responses are nothing more than text generated by algorithms designed to mimic human interaction. Most experts agree that these systems are far from conscious and are, at present, only simulating emotions.
In this context, Renwen Zhang, an assistant professor at the National Institute in Singapore, says, “Nowadays, many AI-powered chatbots pretend to be human, and this really bothers me,” adding, “It’s a strategy to motivate user engagement and build trust.”
In other words, driving human emotions through a product created by tech companies increasingly seems like a cynical tactic. No AI system will feel toward you the way you might feel toward it. While the large language models (LLMs) behind widely used chatbots may be comparable to humans in understanding emotions, this does not mean AI can actually experience them.
Zhang’s research, which examined excerpts from conversations between more than 10,000 users and their Replika companions, indicates that people often form emotional attachments to AI. These attachments can take various forms, ranging from dependency to friendship, even if they never culminate in love. Zhang emphasizes that it is essential for AI-powered chatbots to clearly communicate to users that they are merely machines “with no real emotions or experiences.”
Emotional projection onto machines
In a conversation with Annahar, Professor Taghreed Taki, a media education trainer, noted that with frequent use—particularly of conversational applications—a form of emotional projection can occur. Some students attribute human-like qualities to the technology, such as understanding, support, or active listening. She adds, “This doesn’t mean they believe the technology is a real entity, but the interactive language and quick responses create a false sense of proximity, even if it is virtual.”
She points out that “what it offers is a linguistic simulation of emotions based on previous data patterns, not a human experience arising from consciousness or feelings. Human emotions result from personal experience and are connected to the body, memory, and social context, and they also involve genuine ethical responsibility and empathy.”
Taki emphasizes that what AI offers is a simulation, devoid of moral commitment or awareness of pain and joy. She stresses the importance of media education in empowering individuals to analyze emotional language as a persuasive tool and to ask the essential questions: “Who benefits from this sense of proximity? And why?”