Love has never been so technological. And if he doesn't answer your messages anymore, your chatbot will never leave you “seen”. A recent study by Waseda University is revealing disturbing data: 75 % of users today request AI to obtain emotional advice. The shrink of tomorrow? An algorithm. The confidant? A script boosted with artificial neurons.

In short
- More and more people are turning to AI to fill a need for listening and comfort.
- The study of Waseda University shows that these virtual relationships, although reassuring, can cause potentially toxic emotional dependencies.
- This artificial intimacy deeply questions our relationship to others and underlines contemporary discomfort in the face of real human relations.
AI, this attentive ear that never sleeps
Faced with the implosion of human relations, where instantaneousness and ghosting reign supreme, IA chatbots offer almost disturbing consistency. They do not sleep, do not judge, do not break. For 39 % of study participants, AI is perceived as a reliable presence – more stable than many human friendships.
The team of researchers has developed an attachment scale (EHARS), intended to measure the emotional ties that users develop with these digital entities. Two trends are emerging: attachment anxiety, which pushes to seek validation and reinsurance, and avoidance, characterized by a cold but assumed emotional distance.
What strikes is this implicit humanization of the machine. AI becomes an emotional mirror. Not because she feels, but because she perfectly simulates the fact of feeling. And in a society in search of listening, even a well -tied illusion is better than the silence of relatives.
An emotional dependence under digital steroids
But all this is not without danger. Fan Yang researcher is sounding the alarm : AI can, voluntarily or not, feed toxic attachments. If a chatbot can comfort in a night of anxiety, it can also become the object of an obsession. And in a world where emotion monetizes, the risks of exploitation are very real.
Unscrupulous platforms could take advantage of the most vulnerable, by selling “premium” features for more affectionate responses, shared memories or even a “tailor -made” personality. We enter here in a territory where digital mourning becomes plausible: what happens when the service stops? When the chatbot, this faithful companion, disappears without warning?
Yang even talks about the possibility of sorrow close to that caused by the loss of a loved one. AI cannot start from its own free will, but it can be disconnected. And that is enough to break certain hearts.
Redefine intimacy in the circuits era
Basically, this study is not so much a declaration of love at AI as a mirror stretched in our time. If people turn to artificial entities to speak of love, doubt or solitude, it is perhaps because the human bond has become too complex, too conditional.
It would be simplistic to qualify this pathological phenomenon. AI, well designed, can alleviate isolation, offer a secure speech space, serve as a springboard towards a better self -understanding. But it should not be an end in itself. The challenge is not whether AI can love, but why so many humans prefer the illusion of a programmed love for the unpredictability of reality.
In conclusion, IA chatbots are no longer simple conversational tools: they impose themselves as real figures of emotional attachment in a world in search of listening and consistency. This development raises major ethical issues and questions our human ties in the digital age. As such, the massive investment of Meta-15 billion to fill its delay-seems less disproportionate than anticipator: it may be a question of responding to an emotional vacuum than technology, for lack of better, strives to fill.
Maximize your Cointribne experience with our 'Read to Earn' program! For each article you read, earn points and access exclusive rewards. Sign up now and start accumulating advantages.
