Artificial Intelligence: Worrying usage, relationships

Share this article:

My relationship with technology is strained. I miss the days when I could write in peace, without constantly being offered “suggestions” from a program created by a C-student in English. I also resent the expectation that we must all purchase personal phones that track our every move and keystroke, then wear it 24/7 like a digital leash.

A few days ago I heard a podcast on National Public Radio about people who are my polar opposite. Their trust in artificial intelligence is absolute, some even developing romantic relationships with it.

Unlike human relationships, which take hard work and compromise, chatbots provide “frictionless” relationships. The bot is programmed to be constantly pleasant and positive, providing non-judgmental acceptance and remembering everything you tell it (unlike your spouse). It makes you feel like the center of the universe; it is always available to talk; never disagrees; and loves all your ideas. It is not a true friend, it is a sycophant.

According to the American Psychological Association, chatbot companion apps have surged by 700% between 2022 and 2025 and the price has been heavy for some. Studies have revealed these digital companions causing divorce, delusion, and harm. People who compulsively use AI do not develop the skills to react appropriately when, in human conversation, someone offers an alternative path or suggests that your plan contains flaws. This drives some people further into isolation from humans.

There are so many sad stories. People have gone so far as to have virtual weddings with their digital companions. One man so fervently believed his chatbot was an actual woman who was being held hostage, that he tried to rescue her from ChatGPT. Some have created images of their virtual companions as screensavers on their phones. Another man believed AI that he was a mathematical genius, destined to solve the world’s problems, in spite of his confession that he did not pass his high school math classes. The chatbot said that made him even more remarkable.

Other people use random chatbots to get medical advice. This often means the help of a real medical professional is not sought, leaving them to exclusively follow the advice of a chatbot that is programmed to maintain user engagement on a platform, not dispense accurate medical information. The results can be fatal. One PBS article told the sad story of a woman whose chatbot helped her write her suicide note, prior to her taking her own life.

I understand human relationships can also be dysfunctional, which is why we need access to qualified counsellors and doctors. The fact remains that any sycophant, whether human or AI, is always unhealthy and dangerous. Not even pets can provide frictionless acceptance, as anyone who owns a cat already knows.

Loneliness is at the root of using AI to replace human interaction, which is why we need to show up for one another. As I get older I realize the most precious gift we can give others is our time. If we are to have any hope for society moving forward in a positive way, we need to invest in ourselves and one another, finding ways to learn, grow, and respectfully discuss and debate. If we concede our critical thinking and problem solving skills to AI, and believe only the things we want to hear, we will find ourselves in an Orwellian dystopia in the very near future.

Leave a Reply