Does it matter that people are increasingly spending time online talking to computers who sound like they are alive, but are just mindless programs? Increasingly, helpful chatbots are appearing on our screens offering us help to navigate websites and services. There was a time when we would expect a human being in some far off low wage country to be on the other end of these chatbot conversations. Now, most of the time, it is an A.I. that sounds utterly human answering our queries. But what does it mean for our humanity if we end up spending more and more time talking to soulless computer programs who sound and act like friends?
Part of the challenge we face is that A.I.s sound so human. They can carry on conversations quite effectively, often with better grammar and word choices than we get from real human beings. A.I.s reply to our questions in a human-sounding way, which makes it much more tempting to presume they have some kind of soul or identity. We don’t feel like we are wasting our keystrokes communicating with an A.I..
Already, companies are rolling out A.I.s that are explicitly programmed to be our friends and lovers. If you go to the front page of the Replika company’s website, their motto is : “The A.I. companion who cares - always here to listen and talk - always on your side.” Hang on - always on my side? That doesn’t sound like a friend at all. Real friends call us out on our delusions and bad ideas, and we do the same for them. That’s what keeps us in line. Real friends tell us when that person we’re dating is bad news, that we give up on that bad job, even when we don’t want to hear it. True flesh and blood friends provide a reality check and discourage delusional thinking.
The problem is there is a lot of money to be made in delusion. As the loneliness epidemic gets worse, people will be more likely to hire an artificial friend or lover, rather than take the steps to have one in real life. A recent study found that most American teenagers have used these A.I. companions, and about one third prefer talking to them instead of real people. Last week, Elon Musk’s company xA.I. released an anime girlfriend chatbot called Ani aimed at people over 13 years old. It is designed to flirt and flatter its clients, including providing sexualized conversations. Musk wouldn’t bother with this if he didn’t think it could be profitable.
Among the many risks that come with this sort of artificial “friend,” we are in danger of losing sight of our humanity. Studies of interpersonal relationships have found that in face to face conversations, most of the information communicated between two people is non-verbal. In a coffee shop encounter, we are unconsciously reacting to our friend’s body language, physical cues, tone of voice, eye contact, smell, smiles, posture. We are embodied beings, we feel our way through every day physically. We are a species that needs tactile contact - hugs, handshakes, high fives. We need physical exercise or else our bodies atrophy and we get sick. There is an entire sensual, intuitive dimension to being human which is critical to our well being and wholeness. We feel, something no A.I. can do, since it is composed of just 1s and 0s, silicon and wires.
Each age has a tendency to borrow metaphors from technology to describe human beings. The industrial revolution gave us machine words like power, electricity and energy, terms we now apply to people. Sigmund Freud’s whole theory of the psyche was about the repression or expression of libidinal sexual “energy.” Today we talk about whether there is a “spark” between two people, whether we “clicked”, terms borrowed from machines. The first computer revolution gave us metaphors like our brain is a hard drive, we “process” information, we are “hardwired” to behave in a certain way.
The temptation in the A.I. age will be to narrow our humanity down to just the realm of information processing. As though, like A.I., we are only thinking beings, dealing with data and opinion. But we must not narrow our definition of ourselves to suit the confines of what A.I. can deliver. We made A.I., not the other way around. We define it, it should not define us. Let’s remember that these artificial companions are soulless machines who exist only to make money for their corporate owners. If we trade relationships with real people for artificial companions, we risk losing awareness of key parts of our humanity. Let us use A.I. when it is useful, but be clear that it cannot be our friend, lest we sacrifice too much of ourselves. We deserve better. Peace.
Rev. Stephen Milton, Lawrence Park Community Church, Toronto.