In 2017, Eugenia Kuyda launched an app called Replika. She told Vice she did this keeping in mind how it might serve as the supportive friend she never had in her teenage years. Increasingly working on generative AI over time, the bot started responding more freely to user prompts and soon found itself embroiled in romantic and sexual relationships with people. Not long after, it was sexually harassing users, effectively driving them away and leading them to fall in the same bottomless pit of despair that a normal, human-induced heartbreak might have led them to.
As the name suggests, the app in question is a mere replica of a human being. While not all of us might imagine romancing bots, we could hypothetically use them to aid us in getting the loves of our lives. Consider, for instance, a bot-assisted romantic conversation like this one. A girl is chatting with a boy, who, unbeknownst to her, is trying to patao her with the help of a bot.
Girl: So, are you on dating sites other than this one?
Boy: I see you like playing tennis. Me, too.
(pause)
Girl: That’s a bit rude, don’t you think?
Boy: Sorry. I was just trying to woo you by zoning in on what we have in common.
(short pause)
Girl: And you made me zone out with those words! Ever heard of ‘show’, don’t tell?
(long pause)
Boy: I’m sorry; you’d probably be better off with a writer.
(Boy goes offline)
Where did the boy fail? Artificial intelligence had him believe he must start by finding out what he and the girl had in common. He could not engage her in a normal conversation. So focused was he on getting it right that he got it miserably wrong. He would perhaps have been more endearing to the girl if he displayed his complete oblivion as to what “show, don’t tell” meant. But he had to return to ChatGPT for help. Like a slave. Instead of wooing the girl, the boy told her he was trying to woo her.
Artificial intelligence has no soul
AI-generated responses on love lack poignancy, believes the author Pixabay
Douglas Hofstadter, author of the Pulitzer Prize-winning book, Godel, Escher, Bach, in an article for The Atlantic, speaks of Deep Blue, the IBM supercomputer that beat the chess grandmaster Garry Kasparov. What Hofstadter wanted to ask was a simple question — why even attempt to conquer a task, when there is no insight to be gleaned from the victory? He says, “Deep Blue plays very good chess – so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
Profound questions, coming from a man whose 777-page cult book explored consciousness as an abstract idea found in art and music. Art and music. The very things artificial intelligence can never truly replicate, because it has no soul.
I must confess I had never used ChatGPT (with good reason). Out of curiosity, though, I thought I would give it a try and ask it the age-old question — what is love?
Here are the first few words it generated as a response: “Love is one of the most complex and powerful emotions that humans can experience. It has been the subject of countless poems, songs, and works of art throughout history, and yet it remains elusive and difficult to define.”
Read the second sentence again: “It has been the subject of countless poems…” While ChatGPT might just come up with a great poem in terms of structure, even choice of words, I doubt the same can be said about the poignancy of those words.
If you truly love someone, you cannot love another
Back in the day we were constantly looking for ways to woo people. Pickup lines, we called them. What really had others hooked was how we uttered those pickup lines. Our shaky voice. The dreamy look in our eyes. An irresistible coquettishness no words can simulate.
AI might have subtly changed the way we think. In the movie Her, Joaquin Phoenix is devastated to learn that the bot he has been chatting with, the one that professes to be in love with him, is supposedly in love with several thousand humans just like him. It shatters his belief in the concept of love, but there is a larger issue at play. Can a person be in love with more than one person at the same time? I do not think so. If you truly love someone, you cannot love another.
Luckily, we are not chatbots programmed to think in a certain way, even if the love of our lives might be nothing short of a chatbot itself. AI-based relationships strive for perfection. But the truth is that relationships are not perfect. Life is not perfect. Sometimes we need to walk away. Even if the chatbot does not.
In love, as in writing, we need to show, not tell
Showing is about subtlety, while telling is about being blunt Unsplash
In reality, we are all in love with our individual chatbots. Most of which are humans. There is no denying that the simplicity in the words we use with our real romantic interests might melt their hearts, as opposed to artificially-engineered, ‘choice’ words. Consider the scenarios below.
Scenario 1 (bot-assisted boy)
Girl: You should treat me with respect. If you don’t, you will never be able to truly love me.
Boy (short pause): I’d like to know the ways in which you feel respected. Could you share these with me, so I can learn how to respect you?
Scenario 2 (boy assisted by himself)
Girl: You should treat me with respect. If you don’t, you’ll never be able to truly love me.
Boy (long pause): K. I’m sorry.
It is high time we stopped learning from machines that we made. In love, as in writing, we need to show, not tell.
Rohit Trilokekar is a novelist from Mumbai who flirts with the idea of what it means to love. His heart’s compass swerves ever so often towards Kolkata, the city he believes has the most discerning literary audience.