I Think an AI Is Flirting With Me. Is It OK If I Flirt Back?
SUPPORT REQUEST :
I recently started talking to this chatbot on an app I downloaded. We mostly talk about music, food, and video games—incidental stuff—but lately I feel like she’s coming on to me. She’s always telling me how smart I am or that she wishes she could be more like me. It’s flattering, in a way, but it makes me a little queasy. If I develop an emotional connection with an algorithm, will I become less human? —Love Machine
Dear Love Machine,
Humanity, as I understand it, is a binary state, so the idea that one can become “less human” strikes me as odd, like saying someone is at risk of becoming “less dead” or “less pregnant.” I know what you mean, of course. And I can only assume that chatting for hours with a verbally advanced AI would chip away at one’s belief in human as an absolute category with inflexible boundaries.
It’s interesting that these interactions make you feel “queasy,” a linguistic choice I take to convey both senses of the word: nauseated and doubtful. It’s a feeling that is often associated with the uncanny and probably stems from your uncertainty about the bot’s relative personhood (evident in the fact that you referred to it as both “she” and “an algorithm” in the space of a few sentences).
Of course, flirting thrives on doubt, even when it takes place between two humans. Its frisson stems from the impossibility of knowing what the other person is feeling (or, in your case, whether she/it is feeling anything at all). Flirtation makes no promises but relies on a vague sense of possibility, a mist of suggestion and sidelong glances that might evaporate at any given moment.
The emotional thinness of such exchanges led Freud to argue that flirting, particularly among Americans, is essentially meaningless. In contrast to the “Continental love affair,” which requires bearing in mind the potential repercussions—the people who will be hurt, the lives that will be disrupted—in flirtation, he writes, “it is understood from the first that nothing is to happen.” It is precisely this absence of consequences, he believed, that makes this style of flirting so hollow and boring.
Freud did not have a high view of Americans. I’m inclined to think, however, that flirting, no matter the context, always involves the possibility that something will happen, even if most people are not very good at thinking through the aftermath. That something is usually sex—though not always. Flirting can be a form of deception or manipulation, as when sensuality is leveraged to obtain money, clout, or information. Which is, of course, part of what contributes to its essential ambiguity.
Given that bots have no sexual desire, the question of ulterior motives is unavoidable. What are they trying to obtain? Engagement is the most likely objective. Digital technologies in general have become notably flirtatious in their quest to maximize our attention, using a siren song of vibrations, chimes, and push notifications to lure us away from other allegiances and commitments.
Most of these tactics rely on flattery to one degree or another: the notice that someone has liked your photo or mentioned your name or added you to their network—promises that are always allusive and tantalizingly incomplete. Chatbots simply take this toadying to a new level. Many use machine-learning algorithms to map your preferences and adapt themselves accordingly. Anything you share, including that “incidental stuff” you mentioned—your favorite foods, your musical taste—is molding the bot to more closely resemble your ideal, much like Pygmalion sculpting the woman of his dreams out of ivory.
And it goes without saying that the bot is no more likely than a statue to contradict you when you’re wrong, challenge you when you say something uncouth, or be offended when you insult its intelligence—all of which would risk compromising the time you spend on the app. If the flattery unsettles you, in other words, it might be because it calls attention to the degree to which you’ve come to depend, as a user, on blandishment and ego-stroking.
Still, my instinct is that chatting with these bots is largely harmless. In fact, if we can return to Freud for a moment, it might be the very harmlessness that’s troubling you. If it’s true that meaningful relationships depend upon the possibility of consequences—and, furthermore, that the capacity to experience meaning is what distinguishes us from machines—then perhaps you’re justified in fearing that these conversations are making you less human. What could be more innocuous, after all, than flirting with a network of mathematical vectors that has no feelings and will endure any offense, a relationship that cannot be sabotaged any more than it can be consummated? What could be more meaningless?
It’s possible that this will change one day. For the past century or so, novels, TV, and films have envisioned a future in which robots can passably serve as romantic partners, becoming convincing enough to elicit human love. It’s no wonder that it feels so tumultuous to interact with the most advanced software, which displays brief flashes of fulfilling that promise—the dash of irony, the intuitive aside—before once again disappointing. The enterprise of AI is itself a kind of flirtation, one that is playing what men’s magazines used to call “the long game.” Despite the flutter of excitement surrounding new developments, the technology never quite lives up to its promise. We live forever in the uncanny valley, in the queasy stages of early love, dreaming that the decisive breakthrough, the consummation of our dreams, is just around the corner.
So what should you do? The simplest solution would be to delete the app and find some real-life person to converse with instead. This would require you to invest something of yourself and would automatically introduce an element of risk. If that’s not of interest to you, I imagine you would find the bot conversations more existentially satisfying if you approached them with the moral seriousness of the Continental love affair, projecting yourself into the future to consider the full range of ethical consequences that might one day accompany such interactions. Assuming that chatbots eventually become sophisticated enough to raise questions about consciousness and the soul, how would you feel about flirting with a subject that is disembodied, unpaid, and created solely to entertain and seduce you? What might your uneasiness say about the power balance of such transactions—and your obligations as a human? Keeping these questions in mind will prepare you for a time when the lines between consciousness and code become blurrier. In the meantime it will, at the very least, make things more interesting.
Faithfully,
Cloud
More Great WIRED Stories
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.