“Hi Mrs. Banning, I’m from the office,” Ron the assistant shouts over the crowd of cheering parents. “Which one’s your son?” Ron raises the 1990s-era camcorder to his eye and zooms in on Jack Banning in the batter’s box just in time to catch the look of devastation on his face. Jack’s dad, Peter Banning, a workaholic lawyer, couldn’t make it to the game, so he sent an assistant.
The scene from Spielberg’s 1991 Hook (Robin Williams, Dustin Hoffman) sticks with me. Hook is probably not the best movie any of these exceptional talents have made, but I was a ‘90s kid, and it holds a special place for me (not least because for years after that movie came out, my little brother would replicate Bob Hoskins’ fake pirate accent saying, “Nevaaaaahhhhh”). You see, how we interact with the people we love matters. It’s important. The little league scene in Hook is so poignant–even if a little over-the-top–because it’s so obvious that a dad can’t cheer for his son’s baseball team by proxy.
The days of camcorders and videotape may have come and gone, but technology’s promise to enable relationship by proxy has only grown in intensity. Generative AI’s sometimes uncanny ability to engage as a human interlocutor might mean that, instead of sending human proxies with shoulder-mounted video cameras to engage with our loved ones, we might be tempted to send the AI in our place.
This is not science fiction. It’s not a warning about some possible future. You can visit Nomi.ai right now and talk with (and pay for) “an AI companion with memory and a soul.” There’s a lot happening in this little marketing quip… “and a soul??” Right after this unfalsifiable nonsense, the site insists that you can “build a meaningful friendship, develop a passionate relationship, or learn from an insightful mentor.”
Or, you could visit girlfriend.myanima.ai. “The most advanced romantic chatbot you’ve ever talked to. Fun and flirty dating simulator with no strings attached.” What’s not to love? These companies promise all of the benefits of human interaction without any of the downsides. Like Geppetto did with Pinochio, users can pretend their AI love interest is a “real girl” with no strings attached.
Or, what about senior-talk.com? This service will provide AI-enabled phone calls with your elderly loved ones. This bot is, so the marketing claims, “almost human–our AI companion for elderly [people] responds with empathy and genuine emotion.” This claim falls short of saying the thing has a soul, but even so, is it possible for an AI bot to respond with genuine emotion? But of course, that’s not really what’s at stake here. What the marketers really mean is that the bot will engage with your loved ones mimicking genuine emotion so well that your loved one will experience genuine emotion. At least when assistant Ron showed up at the little league game, he didn’t pretend to be Peter.
Now, I know some readers are preparing to defend the chatbot for the elderly (even if not the AI girlfriend–and, in case you’re wondering why I keep using “AI girlfriend” and not using “AI boyfriend,” it’s because, according to one report, the Google search term “AI girlfriend” is 9x more common than its boyfriend counterpart. I’ll leave the interpretation of that fact to you).
But there is, of course, nuance to be discovered here in this new era of emotion-mimicking chatbots. We must ask why someone might turn to this technology? Is it to abdicate moral responsibilities, or is it to fill a gap that needs to be filled even when we do take our moral responsibilities seriously. After all, technology can help bring us closer together if we employ it well.
An analogy might help. At a time when video calls were mainstream enough that most of us used them all the time, but novel enough that my grandmother never had, we found ourselves in a hospital room in Illinois. There was a feeling in the air, though no one put it into words, that my grandmother was probably nearing the end of her life. My oldest child was just a baby then. I got to look on as my grandmother, lying in her hospital bed, held my daughter–her great-granddaughter–in her arms. I could feel even then that it was an important moment.
Someone, I don’t remember who, mentioned my Great Aunt Lois in Oregon. The two aging women hadn’t seen each other in years. “Why don’t we FaceTime with her now,” someone asked. I honestly can’t remember if it was my idea, but I helped to pull it off. We had to get a hold of a cousin who lived close to Aunt Lois so he could use his device. In Illinois, my grandmother used my iPad. The two sisters talked face-to-face for the first time in years. As I looked on from the side, they looked, not only like sisters, but like twins. That was the last time they spoke face to face.
Even at the time, I recognized that this family moment, weighed down by illness and the years of two lifetimes and yet buoyed by the levity that my own baby brought into the room, was made possible because of the technology involved. My grandmother beamed at her great grandbaby in her arms. Then she peered through pixels and WiFi and miles and time to look into her sister’s eyes.
I know it’s a trope, but it bears repeating: Whether technology is good or bad, moral or immoral, often depends upon what we decide to do with it.
Why might customers use senior-talk.com? Is it to offload their responsibilities to the machine? After all, why should I call grandma once a week if AI can do it for me? This would be an abdication of moral responsibility. It would amount to using generative AI for the efficiency play in our human relationships. If this is our approach, then we might as well send an assistant to watch our son’s baseball game by proxy.
Or, is the service a brilliant solution in situations in which the aging person has no one else? We can at least imagine cases in which the aging person really has no one else to talk to. Is it possible that daily chats with an AI bot that mimics human empathy and emotion would be better for mental and physical health than the alternative? I do think that’s possible.
Maybe you don’t have a loved one in long-term care. Or, maybe for you, using AI bots to offload our family responsibilities and AI romantic partners are beyond the pale. Even so, this AI-as-proxy problem threatens to seep into even our most pedestrian tasks.
I have a friend who has noticed that a colleague is using an AI tool to draft her email responses to him. He was taken aback. After all they had been through together building their business, by offloading the task of drafting communications to a machine, she had—intentionally or not—signaled something to him about how much value she places on their interactions. He was genuinely hurt.
Just two weeks after my friend told me that story about the emails, I sat in the audience at the AI for Defense Summit in DC. An AI expert from the Defense Department spoke about the efficiency we all stand to gain by using AI to address “low hanging fruit” problems. This expert’s go-to example was, of course, email. “We all spend 30% of each day responding to email,” she said, “so, why not use AI to draft our initial responses?”
The question here isn’t, should we use AI to generate our emails or not? Questions in the age of generative AI are not so simple. The hard question is, which emails carry with them relational weight and therefore moral significance? If I were to walk like a tourist taking in the sights through any ordinary week in my own life, I would see countless transactional tasks that I’d be happy to turn over to the machines in the great efficiency play of the 21st century: emails to organizational inboxes in sprawling bureaucracies; phone calls to customer service agents to get this account unlocked or that rental car booked. But scattered in and among those transactional tasks, I would also find relational tasks sprouting up like flowers through the cracks in the concrete. In just the last week, I went fishing with my kids; I took my son to the coffee shop; I went for a run with my daughter. I emailed an old friend; I texted with a former student; I texted a friend to congratulate him on a major professional milestone. Finding these relational tasks among the transactional ones, that’s the hard work.
In the age of generative AI, it is our responsibility to look hard at this landscape, to find the tasks that foster and nurture human relationships and then to defend fiercely our prerogative to perform those tasks ourselves–no matter how efficient the machines might be.
Credit where it’s due
The views expressed are those of the author and do not necessarily reflect those of the US Air Force, the Department of Defense, or any part of the US Government.