AI Requires a Human Touch
AI can replace (some of) our work
I want to share something I read at work this week. There had been an email exchange about the degree to which AI will enable us to automate our work. I work at a professional military education institution, so our work includes both graduate level academic instruction and military professional education. Here’s what my colleague had to say in response to someone else.
You are missing the human element - AI may be able to do a lot of things I do, but I would argue it can’t counsel a student who lost a child to cancer or who is going through a divorce. Being a good teacher is about making a personal connection. It is about inspiring and encouraging people. AI does not do that well. Being a military officer is more than writing staff products. War and leadership are fundamentally a human endeavor, it's all about how to compromise, inspire, and motivate. If we get to the point where that doesn’t matter, AI can have my job. …
In my younger years, I ordered Marines to risk their lives at my direction. They did what I asked of them because we had a human connection. They knew I cared about them. AI can't inspire because it doesn't care. What AI doesn't do well is capture envy, jealousy, ambition, love, hate… culture. Or the feeling of being part of a team working together to solve a challenging problem
Again, I would feel threatened if my sole purpose was writing/critiquing/generating written content. But my job is inspiring, encouraging, and guiding officers to develop themselves. Show me the model that can do that and I’ll happily give it my job.
This response (which I endorse wholeheartedly) highlights a problem that’s been metastasizing beneath the surface in the literature on AI for a long time. There are so many examples I could choose from, but the one I that comes most immediately to mind is from the discussion on AI benchmarking.
When OpenAI released GPT-4, they also released a 100 page document they called a “Technical Report” (though some have criticized it as merely a press release “masquerading” as real research). In any event, the authors showed how much better GPT-4 did on several standardized tests than its predecessor, GPT-3.5, had done.

If you didn’t see this in the coverage at the time, it made some waves in AI circles. The blue bars represent GPT-3.5’s performance on the standardized tests listed at the bottom of the chart. The green bars represents GPT-4’s performance.
So, just to pick out a few examples, where GPT-3.5 scored in about the 10th percentile on the Bar Exam, GPT-4 scored in about the 90th percentile. Where GPT-3.5 scored in about the 30th percentile on the AP Physics 2 exam, GPT-4 scored in above the 60th percentile.
These data are surprising and credit is due to OpenAI for GPT-4’s performance.
But, in an entirely unsurprising turn of events, several commentators took this to mean that, because GPT-4 can pass the bar or pass AP Physics 2, that GPT-4 is somehow qualified to be a lawyer or to test out of a college-level modern physics class.
And this is the problem lurking beneath the standardized tests-as-benchmarks approach to measuring AI performance. Until very recently, it was safe to assume that everyone (everything) taking the bar exam or the AP Physics 2 exam was, you know, a human.
As practitioners use these various standardized tests to evaluate students, they probably—whether wittingly or not—make some important assumptions. In order to practice law, you have to pass the bar and be a human. In order to be a doctor, you have to graduate medical school and be a human. In order to get into your top choice college or university, you need a competitive GPA and you need to be a human.
And what, you must be asking, is captured in this requirement to “be a human?” Well, that depends on the specifics of the case. In the quoted text above, my colleague gave a couple of examples from the military and academic environments. The military commander builds trust with subordinates so that he or she can ask them to risk their lives. The academic instructor or professor builds rapport so that he or she can serve, not just as a teacher, but also as a mentor.
Examples will be as varied as professional or vocational experience.
Here’s the problem: In our age of data-driven decisions and automation, the human element—whether in the military, in the classroom, or in literally any other job—is the hardest to measure and, as a result, it is the hardest to justify. But it can be the most important part of the job.
Just this week, Yahoo Finance covered the story of Klarna, the Swedish company that went all in on AI—just as the tech optimists say they should. They partnered with OpenAI to replace 700 human employees—many of which had been in customer service roles—with AI. In 2024, Sebastian Siemiatkowski, Klarna’s CEO, told Bloomberg News that “AI can already do all of the jobs that we as humans do.”
Just a year later, Klarna wants humans back. Siemiatkowski now says, “I just think it’s so critical that you are clear to your customer that there will be always a human if you want.” Another spokesperson for the company tried to balance the requirement for AI and the requirement for humans: “AI gives us speed. Talent gives us empathy. Together, we can deliver service that’s fast when it should be, and empathetic and personal when it needs to be.”
Threading this needle—balancing the efficiency of AI with the human touch of, well, humans—isn’t going to be easy. But it’s the hard work that lay ahead for leaders in almost every sector and every industry.
This is one reason that generative AI is so disruptive. It forces us to come to grips with hard questions about what the human element is and why it matters.
If our work requires us to interact with other human beings, then there is a human element to our work. Or, in the immortal words of Michael Scott, “business is always personal. It’s the most personal thing in the world.”
Credit Where It’s Due
Views Expressed are those of the author and do not necessarily reflect those of the US Air Force, the Department of Defense, or any part of the US Government.



