
Three years into the AI boom, senior leaders are increasingly outsourcing general communications to machines. Klarna’s CEO, Sebastian Siemiatkowski, recently deployed an AI-generated replica of himself to deliver a financial update, while Sam Liang, CEO of Otter.ai, introduced a “Sam-bot” to attend meetings in his place.
But the broader shift towards using large language models for duties once assigned to humans comes at a cost. More than two in five (43%) of UK workers say they feel deceived when senior leaders rely on AI-generated communications, according to a survey of 1,000 employees conducted by Raconteur in partnership with Attest. Only 25% view these messages positively, suggesting that most still crave genuine, human interactions from those in charge.
This growing reliance on AI is not limited to executive-level communications. It is also reshaping daily managerial responsibilities across organisations. Two-thirds of HR leaders predict that more than half of their team’s routine administrative tasks will be handled by AI by the end of 2026, according to Workday’s Global Workforce Report.
New technologies have long expanded human capabilities but often at the expense of other skills. Writing reduced our reliance on memory and calculators diminished the need for basic arithmetic skills. AI may now be reshaping how we learn, make decisions, communicate and solve problems. This isn’t inherently negative, but it raises important questions: which skills will AI enhance and which will it suppress?
AI is eroding leadership skills
Poorly executed AI-generated communications can come across as impersonal, misleading or tone-deaf. Such communication may undermine the building blocks of trust and alter employees’ perception of their leadership team.
A third (33%) of survey respondents say that the use of AI-generated communication erodes leadership credibility – more than double the 13% who believe it enhances it. Meanwhile, 43% say the impact depends on leaders being transparent about how and when AI is used.
“There’s a real danger that AI can mask leadership deficiencies, letting managers avoid developing crucial people skills and eroding their authenticity,” says Kate Field, global head of human and social sustainability at the British Standards Institution (BSI). “Too often, individuals are promoted into management positions based on technical ability rather than people skills. AI can hide this problem,” she says. “Managers might think: ‘Great, I can disengage from being a people manager, knowing my team or being empathetic because I can just hide behind a chatbot.’”
Field warns this behaviour could accelerate a “wave of employee dehumanisation”, where staff feel unseen or unheard, potentially leading to higher staff turnover.
At the same time, AI can be a helpful tool for managers who struggle with emotional intelligence, by offering prompts, conversation starters or guidance to support employees. The challenge lies in striking the right balance: using AI to enhance leadership without replacing the human connection that employees value most.
Leaders must avoid ‘AI speak’
AI is increasingly used in performance management. JP Morgan, Citi and IBM have all introduced chatbots to assist with appraisals. While this practice may improve efficiency, it also risks undermining trust. Only 28% of UK employees would fully trust a manager if AI contributed to feedback, while 30% say it would actively damage trust.
Nearly half (49%) believe AI-assisted recognition or praise lacks authenticity, with scepticism highest among the oldest (55%) and youngest (54%) generations.
To be genuinely useful, AI must produce output that reflects a user’s own voice, tone and writing style. That demand for authenticity is rising, Field says, in part because today’s youngest generation is highly sceptical and quick to spot when a photo or message has been generated by a machine.
AI should be used carefully in employee feedback, argues Chris Oglethorpe, chief people officer at Freeths, a law firm. “While AI can help rephrase communications, the language can easily become impersonal and very ‘beige’ when managers need to connect with their teams,” he says. “It’s important to avoid letting ‘management speak’ turn into ‘AI speak’.”
According to Gallup’s 2025 State of the Global Workplace report, just 10% of British workers feel motivated at work. Performance management remains one of the few opportunities for genuine human connection in the workplace. Outsourcing these interactions to machines risks widening the gap between leaders and teams, at a time when engagement is already abysmal.
“If AI generates feedback that doesn’t resonate with an individual’s experiences or feels inauthentic, employees will notice and rally against it,” Fields warns.
She continues: “Young employees are particularly vocal. They’ve grown up with technology and are clear about what they want and don’t want. One of the biggest mistakes companies make is assuming they already know what employees want. To maintain trust, organisations must respect people’s different expectations and comfort levels, involving employees directly in AI deployment conversations.”
Employees demand transparency
The stakes are particularly high when AI is used for critical people decisions. A majority of employees (60%) are uncomfortable with AI determining promotions or layoffs, for instance.
“AI can be a valuable support tool for saving time, summarising information, highlighting trends and drafting early communications,” says Oglethorpe. “But it cannot replace leaders taking accountability, weighing context and nuance and applying empathy tailored to each individual.” Misuse of AI in sensitive areas, such as hiring, also risks reinforcing biases.
AI is most effective when used to support analysis, spot trends and collect evidence. For instance, AI systems might be used to identify patterns in progression rates or diversity metrics across teams, departments and locations. “Time-poor managers can leverage these insights to make better-informed decisions,” says Oglethorpe. “The danger arises when AI starts making decisions rather than informing them. People management is inherently a contact sport and AI should remain a background tool to aid preparation, not replace judgment.”
There is also a strong demand for boundaries: 91% of workers want either strict limits (43%) or clear guidelines (48%) on when and how leaders can use AI in employee interactions. Transparency is also non-negotiable, with 62% insisting on full disclosure of AI use at all times and 23% wanting disclosure whenever it affects them. Just 4% are completely comfortable with no disclosure at all.
“Many employees feel excluded from decisions about how AI systems are introduced, and this lack of transparency can quickly breed mistrust,” says Ben Wright, global head of partnerships at The Instant Group, a workspace company.
AI adoption is moving faster than policy regulating the tech, he says. “There’s a growing gap between the speed at which companies are adopting AI and the frameworks that guide its use. Employees need to understand not only what AI can do but also where the boundaries lie. Without that clarity, mistrust can undermine progress.”
One solution is a Q&A forum, which can build trust and ease anxieties by encouraging employees to share concerns and feedback about AI in the workplace. Such forums help identify areas where AI may be causing stress, enabling managers to address issues and implement thoughtful solutions.
Leadership traits AI cannot replace
Empathy and emotional intelligence sit atop the list of leadership traits that AI cannot perfectly mimic, cited by 55% of respondents. Again, perceptions vary across generations: millennials are the most convinced that empathy cannot be replicated (57%), while generation Z are comparatively more open to AI’s emotional capabilities.
Purpose and social awareness (35%) and vision and ambition (30%) were also flagged as critical leadership traits that machines would struggle to reproduce.
But as AI refines its humour, empathy and negotiation skills, it is only natural to ask whether machines could eventually replace humans in roles requiring social and emotional abilities.
So far, evidence suggests otherwise. A 2025 study involving more than 6,000 adults, conducted by researchers from the Hebrew University of Jerusalem, Harvard University and the University of Texas, asked participants to write about an emotional experience. They then read an AI-generated response, being told either a human or an AI wrote it. Participants consistently rated the same responses as more empathetic when they believed a human had written them.
“People value people,” Field stresses. “The organisations most likely to succeed will be those that value emotional intelligence and invest in upskilling line managers to become strong people leaders.” Worryingly, she notes, conversations around automation far outpace efforts to train line managers in essential people skills.
Still, Field predicts growing pushback against AI and an increased demand for human-to-human contact in the near-future. “Organisations are going to realise that AI can’t do all the things they thought it could,” she says.
AI is rapidly improving at the tasks machines have always excelled at – speed, pattern-matching and optimisation. That is exactly why leaders must focus on what humans are uniquely good at. As senior leaders rush to integrate AI into managerial processes, the challenge is to leverage its capabilities without eroding the human qualities that define effective leadership.
Three years into the AI boom, senior leaders are increasingly outsourcing general communications to machines. Klarna’s CEO, Sebastian Siemiatkowski, recently deployed an AI-generated replica of himself to deliver a financial update, while Sam Liang, CEO of Otter.ai, introduced a “Sam-bot” to attend meetings in his place.
But the broader shift towards using large language models for duties once assigned to humans comes at a cost. More than two in five (43%) of UK workers say they feel deceived when senior leaders rely on AI-generated communications, according to a survey of 1,000 employees conducted by Raconteur in partnership with Attest. Only 25% view these messages positively, suggesting that most still crave genuine, human interactions from those in charge.
This growing reliance on AI is not limited to executive-level communications. It is also reshaping daily managerial responsibilities across organisations. Two-thirds of HR leaders predict that more than half of their team’s routine administrative tasks will be handled by AI by the end of 2026, according to Workday's Global Workforce Report.