The power of a well-designed memory
To bring this to life, Chris shared a personal story that has stuck with him for 20 years: unboxing his first iPod. He described the thrill of the unique packaging, the novelty of the white headphones, and the satisfying weight of the device itself. This was the peak of the experience. But the detail that cemented this as an unforgettable brand moment, was a tiny sticker on the CD-ROM sleeve, revealed only after the initial excitement had passed. It simply said: 'Enjoy'.
That single word showed a deep, human understanding of the customer's journey. The designers at Apple knew people would impatiently toss the instructions aside in their excitement. They anticipated the emotional dip after the initial unboxing and designed a tiny, perfect detail to end the experience on a high.
This is the kind of nuanced, empathetic design that builds brands. And it begs the question: could AI have designed that?
Where AI fits, and where it fails
The pressure on marketing teams to do more, faster, is immense. We’re all being challenged to automate processes and race towards insight using AI. But as Chris cautioned, the pursuit of efficiency can easily come at the cost of genuine customer connection.
He pointed to brands like Klarna and Duolingo, who moved too quickly to replace human roles with AI and saw their customer satisfaction plummet as a result. The failure wasn't in the technology itself, but in a strategy that prioritised cost-saving over customer value. In fact, we are already seeing a shift back, with some challenger banks and airlines now actively advertising the fact you can speak to a real human.
So, where does that leave AI in the world of customer research and UX?
Chris explored several AI-powered tools for facilitating interviews and auditing user journeys. While some show promise for operating at scale or augmenting human-led research, they come with significant risks. AI moderators can lack the ability to follow conversational tangents where the real insights often lie and automated UX audits can produce dashboards full of metrics that are impressive but dangerously misleading, focusing teams on problems that aren't real issues for actual users.
The real danger is what Chris termed "offloading empathy". When we rely solely on an AI-generated report, we miss the non-verbal cues, the personal stories, and the flickers of frustration or delight that a human researcher experiences first-hand. We get the data, but we lose the meaning.