AI Can Summarise Your Research, But It Cannot Analyse It

Mar 23 / Jean Felix

Every CX professional knows the feeling. You have spent weeks running customer interviews, observing service interactions, and collecting rich qualitative data. The research itself was energising. But now comes the part that nobody romanticises: sitting with the data, reading it line by line, coding it, interpreting it, and writing it up.

Analysis is time-consuming, intellectually demanding, and unglamorous. With the rise of generative AI tools, the temptation to skip this step has never been greater. Drop your transcripts into a chatbot, ask it to "find the key insights," and you have a tidy summary in under a minute.

Some organisations have already started doing this, with stakeholders bypassing their own research teams to get faster answers from AI.

AI tools excel at identifying what was said most frequently. If several participants opened their interviews by saying "I like the new portal, it seems easy to use," a chatbot will dutifully highlight this as a key finding. But an experienced researcher knows to look deeper. Were those participants being polite? Did they go on to describe a series of frustrations that contradicted their initial impression?

AI tends to overemphasise surface-level comments and repeated phrases. It reports what appeared most often, not what mattered most. In customer experience research, the most important insight is rarely the most obvious one.

AI cannot see behavior

This is perhaps the most fundamental limitation. AI chatbots work from text. They cannot observe what a customer actually did during a usability test, a service interaction, or a contextual inquiry. They cannot tell you where a customer hesitated, how many attempts it took to complete a task, or what strategy the customer adopted when the expected path failed.

Experienced researchers know that what people say and what people do are often two different things. That is precisely why good qualitative analysis involves re-watching recordings, not just reading transcripts. AI has access to only half the picture, and it does not know the other half exists.


What AI can do What AI cannot do
Identify frequently mentioned themes across transcripts Detect hesitation, confusion, or workaround behaviours
Summarise what participants said about a topic Recognise when what participants said contradicts what they did
Generate a list of topics covered in an interview Identify what participants did not say or do, but should have
Produce a tidy, formatted summary quickly Interpret the meaning behind ambiguous or contradictory statements
Highlight direct quotes from transcripts Weigh the significance of a finding against the broader research context

What Is Missing Matters Most

Some of the most valuable insights in qualitative research come from what is absent in the data. Perhaps no participant mentioned a feature that would have been genuinely helpful for their task, because it was not discoverable. Perhaps nobody complained about a particular step in the journey, not because it was working well, but because they had already given up before reaching it. An experienced researcher asks: "What did participants not say or not do?" AI does not ask this question. It works with what is in front of it. It cannot reason about absence, and absence is often where the most actionable CX opportunities hide.

Your Credibility Is on the Line

When you hand your analysis to AI, you skip the step of becoming deeply familiar with your own data. This matters enormously when stakeholders push back, ask follow-up questions, or challenge a recommendation. If you have done the analysis yourself, you can answer with specificity and confidence. You can point to the exact moment in an interview where a customer's tone shifted. You can explain why a seemingly minor observation carries strategic weight.

If AI did the thinking for you, you cannot do any of this. You become a messenger for a machine's summary, unable to defend or elaborate on the findings that are supposed to drive your organisation's CX decisions. In a discipline that depends on trust and credibility, this is a risk that no serious practitioner should take.

Use AI as an Assistant, Not an Analyst

None of this means AI has no role in qualitative research. It can be a useful assistant for tasks like transcription, initial sorting, or generating a first-pass summary that you then interrogate and refine. The danger lies in treating AI as a replacement for the analytical process itself.

Analysis is not summarisation. It is interpretation. It requires context, behavioural insight, critical thinking, and the ability to reason about what is not in the data. When CX teams outsource this to a chatbot, they risk producing shallow, incomplete, or misleading findings. Worse, they risk losing the ability to explain or defend the decisions they make on the basis of those findings.

The next time you are tempted to paste your transcripts into a chatbot and call it a day, remember: the value you bring as a CX professional is not in collecting data. It is in knowing what the data means.

Related courses

Interested in similar blogposts?

Remote Journey Mapping: A pragmatic guide 

Why your B2B customer has split personality 

Building customer journey maps on data