BlogBest practice

The role of synthesis in creating order from research chaos

Illustration of a person floating in a universe of documents.
Published
25 February 2020
Content
Lucy DentonLisa Nguyen

When you think of researchers in their ‘war rooms’, images of surfaces covered in clippings and Post-it notes come to mind.

Analyzing qualitative information from primary research can be a daunting prospect and most researchers want to do more than simply report facts. They seek insightful truths that compel their teams to empathize before exploring how issues might be addressed.

That’s a noble aspiration, but it begins to whither when faced with a seemingly impossible amount of carefully collected details. How do experienced research professionals confront such a mountain? What kind of alchemy is it that helps them distill meaning from this madness?

Feats of focus

The most overwhelming aspect of research can be the sheer amount of reading that’s required to understand the material. The average one-hour interview transcript might contain 10,000 words and you're looking at half a dozen of these, and that’s before the workshop output, diaries / journals, visual documentation, or observation notes.

Even with a clear focus, and after using data reduction techniques, it’s possible to have more than 70,000 words that need to be digested – the length of some novels. You're expected to read the text, understand and deconstruct it, all within a week or two.

Analysis can be described with a relatively predictable process, but before we’re able to apply a generalized approach, we need to understand the reasons why we do analysis in the first place and the principles that guide it. In this article, we’ll also define what a good insight looks like, and explore some common challenges faced when getting your head around the collected data.

Why analyze at all?

Under pressure, it might be tempting to go directly from doing an interview to defining a solution to fit the perceived needs. In practice, that creates some problems. If done properly, good analysis will connect specific findings to a specific approach, with clear constraints, and even a roadmap for developing particular solutions.

You can’t shortcut how to unpack and understand your audience. Attempts to merely rely on human memories and impressions from interviews are likely to introduce bias. And even if we did keep notes, when we consume raw data directly, we’re in danger of unconsciously giving weight to certain points. From there we’ll likely form misleading opinions that lead to impulsive decision-making, and eventually, take the whole team down a path that focuses on the entirely wrong outcome.

So, the most reliable way forward should aim to assemble the material and make balanced deductions based on the evidence. From there we can use the facts to tell a compelling story. It’s this foundation that accurately communicates the right findings and determines the ultimate quality of the research.

The qualities of a good insight

Overall, the insights we surface should always inspire action in the people receiving the research results. ‘Insight amnesia’ is a phenomenon that a lot of researchers will recognize. After a ground-breaking presentation of the research results, team members and stakeholders return to their work bubbling with comments. Initially, they’re excited but then the attention fades and momentum is lost.

The best way a researcher can impact the direction of a product is to provide insights that speak the truth in a compelling and actionable way. Michael Morgan, Senior User Experience Researcher at Bloomberg, has reflected on what this actually means. He shared a succinct description of what makes a good research insight great in his column on UX Matters and says: Although not all insights from user research will meet all six of these criteria; you should consider insights that:

  • Are grounded in real data: Draw your conclusions only from what you actually see and hear. Let your intuition flow from your interpretation of actual evidence.

  • Use simple language and concepts: Separate any concepts and express them as independent insights—especially if each of them has significant design implications.

  • Are meaningful and memorable: An insight that tells a story is the ultimate empathy builder. When stakeholders hear a compelling story, it is as if they are experiencing the pain or joy of the protagonist.

  • Speak to the audience: The most effective insights are those that move people in a certain way. They come from answering stakeholders’ burning questions that help shape core research goals.

  • Inspire action: Insights from research should be actionable for UX designers and product teams.

  • Reinforce ownership and commitment: When stakeholders feel that they own an idea—or, in this case, a hypothesis—they are more likely to honor their commitment to follow through on any actions their idea generated.

‘Why’, not just ‘what’

It’s common to hear UX practitioners talking about their focus on helping define the ‘why that happens behind the visuals’.

‘Quantitative’ data, with its numbers and metrics, often exposes the symptoms of usability issues, or a mismatch between a product and the needs of the people using it. The strengths of quantitative measurements are in defining ‘how much’, and ‘how often’, and it’s relatively rare to get an answer about ‘why’ something is happening.

‘Qualitative data’, on the other hand, shows behavior and attitude, and embraces the sometimes illogical nature of people to get much closer to understanding motivations and causative factors. However, it’s also not immune from missing the mark. It’s a common misstep in qualitative research to get excited about identifying an issue or a particular behavior, and fail to pursue the reason why it is occurring. Not knowing the root cause, and the context around it, means you’re unable to suggest what needs to be done.

Many research repositories and libraries are structured around findings. You need to be mindful that the luxury of being able to remix findings, participate in a research democracy, or access intelligence over a longer timescale doesn’t excuse anyone from delivering meaningful and timely insights, only possible with a comprehensive meta-analysis.

Without a systematic approach, researchers can often find themselves in a mire of mistakes. The idea is to stay organized and focused by following a reliable process.

According to Maria Rosala, a specialist in UX research with the Nielsen Norman Group, the most common challenges that researchers face when analyzing qualitative material are due to the large quantities of richly detailed and sometimes contradictory data.

Large quantities of information to digest

Maria explains how superficial or selective analysis can be caused by skim reading.

Long transcripts and extensive field notes can be time-consuming to read; you may have a hard time seeing patterns and remembering what’s important.

Data reduction techniques and the confidence and support to launch in and proceed steadily can be helpful.

Dense information

The wealth of detail itself can make it hard to separate useful and superfluous facts, and researchers become frozen with indecision.

The analysis simply becomes a regurgitation of what participants’ may have said or done, without any analytical thinking.

It takes discipline, collaboration, prioritization, and perhaps—counter-intuitively—some additional research to clarify what you’re dealing with.

Contradictory findings

A finding that contradicts another is a common occurrence. It might even be expected when dealing with multiple participants but things can get particularly confusing when a single participant says two different things. Inevitably this increases the difficulty in objectively reaching conclusive findings, because, as Maria puts it, when:

Participant feedback is conflicting, or, worse, viewpoints that don’t fit with the researcher’s belief are ignored.

To compensate, hold the conflicting findings in balance and look beyond their binary nature for other suggestions that might better describe what is happening. To do this, you may need to seek clarification through further research.

Unclear goals for research and analysis

Other challenges might not be inherent in the data, but could stem from a lack of goal setting for the analysis itself. Maria says:

The aims of the initial data collection are lost because researchers can easily become too absorbed in the detail. The analysis lacks focus and the research reports on the wrong thing.

This is an awkward position to be in, as you’re already committed to what you’ve done. If this happens, the first step is to see if there is enough crossover in your research with what you should have been exploring. If there isn’t, you may need to go back and repeat the process, so the earlier you recognize the misalignment, the better.

Allowing synthesis to emerge during analysis

By its most basic definition, analysis is about sifting through research material to identify facts and frame problems. To go beyond a mere assortment of facts and breakthrough into an awareness of what we can conclude about these facts is synthesis.

The best research involves both.

Synthesis can emerge organically during analysis—if we recognize it, and allow it to happen. It’s apparent in those moments when we first glimpse patterns across data sets, or get an intuition about a thread of truth amongst a mishmash of options.

But relying on our instincts in an environment demanding evidence will be tricky.

Harnessing ‘gut feel’ responsibly

Intuition is sometimes seen as a magical sixth sense or as something that emerges from an obscure inner force, but it is actually a mental process triggered by our perception. We might have a fleeting impression of a visual inconsistency, a fact out of place, a facial expression, a sense of tone, or some other thing that has registered without conscious awareness.

It might be more helpful to think of insights as ‘rapid cognition’, or as ‘condensed reasoning’ that uses shortcuts in our brain to link two disparate concepts.

For example, our brain loves to find meaning. It receives multiple simultaneous inputs from our senses, and when attempting to create connections it matches to the closest kindred pattern from amongst the vast repository of our memories and experiences. The definition of ‘closest’ is unique to each of us at any given point in time.

Intuition is therefore not the unconscious processing of cues, such as those used in first impressions. Neither is it the deliberate reasoning used by our forebrain. The non-conscious thinking of intuition has strong links to our subjective preferences and spontaneous feelings. Experience is encoded in our brains as an intricate web of both fact and feeling, and so our understanding and recall of memories are interwoven with emotion.

The seeking of our brains for kindred connections also means that the deeper and longer we go into familiarizing with the research data, the better. The more exposure we have to contexts and the more ways we slice and dice the information in our subsequent attempts to analyze it, the more reliable our intuitions will be. They will be surfacing from a richer array of collected patterns and experiences.

A researcher's brain can be like a cauldron, and the experience of researching as some kind of pottage. Many ingredients can be put in, over and over again, to replenish the goodness over different seasons. Things change with time, and the longer things cook, the more balanced and nuanced the flavors.

As researchers, we’ve already spent hours with people having conversations, asking questions and listening to them, observing behavior and context, and building empathy. To a limited extent, we know them. If we, therefore, resonate strongly with a particular comment and can connect it to a pattern in other comments, then we’re probably onto something worth further exploration.

The art and science of synthesis

It’s important to emphasize the need to connect our hunches to similar comments or data points. It is wise to evaluate our intuition on balance with the evidence, and in the full light of factual data.

Without this kind of backing, our inkling may just be distracting us from reality. Without cross-checking, we’re likely to latch onto the wrong details and pull up the wrong connected web of associations in our brain.

This is also where the importance of objective data collection comes in. We can only rely on our data points for balancing evidence if they are untainted by our opinion and judgments to begin with. The art of intuition is only valuable if we stand on a foundation of good data science.

It is the nature of good analysis and synthesis to hold opposites in tension. We live this daily when the messiness of human behavior conflicts with our clear data recording and decision making, or when the logic of structured thinking is juxtaposed with the unpredictability of obscured nuggets. Being able to work in objective facts while cultivating intuitive insights is just another skill we must balance.

Professional researchers and qualitative analysts have learned how to bridge these divides. At their core, they’re driven to understand the needs of real users. Their hunger for both accuracy and value take them beyond providing just facts and towards communicating meaningful insights. And they successfully harness both empathy and evidence to compel design teams into action, enabling the creation of truly useful products and services.

Ultimately, the best way to motivate action from your research is to help your team make this connection.

If we use these same principles in our approaches, maybe one day we too will find ourselves on the other side of the desk, astounding and inspiring the next generation of researchers with all the contradictions of complex data sets and human capability.

Keep reading

See all

Decide what to build next

Decide what to build next

Start free
Start free

Product

OverviewChannelsMagicIntegrationsEnterpriseInsightsAnalysisPricingLog in

Company

About us
Careers13
Legal
© Dovetail Research Pty. Ltd.
TermsPrivacy Policy

Log in or sign up

Get started for free


or


By clicking “Continue with Google / Email” you agree to our User Terms of Service and Privacy Policy