Go to app
GuidesResearch methodsWhat is motivated reasoning?

What is motivated reasoning?

Last updated

14 November 2023


Dovetail Editorial Team

It’s human nature to seek the truth through reason. But how can we know when we have achieved this or whether our preexisting beliefs, subconscious or otherwise, have clouded our judgment?

Facing the fact we all have hidden biases can be uncomfortable, but we need to be aware of them and work out how to move past them if we are to arrive at an unbiased, correct conclusion.

What is motivated reasoning?

In cognitive psychology, "motivated reasoning" refers to our use of reason to justify a preconceived motive or belief.

The reasoning becomes secondary to the motive or belief, and this may not even be apparent to us. When we apply reason to retroactively defend a compulsion or predetermined conclusion, we've engaged in motivated reasoning.

A classic example of motivated reasoning is the "nimby" phenomenon. Short for "not in my backyard," it refers to people opposing the proposed development of services or land, for example, purely for the reason that it is near to where they live. Examples of nimby scenarios are:

  • Infrastructure development, such as motorways or bridges

  • Extraction of mineral resources

  • Housing for disadvantaged groups such as halfway houses for drug addicts or accommodation for homeless people

This is just one of countless possible motivated reasoning examples. More examples are given in the FAQs at the end of the article.

How motivated reasoning works

It's important to note that the "motivation" behind motivated reasoning can also be motivation for accuracy. For example, a trusted source may admit they don't know something, even if they stand to gain by pretending otherwise and have the influence to do so.

Nonetheless, the usual implication behind the phrase "motivated reasoning" is that someone has an ulterior motive, subconscious or otherwise, to confirm a preexisting belief.

Strictly within the context of belief-motivated reasoning, confirmation bias (the tendency to filter data to confirm beliefs) plays a major role in hiding a person’s true motivations, including from themselves. A researcher may then start to think they’re better than they are at separating fact from fiction.

If they do this, they’re likely to become super-alert of other people’s "cognitive distortions" or "maladaptive beliefs." This often leads to a divide between what they personally think about events and a more objective or public view.

This type of reasoning makes it easy to label their own beliefs as "distorted" or even "maladaptive." This itself is a suspect form of reasoning, likely with its own unstated motives, such as denying, their own cognitive distortions.

Power Threat Meaning Framework (PTMF)

The Power Threat Meaning Framework (PTMF), published by the British Psychological Society in 2018, states that a strictly cognitive approach to interpreting someone's reasoning is often steeped in bias. They suggest that what's lacking is the social context in which the person’s reasoning arose.

Instead of pathologizing a private belief or enshrining public consensus, the focus must shift to discovering the non-pathological (i.e., good, compelling) reason behind a given motivation. This is difficult because motivated reasoning is usually a subconscious process.

Worse, cognitive dissonance (the discomfort of holding conflicting beliefs) can lead to a researcher's subconscious motivations confirming their biases. Rather than admit them, researchers may even present as though they're doing the opposite! In other words, instead of openly acknowledging their biases, they may behave as though they are taking a stance contrary to their actual beliefs or motives. This can be a subtle form of self-deception or a way to conceal their biases, often unconsciously, to reduce the discomfort of cognitive dissonance.

What causes motivated reasoning?

Motivated reasoning is often caused by strong, unmet motivations. It can create a latent psychological pressure to assert one's motivations, yet hide them so they appear more intellectually compelling.

Most researchers would be honored to be seen as motivated by reason. All too often, though, cognitive scientists find evidence that people begin with predetermined motivations and biases.

Cognitive scientists have attempted many explanations for motivated reasoning, including:

  • Unmet emotional, psychological, or other compelling needs

  • Past conditioning, namely shaping behavior through repeated experiences 

  • Arousal and need for structure (Kunda theory)

  • Bias to support positions one has already achieved (Lodge-Taber theory)

  • Neuronal pathways activated by consistent emotional stimulation

Let’s look at how confirmation bias and cognitive dissonance are two major drivers of motivated reasoning, and how each can be handled for greater psychological harmony.

Confirmation bias

Confirmation bias is our tendency to interpret or look for information that supports our preexisting assumptions. It's like an evidence-gathering process that consistently filters data. This "filtering" can take the form of ridiculing, downplaying, or otherwise judging the worth of data before honestly investigating it.

One theory about the mechanisms behind motivated reasoning is that we engage in a kind of “motivated skepticism”. We can be less critical when looking at information consistent with our preferred conclusions than when faced with data that contradicts our beliefs.

There are dual motivations happening here:

  • To search for information confirming a bias, whether externally or by memory retrieval—making the bias like a hypothesis that requires constant positive reinforcement

  • To diminish contradictory information, and even seek it out, like something to hunt down and eradicate (even if just rhetorically)

They’re two sides of the same coin. After all, isn’t it easier to create supporting arguments for your preferred conclusions than to entertain opposing ideas?

How to minimize confirmation bias

The good news is that when a form of reasoning isn't challenged enough, those sticking to it become less capable of defending it. An open marketplace of ideas can identify logical fallacies, wishful thinking, and outright untruths.

Those relying on confirmation bias and motivated reasoning will experience the mental distress involved in defending poorly backed claims. It doesn't necessarily matter if they're doing so intentionally or not. What matters is that a fair and open debate, whether formal or informal, allows competing or contradictory ideas to capture mass attention.

There are many tools serious researchers use to try and minimize confirmation bias, such as:

  • Opening research up to peer review

  • Repeating experiments to confirm or deny previous findings

  • Removing incentives for specific conclusions

  • Revealing previously hidden research and data

  • Exposing conflicts of interest

Needless to say, confronting bias can make for a downright hostile intellectual arena. One reason is that letting go of bias often involves confronting painful psychological defense mechanisms, as we’ll explore next.

Cognitive dissonance

Underlying the psychology of motivated reasoning is cognitive dissonance, a type of discomfort-driven “gatekeeper” for galvanized belief systems. As motivated reasoning shapes new information around those beliefs, it’s cognitive dissonance that actively protects those beliefs.

What may begin as a low level of discomfort on hearing contradictory data could lead to a response that is more emotional than intellectual. The question is, will this lead to an honest interest in the new data, or rejection, whether by direct opposition or sly, rhetorical maneuvers?

It’s important to understand that cognitive dissonance activates the brain’s posterior medial frontal cortex (pMFC), which plays a central role in helping people identify and avoid adverse outcomes. The term “cognitive dissonance” tends to carry a negative tone (as does “motivated reasoning”), but objectively, who is to decide when cognitive dissonance is “incorrect” or not?

Yet, if knowledge and experience of a perceived threat are lacking, it’s also obvious how fiercely guarded bias can make the motivated reasoner highly defensive, or even take the offensive. Conversely, it can keep us comfortable (over the long term), even when hearing very difficult information.

Good or bad, right or wrong, at the heart of cognitive dissonance is selective information processing and survival-level motivation to maintain mental coherence and cut down on psychological tension.

How to minimize cognitive dissonance

If perceived threats to psychological comfort hold such power over our ability to reason, it should motivate the objective thinker to confront those threats.

There are three ways to reduce cognitive dissonance:

  • Change beliefs: Alter one of the conflicting beliefs to align them with each other.

  • Seek information: Gather more information or evidence that supports one of the conflicting beliefs to justify a particular choice or action.

  • Minimize importance: Reduce the significance of one of the conflicting beliefs to lessen the discomfort it causes.

Research shows that cognitive dissonance affects health behaviors such as smoking, sun protection, and sexual risk-taking. Different dissonance-based health-behavior interventions show promise of improving behavior, attitude, and intention.

How to avoid motivated reasoning and think critically

Making a concerted effort to reduce the causes of confirmation bias can lower the influence that motivations have over our reasoning. Of course, this is somewhat tricky, for isn't some motivation necessary to arrive at the truth?

It may be more reasonable to simply accept major unexplored motivations are driving us, even if we’re unaware of it. For some, taking this view can be the best they can do to orient themselves towards a less biased view.

More tangibly, consider treating yourself as you would a researcher, journalist, scientist, or other figure you hold to a high standard. This might look like:

  • Abstaining from certain research topics that cause a deep emotional response in you, and outsource that research to a trusted, non-biased researcher in your field

  • Remove any personal stake in the outcome, whether financial or other

  • Aim not to make immediate adjustments based on research conclusions—"sit" on the conclusions for a predetermined period

  • Review your conclusions with a trusted confidant who consistently gives honest and well-reasoned feedback, even if it's uncomfortable

  • Compare your conclusions with those of others, aiming primarily for people with no apparent bias or motivation, and/or those with varying motivations


What are motivated beliefs?

Strictly speaking, "motivated beliefs" is a more accurate term for what most people mean by motivated reasoning. "Motivated reasoning" could just as well describe a pure, unbiased motivation to use reason with the absolute minimum psychological bias. By convention, people talk of motivated reasoning with a presumption that all motivated reasoning is belief-driven.

What are examples of motivated reasoning?

Besides the "nimby" example given above, other motivated reasoning examples include:

  • Downplaying research that shows a behavior the researcher enjoys is unhealthy

  • Over-emphasizing research that shows your position in a positive light when higher-quality research paints a more negative picture

  • Learning about a more effective method, then ignoring or dismissing it because this is easier than changing your habits

  • Deriding a practice when others do it, but justifying it when it benefits you

  • Altering the definition of legal terms to make an illegal act legal

  • Someone entrusted with setting ethical standards lowering those standards

The common thread of any motivated reasoning example is a general sense of hypocrisy. Motivated reasoning itself, though, is more like the "tacked-on" excuse that denies there was any hypocrisy to begin with.

What's the difference between motivated reasoning and confirmation bias?

Motivated reasoning relates to arriving at biased conclusions. Confirmation bias relates to acquiring data that confirms a bias. If motivated reasoning is the tendency to promote conclusions that uphold one's beliefs, then confirmation bias is the mechanism for doing so.

Should you be using a customer insights hub?

Do you want to discover previous research faster?

Do you share your research findings with others?

Do you analyze research data?

Start for free today, add your research, and get to key insights faster

Get Dovetail free

Editor’s picks

How to write a research paper

Last updated: 11 January 2024

Implications in research: A quick guide

Last updated: 11 January 2024

Diary study templates

Last updated: 13 May 2024

Related topics

Employee experienceUser experience (UX)Patient experienceSurveysMarket researchCustomer researchResearch methodsProduct development

Decide what to build next

Decide what to build next

Get Dovetail free


OverviewChannelsMagicIntegrationsEnterpriseInsightsAnalysisPricingLog in


About us
© Dovetail Research Pty. Ltd.
TermsPrivacy Policy

Log in or sign up

Get started for free


By clicking “Continue with Google / Email” you agree to our User Terms of Service and Privacy Policy