Life deals most of us a consistent stream of ego blows, be they failures at work, social slights, or unrequited love. Social psychology has provided decades of insight into just how adept we are at defending ourselves against these psychic threats. We discount negative feedback, compare ourselves favorably to those who are worse off than us, attribute our failures to others, place undue value on our own strengths, and devalue opportunities denied to us–all in service of protecting and restoring our sense of self-worth. As a group, this array of motivated mental processes that support mood repair and ego defense has been called the “psychological immune system.” Particularly striking to social psychologists is our ability to remain blind to our use of these motivated strategies, even when it is apparent to others just how biased we are. 

However there are times when we either cannot remain blind to our own psychological immune processes, or where we may find ourselves consciously wanting to use them expressly for the purpose of restoring our ego or our mood. What then? Can we believe a conclusion we reach even when we know that we arrived at it in a biased way? For example, imagine you’ve recently gone through a breakup and want to get over your ex. You decide to make a mental list of all of their character flaws in an effort to feel better about the relationship ending.

A number of prominent social psychologists have suggested you’re out of luck—knowing that you’re focusing only on your ex’s worst qualities prevents you from believing the conclusion you’ve come to that you’re better off without him or her. In essence, they argue that we must remain blind to our own biased mental processes in order to reap their ego-restoring benefits. And in many ways this closely echoes the position that philosophers like Mele have taken about the possibility of agentic self-deception.

The argument behind the psychologists’ claim has two premises. First is that people need to believe they are perceiving themselves and world the way they really are – that human beings need to maintain an ‘illusion of objectivity.’ Second, psychologists have claimed that once people become aware that some external influence – including their own desire – has influenced their perceptions, they will attempt to correct for that influence to bring their ‘biased’ perception in line with reality. The reason, as Balcetis and Dunning put it, is that “if people… knew they believed some pleasant thought merely because they wanted to believe it, they would also know, at least in part, how illegitimate that thought was.” Thus in essence, psychologists have argued that awareness of our own motivated mental processes means that we see the outputs of those processes as tainted, and as a result are forced to conclude they are untrue. 

However intuitive and appealing this claim may be, there is scant empirical evidence to support it. Instead, if anything, research suggests the opposite – that people can and do benefit from their own biased mental processes even when made aware of them. Which raises the question – how can people manage this feat? How could you truly believe you’re better off without your ex if you are simultaneously aware that this belief was formed by selectively remembering only his or her worst qualities?

Ultimately, I take no issue with the claim that people need to maintain an ‘illusion of objectivity,” meaning the belief that they are seeing themselves and the world as both truly are. Instead I challenge the assumption that awareness that one’s perceptions have been influenced by a biased mental process is at odds with the belief that those perceptions (the products of our biased processes) are objectively true. Outside the context of the self, we are quite practiced at distinguishing between the objectivity of a mental process and the objective truth of a mental product.

Imagine Michael Jordan’s mother, who believes that her son is the greatest basketball player of all time. It is not hard to recognize both that a) the way she arrived at that conclusion is likely to be biased by her own desire for it to be true and b) her bias in reaching that conclusion has no bearing on its veracity. Whether her belief is objectively true is independent of how objective (free from bias) her process of generating it was. Now let’s imagine applying that same process versus product distinction to ourselves. In a world in which a biased mental process means a tainted and therefore incorrect mental product, knowing I thought through only my ex’s worst qualities would invalidate my conclusion that I’m better off without him. But just as Jordan may in fact be the greatest even if his mother only believes that because he’s her son, I may in fact actually be better off without my ex even if I only believe that because I generated a biased list of his faults.

But how do we determine whether a conclusion we were motivated to come to is objectively true, if the objectivity of the mental process that generated it is no guide? Kunda suggested that the ability “to construct a justification of [our] desired conclusion that would persuade a dispassionate observer” constitutes a boundary on motivated reasoning. I would argue that the actual threshold for concluding our mental products are true is much lower than this.

Recent work by philosopher Elijah Millgram highlights that while historically truth has been defined as a binary construct, in reality much of the world is comprised of non-binary ‘soft truths’—a position supported by psychologists who have highlighted the fundamentally ambiguous nature of social reality. Taken together with research on the strength of naïve realism and the differential standards of evidence we apply when judging congenial versus uncongenial propositions, you have a recipe that will allow us to believe nearly anything we want to, whether consciously or not.    

The next time you hear someone say ‘I know I’m biased, but…’ mentally tag it as evidence for the fact that we can simultaneously know that we are biased and believe that we are right. And you should be grateful that we are capable of such mental gymnastics, because it allows us to use our psychological immune processes with awareness and intentionality while still reaping their many benefits.


This post was first published on Scientific American and is shared with the author and editor’s permission.

Emily Rosenzweig is an assistant professor of Marketing at the A.B. Freeman School of Business at Tulane University. She studies judgment and decision making, including a focus on understanding the process and boundaries of self-knowledge.

References:

Gilbert, D.T., Wilson, T.D., Pinel, E.C., Blumberg, S.J., & Wheatley, T.P. (1998). Immune neglect: A source of durability bias in affective forecasting. Journal of Personality and Social Psychology, 75, 617-638.

Pronin, E., Lin, D.Y., and Ross, L. (2002). The bias blind spot: Perceptions of bias in self versus others. Personality and Social Psychology Bulletin, 28, 369-381. 

Sherman, D.K., Cohen, G.L., Nelson, L.D., Nussbaum, A.D., Bunyan, D.P., & Garcia, J. (2009). Affirmed yet unaware: Exploring the role of awareness in the process of self-affirmation. Journal of Personality and Social Psychology, 97, 745-764.

Balcetis, E., & Dunning, D. (2006). See what you want to see: Motivational influences on visual perception. Journal of Personality and Social Psychology, 91, 612-625.

Baumeister, R.F. & Newman, L.S. (1994). Self-Regulation of Cognitive Inference and Decision Processes. Personality and Social Psychology Bulletin, 20, 3-19.

Pronin, E. , Gilovich, T., and Ross, L.(2004). Objectivity in the eye of the beholder: Divergent perceptions of bias in self versus others.Psychological Review, 111, 781-799.

Pyszczynski T. & Greenberg J. (1987). Toward an integration of cognitive and motivational perspectives on social inference: A biased hypothesis-testing model. Advances in Experimental Social Psychology, 20, 297-340.

Mele, A.R. (2000) Self-Deception Unmasked.

Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108, 480-498. 

Millgram, E. (2009). Hard Truths. Sussex, UK: Wiley-BlackwellRoss, L., & Ward, A. (1996). Naive realism in everyday life: Implications for social conflict and misunderstanding. In T. Brown, E. S. Reed, & E. Turiel (Eds.), Values and knowledge. The Jean Piaget Symposium Series (pp. 103–135). Hillsdale, NJ: Erlbaum