Four decades ago, Lee Ross and his colleagues asked students at Stanford University to do a very unusual thing.  They asked them to walk around the Stanford campus for half an hour wearing a large sandwich board that read “Repent,” counting the number of people who spoke with them while they walked around wearing the sign.   Presumably, the data from this unusual task would become part of a study of “communication techniques.” 

In reality, the researchers wanted to compare students who said yes and those who said no to this unusual request.  Ross and his colleagues observed something surprising.  Students who agreed to the unusual request believed that a clear majority of other students would also agree to do so. But students who refused to carry the sign around campus believed that a clear majority of other students would also refuse. 

This tendency is known as the false consensus effect.  We often believe that there is more consensus—that is, more agreement—for what we say, think, and do than is really the case.

False consensus effects apply to all kinds of judgments, but they are much more pronounced for people who are in the statistical minority than for people in the statistical majority.  The extremely rare people who believe that the earth is flat, for example, are very, very likely to overestimate how many other “flat earthers” there are.  In contrast, there is little room for the 99.9% of people who believe that the earth is round to overestimate how many other people agree with them.  As a more mundane example, in a sample in which 30% of college students believed that technology-wielding alien life exists elsewhere in the universe, such students believed that about 60% of their peers shared their minority viewpoint, clearly overestimating agreement with their belief. 

Perhaps the two most important things to know about the false consensus effect have to do with its potency.  First, false consensus effects still exist for important or self-defining beliefs.  Second, neither education about the false consensus itself nor large rewards for accuracy seem to eliminate the false consensus effect.  This bias is hard to eliminate.   

To appreciate this second point, consider a classic study by Brain Mullen.  Mullen wanted to see if the false consensus effect would still occur when avoiding the bias could help people win thousands of dollars in cash and prizes.  Mullen examined data from an old TV game show (“Play the Percentages”). The data provided by game show contestants were their estimates of the percentage of studio audience members who would be able to answer specific trivia questions (such as “What state did Hubert Humphrey represent in Congress?”).  Back when people still remembered Hubert Humphrey (he was once vice-president), 72% of audience members were able to report correctly that Humphrey represented Minnesota. Today, of course, that figure would be much lower.  Mullen found that contestants consistently overestimated the percentage of others who knew the answers to questions when they themselves had known the answers to the questions. 

Consistent with other research, Mullen also observed larger false consensus effects for people whose own answers placed them in the statistical minority.  The rare people who knew the answer to a difficult question were especially likely to overestimate the percentage of others who shared their esoteric knowledge. This was the case even when people were trying very hard to guess correctly the percentage of audience members who did or didn’t know something. As Mullen noted, offering incorrect estimates of what the studio audience members knew in this gameshow cost these contestants dearly. They were rewarded handsomely for more accurate guesses.

The false consensus effect has now been demonstrated for a very wide range of judgments, from what kind of bread people prefer to how often people think other people lie or cheat.  Further, we have learned more about why this judgmental bias is so pervasive. For example, Mauricio Carvallo and his colleagues found that the need to belong (the desire to be accepted and connected to others) nudges people toward false consensus effects. That is, we overestimate consensus for our own attitudes and behavior because we assume that other people who share our opinions are more likely to accept us. In a sense, then, the false consensus effect is a form of wishful social thinking.  It’s reassuring to think that people agree with us.

Cognitive and perceptual biases also contribute to the false consensus effect.  Because most people tend to associate with others who share their own attitudes and opinions, people who ask themselves how many of their acquaintances believe something will be relying on consensus information that comes from a biased sample. For both motivational and cognitive reasons, then, most people believe that what they think and do is more popular than it really is. 

Sometimes this bias is harmless—as when we convince ourselves that our favorite dessert is more popular than it really is.  At other times, however, this bias can be highly problematic.  A case in point is assuming that your spouse shares all of your personal attitudes about childrearing.  Such inaccuracies often become evident, of course, only after we have children.

Finally, recent research suggests that a very close cousin of the false consensus effect —believing that what happens to you also happens to many other people—may play a role in public opinion about climate change.  If the climate in our own state or nation is often chilly, we may conclude that it must be chilly elsewhere on the globe—and take a skeptical stance on the reality of global warming.


For Further Reading

Mullen, B. (1983). Egocentric bias in estimates of consensus. The Journal of Social Psychology, 121, 31-38.

Pelham, B.W. (2018). Not in my back yard: Egocentrism and skepticism about climate change. Environmental Science and Policy, 89, 421-429.

Ross, L., Greene, D., & House, P. (1977). The false consensus effect: An egocentric bias in social perception and attribution processes. Journal of Experimental Social Psychology, 13, 279–301.


Brett Pelham is a social psychologist who studies the self, health, culture, evolution, stereotypes, and judgment and decision-making.  He is also an associate editor at Character and Context.