People who think their opinions are superior to others are most prone to overestimating their relevant knowledge and ignoring chances to learn more

GettyImages-165763476.jpgBy guest blogger Tom Stafford

We all know someone who is convinced their opinion is better than everyone else’s on a topic – perhaps, even, that it is the only correct opinion to have. Maybe, on some topics, you are that person. No psychologist would be surprised that people who are convinced their beliefs are superior think they are better informed than others, but this fact leads to a follow on question: are people actually better informed on the topics for which they are convinced their opinion is superior? This is what Michael Hall and Kaitlin Raimi set out to check in a series of experiments in the Journal of Experimental Social Psychology.

The researchers distinguish “belief superiority” from “belief confidence” (thinking your opinion is correct). Belief superiority is relative – it is when you think your opinion is more correct than other people’s; the top end of their belief superiority scale is to indicate that your belief is “Totally correct (mine is the only correct view)”.

The pair set out to find people who felt their beliefs on a variety of controversial political issues (things like terrorism and civil liberties, or wealth redistribution) were superior, and to check – using multiple choice quizzes – how well they were informed on the topics about which they held these superiority beliefs. 

Across five studies Hall and Raimi found that those people with the highest belief superiority also tended to have the largest gap between their perceived and actual knowledge – the belief superior consistently suffered from the illusion that they were better informed than they were. As you might expect, those with the lowest belief superiority tended to underestimate how much they knew.

As well as simple background knowledge, the researchers were also interested in how people with belief superiority sought out new information relevant to that belief. They gave participants a selection of news headlines and asked them to select which articles they’d like to read in full at the end of the experiment. Categorising headlines as belief-congruent or belief incongruent, the researchers observed that those participants with higher belief-superiority were more likely to select belief congruent headlines. In other words, despite being badly informed compared to their self-perception, these participants chose to neglect sources of information that would enhance their knowledge.

Finally and more promisingly, the researchers found some evidence that belief superiority can be dented by feedback. If participants were told that people with beliefs like theirs tended to score poorly on topic knowledge, or if they were directly told that their score on the topic knowledge quiz was low, this not only reduced their belief superiority, it also caused them to seek out the kind of challenging information they had previously neglected in the headlines task (though the evidence for this behavioural effect was mixed).

The studies all involved participants accessed via Amazon’s Mechanical Turk, allowing the researchers to work with large samples of Americans for each experiment. Their findings mirror the well-known Dunning-Kruger effect – Kruger and Dunning showed that for domains such as judgements of grammar, humour or logic, the most skilled tend to underestimate their ability, while the least skilled overestimate it. Hall and Raimi’s research extends this to the realm of political opinions (where objective assessment of correctness is not available), showing that the belief your opinion is better than other people’s tends to be associated with overestimation of your relevant knowledge.

Overall the research presents a mixed picture. It shows, as others have, that our opinions are often not as justified as we believe – even for the opinions that we are most confident are better than other people’s. On the other hand, it shows that people are responsive to feedback, and aren’t solely driven by confirmation bias when they seek out new information. The final picture is of human rationality that is flawed, but correctable, not doomed. 

Is belief superiority justified by superior knowledge?

tom stafford head shotPost written by Tom Stafford (@tomstafford) for the BPS Research Digest. Tom is a psychologist from the University of Sheffield who is a regular contributor to the Mind Hacks blog. His latest book is For argument’s sake: evidence that reason can change minds.

46 thoughts on “People who think their opinions are superior to others are most prone to overestimating their relevant knowledge and ignoring chances to learn more”

  1. Every questionnaire includes the percentage of don’t knows – presumably they either know less and realise it or know a lot but still can’t decide on an issue. The article would suggest that these my be the only ones who are well informed about their own capacities. All others are delusional, it would appear: those who know a lot think they don’t and those who know only a little think they know more than they do.

    I’d guess there are some people who both know a lot and believe their opinions to be superior to those they know know less. All this assumes that knowledge is something that can be assessed in terms of data points, however, which I suggest is a naive assumption. Does it take a lot of empirical knowledge to know that mothers generally love their children, for example?

    As a cautionary tale the analysis in study may give us pause for thought but so too would a world in which people went around having no real convictions and no ability to act.

  2. Pingback: fleeting abundance
  3. The study uses political subjects to evaluate belief confidence and bias to ground truth. However, one can argue that political subjects are often up to debate, thus with a subjective part at play, which makes the choice of an appropriate “correct answer” for each quiz question quite difficult or even biased. It would be good to get a follow-up study on a more formal domain of knowledge like mathematics, physics or history, as to clear up the possible influence of such a bias.

    1. After reading the supplementary materials, I can confirm that there is a significant number of questions that are not trivial, such as questions based on polls (eg, Question: how many Americans support this reform? Solution: according to xxx’s poll, 90% do). So the approach is very interesting, but clearly another study in a more formal domain is needed to draw any conclusion.

      1. Also note that often the solutions are chosen according to a single article/source, even though there might be multiple sources with potential contradictory information (with the best example being poll based questions, by picking another poll you get another value).

Comments are closed.