In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.
Naïve realism provides a theoretical basis for several other cognitive biases, which are systematic errors when it comes to thinking and making decisions.
[1][2] It is related to the philosophical concept of naïve realism, which is the idea that our senses allow us to perceive objects directly and without any intervening processes.
[4] Several prominent social psychologists have studied naïve realism experimentally, including Lee Ross, Andrew Ward, Dale Griffin, Emily Pronin, Thomas Gilovich, Robert Robinson, and Dacher Keltner.
"[5] Lee Ross and fellow psychologist Andrew Ward have outlined three interrelated assumptions, or "tenets", that make up naïve realism.
According to their model, people: Naïve realism follows from a subjectivist tradition in modern social psychology, which traces its roots back to one of the field's founders, German-American psychologist Kurt Lewin.
In 1948, psychologists David Kretch and Richard Krutchfield argued that people perceive and interpret the world according to their "own needs, own connotations, own personality, own previously formed cognitive patterns".
"[13] Solomon Asch, a prominent social psychologist who was also brought up in the Gestalt tradition, argued that people disagree because they base their judgments on different construals, or ways of looking at various issues.
"[15] In a seminal study in social psychology, which was published in a paper in 1954, students from Dartmouth and Princeton watched a video of a heated football game between the two schools.
[1] A 1977 study conducted by Ross and colleagues provided early evidence for a cognitive bias called the false consensus effect, which is the tendency for people to overestimate the extent to which others share the same views.
For a study in 1985, pro-Israeli and pro-Arab students were asked to watch real news coverage on the 1982 Sabra and Shatila massacre, a massive killing of Palestinian refugees (Vallone, Lee Ross and Lepper, 1985).
In a study conducted by Pronin, Lin, and Ross (2002), Stanford students completed a questionnaire about various biases in social judgment.
The individual either has been exposed to a different set of information, is lazy or unable to come to a rational conclusion, or is under a distorting influence such as bias or self-interest.
[1] This gives rise to a phenomenon called false polarization, which involves interpreting others' views as more extreme than they really are, and leads to a perception of greater intergroup differences (see Fig.
[6] People assume that they perceive the issue objectively, carefully considering it from multiple views, while the other side processes information in top-down fashion.