"Humans are a most illogical and fascinating species"
We're none of us perfect. As a species, we hold an incredible number of biases, many of which are entirely illogical.
In this post, I'll outline a few common biases. You might want to keep these in mind when developing your online communications.
The False Consensus Effect
We tend to assume that other people share our beliefs, even where evidence suggests otherwise.
This effect is especially powerful in group settings. An insular group of people tend to assume — wrongly — that the world at large shares their collective beliefs.
The group has reached a consensus. And since they are isolated from the general population, they rarely encounter people with differing beliefs. So they believe everyone thinks the same way.
There's a rather dark extension to the false consensus effect: When shown that some people don't in fact share their group's beliefs, members of the group tend view the "non-believers" as somehow defective or inferior.
System Justification Theory
We not only want to feel good about ourselves (ego-justification) and our groups (group-justification), but also about our overall social order. We therefore tend to support and defend the status quo, i.e. to perceive it as positive and legitimate.
Existing social, economic, and political systems are preferred. Alternatives are disparaged.
Most interestingly, this view is not limited to citizens who benefit from the current system. It also applies to those who are disadvantaged by it (e.g. minorities and the poor).
Being the most dependent on the existing social order, the disadvantaged want and need to believe it is fair and just. So they'll support the status quo, even against their own best interests.
I saw a great example of this when Barack Obama was rallying for health care reform. A protester was interviewed. It turned out she herself didn't have health insurance and would have benefited from reform. Yet she remained rabidly opposed to reform.
The Von Restorff Effect
If something "sticks out like a sore thumb", we're not only more likely to notice it, but we're more likely to remember it.
The item can be made to stand out by graphical treatment (for example, highlighted by color). Or, it can be made to stand out by its very nature. Something that's funny or just plain bizarre is more memorable.
Lake Wobegon Effect
Lake Wobegon is a fictional town in which "all the children are above average".
Logically, not everyone can be above average. Yet there's an overwhelming human tendency to overestimate one's own capabilities in comparison to others.
You've probably noticed how everyone claims to be above average in many ways, including:
- Parenting skills
- Sense of humor
- Lovemaking skills
- Driving ability
The Pseudocertainty Effect
We tend to prefer risk-averse choices if the expected outcome is positive, but will make risk-seeking choices to avoid a negative outcome.
Much depends simply on how the choice is worded. For example, consider the hypothetical scenario* in which "an epidemic is likely to kill 600 people if left untreated".
Test subjects were told that:
- Treatment A will save 200 people, and
- Treatment B has 1/3 chance of saving 600 people and 2/3 chance of saving nobody.
An overwhelming majority opted for Treatment A, which guaranteed a positive outcome for 200 people. (Even though it meant 400 people would surely die.)
Test subjects were given the same data, but phrased in a negative way, i.e:
- Under Treatment A, 400 people will die
- Under Treatment B, there is a 1/3 chance that nobody will die and a 2/3 probability that all 600 will die.
In this scenario, the overwhelming majority chose Strategy B. They were willing to risk the possibility of a larger negative outcome, to avoid the guaranteed negative outcome of 400 people dying.
Note that scenarios 1 and 2 are exactly the same. They're just worded differently. Yet the respondents reached exact opposite conclusions. So be careful how you word your options!
There are, of course, many more biases. If there's sufficient interest, I'll outline more biases in upcoming posts.
* Kahneman, Daniel and Tversky, Amos. "The Framing of Decisions and the Psychology of Choice" Science 211 (1981), pp. 4538, copyright 1981 by the American Association for the Advancement of Science.