Biased Assimilation Effect (Going to Extremes: How Like Minds Unite and Divide)

Our initial convictions are more apt to be shaken if it’s not easy to dismiss the source as biased, confused, self-interested or simply mistaken.
Our initial convictions are more apt to be shaken if it’s not easy to dismiss the source as biased, confused, self-interested or simply mistaken.

IT is well known that when like-minded people get together, they tend to end up thinking a more extreme version of what they thought before they started to talk. The same kind of echo-chamber effect can happen as people get news from various media. Liberals viewing MSNBC or reading left-of-center blogs may well end up embracing liberal talking points even more firmly; conservative fans of Fox News may well react in similar fashion on the right.

The result can be a situation in which beliefs do not merely harden but migrate toward the extreme ends of the political spectrum. As current events in the Middle East demonstrate, discussions among like-minded people can ultimately produce violence.

The remedy for easing such polarization, here and abroad, may seem straightforward: provide balanced information to people of all sides. Surely, we might speculate, such information will correct falsehoods and promote mutual understanding. This, of course, has been a hope of countless dedicated journalists and public officials.

Unfortunately, evidence suggests that balanced presentations — in which competing arguments or positions are laid out side by side — may not help. At least when people begin with firmly held convictions, such an approach is likely to increase polarization rather than reduce it.

Indeed, that’s what a number of academic studies done over the last three decades have found. Such studies typically proceed in three stages. First, the experimenters assemble a group of people who have clear views on some controversial issue (such as capital punishment or sexual orientation). Second, the study subjects are provided with plausible arguments on both sides of the issue. And finally, the researchers test how attitudes have shifted as a result of exposure to balanced presentations.

You might expect that people’s views would soften and that divisions between groups would get smaller. That is not what usually happens. On the contrary, people’s original beliefs tend to harden and the original divisions typically get bigger. Balanced presentations can fuel unbalanced views.

What explains this? The answer is called “biased assimilation,” which means that people assimilate new information in a selective fashion. When people get information that supports what they initially thought, they give it considerable weight. When they get information that undermines their initial beliefs, they tend to dismiss it.

In this light, it is understandable that when people begin with opposing initial beliefs on, say, the death penalty, balanced information can heighten their initial disagreement. Those who tend to favor capital punishment credit the information that supports their original view and dismiss the opposing information. The same happens on the other side. As a result, divisions widen.

This natural human tendency explains why it’s so hard to dislodge false rumors and factual errors. Corrections can even be self-defeating, leading people to stronger commitment to their erroneous beliefs.

A few years ago, for example, both liberals and conservatives were provided with correct and apparently credible information showing that the George W. Bush administration was wrong to think that Iraq had an active unconventional weapons program. After receiving the correct information, conservatives became even more likely to believe that Iraq had such weapons and was seeking to develop more.

The news here is not encouraging. In the face of entrenched social divisions, there’s a risk that presentations that carefully explore both sides will be counterproductive. And when a group, responding to false information, becomes more strident, efforts to correct the record may make things worse.

Can anything be done? There is no simple term for the answer, so let’s make one up: surprising validators.

People tend to dismiss information that would falsify their convictions. But they may reconsider if the information comes from a source they cannot dismiss. People are most likely to find a source credible if they closely identify with it or begin in essential agreement with it. In such cases, their reaction is not, “how predictable and uninformative that someone like that would think something so evil and foolish,” but instead, “if someone like that disagrees with me, maybe I had better rethink.”

Our initial convictions are more apt to be shaken if it’s not easy to dismiss the source as biased, confused, self-interested or simply mistaken. This is one reason that seemingly irrelevant characteristics, like appearance, or taste in food and drink, can have a big impact on credibility. Such characteristics can suggest that the validators are in fact surprising — that they are “like” the people to whom they are speaking.

It follows that turncoats, real or apparent, can be immensely persuasive. If civil rights leaders oppose affirmative action, or if well-known climate change skeptics say that they were wrong, people are more likely to change their views.

Here, then, is a lesson for all those who provide information. What matters most may be not what is said, but who, exactly, is saying it.

Cass R. Sunstein, a law professor at Harvard and the author of “Going to Extremes: How Like Minds Unite and Divide,” was until last month the administrator of the White House Office of Information and Regulatory Affairs.