Why You Can't Replace Myths with Good Information Part 1

or Why You Can't Change People's Preconceived Ideas No Matter what the Facts

How do you know when you have lost an argument?  You move from respectful and thoughtful debate into the use of clichés and labels (i.e. xphobic/facist/ commie/etc.), personal attacks (i.e. I can't reason with you because you are a tool of management or part of the 'conspiracy', etc.), or just throw in a totally irrelevant argument that changes the subject (well, I don't see how that affects the price of tea in China, etc.).  The funny thing is, when you have presented the other person with those facts and they can't win they STILL hold onto their belief stubbornly.  Why is that?

In an article I wrote some time ago on Selecting a Candidate, I received some angry e-mails wondering how how could I write that Sarah Palin and Barrack Obama were (from a talent management perspective) roughly equal in terms of experience. The gist of the e-mails was how dare I write that and do I know what her positions are on x, y, and z. Well, the article was not about positions but about experience. No matter what I said or brilliant facts I presented, it would not have changed a single person's mind. You cannot argue facts to beliefs. In all truthfullness, the article was meant to be somewhat provocative and leads me to the problem that many of us face - how do we change minds? How do we, in a sense, create sustainable change both organizationally and personally? We have all been greeted by a new initiative with a yawn and a 'here comes the next flavor of the month' comment. We know that there are myths and ideas that people have that, the more we try to fight them, the tighter they are clinged to. In fact, we often are frustrated because we cannot replace myths with good information. So, what do we do?

This article, originally published in the Washington Post and written by staff reporter Shankar Vedantam, has profound implications for both organizational development in general and learning and development specifically. The article, reprinted in its entirety below, discusses a University of Michigan study about the difficulty of changing people's minds. This study, along with the work of Dr. Jeffrey Schwartz from the UCLA medical school on how the mind operates, really gives us an insight to changing beliefs.

Persistence of Myths Could Alter Public Policy Approach

The federal Centers for Disease Control and Prevention recently issued a flier to combat myths about the flu vaccine. It recited various commonly held views and labeled them either "true" or "false." Among those identified as false were statements such as "The side effects are worse than the flu" and "Only older people need flu vaccine."

When University of Michigan social psychologist Norbert Schwarz had volunteers read the CDC flier, however, he found that within 30 minutes, older people misremembered 28 percent of the false statements as true. Three days later, they remembered 40 percent of the myths as factual.

Younger people did better at first, but three days later they made as many errors as older people did after 30 minutes. Most troubling was that people of all ages now felt that the source of their false beliefs was the respected CDC.

The psychological insights yielded by the research, which has been confirmed in a number of peer-reviewed laboratory experiments, have broad implications for public policy. The conventional response to myths and urban legends is to counter bad information with accurate information. But the new psychological studies show that denials and clarifications, for all their intuitive appeal, can paradoxically contribute to the resiliency of popular myths.

This phenomenon may help explain why large numbers of Americans incorrectly think that Saddam Hussein was directly involved in planning the Sept 11, 2001, terrorist attacks, and that most of the Sept. 11 hijackers were Iraqi. While these beliefs likely arose because Bush administration officials have repeatedly tried to connect Iraq with Sept. 11, the experiments suggest that intelligence reports and other efforts to debunk this account may in fact help keep it alive. Similarly, many in the Arab world are convinced that the destruction of the World Trade Center on Sept. 11 was not the work of Arab terrorists but was a controlled demolition; that 4,000 Jews working there had been warned to stay home that day; and that the Pentagon was struck by a missile rather than a plane.

Those notions remain widespread even though the federal government now runs Web sites in seven languages to challenge them. Karen Hughes, who runs the Bush administration's campaign to win hearts and minds in the fight against terrorism, recently painted a glowing report of the "digital outreach" teams working to counter misinformation and myths by challenging those ideas on Arabic blogs.
A report last year by the Pew Global Attitudes Project, however, found that the number of Muslims worldwide who do not believe that Arabs carried out the Sept. 11 attacks is soaring -- to 59 percent of Turks and Egyptians, 65 percent of Indonesians, 53 percent of Jordanians, 41 percent of Pakistanis and even 56 percent of British Muslims.

Research on the difficulty of debunking myths has not been specifically tested on beliefs about Sept. 11 conspiracies or the Iraq war. But because the experiments illuminate basic properties of the human mind, psychologists such as Schwarz say the same phenomenon is probably implicated in the spread and persistence of a variety of political and social myths.

The research does not absolve those who are responsible for promoting myths in the first place. What the psychological studies highlight, however, is the potential paradox in trying to fight bad information with good information.
Schwarz's study was published this year in the journal Advances in Experimental Social Psychology, but the roots of the research go back decades. As early as 1945, psychologists Floyd Allport and Milton Lepkin found that the more often people heard false wartime rumors, the more likely they were to believe them.
The research is painting a broad new understanding of how the mind works. Contrary to the conventional notion that people absorb information in a deliberate manner, the studies show that the brain uses subconscious "rules of thumb" that can bias it into thinking that false information is true. Clever manipulators can take advantage of this tendency.


Click here to read Part 2 of How Myths Affect Thinking