The excepts in blue are from the article, What’s the evidence
on using rational argument to change people’s minds?
by Tom Stafford:
"...people didn’t change their minds in the direction of the arguments presented to them, far from it. Rather, people who had pro-death penalty views found flaws and biases in the anti-death penalty studies, and vice versa. The participants in the experiment ended up with more extreme views than they started with - the pro- people becoming more pro and the anti- becoming more anti. The participants in the experiment ended up with more extreme views than they started with - the pro- people becoming more pro and the anti- becoming more anti. This 'biased assimilation effect', whereby we only believe evidence that fits with what we already believe, is no historical artefact...For public issues...it will never be clear what the right answer is...It is not really surprising that their views can’t be dislodged with a few choice anecdotes. Who’d want opinions if they were shifted by the slightest counter-argument..."
"...when people have low involvement in an argument, neither the strong or weak arguments were persuasive. People’s minds were made up, and no argument shifted them. But in the high involvement condition both the strong and weak arguments had a significant effect....in the domain of moral arguments, strong arguments were only persuasive if people were given some deliberation time before being forced to answer."
In an experiment mentioned in the article half of the participants were asked to give reasons why they felt like they did (about an issue), and the other half were asked to give an explanations of how the policy would have effects.
"Both groups then re-rated their position for or against the policy and these 'after' scores were compared with the 'before' scores. The 'reasons' group didn’t shift their views at all, remaining just as entrenched in their positions, for or against, as when they started the experiment. The 'explanations' group did change.....when we are asked to provide explanations for how we think the world works, some of that illusion evaporates, undermining our previous certainty."
People don't change deeply held values or beliefs because we give them isolated facts on an issue that support our own values, beliefs or political/moral views, people change when given the change to engage in their own thinking through explaining, imagining, thinking from a systems view, beyond a list of facts or reasons. If we increase involvement and engagement, not just by being a detached spectator finding facts that support our current beliefs, and explain how something works or imagine how it could work, we can then see the loopholes in our own belief systems and adjust.