Last month, the London Guardian reported that the mantra of the Brexit campaigners was “Facts don’t work… You have got to connect with people emotionally.”
MSNBC’s Maddow Show contributors also fretted about this political information approach spreading to Europe from the United States. But emotion-heavy, data-light appeals have long been part of British and continental European politics: propaganda isn’t an American invention!
Is this really what we’re exporting? Assurances that ‘facts don’t work,’ ‘the Trump success’ is a model worthy of emulation, and subject-matter experts are better left ignored?'” —Steven Benen
I invested eight years in learning the arts and science of informing people. Technical communication is about effective information; rhetoric is about effective persuasion. And while the two fields are so related I could take coursework in the same program and from the same cohort of instructors, they are distinct.
We learn from both fields to handle information with respect and not to use information structures to mislead, obscure, or harm. There’s something deeply unsettling about graduating out of such a program and into a public sphere that does not seem to value “facts.”
But there are psychological reasons why people prefer pathos (emotional appeals) and ethos (credibility markers) to logos (logical argument, data, and fact), and it’s probably not going to change.
On the blog You Are Not So Smart, David McRaney describes several ways our intuitions and habits of thought betray us. The Backfire Effect is one such betrayal based on the fact that we usually treat our beliefs like prized property and guard them more carefully than we do our own homes.
Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.—David McRaney
This is why a site like the Washington Post has ended its debunking column: readers absorbed only initial misinformation. Clarifying myths or correcting partial truths only seemed to entrench people’s prior commitment to stories and claims later proven wrong. Facts weren’t working, and misinformation sticks.
Psychologist and writer M. Scott Peck wrote about this in his own way decades ago. In The Road Less Traveled, Peck explains how we set up homeland security not only around single beliefs but also around our larger belief systems:
Our view of reality is like a map with which to negotiate the terrain of life. If the map is true and accurate, we will generally know where we are, and if we have decided where we want to go, we will generally know how to get there. If the map is false and inaccurate, we generally will be lost. While this is obvious, it is something that most people to a greater or lesser degree choose to ignore. They ignore it because our route to reality is not easy. First of all, we are not born with maps; we have to make them, and the making requires effort. The more effort we make to appreciate and perceive reality, the larger and more accurate our maps will be. But many do not want to make this effort. Some stop making it by the end of adolescence. Their maps are small and sketchy, their views of the world narrow and misleading. By the end of middle age most people have given up the effort. They feel certain that their maps are complete and their Weltanschauung is correct (indeed, even sacrosanct), and they are no longer interested in new information. It is as if they are tired. Only a relative and fortunate few continue until the moment of death exploring the mystery of reality, ever enlarging and refining and redefining their understanding of the world and what is true.”
M. Scott Peck
Peck argues that we have to be “dedicated to reality” in order to stay healthy, to make sound decisions, and to make progress that benefits us and others. How we relate to facts makes all the difference: we can be healthy when we’re continuously relating. Establishing a once-for-all-time relation as if the world is a closed, fixed system doesn’t work. And yet we demonstrate the backfire effect as we resist new facts and undermine the reform that information often requires of us.
People working on issues like climate change are taking this information on board, and rather than simply offering people “the facts” are sharing stories that allow members of the public to engage emotionally, leading with the issues with their values and priorities, not just logically with charts and graphs. LGBTQ organizers are doing likewise.
Why did Britain’s Remain campaign not do that?
How can you engage more than the raw facts when you have opportunities to inform and persuade others in your own career?