By Chris Satullo
Here’s a sad fact of human existence: We are deeply irrational beings. Me, you, all of us.
Some of us, though, are just a bit more aware of how irrational we can be; we throw up occasional defenses against the onslaught of unreason.
Then there are those who don camouflage, grab firearms and march to state capitols to demand an end to social distancing, imagining themselves the heirs to revolutionary Boston’s Sons of Liberty. Sadly, they’re more like the poor souls who bought snake oil from a Wild West huckster.
Ever hear of the Dunning-Kruger Effect?
It’s a maxim of cognitive psychology, proved out in a number of studies. Simply put: The less you know about something, the less likely you are to recognize how little you know. The more incompetent you are at something, the more likely you are not to realize how incompetent you are.
Dunning-Kruger is a twist on a broader human flaw known as “the over-confidence effect.” This is the syndrome mocked by public radio’s Garrison Keillor in his famous description of his fictional town, Lake Woebegone: “where all the women are strong, all the men are good-looking, and all the children are above average.”
We’re all prone to this fallacy. I’m probably a worse driver, golfer and Scrabble player than I fancy myself. One survey of profs at a Nebraska university found that 68 percent of them ranked themselves in the top 25 percent for teaching ability. OK, sure.
But the Dunning-Kruger Effect does have a flip side. In studies of the syndrome, people who are good at a complex task tend to underrate where they stand on the competence scale. They feel they have a lot of room to improve and assume others are ahead of them.
In lay terms, the more they know, the more they recognize how much more they need to know before they can claim mastery.
They are, in other words, like Dr. Anthony Fauci — and all the other skilled health practitioners who are scrambling as fast as they can to decipher the cloaked, clever and lethal adversary known as COVID-19.
You can detect this worry (“I’m still fundamentally in the dark”) in the pained expressions and elaborately couched language that the true experts use when trying to convey COVID-19 information to a panicky public.
They know they need to give folks some counsel to go on. But they also know what they are saying in all good faith today could be altered, even reversed, by a new data set, a new clinical finding, tomorrow. That doesn’t make them liars; it makes them scientists.
They also know that whatever they say will soon be brutally attacked and perhaps fatally misconstrued by people deep in the grips of Dunning-Kruger Effect.
Including one commander-in-chief.
It’s pertinent to note here that a study conducted after the 2016 election and published in 2018 found that, when it comes to the grasp of public issues, the Dunning-Kruger Effect intensifies when people are given cues that remind them of their partisan identities.
Again, to put it in lay terms: Donald Trump makes us all dumber.
Not just those who buy his dangerously ignorant, narcissistic bluster. Those of us who despise him also risk missing key points and undervaluing useful ideas because he or someone in his camp mentioned them. (Hydroxychloroquine, for example, might eventually prove to be a part of a useful COVID-19 treatment regimen for some patients, but a deeply angry part of me still roots, illogically, for that not to happen.)
Modern cognitive psychology (which, by the way, blasts to smithereens the classical economics upon which Republican free market orthodoxy depends) builds upon groundbreaking work done in the late 20th century by two men named Daniel Kahneman and Amos Tversky.
Thinking, Fast and Slow, a 2011 book by the Nobel laureate Kahneman (Tversky died in 1996), documents many of the logical fallacies that we see cavorting perilously across the world stage today. In the book, Kahneman laid out a metaphor for how our minds work, speaking of System 1 and System 2.
System 1 acts lightning fast, trying to keep us alive by helping us make swift sense of the data cascading around us (“Red light!”), fitting it into a familiar narrative. It does so by means of mental shortcuts or rules of thumb. These “heuristics,” as Kahneman called them, often serve us well. But they can also hurl us deep into illogic when a situation is more complicated than System 1 can handle.
Like an internal GPS, System 2 should kick in to help us navigate those situations too complex for System 1 to grasp. (“My usual route is blocked due to a downed tree; what’s my best alternate route to work?”)
In a lot of us, though, System 2 is a bit lazy. One of System 1’s persistent tricks is to cajole System 2: “Relax. No need for you to get involved. I’ve got just the shortcut to figure this out.” Even when the shortcut is pathetically or dangerously inappropriate to the situation.
Racial bias is one of those shortcuts.
Confirmation bias – locking in on information that supports what you want to believe, while suppressing data that questions it – is another. (“The number of deaths is down this week, so obviously it’s safe to go back to work.”)
Two cousin fallacies now very much at play are availability and recency bias – overvaluing some information you happen to possess or some story you just heard: “I know someone who had it and it was just like a mild flu; she’s totally fine now. We’re completely overreacting to this thing.”
Or, “Last night on Fox [or CNN or NPR] they were saying…so it think it’s pretty clear that…”
Again, System 1 is not always wrong. Far from it. For example, these days, System 1 alerts me, when I see a young, mask-less runner chugging towards me, to steer clear of his path by 12 feet. That’s good. But what if, once this crisis calms, System 1 developed a mental shortcut telling me to view all young men as threats to my existence, to be shunned and disdained? I’d need System 2 to kick in and help me think through the complexities of human interaction a little more wisely.
Coming into focus now for researchers is one aspect of what makes COVID-19 so lethal: It can induce in your body what’s called a “cytokine storm.” That is, your body’s immune system feels so threatened by the virus that it goes on tilt. In this runaway mode, your immune system can actually damage vital organs and get in the way of what medical personnel are doing to save you.
Kahneman’s System 1 is a little like the brain’s immune system. It triggers reflexively to help us spot and deal with imminent threats. However, if its quick-twitch shortcuts overwhelm our brains and keep System 2 from doing its job, we can make decisions that might kill us.
Like ending social distancing too soon.
These “Liberate!” rallies at state capitols are a warning sign of a cytokine storm brewing in our body politic.
Still, don’t judge the people waving signs on capitol steps harshly; they are just succumbing to System 1’s innate flaws, traps of illogic to which we all fall prey from time to time.
I reserve my rage and contempt for the invisible puppeteers who are funding and fomenting these protests. They aren’t in the grips of logical flaws. They are driven by the moral flaw of greed. They’re pressing for the economy to gear up prematurely, just so they can get back to their habit of making 100 times more money per day than the rest of us – no matter how many people it kills.
Don’t let them succeed. Keep System 2 at the ready.
—
Chris Satullo, a civic engagement consultant, is a former editorial page editor/columnist at The Philadelphia Inquirer, and a former vice president/news at WHYY public media in Philadelphia.
Is this possibly an example of the Dunning-Kruger Effect? “I reserve my rage and contempt for the invisible puppeteers who are funding and fomenting these protests. They aren’t in the grips of logical flaws. They are driven by the moral flaw of greed. They’re pressing for the economy to gear up prematurely, just so they can get back to their habit of making 100 times more money per day than the rest of us – no matter how many people it kills.”
How do we know what they are driven by or what their motivations are, or is this an example of confirmation bias?
As always, a great piece. Thanks, Chris.
What he said!
But for those daunted by “Thinking Fast and Slow,” an ideal (and entertaining) introduction to the life and thought of Kahneman and Tversky is Michael Lewis’ “The Undoing Project.” What Chris has done here is an excellent application of their thought to our current public situation.