Check out the final edition of our newsletter Quaker Concern for 2024
August 12, 2024Build your peace skills: join a free interactive workshop series
August 28, 2024Here’s a wild turn of events. In a widely shared essay the philosopher Quassim Cassam argued that people believe false things—such as harmful and extremist conspiracy theories—because of their intellectual character flaws. He listed a bunch: Gullibility, closed-mindedness, negligence… These, he said, are what cause faulty beliefs. He offered as an example, “the ‘hot hand’ in basketball. The idea is that when a player makes a couple of shots he is more likely to make subsequent shots. Success breeds success.”
Cassam explained how social scientists used “detailed statistical analysis to demonstrate that the hot hand doesn’t exist—performance on a given shot is independent of performance on previous shots. The question is, why do so many basketball coaches, players and fans believe in it anyway?” When the scientists sent their findings around to basketball coaches:
One responded: ‘Who is this guy? So he makes a study. I couldn’t care less.’ This seems like a perfect illustration of intellectual vices in operation. The dismissive reaction manifested a range of vices, including closed-mindedness and prejudice. It’s hard not to conclude that the coach reacted as he did because he was closed-minded or prejudiced… A less closed-minded coach might well have reacted completely differently to evidence that the hot hand doesn’t exist.
Here’s what makes this story fascinating: More recent and robust analysis of the same data, and data from other sports, seems to show that the coaches weren’t just closed-minded, they were… right!
It was the scientists who said the hot hand was a myth who appear to have made the mistake.
This suggests that it was a valid strategy for the coaches to be highly confident in something they’d spent many thousands of hours watching happen. And a more open-minded coach who gave up on the idea of the hot hand because of the original study would have then made coaching mistakes (like pulling a player for their regular rest even though they seemed to be on fire).
With the benefit of the new evidence, it seems like some scientists and philosophers were too quick to dismiss the coaches’ experience. After all, they criticized the coaches’ intellectual character when that character might have actually led them to make better decisions in this case.
The lesson I’d draw here is this: what’s a reasonable way to think versus an unreasonable “intellectual vice” is hard to determine. Our judgments are often relative—what seems true in one moment turns out to be false later when we get some new information. Thinking someone’s character is flawed has costs to your relationship to them. Do you need to be making this harsh analysis of someone? Does it help with your goals for the relationship? Is it even true?
I’ve had the experience of sharing evidence that later turned out to be dubious. In my book Are We Done Fighting? I cited the behavioural economist Dan Ariely several times. Serious questions have since arisen about whether he faked some of his studies.
One reviewer of my book wrote (in what I must say was a very positive review!) that I hedged too often. “I want an author to convey his ideas to me in powerful and concise language, even if he’s wrong.” I get that preference. But the Dan Ariely issue is exactly why I don’t write that way.
Personally, I try to match my tone to my evidence. If I don’t know something from personal experience or extremely thorough evidence, I’ll say “seems to be…” or “my understanding is…” or “right now I’m thinking…” These kinds of phrases give a nod to how much my positions may change; how much I don’t understand or know. Usually, it’s a lot!
But if I was a basketball coach who’d seen millions of shots being taken, maybe I’d feel justified in being highly confident of my assessment of when a shooter has the hot hand. I might feel so confident that I could readily dismiss (alleged) evidence that the hot hand is a myth.
My strategy of making qualified arguments might come with a cost. Studies suggest that many of us trust someone more when they’re firm and unequivocal on moral issues (even when they don’t walk the talk) or when they take a strong stand on social issues, even when we disagree with them.
This isn’t a post about why we shouldn’t believe scientists. Scientists come up with theories (the hot hand doesn’t exist) and, at least for the time being, new evidence then reverses those theories (the hot hand actually does exist). This is science doing what it’s meant to do.
But it does highlight that what counts as an “intellectual flaw” is relative and fickle. All of us likely believe some things that will later turn out to be incorrect. So some degree of curiosity and intellectual humility (the belief not that you are wrong, but that you might be) is valuable. But we all have to decide where to trust ourselves and our experiences—as the coaches and players who kept believing in the hot hand did, and as many people who hold harmful and extremist beliefs would be much better off not doing.
This post originally appeared on Psychology Today. You can learn much more about how beliefs drive conflicts, and practical ways to respond by taking our free online workshop.