Table of Contents
In his paper on The Believing Game Peter Elbow proposes that in addition to the traditional "Doubting Game" (skeptical thinking) we should also engage in the "Believing Game" which involves understanding an argument by accepting it as true. He argues that while the skeptical, scientific, method (searching for flaws) is valuable it needs to be augmented by an accepting method (searching for virtues) as well. The Doubting Game has dominated because of its usefulness but it can lead us to nurture blind spots that are protected by our skepticism. By accepting the arguments of someone whose position we dislike we can, potentially find flaws in our own thinking.
An important point that Elbow makes is that there are two levels at which we have to look at arguments - the logic behind the arguments and the actual thing that is being argued for or against. It is possible to make a flawed argument for a good idea and a sound argument for a bad idea. The Doubting Game is a search for flaws in the argument while the Believing Game is a way to examine the underlying position that the argument is trying to make.
|Doubting Game||Believing Game|
Within the sciences you can see many cases where adopting a believing mindset ultimately proved useful, leading to paradigm shifts - the switch from the Earth as the center of the cosmos to having the Earth orbit the Sun, miasma theory switching to germ theory, and the acceptance of Plate Tectonics, among other examples - but Elbow says that it is also helpful to play the Believing Game even if you ultimately don't accept the underlying premise, because without the suspension of disbelief you won't fully understand what's being proposed and ultimately "They may seem wrong or crazy–they may be wrong or crazy–but nevertheless they may still be able to see something that none of us can see."
Graff & Birkstein propose a concrete version of the Believing Game as a game:
To get a feel for Peter Elbow's "believing game," write a summary of some belief that you strongly disagree with. Then write a summary of the position that you actually hold on this topic. Give both summaries to a classmate or two, and see if they can tell which position you endorse. If you've succeeded, they won't be able to tell.
Hedgehogs and Foxes
Besides paradigm shifts this also makes me think of an article in the Atlantic about how experts tend to make horrible predictions because they hold to a specific view and bend conflicting information to fit their view (Hedgehogs) while those who tend to be cross-disciplined generalists who are able to take in information from other experts and update their prior beliefs when conflicting information comes in (Foxes) make better predictions than the specialized "experts".
Einstellung and Shoshin
Now I'm straying way into left field, but thinking of the Hedgehogs and Foxes puts me in the mind of the Einstellung Effect wherein people become stuck trying to apply the same solution even when it is no longe applicable, whereas those lacking experience in the problem can sometimes see possibilities that are not as obvious to the experienced. The point of reading Elbow's paper was to get an idea of how to write effective summaries of other people's work, but instead I think I've talked myself into believing it's a way to keep a "beginner's mind".