Rating: 7.8/10.
The Scout Mindset: Why Some People See Things Clearly and Others Don’t by Julia Galef
A lot of psychological research has shown how we’re prone to cognitive biases, although less has been written about how to overcome them. Many of these biases result from deciding on some opinion and holding onto it, even with contrary evidence, you still find a way to fit the evidence to support your side. Julia Galef calls this the “soldier mindset” because you’re like a soldier, defending your side against enemy attacks. In contrast, the scout mindset is about being willing to readily change your mind, and adopting the scout mindset leads to a more truthful view of reality.
This is easier said than done, of course — intelligence and education don’t give you automatic protection against soldier mindset. One study found that more educated and scientifically literate republicans are less likely to believe in climate change. Julia herself describes many instances where she behaved like the solder mindset and only realized later on.
The soldier mindset is our default mode of thinking, and actually offers some nontrivial advantages. It makes us look more confident, feel better about our situations, fit in to a social group, etc. However, Julia argues that there are more advantages to having an accurate picture of reality, which is crucial for making the correct decisions. Moreover, you don’t have to give up any of these advantages by adopting the scout mindset: motivate yourself by knowing that you are doing the best action, not by lying to yourself. Others’ perceptions of confidence has more to do with “social confidence” like posture and speaking loudly, and less about “epistemic confidence” which is claiming certainty when the situation does not warrant it.
The best sign of a scout mindset is if you often admit you were wrong, otherwise you’re probably a soldier without realizing it. This doesn’t have to be an embarrassing and painful experience, it’s better to think of it as an incremental update to your beliefs (which are continuously updating). Julia recommends some tests to catch cases where you might be biased like applying a double standard, treating the status quo differently, etc.
We tend to dig in to defend our beliefs when we make it part of our identity, and then, debating with the other side makes us even more polarized. It’s better to be flexible in your identity, even if you strongly believe in one side, you should know the other side’s arguments well enough to pass as one of them in a “Turing test”. This can also make you a better debater because you can approach their position from a common ground, but of course, be ready to switch sides if they present compelling evidence.
Overall, the ideas in this book were fairly familiar to me, but it was still an entertaining read. The insidious thing about soldier mindset is you can never realize you’re doing it — you will always feel that you’re being a scout — but the frequency of changing your mind seems like a good indicator. The complexity of the real world makes it impractical to apply full scrutiny to everything, so we’re forced to take (possibly biased) mental shortcuts: being 100% scout mindset is probably unattainable, though we should strive towards it as much as we can.