top of page

"The Scout Mindset: Why Some People See Things Clearly and Others Don't"

by Julia Galef

After reading this book in August of 2021, I wrote,

 

"This book kept pulling me in...one good idea after another...it's about the practical joys of prioritizing truth."

​

See all my book recommendations.  

​

Here are the selections I made:

​

As the late physicist Richard Feynman once said, “The first principle is that you must not fool yourself—and you are the easiest person to fool.”

 

Even if you’ve never heard the phrase motivated reasoning, I’m sure you’re already familiar with the phenomenon. It’s all around you under different names—denial, wishful thinking, confirmation bias, rationalization, tribalism, self-justification, overconfidence, delusion. Motivated reasoning is so fundamental to the way our minds work that it’s almost strange to have a special name for it; perhaps it should just be called reasoning.

 

Picquart’s process of coming to realize that Dreyfus was innocent is a striking example of what cognitive scientists sometimes call accuracy motivated reasoning. In contrast to directionally motivated reasoning, which evaluates ideas through the lenses of “Can I believe it?” and “Must I believe it?,” accuracy motivated reasoning evaluates ideas through the lens of “Is it true?”

 

Being in scout mindset means wanting your “map”—your perception of yourself and the world—to be as accurate as possible.

 

Of course, all maps are imperfect simplifications of reality, as a scout well knows. Striving for an accurate map means being aware of the limits of your understanding, keeping track of the regions of your map that are especially sketchy or possibly wrong. And it means always being open to changing your mind in response to new information. In scout mindset, there’s no such thing as a “threat” to your beliefs.

 

If you find out you were wrong about something, great—you’ve improved your map, an...

This highlight has been truncated due to consecutive passage length restrictions.

 

Scout mindset is what keeps you from fooling yourself on tough questions that people tend to rationalize about, such as: Do I need to get tested for that medical condition? Is it time to cut my losses or would that be giving up too early? Is this relationship ever going to get better? How likely is it that my partner will change their mind about wanting children?

 

Scout mindset is what prompts us to question our assumptions and stress-test our plans.

 

In our relationships with other people, we construct self-contained narratives that feel, from the inside, as if they’re simply objective fact. One person’s “My partner is coldly ignoring me” can be another person’s “I’m respectfully giving him space.” One person’s “authentic” can be another person’s “rude.” To be willing to consider other interpretations—to even believe that there could be other reasonable interpretations besides your own—requires scout mindset.

 

Being the kind of person who welcomes the truth, even if it’s painful, is what makes other people willing to be honest with you.

 

You can say that you want your partner to tell you about any problems in your relationship, or that you want your employees to tell you about any problems in the company, but if you get defensive or combative when you hear the truth, you’re not likely to ...

This highlight has been truncated due to consecutive passage length restrictions.

 

First, we have to start by taking the soldier seriously. Why is soldier mindset so often our default? What makes it so tenacious? Or, put differently, if scout mindset is so great, why isn’t everyone already using it all the time? That’s the subject of the next chapter: What the Soldier Is Protecting.

 

I try to abide by the rule that when you advocate changing something, you should make sure you understand why it is the way it is in the first place.

 

“What function does motivated reasoning serve?” I’ve broken it down into six overlapping categories: comfort, self-esteem, morale, persuasion, image, and belonging.

 

PERSUASION: CONVINCING OURSELVES SO WE CAN CONVINCE OTHERS

 

When law students prepare to argue for either the plaintiff or defendant in a moot court, they come to believe that their side of the case is both morally and legally in the right—even when the sides were randomly assigned.

 

IMAGE: CHOOSING BELIEFS THAT MAKE US LOOK GOOD

 

Just as there are fashions in clothing, so, too, are there fashions in ideas.

 

BELONGING: FITTING IN TO YOUR SOCIAL GROUPS

 

In some religious communities, losing your faith can mean losing your marriage, family, and entire social support system along with it.

 

That’s an extreme case, but all social groups have some beliefs and values that members are implicitly expected to share, such as “Climate change is a serious problem,” or “Republicans are better than Democrats,” or “Our group is fighting for a worthy cause,” or “Children are a blessing.” Dissent may not get you lite...

This highlight has been truncated due to consecutive passage length restrictions.

 

Fitting in isn’t only about conforming to the group consensus. It also means demonstrating your loyalty to the group by rejecting any evidence that threatens its figurative honor.

 

When you think about all of the things we use soldier mindset for, it becomes obvious why the frequently proposed fixes for it are futile. Such fixes typically involve words like “teaching” or “training,” as in: We need to teach students about cognitive biases. We need to train people in critical thinking. We need to educate people in reason and logic.

 

None of these approaches have shown much promise in changing people’s thinking in the long run or outside of the classroom. And that should not surprise us. We use motivated reasoning not because we don’t know any better, but because we’re trying to protect things that are vitally important to us—our ability to feel good about our lives and ourselves, our motivation to try hard things and stick with them, our ability to look good and persuade, and our acceptance in our communities.

 

Why Truth Is More Valuable Than We Realize

 

WE MAKE UNCONSCIOUS TRADE-OFFS This is one of the paradoxes of being human: that our beliefs serve such different purposes all at once. Invariably, we end up making trade-offs. We trade off between judgment and belonging. If you live in a tight-knit community, it might be easier to fit in if you use soldier mindset to fight off any doubts you have about your community’s core beliefs and values. On the other hand, if you do allow yourself to entertain those doubts, you might realize you’re better off rejecting your community’s views on morality, religion, or gender roles, and deciding to live a less traditional life.

 

We trade off between judgment and morale. When you come up with a plan, focusing only on its positives (“This is such a great idea!”) can help you work up enthusiasm and motivation to carry it out. On the other hand, if you scrutinize your plan for flaws (“What are the downsides? How might this fail?”), you’re more likely to notice if there’s a better plan you should switch to instead.

 

It’s widely known that present bias shapes our choices about how to act. What’s much less appreciated is that it also shapes our choices about how to think. Just like sleeping in, breaking your diet, or procrastinating on your work, we reap the rewards of thinking in soldier mindset right away, while the costs don’t come due until later.

 

As Francis Bacon said, “Hope is a good breakfast, but a bad supper.”

 

Do you tell other people when you realize they were right?

 

How do you react to personal criticism? Maybe you’ve had a boss or a friend who insisted, “I respect honesty! I just want people to be straight with me,” only to react poorly when someone took them up on that. They got offended or defensive or lashed out at the feedback-giver in retaliation. Or perhaps they politely thanked that person for their honesty and then gave them the cold shoulder from then on. It’s

 

3. Do you ever prove yourself wrong?

 

Do you take precautions to avoid fooling yourself?

 

Do you have any good critics?

 

Over the next few pages, we’ll explore five different types of thought experiments: the double standard test, the outsider test, the conformity test, the selective skeptic test, and the status quo bias test.

 

THE DOUBLE STANDARD TEST

 

THE OUTSIDER TEST

 

THE CONFORMITY TEST

 

THE SELECTIVE SKEPTIC TEST

 

THE STATUS QUO BIAS TEST

 

Thought experiments aren’t oracles. They can’t tell you what’s true or fair or what decision you should make. If you notice that you would be more forgiving of adultery in a Democrat than a Republican, that reveals you have a double standard, but it doesn’t tell you what your standard “should” be. If you notice that you’re nervous about deviating from the status quo, that doesn’t mean you can’t decide to play it safe this time anyway. What thought experiments do is simply reveal that your reasoning changes as your motivations change. That the principles you’re inclined to invoke or the objections that spring to your mind depend on your motives: the motive to defend your image or your in-group’s status; the motive to advocate for a self-serving policy; fear of change or rejection.

 

Not all overconfidence is due to motivated reasoning. Sometimes we simply don’t realize how complicated a topic is, so we overestimate how easy it is to get the right answer. But a large portion of overconfidence stems from a desire to feel certain. Certainty is simple. Certainty is comfortable. Certainty makes us feel smart and competent.

 

contingent—that what seems true or reasonable or fair or desirable

 

And given that there are so many ways to cope that don’t involve self-deception,

 

The “self-belief” model of motivation assumes that if you acknowledge the possibility of failure, then you’ll be too demoralized or afraid to take risks. In that model, people who believe that failure is unthinkable are the ones who try the hardest to succeed. Yet in practice, things often seem to work the other way around—accepting the possibility of failure in advance is liberating. It makes you bold, not timid. It’s what gives you the courage to take the risks required to achieve something big.

 

Soldier morale can be effective, at least in the short term. But it’s a brittle kind of morale, one that requires you to avoid or rationalize away new information that could threaten your ability to keep believing in success. Scouts rely on a different kind of morale. Instead of being motivated by the promise of guaranteed success, a scout is motivated by the knowledge that they’re making a smart bet, which they can feel good about having made whether or not it succeeds. Even if a particular bet has a low probability of success, they know that their overall probability of success in the long run is much higher, as long as they keep making good bets. They’re motivated by the knowledge that downturns are inevitable, but will wash out in the long run; that although failure is possible, it’s also tolerable.

 

But most of the time, being wrong doesn’t mean you did something wrong. It’s not something you need to apologize for, and the appropriate attitude to have about it is neither defensive nor humbly self-flagellating, but matter-of-fact.

 

IF YOU’RE NOT CHANGING YOUR MIND, YOU’RE DOING SOMETHING WRONG

 

He calls it a “de minimus error,” an attempt to minimize the inconsistency between observations and theory.

 

BE WILLING TO STAY CONFUSED

 

As Isaac Asimov reportedly said: “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka’ but ‘That’s funny . .

 

Escape Your Echo Chamber

 

Partly because of the explicit community rules, and partly just because of the type of person who is attracted to such a community, the tone of discussions on ChangeAView is very different from that on most of the internet. It wasn’t hard to find examples of comments like the following, any of which would be outliers elsewhere:

 

Before you close this book, consider making a plan for what those incremental steps toward scout mindset might look like for you. I recommend picking a small number of scout habits to start with, no more than two or three. Here’s a list of ideas to choose from: The next time you’re making a decision, ask yourself what kind of bias could be affecting your judgment in that situation, and then do the relevant thought experiment (e.g., outsider test, conformity test, status quo bias test). When you notice yourself making a claim with certainty (“There’s no way . . .”), ask yourself how sure you really are. The next time a worry pops into your head and you’re tempted to rationalize it away, instead make a concrete plan for how you would deal with it if it came true. Find an author, media outlet, or other opinion source who holds different views from you, but who has a better-than-average shot at changing your mind—someone you find reasonable or with whom you share some common ground. The next time you notice someone else being “irrational,” “crazy,” or “rude,” get curious about why their behavior might make sense to them. Look for opportunities to update at least a little bit. Can you find a caveat or exception to one of your beliefs, or a bit of empirical evidence that should make you slightly less confident in your position? Think back to a disagreement you had with someone in the past on which your perspective has since shifted and reach out to that person to let them know how you’ve updated. Pick a belief you hold strongly and attempt an ideological Turing test of the other side. (Bonus points if you can actually find someone from the other side to judge your attempt.)

 

“Jeff Bezos to Employees: ‘One Day, Amazon Will Fail’ But Our Job Is to Delay It as Long as Possible,”

​

bottom of page