I.
You tried Carol Dweck’s Growth Mindset, but the replication crisis crushed your faith. You tried Mike Cernovich’s Gorilla Mindset, but your neighbors all took out restraining orders against you. Yet without a mindset, what separates you from the beasts? Just in time, Julia Galef brings us The Scout Mindset (subtitle: “Why Some People See Things Clearly And Others Don’t).
Galef admits she’s a little behind the curve on this one. Books on rationality and overcoming cognitive biases were big ten years ago (Thinking Fast And Slow, Predictably Irrational, The Black Swan, etc). Nowadays “smiling TED-talk-circuit celebrity wants to help you improve your thinking!” is more likely to elicit groans than breathless anticipation. And that isn’t the least accurate description of Julia (you can watch her TED talk here).
But Galef earned her celebrity status honestly, through long years of hard labor in the rationality mines. Back in ~2007, a bunch of people interested in biases and decision-making joined the “rationalist community” centered around the group blogs Overcoming Bias and Less Wrong. Around 2012, they mostly left to do different stuff. Some of them went into AI to try to save the world. Others went into effective altruism to try to revolutionize charity. Some, like me, got distracted and wrote a few thousand blog posts on whatever shiny things happened to catch their eyes. But a few stuck around and tried to complete the original project. They founded a group called the Center For Applied Rationality (aka “CFAR”, yes, it’s a pun) to try to figure out how to actually make people more rational in the real world.
Like - a big part of why so many people - the kind of people who would have read Predictably Irrational in 2008 or commented on Overcoming Bias in 2010 - moved on was because just learning that biases existed didn’t really seem to help much. CFAR wanted to find a way to teach people about biases that actually stuck and improved decision-making. To that end, they ran dozens of workshops over about a decade, testing various techniques and seeing which ones seemed to stick and make a difference. Galef is their co-founder and former president, and Scout Mindset is an attempt to write down what she learned.
Reading between the lines, I think she learned pretty much the same thing a lot of the rest of us learned during the grim years of the last decade. Of the fifty-odd biases discovered by Kahneman, Tversky, and their successors, forty-nine are cute quirks, and one is destroying civilization. This last one is confirmation bias - our tendency to interpret evidence as confirming our pre-existing beliefs instead of changing our minds. This is the bias that explains why your political opponents continue to be your political opponents, instead of converting to your obviously superior beliefs. And so on to religion, pseudoscience, and all the other scourges of the intellectual world.
But she also learned that just telling people “Hey, avoid confirmation bias!” doesn’t work, even if you explain things very well and give lots of examples. What does work? Research is still ongoing, but the book concentrates on emotional and identity-related thought processes. Above, I made fun of everyone and their brother having a “mindset”, but this book uses the term deliberately: thinking clearly is about installing an entirely new mindset in yourself in a bunch of different ways.
Galef’s preferred dichotomy is “soldier mindset” vs. “scout mindset”. Soldiers think of intellectual inquiry as a battle; their job is to support their “side”. Soldiers are the people who give us all the military and fortress-related language we use to describe debate:
Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments. They rest on solid foundations. We might hold a firm conviction or a strong opinion, be secure in our convictions or have an unshakeable faith in something.” This soldier mindset leads us to defend against people who might “poke holes” in our logic, “shoot down” our beliefs, or confront us with a “knock-down” argument, all of which may be our beliefs are “undermined”, “weakened”, or even “destroyed” so we become “entrenched” in them less we “surrender” to the opposing position
A Soldier’s goal is to win the argument, much as real soldiers want to win the war. If you’re an American soldier fighting the Taliban, you want to consider questions like “What’s the most effective way to take that mountain pass?” or “How can I shoot them before they shoot me?”, but definitely not “What are the strongest arguments for defecting and joining the Taliban?” Likewise, someone with Soldier Mindset considers questions like “What’s the most rhetorically effective way to prove this point?” or “How can I embarrass my opponents?”, but not “Am I sure I’m on the right side?” or “How do we work together to converge on truth?”
Scout Mindset is the opposite. Even though a Scout is also at war, they want to figure out what’s true. Although it would be convenient for them if the enemy was weak, if the enemy is in fact strong, they want to figure that out so they can report back to their side’s general. They can go on an expedition with the fervent hope that the enemy turns out to be weak, but their responsibility is still to tell the truth as they understand it.
But isn’t there still a war you have to win? Aren’t there some beliefs you want to fight for, such that even if you need to be reasonable when figuring out the best way to proselytize them, they themselves should be beyond challenge? Julia thinks this point is probably further back than you expect. Even a true American patriot might want to consider the possibility that, instead of trying really hard to win the war in Afghanistan, the best thing for the US is to cut their losses and get out. If you were too focused on winning the war because “that’s the most pro-America thing to do”, you might (might!) miss that.
And maybe you’re not just an American patriot. Maybe you only support America because you think it best embodies certain values you really care about. If America had stopped embodying those values, wouldn’t you want to know about it? When Andrew Jackson toasted “the Union - of all things the most dear” didn’t John Calhoun respond with “to the Union, of all things the most dear - except freedom”?
(not that John Calhoun was very good at promoting freedom - maybe he should have used more scout mindset!)
II.
Are Scouts really better than Soldiers? Isn’t this just evidence-less cheerleading for your team (ie Team Scout), exactly the sort of thing Scouts are supposed to avoid?
Julia Galef is extremely prepared for your trollish comments to this effect. She avoids the “Scouts are better than Soldiers” dichotomy, instead, arguing that both these mindsets have their uses but right now we lean too hard in the direction of Soldier. She gives lots of evidence for this (including an evolutionary argument that Soldier was more useful in small bands facing generally simple problems). I’ll review a little of this; for the full story, read Chaper 3 of the book.
One justification for Soldier mindset is that you are often very sure which side you want to win. Sometimes this is because the moral and empirical considerations are obvious. Other times it’s something as simple as “you work for this company so you would prefer they beat their competitors.” But even if you know which side you’re supporting, you need an accurate picture of the underlying terrain in order to set your strategy. She gives the example of the Humane League, an animal rights group that was picketing laboratories to stop animal testing. After a while they evaluated that program and found it rarely worked, and when it did work, the animals saved were only a drop in the bucket. So they tried other strategies, and one of them (pressuring agribusinesses to improve animal welfare) worked really well and saved far more animals. Even though the Humane League remained good Soldiers for their cause of animal welfare, their Scout mindset let them abandon an unpromising strategy and switch to a promising one.
Galef spends a lot of time in Silicon Valley, where the tech crowd has a different objection: don’t you need to be insanely overconfident to launch a startup? 90% of startups fail. But a lot of good founders seem absolutely certain they can succeed. They act as good Soldiers for Team “we’re definitely going to make a billion dollars”, and that certainty rubs off on employees, investors, etc and inspires confidence in the company. Wouldn’t a more realistic Scout Mindset doom them?
Galef says not necessarily. Did you know that Jeff Bezos said outright he started off with a 30% chance Amazon would succeed, even going so far as to tell investors “I think there’s a 70% chance you’re going to lose all your money”? Or that Elon Musk said the odds of SpaceX working were “less than 10%”? Ethereum founder Vitalik Buterin said he’s “never had 100% confidence in cryptocurrency as a sector…I’m consistent in my uncertainty”. And since the book came out, I stumbled on this profile of billionaire Sam Bankman-Fried, which says he believed his chances of success “were only 20% to 25%”.
Galef adds a story from the early days of Intel. They were making computer memory components, and the Japanese were outcompeting them. The executives talked among themselves, admitted they probably couldn’t beat the Japanese, pivoted to a different kind of computer chip - microprocessors - and the rest is history. Even though on the larger-scale they remained Soldiers for their final goal (Intel should make money), being able to play Scouts for their subgoal (what should our strategy be?) served them better than insane overconfidence.
III.
The book divides learning Scout Mindset into an intellectual half (Part II) and an emotional half (Part III - V). The intellectual half emphasizes probabilistic thinking and thought experiments.
You’ve probably heard the probabilistic (aka Bayesian) side of things before. Instead of thinking “I’m sure global warming is fake!”, try to think in terms of probabilities (“I think there’s a 90% chance global warming is fake.”) Instead of thinking in terms of changing your mind (“Should I surrender my belief, and switch to my enemy’s belief that global warming is true”), think in terms of updating your probabilities (“Now I’m only 70% sure that global warming is fake”). This mindset makes it easier to remember that it’s not a question of winning or losing, but a question of being as accurate as possible. Someone who updates from 90% to 70% is no more or less wrong or embarrassing than someone who updates from 60% to 40%.
(this comes up again in the last part of the book, the part on how to be emotionally okay with changing your mind. “Probability update” is less emotionally devastating than “I said X, but actually ~X, so I was dead wrong.")
Not sure how sure you are? The book contains a fun probability calibration exercise. I won’t violate its copyright, but you can find a very similar automated test here
But you probably already knew all of this. One of the genuinely new ideas in Scout Mindset is its endorsement of various counterfactual “tests”. The idea is, imagine yourself considering a similar question, under circumstances that would bias you the opposite direction. If you stick with your opinion, it’s probably honest; if you’d change your opinion in the counterfactual, you probably had it because of bias.
So for example, if a Republican politician is stuck in some scandal, a Republican partisan might stand by him because “there’s no indisputable evidence” or “everyone in politics does stuff like that” or “just because someone did one thing wrong doesn’t mean we should fire them”. But before feeling too sure, the partisan should imagine how they would feel if a Democrat committed exactly the same scandal. If they notice they’d feel outraged, then their pro-Republican bias is influencing their decision-making. If they’d let the Democrat off too, then they might be working off consistent principles.
I try to use this test when I remember. I talk a good talk about free speech, and “don’t cancel other people for discussing policies you don’t like, they have a right to their opinion and you should debate it instead”. But a while back I read an article on Harvard hosted a conference on “the risks of home schooling”, with an obvious eye towards seeing whether they could get home schooling regulated or banned. My first twenty thoughts were something like “is there some way to get revenge on Harvard for being the sorts of people who associate with causes like this?”, plus anger that the administration was probably going to pretend it was neutral on this issue and just “encouraging debate”. Then by my twenty-first thought I remembered this is exactly the sort of thing I was supposed to be against, and grudgingly decided to be more understanding and sympathetic of everyone in the future.
Or: sometimes pundits will, for example, make fun of excessively woke people by saying something like “in a world with millions of people in poverty and thousands of heavily-armed nuclear missiles, you’re really choosing to focus on whether someone said something slightly silly about gender?” Then they do that again. Then they do that again. Then you realize these pundits’ entire brand is making fun of people who say silly things (in a woke direction) about gender, even though there are millions of people in poverty and thousands of nuclear missiles. So they ought to at least be able to appreciate how strong the temptation can be. As Horace puts it, “why do you laugh? Change the name, and the joke’s on you!”
Some other counterfactual tests like this you can try:
Status Quo Test: If you’re defending the status quo, imagine that the opposite was the status quo. Would you be tempted to switch to what you have now? For example, I sometimes feel tempted to defend American measurements - the inch, the mile, Fahrenheit, etc. But if America was already metric, and somebody proposed we should go to inches and miles, everyone would think they were crazy. So my attraction to US measurements is probably just because I’m used to them, not because they’re actually better.
(sometimes this is fine: I don’t like having a boring WASPy name like “Scott”, but I don’t bother changing it. If I had a cool ethnically-appropriate name like “Menachem”, would I change it to “Scott”? No. But “the transaction costs for changing are too high so I’m not going to do it” is a totally reasonable justification for status quo bias)
Conformity Test: Imagine that some common and universally-agreed idea was unusual; would you still want to do it? If not, you might be motivated by conformity bias. Suppose only 5% of people got married or had kids; would you still want to be one of the 5%? Suppose almost everyone started a business after high school, and going to college instead was considered a weird contrarian choice - would you take it anyway?
Again, sometimes this is fine. Doing the same thing as everyone else earns you friends, and is usually good evidence that you’re not making a terrible error. But it’s at least worth being aware of. Julia writes:
When I was a kid, I idolized my cousin Shoshana, who was two years older than me…during a family camping trip one summer….as we sat in her tent, listening to the latest album on her cassette player, Shoshana said “Ooh, this next song is my favorite!” After the song was over, she turned to me and asked me what I thought. I replied enthusiastically “Yeah, it’s so good, I think it’s my favorite too.”
“Well, guess what?” she replied. “That’s not my favorite song. It’s my least favorite song. I just wanted to see if you would copy me.”
It’s possible my cousin Shoshana crossed paths with Barack Obama at some point, because he used a similar trick on his advisors when he was president. It was essentially a “yes man” test: If someone expressed agreement with a view of his, Obama would pretend he had changed his mind and no longer held that view. Then he would ask them to explain to him why they believed it to be true. “Every leader has strengths and weakness, and one of my strengths is a good BS detector” Obama said.
The Selective Skeptic Test: How credible would you consider the same evidence if it supported the other side?
A meta-analysis of ninety careful studies, published by a prestigious psychology professor, shows that there is no such thing as telepathy, p < -10^10. Does that put the final nail in the coffin? Does it close the debate? Is anyone who tries to pick holes in it just a sore loser? Does it mean that anyone who keeps believing in telepathy after this is a “science denier”?
In the real world, a study meeting that description shows there is such a thing as telepathy. Hopefully you left yourself some room to say that you think the study is wrong.
This is another one with some subtlety. By Bayes’ Rule, you should believe evidence for plausible things more than you believe evidence for implausible things. If my friend says she saw a coyote out in the California hills, I believe her; if she says she saw a polar bear, I am doubtful. I think the best you can do here is understand that, a giant meta-analysis proving telepathy is false doesn’t force a believer to change her mind any more than a giant meta-analysis proving it’s true forces you to change yours.
A lot of the best rationalists I know instinctively apply these tests to everything they think. One technique for cultivating this practice (not the book’s recommendation) is to go on Twitter, where the adage is “there’s always an old tweet”. Argue that people who say racist things should be cancelled, and someone will dig up your old racist tweet and make you defend why you shouldn’t face the same consequences. Argue that it’s disgraceful when the other party uses extreme violent language about their outgroup, and someone will dig up an old tweet where you used even more extreme language about yours. Demand that the Republican senator resign for sexual misconduct, and someone will find the old tweet where you said the Democratic senator should tough it out. Eventually, if you want to maintain any dignity at all, you learn to double-check whether your beliefs are consistent with one another or with what you’d believe in vaguely similar situations.
Scout Mindset says: why not try the same thing, even when you’re not on Twitter, just to determine what’s true?
IV.
And one very likely answer is: because it would hurt.
Scout Mindset tries to differentiate itself from other rationality-and-bias books by caring a lot about this. It argues that, while other rationality books just told you what to do, most people wouldn’t do it; they’d be too emotionally attached to their existing beliefs. So after giving a few intellectual suggestions, it goes on a deep dive into the emotional side.
At times, this sounds a little facile. There are lots of pages to the effect of “instead of relying on false beliefs in order to feel good about yourself, have you considered just having true beliefs but feeling good anyway?” The book phrases this a little more politely:
There is an abundance of different coping strategies, and you don’t need to be so quick to go with the first thing you happen to pull out of the bucket. You can almost always find something comforting that doesn’t require self-deception if you just rummage around in there.
For example:
I once felt guilty about something inconsiderate I had done to a friend and spent a week trying to justify my behavior to myself. Should I apologize? “No, that’s unnecessary, she probably didn’t even notice” I told myself, at various times - and “She probably forgave me already anyway” at other times. Obviously I didn’t find these internally contradictory justifications fully satisfying, which is why I had to keep having the same argument with myself again and again.
Finally I asked myself: “Okay, suppose I had to apologize. How would I do it?” It didn’t take me long to draft in my head the rough contours of an apology that I felt I could deliver without too much angst. And when I imagined my friend’s reaction, I realized that I expected her to be appreciative, not angry. Once the prospect of apologizing seemed tolerable, I returned to my original question: “Should I apologize?” Now the answer was much clearer: yes, I should. It’s striking how much the urge to conclude “That’s not true” diminishes once you feel like you have a concrete plan for what you would do if the thing were true.
I’m mentioning this story in particular because because of how it straddles the border between “rationality training” and “being-a-good-person training”. It reminds me of C.S. Lewis - especially The Great Divorce, whose conceit was that the damned could leave Hell for Heaven at any time, but mostly didn’t, because it would require them to admit that they had been wrong. I think Julia thinks of rationality and goodness as two related skills: both involve using healthy long-term coping strategies instead of narcissistic short-term ones.
I know some rationalists who aren’t very nice people (I also know others who are great). There are lots of other facets of nice-person-ness beyond just an ability to acknowledge your mistakes (for example, you have to start out thinking that being mean to other people is a mistake!) But all these skills about “what tests can you put your thoughts through to see things from the other person’s point of view?” or “how do you stay humble and open to correction?” are non-trivial parts of the decent-human-being package, and sometimes they carry over.
In one sense, this is good: buy one “rationality training”, and we’ll throw in a “personal growth” absolutely free! In another sense, it’s discouraging. Personal growth is known to be hard. If it’s a precondition to successful rationality training, sounds like rationality training will also be hard. Scout Mindset kind of endorses this conclusion. Dan Ariely or whoever promised you that if you read a few papers on cognitive bias, you’d become a better thinker. Scout Mindset also wants you to read those papers, but you might also have to become a good person.
(in case this is starting to sound too touchy-feely, Julia interrupts this section for a while to mercilessly debunk various studies claiming to show that “self-deluded people are happier)
Here Scout Mindset reaches an impasse. It’s trying to train you in rationality. But it acknowledges that this is closely allied with making you a good person. And that can’t be trained - or, if it can, it probably takes more than one TED talk. So what do you do?
Scout Mindset goes with peer pressure.
We hear about Jerry Taylor, a professional climate change skeptic who would go on TV shows debating believers. During one debate, he started questioning his stance, did some more research afterwards, decided he was wrong after all, and became an environmental activist.
And about Joshua Harris, a pastor who led a “don’t date before marriage” movement. At age 21, he wrote a book I Kissed Dating Goodbye, which became a hit in evangelical circles. But over the years, he got a lot of feedback from people who said they felt like the book really hurt them. Twenty years later, he retracted the book and urged people to date after all.
And about scientist Bethany Brookshire, who complained online that men always wrote to her as “Ms. Brookshire” vs. women’s “Dr. Brookshire”, proving something about how men were too sexist to treat a female scientist with the respect she deserved. The post went viral, but as it became a bigger deal, she wanted to make sure she was right. So she went over hundreds of past emails and found that actually, men were more likely to call her Dr. than women; the alternate pattern had been entirely in her imagination. So she wrote another post saying I Went Viral. I Was Wrong, which was generally well-received and prompted good discussion.
And:
The example of intellectual honor I find myself thinking about most often is a story related by Richard Dawkins from his years as a student in the zoology department at Oxford. At the time there was a major controversy in biology over a cellular structure called the Golgi apparatus - was it real or an illusion created by our observational methods?
One day, a young visiting scholar from the United States came to the department and gave a talk in which he presented new and compelling evidence that the Golgi apparatus was, in fact, real. Sitting in the audience of that talk was one of Oxford’s most respected zoologists, an elderly professor who was known for his position that the Golgi apparatus was illusory. So of course, throughout the talk, everyone was stealing glances at the professor, wondering: How’s he taking this? What’s he going to say?
At the end of the talk, the elderly Oxford professor rose from his seat, walked up to the front of the lecture hall, and reached out to shake hands with the visiting scholar, saying, “My dear fellow, I wish to thank you. I have been wrong these fifteen years.” The lecture hall burst into applause.
Dawkins says: “The memory of this incident still brings a lump to my throat.” It brings a lump to my throat too, every time I retell that story. That’s the kind of person I want to be - and that’s often enough to inspire me to choose scout mindset, even when the temptations of soldier mindset are strong.
Julia says that these people were able to change their minds so effectively because they had an identity as “scouts”, moreso than their identity as global-warming-skeptics or dating-skeptics or Golgi-apparatus-skeptics or whatever. It was more psychologically painful for them to be obstinate and irrational than for them to admit they were wrong. So they were able to use healthy coping mechanics and come out okay on the other side.
Once she’s finished bombarding you with examples of epistemically healthy people, she moves on to epistemically healthy communities. The rationalist and effective altruist communities get namedropped here, as does the r/changemyview subreddit. At every point, Julia mentions how much she personally respects all these people - and, implicitly, how much she is rooting for you to become like them.
All of this reminds me of a theory of psychotherapy, which is that one way people get messed up is by knowing a lot of messed-up people, so much so that the little voice in their head that tells them what to do, gets trained on messed-up people. When you think “what’s the right thing to do in this situation?” the abstracted voice of your community of epistemic peers answers “Something messed-up!”
Then you get a therapist, who is (hopefully!) a really together, with-it, admirable person. You talk about all your issues with them, so much so that when you have an issue, it’s your therapist’s voice you hear in your head giving you advice about it. When you ask “what would other people think of this?”, it’s your therapist you’re thinking of. Plus, your therapist is credentialed as an Officially Correct High Status Person. She’s not just speaking for herself, she’s serving as an ambassador for a whole world of healthy normal people; her words are backed by the whole weight of polite society. So if you’re making a decision to, like, commit a crime, instead of feeling internalized peer pressure from all your scummy friends to do it, you feel internalized peer pressure from your therapist (and the normal world she represents) not to.
This last section of Scout Mindset seems to be trying something like that. Julia is trying to normalize changing your mind, to assure you that lots of great people who you respect do it, that there are whole communities out there of people who do it, that she does it and she is a TED-talk-having celebrity who you implicitly trust.
One last story, that goes almost a little too far:
One week in 2010, I was following a heated debate online over whether a particular blog post was sexist. The blogger, a man in his mid-twenties named Luke, chimed in to say that he had considered his critics’ arguments carefully but didn’t think that there was anything wrong with his post. Still, he said, he was open to changing his mind. He even published a list titled “Why It’s Plausible I’m Wrong”, in which he summarized and linked to some of the best arguments against him so far, while explaining why he wasn’t persuaded by them.
A few days later, by which point the debate spanned over 1,500 comments across multiple blogs - Luke posted again. He wanted to let everyone know that he had found an argument that had convinced him that his original post was harmful.
He had surely already alienated many readers who believed his original post to be morally wrong, Luke acknowledged. “And now, by disagreeing with those who came to my defense and said there was nothing wrong with my post, I’ll probably alienate even more readers,” he said. “Well, that’s too bad, because I do think it was morally wrong.”
“Wow,” I thought. I admired both the fact that Luke didn’t change his mind in the face of strong pressure, and the fact that he did change his mind in response to strong arguments. I decided to message him and share my appreciation: “Hey, this is Julia Galef - just wanted to tell you how much I appreciate your thoughtful writing! It feels like you actually care what’s true.”
“Hey, thanks - I feel the same way about your writing,” Luke replied.
Ten years after that exchange, we’re engaged to be married.
I know Julia and Luke, they’re both great, and you should absolutely substitute them for whoever was judging you in your mind before. If it would help to have a voice to attach to the name, you can listen to Julia on the Rationally Speaking podcast.
Share this post