Comment deleted
Expand full comment

So what you're saying is... Tsuyoku Naritai!

Expand full comment

I've only occasionally come across anti-rational-community sentiment online, but the way I've always interpreted the criticism is something like 'the rationality community is very powerful (because it contains a bunch of silicon valley people) and that even if the rationality community is 90% as rational as they think they are, that 10% mismatch between perception and reality is enough to do a lot of damage'.

I liken it to criticisms of the church with respect to compassion etc: yes the implicit goal of the church is to be more compassionate than average, but if you're only 90% as compassionate as you actually think you are, and you wield enormous power, that leaves lots of room for damage.

I don't agree with these criticisms when directed at the rationality community but I understand the logic.

Expand full comment

I think the problem is that people are disposed to SEVERELY overestimate the level of certainty in scientific conclusions, and indeed fact claims about the world in general, and SEVERELY underestimate the amount of dependence on assumptions, interpretations, and theories involved in constructing such claims. If you look carefully at how science is done (starting with e.g. the replication issues in everything from biology to psychology), how academic and medical peer review and publishing works, and how many assumptions are needed to construct narrative claims in e.g. journalism, it becomes reasonable that a very large number of "facts" people pound the table about and scream are true are in fact highly contestable and provisional. Yes, there is good and bad reasoning but many fewer controversies can be definitively established by good reasoning than is often claimed.

Expand full comment

It seems like there are some parallels to the argument by some theologians that atheism is itself a theology, or faith-based, or such.

Expand full comment

I wouldn't say there's no such thing as being better at rationality. I would say, holy shit, rationalists have gone off some weird and wildly unsuccessful and in some cases deeply destructive places as part of that pursuit. That doesn't make me say the whole thing is fruitless, but the pursuit of it as your goal is maybe not the best way to get closer.

Expand full comment

My problem with the rationalist community is that, while I agree with so many WAYS of thinking, and I agree with so many leaders, so many of the members that show up in comment sections, or at chatrooms and meetups, have used the principles and methodologies of rationality to argue for rediculus positions that have glaring holes in the logic one way or another.

In addition, it feels like the average member of the society is often out of touch with social norms and decorum.

In a perfect world, the followers of rationality could depend on each other, allow each of us to specialize at applying the tools to a specific field, but instead, I find myself trusting rationalists slightly less than anyone I would meet in any other community on their overall decision making abilities.

Expand full comment

I think the point that there are gradations of more or less rational is a perfectly legitimate and I further think that we should frequently (but not always) work to be more rational. I think the rub is a) there are elements of human life that will always defy rationality as conventionally define, particularly those related to our emotional selves and b) the question endures whether the human brain has evolved in such a way to achieve a particular kind of eventual objectively rational state that truly transcends human subjectivity and all of the fallacies and problems we associate it with. My problem with rationalists is that most simply assume the answer to b) is yes.

But sure, more or less rational exists.

Expand full comment

Don't disagree with your core point; I'd add that the sociology of scientists is anything but rational, and that this is more hypocritical than when irrationalists are irrational (since we don't expect otherwise). https://whatiscalledthinking.substack.com/p/when-scientists-function-like-priests

Expand full comment

Doesn't anyone who believes they are correct about a thing think they are more rational than whoever disagrees with them about that thing?

Or, as Steven Pinker says: "Reason is non-negotiable. As soon as you show up to discuss the question of what we should believe (or any other question), as long as you insist that your answers, whatever they are, are reasonable or justified or true and that therefore other people ought to believe them too, then you have committed yourself to reason, and to holding your beliefs."

Expand full comment

The OP is a muddled, but directionally right. The fact that things can be, say, more political or less political, doesn't mean that something can be totally apolitical. For rationality, certainly it's possible to be more or less rational, but I think it's much harder than Scott is acknowledging here. IMHO, it requires multiple iterations of steelmaning.

Expand full comment

I would make a stronger claim that the one that the quoted poster is making -- that the rationality community feels more like a culture or a religion than a group of actually rational people. Much in the way that Christians would generally say that they highly value moral goodness as an important aspect of being Christian and can even be brought together as a community with a set of shared values around this -- while at the same time it isn't obvious that Christians are morally better than other similar groups -- I'd say that members of the rationality community can celebrate and value the concept and culture and performance of rationality without actually having to be particularly rational. And just as aspects of Christian culture might make someone *less* good while also making them quite confident that they are doing the moral thing -- for example, in terms of how a person feels they should treat LGBT people or sexually promiscuous people -- so too could rationality culture potentially make people even less rational on certain topics while simultaneously encouraging people to believe they are more correct than they are. Like moral goodness, rationality is very hard to achieve. (I actually think it would be super fun to have some kind of study on whether self-identified rationalists are actually more rational than non-rationalists or anti-rationalists of similar educational and professional attainment across a variety of topic areas!)

Expand full comment

Y'know, the older I get, the more I think there is no one Platonic Rationality, at least for mundane individual affairs.

Rationality is fractal. There is often no one right answer as to what is Most Rational in a given situation. It all depends on the exact situation and the exact individual(s) or more generally initial prior(s) involved.

To use the framework of meaningness.com, Level 5 rationality is where we need to be. Seeing naive Level 4 rationality, in my intuition, is what your original Reddit commenter meant by "hating its specifics".

I dunno, maybe what I am thinking is Level 5 rationality is not really rationality. Maybe that (ie, knowing when to apply Level-4 Rationality) is just what wisdom is. Am I totally off-base? Is this old hat and goes by other names within the rationalist community?

Expand full comment

Freddie has a surprisingly strawmannish view of the rationalist community given how much he’s interacted with them.

> I like the culture [...] However

> 1. ⁠There is no such thing as "rationality" that is free from ideoloy

> 2. ⁠They have too much faith in the power of their own cognition

> 3. ⁠Their indifference to human emotion and social cues is not integrity, it's a refusal to confront the material importance of emotions

> 4. ⁠Eliezer Yudkowsky is their king and he's kind of an asshole

> 5. ⁠We're all just scrambling around trying to find meaning and understanding with brains that are the imperfect products of evolution.

The closest of these to a real criticism of rats who I know and respect is #2. (I respect Yud but I’ve heard a rat defined as “someone who disagrees with Eliezer about something.”) I have rat friends who lean a bit hard on their personal understanding of economics or freedom or intelligence, they could stand to chill a bit. But the rest are pretty silly, especially 5 which could itself be the tagline of the movement.

Expand full comment

Question about Alex Jones and the highly irrational. Do they know they are being highly irrational?

Like anyone I do a ton of irrational things. But I’m aware that I do a ton of irrational things. Does Alex Jones just think it all makes perfect rational sense?

Expand full comment

Rationality is a process, not a property. There are properties that enable rationality, like intelligence and discipline, but rationality is something a person does, not something they are. It's also something a person does from a specific starting point: what updating priors looks like depends on what your priors are. And while we might imagine that all rational people will converge -eventually- on certain positions, that's not to say their journeys will look similar at all.

Expand full comment

Alex Jones isn’t irrational- he’s completely rational in pursuing the goal of his media business which is to make money.

Expand full comment

I think on the whole, people interpret "I am a rationalist" similarly to "I am law abiding", not "I am kind". That is, they view you as someone cosplaying a vulcan, not a regular person trying to fight their biases.

Expand full comment

My feeling is that a lot of people have an instinctive reaction to claims to be [highly rational and thus minimally ideological], and to overcorrect and pattern-match against that.

The correct formulation of "everything is ideological" is that (1) even a hypothetical 100% rational agent is going to have goals, and those goals will guide what they consider good and also hence worth investigating, and (2) GIGO, even a 100% rational individual is going to have false inputs from societal common sense and propaganda and so on. (Neutrally reading the best philosophical arguments in 1355 will lead to very different ideas on metaphysics than in 1955, and perhaps by pessimistic induction in 2755 as well.) Also the propositional components of ideology are things that one can be rationally led to.

I'd say "rationality and ideologicalness" are orthogonal, in part because I think the ideal (not even merely achievable) agent is fully rational and fully ideological, in the above sense, and because there are non-ideological sources of error, but of course the "ideology is the opposite of rationality" instinct is coming from some truths too, specifically some sources of error that rationalists and others have long emphasized: wishful thinking, filter bubbles (a social-level effect,) not making identity small, blah blah blah. These are virtues worth practicing, and I'm glad the rationalist crowd (again amongst others) have brought attention and thinking to them.

Expand full comment

It's as though someone came across RationalWiki and thought it represented the rational community.

Expand full comment

I know this isn't really the point of the article and it was supposed to be selected as a non-controversial point, but I can think of several instances where I'm a lot less confident in the whole "Name-brand, government science is much better and more reliable than corporate science" system than I'm supposed to be.

I think the most significant way I'm different in this is that you seem to be doing some sort of math that says "Well, the corporate scientists are being paid to say things that go a certain way - if they refuse to do that, they won't get paid, so the incentives are fucked. You will only get science that confirms things in one direction from them". And that's so, or at least it wouldn't surprise anybody if we found it to be so.

But I've been around, say, tobacco control epidemiology enough to know that the same thing is true for them; they only get "paid" for saying that tobacco is maximally harmful in every way and giving science that supports any ban at any time; anyone who seriously breaks from this either never gets their career off the ground or is immediately ignored forever after that. There's a guy, probably the most famous guy in that field, whose name is Stanton Glantz. He's basically famous and successful because he's reliable - he never says anything that isn't maximally negative toward tobacco. He's also known for not being a particularly good scientist or particularly truthful, but this has never actually hurt his career in any way - he does the thing that gets him paid, the grants keep rolling in, and it's very very clear where his incentives are.

I'm not sure how well this applies to something like climate science, at least to the extent I'd say their credibility is ruined or the science is all bullshit or something. But it's clear enough their incentives are 100% in line with saying climate issues are maximally important and dangerous; they went from "field that barely exists" to "major field of science that everyone talks about and has conferences and huge grants and unlimited funding" in a few decades by pursuing that line of thought. That doesn't mean it's all wrong or fake, but it makes me feel uncomfortable when someone goes "well, clearly all the corporate funded studies are biased and unreliable" without mentioning the in-lots-of-ways similar perverse incentives on the other side.

Expand full comment

I agree with the general points in Scott's post, but I don't think they are really doing the work he wants them to do.

The problem, in a nutshell, is that virtually everybody is better at rationality than Alex Jones. The U.S. Congress is more rational than Alex Jones. The Elk Club is more rational than Alex Jones. The FDA is more rational than Alex Jones. It is a weak claim.

Rationalists are making a much stronger claim. I suspect that the rationalist community as a whole believes that it is generally more rational than many, most, or maybe nearly all other agglomerations of humans, even if it hasn't achieved perfect rationality. Forget Alex Jones. Is the rationalist community more rational than the Republican party? The Democratic party? The editorial board of the New York Times? The cast of Fox and Friends? My guess is that most rationalists would say yes. (My guess is that most rationalists would feel strongly that the answer is yes.)

As a generally sympathetic observer of the rationalist community, I would say: maybe. My sense is that the rationalist community is more "rational" (meaning has a higher propensity to believe things which are actually true) than other communities in some respects and less so in others. In other words, the rationalist community has some systematic biases. In other words, it has an ideology.

If nothing else, it has a belief in rationality as a concept, as an end achievable through specific practices, as an item of value, and probably as a moral good as well.

Perhaps more importantly, the presence of systematic biases indicates that the rationalist community is blind to its biases. I would expect a truly rationalist approach to life to entail a radical epistemological humility. That is not my experience of the rationalist community as it actually exists, whatever lip service it might pay to the notion.

This is in no way meant to be a scathing critique of rationalism. As I said, I'm a generally sympathetic observer. I think it is possible and worthwhile to be "less wrong." I just think the rationalist community is probably less less wrong than it supposed itself to be, due in part to, yes, its ideology.

Expand full comment

This is an old motte and bailey. Nothing is perfectly rational or objective is the motte and "I can be explicitly biased/partisan/prejudiced/whatever" is the bailey. You act explicitly prejudiced or to support your party and the moment someone accuses you of bias point out that being unbiased is impossible anyway. Yes, but there's a difference between someone failing to be perfectly rational and someone who's not even trying.

Expand full comment

I think this really comes down to people being unwilling or unable to stare at their own hypocrisy in the face.

Freddie agrees that he sees people who claim to truly want to live for only 80 years as a cope. I agree with him here. But I think that producing arguments for "why I am not a rationalist" is also a cope. We all know rationality is really hard, and that we're constantly failing at it. We all know that we SHOULD make choices as rationality as possible and that that would require putting a lot of boring effort into really hard things, and then continue to be really boring. You'd have to learn Bayes!

But I think I'm just comfortable being somewhat hypocritical on this. I think we should all be comfortable admitting that we're hypocrites, that we aren't working as hard as we could towards being as rational as possible as much as possible even though we believe that's what we should be doing. And that's okay.

Expand full comment

I came across an interesting claim recently that there are 'pragmatic contradictions'--you can't say "I am asleep right now", for example, because being asleep is contradictory with talking. Similarly, you might think that saying something like "I am very humble" or "I am very wise" is not quite a pragmatic *contradiction*, but is in that direction--a commonality of humble people is that they don't say sentences about how humble they are, and perhaps a commonality of wise or unbiased or objective people is that they don't say sentences about those things.

I think I mostly don't buy this, and feel something like your position--not about humility (I don't pretend to be especially humble) but about wisdom / rationality / etc.; actually I *have* put a lot of effort into this, and started out ahead, and so on; looking around it really is my best guess that I'm 'remarkably good' along those axes, such that me remarking on being good there does actually make sense. [While, of course, having a TAP to check: why exactly do I want to remark on that in this context? Only sometimes does it make it past that filter, and so the thing I'm pointing out here is more like "that filter doesn't kill all instances where I want to say it, and I don't think it should."]

Expand full comment

Are you arguing that "there *is* such thing as 'rationality' that is free from ideology?" "Traditional" mathematics was probably considered free of ideology. But then along came Constructive Mathematics, which forced an ideological choice. It seems to me that it's hard to argue that there are *no* ideological commitments in any particular perspective. After all, how would you know that you just haven't found them yet?

Expand full comment

I wouldn't say you're missing the point of those arguments (everything is political, enmeshed in ideology etc.) because I trust you understand them, but I do think you are answering them from a viewpoint which takes for granted that people are only advocating for a weak version of their arguments and I think you are assuming some thing that stronger proponents of those statements would reject.

I think there is probably a weak version and strong version of those original statements and viewpoints. The weak version is the one people use pretty often - everyone is biased, so you can't call me out on being biased; or [group I disagree with] is cool but their viewpoint comes from ideology/politics just like everyone else's, as if that diminishes their worth. This is generally a claim to political on a Red Team - Blue Team axis, and a position which would agree to your statement that it is possible to be "bad at objectivity", i.e., Alex Jones or fossil-fuel funded studies. These are the people who say "everything is political" yet still point to studies that "prove" gender differences don't exist as a victory for feminism. They use these statements as arguments to invalidate their opponents without fully thinking through their implications. I think this is the version you are responding to the most directly, and if we are thinking of political/biased/ideological as similar to their usage in common parlance, you are right - things can be more or less biased or political etc., pretty evidently.

The strong version of these original statements mostly exists within critical/social theory/continental philosophy academics. Everything is political isn't a statement about things having clear and obvious implications that deal with what we commonly think of as the political sphere, i.e., policy and governance, and has nothing to do with how Red or Blue or in-between a thing is. it relates more to power and marginality, race-class-gender, control-coercion-discipline, ideology and discourse, and the production of all of the above. Everything is political means that we can't and shouldn't think of things without considering their role in maintaining power structures, maintaining institutions, and producing subjects that are classified in certain ways.

When looked at this way, a reminder that something is political is a push back against a claim to the apolitical, and I think a push back against a claim to something being more or less political (in a broad sense) than something else. In this understanding of the political, saying something is less political than something else is impossible - everything exists within a discourse that is political, i.e., both a language and a social sphere in which terms are laden with power and perpetuate structures and systems and institutions.

In this sense, a critique of rationality that mentions "everything is political" or "...ideology" seems to me to be pushing back against an image of objectivity or neutrality, and instead taking the stance that nothing is neutral, nothing is outside of the political (broadly defined) and concepts need to own their ideological implications. That being political isn't bad or good, it just is the state of everything.

In the instance of Alex Jones, a critique of him from this viewpoint may rest on the actual positions he espouses as opposed to his justifications. A critique of a fossil-fuel company funding a study may rest on their utilization of capital to promote a favorable viewpoint and perpetuate a system in which the rich and powerful exploit natural resources to the detriment of those with less power or influence.

So overall, I get your argument, I just don't think you are taking the strongest version of the points you are countering.

Expand full comment

These were my own main criticisms of Yudkowskian rationalism, which have so far gone without comment: https://www.goodreads.com/review/show/3850071573?book_show_action=false

Expand full comment

The best version of the criticism of the use of “rationalist” would, I think, point out that the assertion that one is a rationalist happens in a context. It doesn’t merely describe the world in a value-neutral way, it draws a contrast with other people, and assigns higher value to the rationalist approach. In most contexts, the other people one takes seriously enough to even bother contrasting with aren’t on an Alex Jones (the character; as others stated, Alex Jones the performer has found a very successful niche) level, they’re the most rational spokespeople for some paradigm other than rationalism. So, one is effectively claiming to be more rational than the most rational people from non-rationalist paradigms, because those paradigms don’t even allow for rationality as great as the rationalist’s. It’s an indictment not typically just of individual capacities, but of entire paradigms, and suggests that the rationalist thinks their paradigm objectively preferable to every other salient paradigm.

Which, from within rationalism, it is. Lots of paradigms are the best according to their own standards. The trouble is that there isn’t a sensible way to compare them without assuming one of them, and none of them is better at everything people care about than all the others. So, the selection of some of those needs to privilege isn’t meta- rational, it’s just … a choice. Rationalists tend to think that rationality isn’t about them choosing what they like. The criticism points out that calling yourself a rationalist communicates exactly that sort of choice, of selecting some things to care about and other things to neglect.

Expand full comment

Are Alex Jones or the fossil fuel industry really irrational? They have different goals that stem from different values, but they seem to be effectively pursuing and achieving those goals which is a form of rationality. They don't pursue objectivity perhaps, but the pursuit of objectivity is arguably a value choice, and not even necessarily rational from a self interested point of view. That seems to be the complaint on Reddit. The rationalist definition of rationality sneakily includes many value choices, rather than really being about "rationality" itself. Call it an argument over definitions.

Also, contra the post title, if you can be bad, it doesn't follow that you can be "good". Consider a population where everyone is equally good at something, but some individuals suffer injuries that permanently make them worse. There will be a few bad individuals, many normal individuals, but no "good" individuals. Yes the normal ones will be "better" than the injured bad ones, but again we have an issue of definitions over what we mean by "good". This is a situation where no one can claim special skill or superiority. This shape of distribution is arguably how certain fields like investing work, where there is no evidence of superior skill or alpha, even though there are demonstrably bad investors.

Expand full comment

There's no such thing as naked rationality. Rationality is always in the service of Something and we see each other as rational only as much as our Somethings align. Alex Jones is simply optimizing towards a wildly different Something.

As for his views on that school shooting, he claims that he was reporting on people claiming it was fake, and not claiming it was fake himself, and it was several years later that the media took his words out of context and made it look like these were his views.

So one could make the case that on this specific issue Alex Jones is more rational than you. He was reporting on people making incorrect claims, as a journalist, while you are making incorrect claims about Alex Jones :)

I think the closest we can get to naked rationality is to optimize towards our best understanding of what the most objective possible truth might be. But then we would have to venerate truth as an unreachable highest possible ideal and I have no idea how to do that without building a messy relationship between truth and beauty.

Expand full comment

You're smuggling a value judgement. You believe that rationality is good, that it's better/more moral to be rational than irrational. That's the political statement.

Expand full comment

Given that Alex Jones has a viewpoint with a track record of being factually accurate (as folks like Rogan and Pool have found when they fact check him), you can call him irrational, but your own prediction rate is just as faulty. I'll continue to read you in moderation and listen to him in moderation, and likely gain equal value from both. But he's more interesting.

Expand full comment

"But I’d much rather spend my time and energy to learn from the people who are better."

You say this Scott, but then you also never correct the obvious mistakes I have pointed out regarding your readings of Marx. (for example here: https://astralcodexten.substack.com/p/book-review-global-economic-history#comment-1795464)

Expand full comment

People like us debate about things like optimal tax schedules, best type of statistical analysis, most sensible vaccination schedule and distribution, future technologies, capital allocation.

Meanwhile we have a sizable chunk of the population is debating about whether vaccines are a global plot to sterilize humanity, whether the world is flat, whether god created the world 6000 years ago or 10,000 years ago, and most recently whether we should ban hospitals from requiring vaccinations for employees (you think this is some fringe kooky politics? No these bills have advanced out of legislatures).

Being able to go from smart college student to Scott level rationalist might be worthwhile. But it’s rather like deciding what clothes to wear when your house is burning down. How do we inject a bare minimum level sanity into the world?

Expand full comment

I so very much respect you and your POV on so many things, but on this topic I have to respectfully disagree.

For example, I don't think Alex Jones is actually irrational. I think he's a rational actor who has determined that his brand of rhetoric gets him what he wants (money, infamy). He has probably determined, quite rationally, that he could not attain the money or infamy he has attained in more legitimate ways.

And, yes, I think it is absolutely reasonable -- and hyper-rational -- to assert that even rationalists come to the party with biases. It may very well be that the rationalist bias leads to better fact-finding and improved revealing of measurable truths, but it's still a bias.

I do not think we can remove bias from any assessment. I might be going out on a limb here, but perhaps we can only remove bias from standard measurements.

I humbly suggest that we might want to look at bias as being as inevitable and as pervasive as the observer effect. In fact, they may be related.

Expand full comment

It's a branding issue.

Imagine a group called the "Truthists" who uphold the telling of truth. They'd be scrutinized for any sign of dishonesty, and any lie from any Truthist ever would be wielded as an example of the naked hypocrisy of the movement, how you can't trust Truthists, and how they should actually be called Untruthists, and then they'd become a punchline in an SNL skit. This would probably happen even if Truthists were overall more honest than the population at large!

In general, people become suspicious when you brand yourself around a positive identifier. It's like those hospitals with creepily positive names ("Holistic Happiness and Wellness Center"). You instinctively wonder what exactly they're hiding....

Expand full comment

There's a reason they called it lesswrong.com and not good.com, and that reason is that the domain was vastly less expensive. Hmm forgot where I was going with this.

Expand full comment

The point of the "everything is biased" argument is, generally, to bypass scrutiny of your political ideas as they pertain to objectivity. This nonsense has been loose in social studies for decades, as anyone who's followed the topic knows.

Saying that "nobody is fully unbiased" is completely different from saying "no unbiased set of facts exists." Even if you've got nothing but biased people, you can still extract a set of facts as long as they aren't all biased in the same direction. Refer to the standard bullet points on "the group being smart as a whole," yada yada yada.

When you run into someone who thinks your view is biased in a particular instance, you can talk to them. When you run into these people who think there's no point in reasoning because "everyone is biased," watch for sudden movements and protect your vital organs.

Expand full comment

I think your arguments are basically correct individually, and incorrect in their conclusion, entirely because you miss a single, important element:

The people who accuse somebody of being biased, they're almost always trying to win a political argument that has nothing to do with whether or not the person they are accusing is biased. The people who accuse somebody of being irrational, they're almost always trying to undermine somebody's rational response to a situation. "Irrational", as a word, is something the public generally thinks that somebody says when they're gaslighting somebody, to shame them for their entirely rational emotional response to a fucked up situation.

The connotative and denotative meanings of these words clash in a way which I think invites a level of extra scrutiny to claims related to them. That is, the demand for rigor isn't isolated; these words are part of a class of words we are generally suspicious of the use of. "Enlightened", "Godly", "Hormonal", "Emotional", "Committed" (in an ideological sense). I think we're reaching the point where we can add "Scientific" to the list.

Also, there is a lot of specific ideology arising from founding effects in the rationalist community, which I think sometimes gets conflated with rationalism itself.

Expand full comment

A strong rationalist should be aware that they have limited rational compute capacity. So there is some amount of need to apply shortcuts to adversarial anti-rationalists like a spam filter to protect your capacity.

Expand full comment

> Even in the vanishingly unlikely chance that I’m the [most rational] person in the world, I still don't think I'm hitting up against the limit of what's possible

We'll have to ask gwern

Expand full comment

I think the confusion is that "rationalists" view rationalism as a tool. It's like a hammer. A hammer can't be biased. You might be able to do very biased things with a hammer--build a "whites only" sign--but the hammer in itself is not a biased thing.

The criticism views "rationalism" as fundamentally tied up in the problems it tries to explain. Asking "how does 200mg of L-theanine daily affect stress levels" is, to them, a biased question because it, to them, presupposes issues like who gets L-theanine and who doesn't, is it morally right for some people to benefit from lower stress levels, who produces L-theanine and why do they not get the benefit. To them, asking the question is making moral and political judgments already, as they do not view these questions apart from the tools used to analyze them.

Confusing a tool for things the tool is used for is an interesting failure mode (in the spirit of "what developmental milestones are you missing"), and I think it's worth calling out that that's what's going on here.

Expand full comment

Surprised that the word "projection" doesn't appear anywhere in your essay.

Anyway, in my experience people making these "everything is relative" arguments fall into three classes:

1) Cynical hucksters and cult leaders (e.g. lawyers, politicians, pundits) who personally benefit by sowing distrust in objective reality because it makes it easier to sell faith in a guru (i.e. themselves).

2) Morally lazy people seeking to assuage their own troubled conscience by arguing "everybody does it! I'm not so bad..." The projection crowd.

3) People who have had delusional understandings of their own intellectual competence who are confroned with it by events, e.g. someone who has won participation trophies for all his life who then discovers that there *actually exist* people who are much smarter and more competent, objectively, than he is, and who therefore retires from "artist" to "critic." It's comforting to his wounded self-esteem to argue that people who are Right all the freaking time are just...beneficiaries of some kind social privilege or momentary consensus support -- instead of being *much smarter than me*. Call it a form of envy, from people who themselves are not smart enough to wrestle with ambiguity and poor data and nevertheless extract from the morass of noise the occasional clear diamond of truth.

Expand full comment

The people you're addressing here are not arguing in good faith. I mean, in this specific case it seems to be more someone getting personally irritated by Yudkowsky and/or some other individuals in the rationalist community (plus the Marxist belief that Marxism is rationally necessary); but usually this argument is just deployed to be obtuse. There's discussions that are built around current politics - not questions of political structure, or even necessarily ideology in general, but the front-page news, the current sources of tribal anger - and then there's discussions that avoid those topics to the best of their ability. We all intuitively understand that, I think. But sometimes people want to make a discussion current-political, and when people tell them to stop ranting about politics they respond by saying everything is technically political.

Which is really separate from the claim that "rationalists think they can be perfectly rational in Straw Vulcan fashion", which is only said by people who've never interacted with the rationalist community to any significant extent. But I've seen the "everything is political" argument pop up much more in other contexts - i.e. whenever sometimes complains about a preachy book that unnecessarily brings in current politics, they get the stock 'everything is political' response. No, I don't need the alien villain to talk about how they want to "make the homeworld great again", or to read your entirely unoriginal thoughts about why it's all men's fault from the space opera narrator. And yes, it *is* possible to deal with themes of authoritarianism or gender without being unbearably preachy about it; but it's impossible to draw a bright line, and so "everything is political" is brought out as an all-purpose defense. Or, yes, with regards to science, and whether the politicization thereof is a problem.

Anyway. The point is, it's a general class of fallacy where "X is bad" is replied to with "there's no clear line between X and not-X, ergo X doesn't exist and it's not a problem that I'm X-ing on your lawn".

Expand full comment

Would you still object if they replaced 'ideology' with 'utility function'?

Because to me, that's closer to what they're saying, at least the ones I'm familiar with. Rationality doesn't *do* anything without a utility function telling it *what* to be rational about pursuing, and I think they use 'ideology' to refer to those motivating factors.

It's not saying 'there's no such thing as rationality because ideology prevents it,' it's saying 'everyone has an ideology *in addition to* being rational or irrational (separate axes).' And if you forget to interrogate your ideology because you think you don't have one because you think you're driven by pure rationality, then you run the risk of accidentally developing a terrible and self-serving ideology in the shadows.

Of course, I think that objection to the rationalist movement is largely covered in the 'we have noticed the skulls' paradigm, or at least should be. On the other hand, if you look into some of the banished culture war discussions on reddit, it's pretty clear that there's still some very strong ideology at play - those discussion are better than other culture war discussion by virtue or being much more rational than normal, but that doesn't remove the influence of ideology from them.

Expand full comment

An undergrad philosophy professor when challenged with "can anyone really imagine infinity?" agreed with you pithily: "Humility on behalf of others is simply a confused form of contempt."

Expand full comment

It's clear to me that everyone is biased. I interpret "bias" as what a person's axiom system is. Different people have different self-evident truths. It's certainly the case that most people have a self-contradictory set of beliefs, and probably the case that all people do, but even if that's not the case, certainly there are different axiom systems that give different sets of "biased" beliefs, that are still correct within the systems.

I really don't think Alex Jones is less rational than the average person. Intelligent people rather drastically overestimate the intelligence and rationality of the average person. The average person believes all kinds of crazy things. I think Alex Jones is probably right about average in rationality. In any case, saying it's obvious that he is much less rational than average needs some support.

It's not clear to me that global warming is necessarily relevant, for many definitions of relevant. (I'd rather say that global warming is probably not an urgent problem, or rather that it wouldn't be if the world enacted a high tax on gasoline and was rational about nuclear power.) It's also not at all clear to me that someone saying that global warming is not an urgent problem must be any more biased or politicized than anyone else.

Perfect rationality is certainly possible: there's a mountain of mathematics that has been proven to be perfectly rational (within a certain axiom system).

Expand full comment

I think that for some people the motion of ideas has a feeling. For others, what they are experiencing emotionally or through senses is a much stronger sensation. What is considered a path to self-improvement will look different for these groups. Looking at a math problem on a page, one person will see the ink, the paper, feel the concept of number and see the number concepts interacting. Someone else will see the paper, the writing, feel frustration at a teacher, the smell of the room and a feeling of loneliness, say, and the sensation of the number concepts will go largely unrecognized. Both people will however use the word “math” to describe their activity in those moments. Of course math B is “political” if one thinks all human experience is politicizable. Math A will seem qualitatively different from other types of interpersonal activity, to those that experience it, and will feel quite far from other models of politicalness.

Expand full comment

I think that for a lot of people, "rationality" is really "received wisdom." This makes sense in most cases (as you've discussed in several posts before) - the accumulated knowledge of culture is probably smarter than what any person could figure out on their own. Alex Jones is labeled "irrational" because he goes way off into left field with kooky theories.

The proper source of received wisdom of course varies based on group affiliation. Conservatives look to the founding fathers and Tucker Carlson, liberals look to FDR and the New York Times, socialists look to Marx and uh, Jacobin(?), etc. But most groups have clear foundational old sources of wisdom as well as recognized modern institutions that are trusted and respected as sources of truth.

Importantly, the idea that rationality is essentially a matter of rejecting conventional wisdom and overcoming cognitive biases is a pretty big ideological leap. Like most ideologies, it is defensible, but it is definitely not how most people think about it. For most people, rationality = conformity, irrationality = nonconformity. Rationalism, as an ideological structure, flips this around and suggests that conformity is irrationality while nonconforming free-thinking is held as the highest ideal.

Given all this, I think it makes a lot of sense that people will be left scratching their heads when they hear someone say, "we're the rationalist community. We're all about avoiding bias and seeing objective truth! Also, we're obsessed with superintelligent AI, half of us have arranged to have our brain frozen when we die, and our comments sections get REALLY racist." Like, there really is a pretty distinctive ideology at work with its own very noticeable quirks. It all hangs together and makes sense once you understand where it is coming from, but that is true of any well-developed ideology.

Expand full comment

I think it's important to distinguish between descriptive and prescriptive statements. If you are doing descriptive work, then yeah, you can be objective. But even then what you say will be influenced by your perspective, as the story about three blind men describing an elephant teaches us. However, when it comes to prescriptive statements, there is no such thing as being objective. If you are saying what should be done, rather than what is, you are expressing a personal preference. All preferences are subjective.

Here is where those two intersect. Choosing what to describe is a preference. This is very common in political discussions, where each side selects which facts to present. And the thing is, you have to be selective about which facts to present - you simply can't present them all. So how do we select which facts to present, in order to be rational or objective, or unbiased? These terms haven't even been defined - they are just nebulous concepts that we all think we have defined in the same way for ourselves when we haven't. Whatever we think of those terms, I think keeping in mind the idea that you cannot separate yourself from your ideology is very useful tool in trying to become more "rational" or "unbiased."

Expand full comment

From Asimov, a related idea: https://hermiene.net/essays-trans/relativity_of_wrong.html TLDR: It's ridiculous to throw up your hands and say all theories are wrong; some are much less wrong than others.

Expand full comment

This feels similar in some way to arguments against meritocracy. "all science is political" is usually leveled at institutional science, with the implication that those institutions are falsely claiming that they *already are* up against the lightspeed-rationality limit. A charitable read is that "all science is political" is a way of telling people who are saying "we know objective truth" that "you should remember that you can be a lot better".

Expand full comment

> I believe there's an important, specific criticism to make of this group, and that "they are doing biased, political science" is the most natural and accurate way to make this criticism.

This only applies if you believe that science is categorically different from literature. If, on the other hand, you believe that science is just another form of narrative, then any statements anyone can make for or against global warming are merely expressions of their political will. The statement "the Earth has warmed X degrees in the past Y years" is not categorically different from saying "we should/shouldn't ban immigration" or "the total number of genders is N" or "people of race A are superior to people of race B", etc.

Expand full comment

His other "cons" are pretty bad too.

2. They have too much faith in the power of their own cognition

Counterpoint: this community is where I even got a clear concept of Outside View. One might say it's _focused_ on failures in cognition. Maybe there's too much emphasis on Inside View, according to Freddie? But then, frankly, his ideology is more esoteric than ideology of some rationalists. Seems like an isolated demand for rigour, backed by nothing. How does one even respond to this accusation?

3. Their indifference to human emotion and social cues is not integrity, it's a refusal to confront the material importance of emotions

Frankly, I'm not sure if I get what he means here, so whatever.

4. Eliezer Yudkowsky is their king and he's kind of an asshole

I don't think that's the case. But maybe I'm wrong. Again, seems like a weird ad hominem.

5. We're all just scrambling around trying to find meaning and understanding with brains that are the imperfect products of evolution

...which is, again, core rationalism(TM) stuff. Like, lots of sequences are basically this. What's the point?

Freddie really needs to add how does he think rationalists should improve. Because these points seem to not be very actionable.

Expand full comment

To be fair to Freddie deBoer, though:

> 1. There is no such thing as "rationality" that is free from ideology

I don't know what precisely he means by this. However, IMO the capital-R rationalist movement definitely has some ideological stances; i.e. positions on certain issues where, should you take the wrong stance, you're automatically branded as a non-Rationalist (and possibly just a crazy/stupid/toxic person altogether). Rationality as a methodology may be ideology-free, but Rationalists as people do not.

> 2. They have too much faith in the power of their own cognition

Absolutely true; the general idea is that intelligence is the only attribute that really matters, and that a sufficiently intelligent agent can accomplish virtually anything (plus or minus some laws of physics).

> 3. Their indifference to human emotion and social cues is not integrity, it's a refusal to confront the material importance of emotions

I don't think it's a *refusal*, per se, but there's no denying that the proportions of people on the autism spectrum is higher among Rationalists than among the general population. But I think it's unfair to conflate autism with willful refusal, so that's a point against deBoer here.

> 4. Eliezer Yudkowsky is their king and he's kind of an asshole

It doesn't matter whether he's an asshole or not; lots of people are assholes. However, IMO the Rationalist movement holds Yudkowsky in higher regard than it is perhaps warranted.

> the notion that we are on the verge of radical life extension is profoundly optimistic based on current technology.

Calling it "profoundly optimistic" is IMO an *understatement*.

Expand full comment

I think this depends on whether you interpret being part of the Rationalist community as either a commitment to being more rational (which should make you more self-critical enough to become continually more rational) or a claim to already be more rational than most people (which will make you less open to criticism, and therefore stuck in the same place). There's an element of both in the movement, and many Rationalists could probably benefit from being more open to criticism, but I suspect that being part of the Rationalist community makes people more open to new ideas than they'd otherwise be. My assumption is that the average Rationalist was always very confident in their own conclusions, and becoming part of the community made them more open to new ideas. There's probably a weird effect where becoming a Rationalist makes you more confident in your conclusions, conditional on your understanding of the world being correct, and as long as you remind open to new information I think that's probably a good thing.

It could of course be the other way around and being "Rational" just makes you an insufferably arrogant know-it-all, but at the very least the Effective Altruism crowd I'm part of loves being told that they might be wrong about everything!

Expand full comment

One possible response would be

"People who identify as rationalists tend to have specific blindspots. They are not just imperfectly rational (everyone is), they are imperfect in distinct ways"

Similar to the way that it always seemed to be _engineers_ who argued for Young Earth Creationism, or that 'New Atheists' were so proud of seeing through religion that they never noticed their sexism, etc etc

Expand full comment

So bad it's good 😁

Expand full comment

I was surprised by this post and how Scott chose to write about such a basic, 101, idea. It's like seeing a college math professor lecture earnestly about how addition is... addition.

Which made me wonder why. Is it that Scott is so frustrated with so many people unable to do simple addition? Is he speaking to the "beginner" readers and trying to catch them up? Or does he find that the usual discourse on relativism (philosophical/logical/sociological) inevitably gets bogged down in a fog of college dorm room weed smoke? (Reading the comments, it appears that it's inescapable even when simplified to a spare denominator.)

I think this was the darkest and most depressing post I've read yet on SSC or ACX.

Expand full comment

Love this idea by Meehl: 'Multiple Napoleons fallacy: "It's not real to us, but it's 'real' to him". "So what if he thinks he's Napoleon?" There is a distinction between reality and delusion that is important to make when assessing a patient and so the consideration of comparative realities can mislead and distract from the importance of a patient's delusion to a diagnostic decision.'

Expand full comment


We’re happy to grant that people are fighting for justice"

should be

We’re happy to grant that people are "fighting for justice"

Expand full comment

I will say this with as much kindness as I can manage, but you're really missing the point of the "anti-rationalist" critiques you are criticizing. The rationalist viewpoint, at least as it manifests in these parts, ABSOLUTELY smuggles in a bunch of ideology and value judgements. You even started to grind into this problem about a decade ago (whose utilitarianism).

For one very obvious example (and at the risk of being culture-war adjacent), my read of the rationalist community on the subject of human difference is that they genuinely believe that race, sex and gender doesn't/shouldn't matter, only ability does/should. In fact, debates about the degree to which race, sex and gender DO matter are framed in terms of ability (in turn, this frequently reduced to IQ).

(I explicitly don't want to argue or start an argument about whether this perspective on race/sex/gender is justified or correct or moral. The point isn't whether the viewpoint is correct, rather that it is one commonly held in these parts)

I once read an interesting point about Japanese artist Hokusai's Views of Mt Fuji. After this series was produced, the image of Mt. Fuji came to dominate the period's art so strongly that even it's absence was a choice laden with meaning. For a more contemporary example, consider Tolkien's influence on Fantasy: you can write Tolkienesque Elves/Dwarves/Wizards, you can try a new take on Tolkienesque Elves/Dwarves/Wizards, or you choose to avoid writing Tolquinesque fantasy by removing Elves/Dwarves/Wizards from your setting completely. But these are your only three options: avoiding comparison to Tolkien, at least on some level, is not an option for a fantasy writer at this time.

Race, Sex and Gender in contemporary politics are kinda like Mt. Fuji or Tolkien. They cast such large shadows that trying to ignore them is a kind of political project in and of itself. You can't be neutral on this stuff, and not because doing so is impossible in the abstract "no one can be totally unbiased" sense of "impossible". Rather, because a belief that one actually SHOULD be neutral on these subjects is ideologically motivated.

If I wanted to go full post-modern-neo-marxist I'd go on to point out that this anti-ideology matches exactly the demographic/class interests of the rationalists as a whole. Compared to the general population, we skew heavily towards white, male, smart, and DSM-diagnosible. Assuming self interest, and given that demographic data and nothing else, the political implications of the rationalist project are utterly predictable.

Please understand I say all this as an intelligent, "neurodivergent" white man who considers himself a rationalist. Y'all are my ingroup, and the people Scott was quoting definitely aren't. But let's not blind ourselves to the political implications of this community just because we don't like the fact that there ARE political implications.

Expand full comment

Isn't Alex Jones possibly pretty rational individually, milking right-wing morons for their money by selling worthless supplements? "Crazy like a fox"? How irrational is it if it works for him?

Obviously his message and content are utterly irrational and bonkers.

Expand full comment

What if Alex Jones' rationality is beside the point? What if he is the heir of Homer rather than Newton? In Jordan Peterson's model "The world can be validly construed as forum for action, or as place of things." Explanations of the world in the former (Homeric) sense focus on describing the meaning of things and ultimately tries to tell people what they should do; explanations of the world in the later (Newtonian) sense seek to create a "increasingly precise determination of the consensually-validatable properties of things, and for efficient utilization of precisely-determined things as tools".

If Alex Jones accurately predicts the ascendance of the Blue tribe and how that translates into the cultural, economic and literal* destruction of the Red tribe, does it really matter how precise the metaphor is? Or is it more important that he craft a powerful metaphor that mobilizes his tribe to resist?

*only a slight overstatement, viz., the unprecedented decrease in life expectancy due to deaths of despair

Expand full comment

The two words, "Less wrong", really do say it all. Perhaps we can just rename rationalists as "Lesswrongers" and quiet the haters.

Expand full comment

I think some of the problem is the perception of arrogance on the part of self-described rationalists by critics. They feel that the rationalists are claiming a sort of superiority - that they can think properly whereas non-rationalists can't, that they are cool, objective, non-emotive truth-seekers and everyone else is a dumb hysteric.

I don't think that's necessarily true (but, um, I have certainly seen some people online describing themselves as rational/rationalist and being a bit arrogant over their superior brainpower) but it's the old clash between Reason and Emotion, with emotion being considered inferior and feminine versus masculine reason and superior brainpower.

People who feel very strongly about things will naturally feel "well, why shouldn't I be angry or upset or enthusiastic or whatever about this? The situation *should* make you angry! Who gave you the right to play the scolding parent treating me like a toddler having a tantrum?" in response to requests to calm down and think about this without emotion. I greatly prefer the "please just calm down and be reasonable" approach myself, but I do see how it can be off-putting. Sometimes you just have to let someone vent before they can calm down and be reasonable, and sometimes the dispassionate approach can seem like you're staking a claim to be the 'better' (as in 'more correct, more right, more reasonable, more adult and mature person' sense) participant in the conversation.

There are terribly irrational people out there, truly. But there are some self-described rationalists who aren't much better, who do seem to think you can reduce people and human problems down to a mechanical box where you just pull this lever and press that button and the correct answer comes out.

I'm one of the people who mocked the "feminist glaciology" thing, but there could be a point there - before it got exaggerated and co-opted into the culture wars - that other viewpoints can make valuable contributions, and maybe you should listen to the folk tales from people who have been living around glaciers all their lives because they might have some information there (which is not to say you immediately junk all science and become a shaman). Science does not appear fully formed out of a void, it is done by scientists who are people and who have the same biases and flaws as every human. I think we see this in the replication crisis, where very definitely people have had agendas or preconceptions which they shoe-horn in to their studies, and massage the results, and then wave this around as "science proves x, y or z!" and/or interested parties like to wave this around as "science proves x, y or z!"

We see it most clearly and obviously in social science and psychology experiments, particularly those from the 70s such as the Stanford Prison Experiment where it turns out the originator deliberately set it up to give him results to support his political ideology around prison reform, and what was dubbed the "sex raft" experiment which I only learned of via Tumblr (all I can is, look, the 70s were weird and everyone was on drugs) https://www.forbes.com/sites/joanneshurvell/2019/01/15/the-1973-raft-experiment-sex-and-sedition-at-sea-now-a-fascinating-film/?sh=6948990271e0


One of the most interesting things about the experiment was that it didn’t work out as Santiago expected so he tries to manipulate the participants and then they turn against him.


Yes, and I think that’s actually an interesting result. That’s why it’s great that the “guinea pigs” are brought together again and are able to speak about and analyse their experiences in my film. The stories the participants told were completely different than what Santiago had written in his journals."

That's not "objective, rational" science, that is "everyone has got biases and an ideology" in action.

Expand full comment

"Statements like “it can’t be separated from ideology” risk putting everyone on so relative a footing that Alex Jones’ version of rationality is no worse than anyone else’s."

Well that doesn't quite scan.

Michael Mann is a major figure in climatology whose work was critically important in understanding the reality of human effects on climate change. He was also influenced by the predominate social order he lives in, and the things he do cannot help but be shaped by it - he has biases and blind spots like anyone else. The same is true of Alex Jones. And yet, one of these is justifiably seen as a respectable scholar and scientist, and the latter is seen as a bizarre crank who rants about gay frogs. We can even acknowledge that both of them are thinkers who fail to be perfectly rational all the time and still consider that difference significant.

To take your race analogy, this is less like pointing out that humans can't quite run as fast as the speed of light and more like pointing out that humans can't quite sprint as fast as racehorses. Nobody grows up independent of the culture that raised them (or the instincts they have as human beings), and as a result, everyone will have some biases. Some people will have less or be more aware of their biases, but I don't think it helps to call that "good". Nobody is "good" on rationality; we're all just different shades of more or less bad. Isn't the rationalist creed something along the lines of "CONSTANT VIGILANCE!" because of that?

This seems, at best, like a heuristic that will make certain people become overconfident (compare to "Cowpox of Doubt") in their own ability to reason. At its worst, it seems like the denial of very basic truths because they're inconvenient to the ideas behind rationalism. Which doesn't seem very vigilant.

Expand full comment

It's more a critique about selective application of the tools of rationalism, e.g. of intellectual buttressing and of critiques of irrationality. It's less about biased methodology, and more about biased cause selection and the harms of elevating certain voices (when they say one thing that is rational).

Expand full comment

I admit I'm very lukewarm about rationality as the rationality community seems to define it. In large part, I'm just not sure I understand what they mean by "rationality." In small part, it's probably because I associate the term "rationality" with a sort of "science, f--- yeah! Screw those stupid religious people!" mentality. That's probably not a fair association, but I choose it anyway.

What that means is, I'm probably sympathetic to the claims Scott is critiquing in the post. It's not that I disagree with Scott. It's just that my knee jerks in that direction.

As a partial aside, I read a portion, maybe 20%, of Eliezer Yudkowsky's A to Z book about rationality (I hope I'm not misspelling his name....I'm too lazy to look it up) book about rationality and its tone seems to substantiate some of my suspicions and reservations about "rationality." I understand that he wrote that book a while ago and that in his preface he says he's taken on a different attitude about his approach to some of the things there. But it's hard for me to shake it off. At any rate, I'm probably being unfair.

Expand full comment

Bookmarking this article for the inevitable thousand times that I’ll need to share this perspective with interlocutors

Expand full comment

Feels like this misses the key point, IMO. Here's something extremely rational: schizophrenic paranoid fantasies, All the conclusions follow from their perceptions! How could we get more scientific? The only thing that prevents us from believing that the CIA is tapping our phones is some sort of trust in institutions, which communicate through media channels stuff like "we promise that we're not following you specifically around".

So, we're not necessarily being any more or less rational than the paranoid schizophrenic when we decide that the CIA isn't tracking us (actually we're probably being less rational, because we're taking more on faith), it's more a question of "who do you trust and what do you place your faith in?"

And once you start poking at those questions, the idea of rationality flies out the window, because rationality (and objectivity as well) is a social tool rather than an end in itself. What's political about rationality is what you choose to do with that tool, and the "true" grounds in terms of ethics, value, etc. (i.e. what one's rationality is deployed *toward*) are only found when one reads between the lines, at least on this blog or in this community.

Remember that objectivity itself was invented by Archimedes and only became a "thing" back in the 1500s, when Galileo realized that our senses were deceiving us about the cosmos, thus toppling the whole "contemplative natural philosophy" thing that Aristotle started up way back when. And objectivity is useful but we can't be objective about everything all the time, so politics creeps in when we decide what we're going to be objective *about*.

Do you see what I'm getting at here?

Expand full comment

I think there might be some misunderstandings between the two camps (rationalists v. "you're not as rational as you think"ists).

One is the conflation of being rational in approach, and then the Rationalist Community - the specific people and organization of known rationalists. It's a lot easier to attack the Rationalist Community than it is the ideal of thinking rationally. For instance, Rationalists take stances on both sides of just about every cultural and empirical question, especially in regards to politics. It sure seems odd to claim rational thinking and then have massive disagreements on core issues.

Secondly, it's entirely reasonable to say that Alex Jones is irrational, and therefore someone can be MORE rational. It's another thing to then claim that the rationality of the thinking involved is free from ideology, or even to a reasonably good extent. If we look at rationality in thinking as a continuum, perhaps a 1-10 scale where Alex Jones is a 2 and perfect rationality is a 10, most rationalists might be around a 5, with some particular outliers hitting a 6. It's good to try to be more rational, but there needs to be some epistemic humility. Nobody here is Spock, let alone a perfectly rational AI.

Part of the humility has to come from a realization that not all questions have rationalist solutions or are built on rationalist understandings. Whether abortion should be allowed, for instance, is not a great test case for rationalist thinking, because both existing sides are built heavily from feelings-based understandings of what is good. You're not going to convince someone who is pro life that killing babies is okay because X% of expecting mothers will have complications and may die, no matter how high X is. Similarly, you aren't going to convince pro choice advocates that they should give up on abortion by showing that a baby has a beating heart or some other metric. There are simply questions that science cannot solve, because they are a different kind of question. Science cannot tell us what morality is. If we define morality prior to applying science, we can use scientific methods to determine a better approach to fulfilling that moral prior, but that still doesn't tell us anything about that moral prior.

Rationalism is not an all-encompassing approach to figuring out life. As long as we don't treat it like it is, and instead use it as one of several methods to improve our existence, that's great. I think a lot of the animosity people have for Rationality is when it's treated as a first and last solution (as if it could help create our priors and morality, and then also solve them).

Expand full comment

Absolutely essential to be able to make those distinctions. Otherwise you end up with a lot of false equivalencies. People who are being abused in various ways often get told this type of thing; Neither of us is perfect! I curse and scream at you sometimes, but you sometimes load the dishwasher wrong. I cheat on you, but you bite your fork! I embezzled from our partnership, but you didn't consult me on hiring that new receptionist! We're both in the wrong. But I magnanimously forgive you for your faults and errors.

Yeah, no.

Expand full comment

When people say “everyone is biased”, they don’t mean “everyone can’t be rational”. What they mean is “when faced with a ‘rational’ decision, which correctly answered requires self sacrifice, most people will take the incorrect, or ‘irrational’, path that leads to the avoidance of personal harm.”

Further, they mean that this reality has substantial impact on how society functions, and can’t be ignored. To get large groups of people to act in ways consistent with Rationalism, you need a strong social structure, like a government or religion, that forces quite a bit of self sacrifice for the greater longer-term good. Then, you can argue over who sacrifices what, and you get politics...

In other words, the logical outcome of a large group of self-interested people (even smart ones), in the presence of resource constraints, leads to politics as we know it.

Expand full comment

The main point of this post seems straightforwardly incorrect. You're assuming that there's a unique position which is the opposite of bad, and that's what we should call good. But there may be many different positions which are all opposite-of-bad in some ways, with the choice between them being an inherently ideological decision.

Two examples. Firstly, art. There are some egregious examples of bad art that almost everyone will agree look ridiculous. And there have been many technical improvements in art over time, like understanding perspective and anatomy and so on, in a way which allowed artists to paint more realistic paintings. But it would be a mistake to extrapolate this to say that photorealistic art is "good" art in a way that's independent of ideology. In fact, there are many styles of art which are intentionally non-photorealistic; choosing which of those styles to call "good art" involves picking an ideological side.

Secondly, human welfare. We all agree that constant unwanted torture constitutes a bad life. Okay, so based on this, can we define a good life in an ideology-free way? Certainly not; there are many different opinions on what it means to live a good life, and one's personal choice between them may come down to very subjective (ideological) factors.

I'm not saying that good rationality is as inherently ideological as good art or good welfare. But the argument you're making here can't succeed without distinguishing between these three domains.

A second point: it's easy to argue that criticisms of rationality for not being totally ideology-free are falling into the fallacy of grey. But many rationalists actually do believe that (something like) bayesianism is perfectly rational in an ideology-free way, which then helps justify a specific idea of what being more rational looks like in humans. For more mainstream figures like Sam Harris, assertions that science is ideology-free play a similar role (see quote below). So, *as stated*, the criticisms you mention are actually attacking a fairly important pillar of rationalist and rationalist-adjacent beliefs. Yes, their conclusions do seem overblown; but before dismissing the criticism as irrelevant, you'll need to argue that the rationality community's beliefs about practical self-improvement don't actually rely on its claims about ideology-free "perfect rationality". I think this is *doable*, but I'm not sure it's actually been done.

From the Klein-Harris debate:

"Ezra Klein: You don’t realize when you keep saying that everybody else is thinking tribally, but you’re not, that that is our disagreement.

Sam Harris: Well, no, because I know I’m not thinking tribally —

Ezra Klein: Well, that is our disagreement."

Expand full comment

I think you need to distinguish two things. I think there is a perfectly coherent (even correct imo) view that rationality, strictly conceived, only means following the probability laws while having priors that are a good fit for our universe is a different thing and needs a different name.

Expand full comment

Rather than argue about semantics, I think it's more useful to frame language in context of, well, usefulness. It's useful to be able to describe Alex Jones as irrational. Sometimes it's also useful to acknowledge that, despite best efforts, our work will still contains bias. Both "sides" can use a straw man to make the other side look silly if semantics is all we care about, but it shouldn't be.

Expand full comment

Friend - in this regard, you are not mad or evil. In this context, it is entirely correct for you to be meritocratic and not egalitarian or cancellatory.

And why is it correct to be meritocratic? Because you are estimating on a well defined domain (the correctness of one's prediction) with a numerical score.

Expand full comment

I think Jones is irrational but an analogy is probably advertising. If you came from a UFO visiting earth, advertising is just totally irrational. Everything is 'the best'. You should eat the same hamburger all the time. This streaming channel has everything you want. AT&T's new plan will give you near orgasmic intensity. But we don't care because we understand selling is a type of intellectual game where you are disconnected from rationality but there are some boundaries (i.e. you can't outright lie or make objectively false claims).

Jones might be thought of as a type of advertising. He says a shooting was a hoax, he gets clicks and sells some of his BS drink powder. He says you should buy guns in case of a random shooter. Sells more powder. Isn't it a contradiction? If shootings are a hoax why prepare for one? Doesn't matter.

Expand full comment

With all due respect, I find the issue of how rational a rationalist may be irrelevant. Why? Because it is impossible to account for every possible bias or to construct a completely perfect argument that can't be tested by rebuttal. The point of rationalism is to accept the provisional nature of our thesis. What I find absolutely advantageous is, in the words of Karl Popper, the falsifiability of rationalism.

I know that any argument, or any theory, or any research is provisional. If better data is found, if better arguments are presented -based on data and its replicability- then I will change my mind. Our nemesis is dogma.

As society becomes so polarized based on what party you support irrespective of candidate or platform, as algorithms feed each of us with more of the same, rationalism can be construed as one of the last holdouts of the principles of the enlightenment and a safe place to discuss ideas rooted in facts. You have to be humble, you need to be open to a better set of data and explanation, you need to essentially know that your views WILL inevitably change over time.

Let's not get distracted by post-modern arguments, IMHO.

Be well, everyone.

Expand full comment

You've already admitted that no water is 100% pure H2O, so why won't you drink from my delicious lead-and-feces-and-plutonium-filled well?

Expand full comment

I think the whole essay is cheapened by using Alex Jones as an example.

I personally haven’t seen a lot of evidence for his irrationality.

In the stuff I’ve seen he is excitable, loud, hyperbolic, and working with a different set of facts.

This isn’t the same as irrationality and seems to undermine the purported rigor of the concept.

Expand full comment

I think you have missed the essence of the criticism being leveled. It is best expressed in your quotes as: "(Everyone’s biased), we’re just not trying to deny it like they are." To be clear, the criticism isn't really, that everyone is biased, it is that many in the rationalist community do a bad job of acknowledging and modelling their own bias.

Often, the people they accuse of being irrational, or just as if not more rational than the person doing the accusing. Not always of course. When you accuse Alex Jones of being irrational, fair criticism. But when other "rationalists" do things like accuse progressives of being irrationally "woke", quite often, the "rationalist" is the one who is being relatively more irrational and ignorant.

And in general, this belief that one is "rational" is quite dangerous. It leads to over confidence and inability to internalize new information. Ideally, when a rationalist writes something about some kind of cognitive bias, this should make the readers feel less confident in their positions as some of their positions will be effected by that bias. In practice I think it tends to make readers more confident in their positions that aren't obviously affected by that bias, while their positions that are effected just get rationalized away. This leads to a feedback loop where in, some readers, upon learning about their own and others bias, become generally more confident in themselves instead of being generally more skeptical as a result of knowing about these biases.

Expand full comment

Scott simplifies the notion of ranking individuals on some sort of objective rationality scale. In actuality, we dwell among pools of rationality that are sometimes overlapping and sometimes independent.

E.g. Buddhist monks who self-immolate might well be almost purely rational. Clearly, rationality is a component of one's perception, since "the self" is necessarily part of the whole equation.

So when considering bias, the question is not our overall ranking but where the blind spots are located. Certainly, blind spots are certain -- and genuinely blind; if we are incapable of viewing the world through the eyes of a Buddhist monk, then we can only guess the rules of rationality that accompany a being with such awareness.

Humility -- admitting that blind spots are inevitable, and, by definition, total -- is I think the first step toward sunshine, as well as a necessary component of rationality writ large. Reading through the comments below there's much lively discussion of fundamental components of rationality, all of which nicely illustrates the different lenses, time frames, and overall orientations of individual awareness. (So I decided to add mine!)

Expand full comment

I think the issue is people have a mental model of rationality that's strongly bimodal. Either/or. Like a barrel of wine with a drop of **** in it is really a barrel of ****, arguments with a teency bit of irrationality are still irrational. That's probably overstating things, but I would argue that there are contexts in which a tiny bit of unexamined, unconscious bias in favor of a particular point of view does indeed lead to conclusions that are way wide of the mark.

Expand full comment

Nobody can achieve perfect rationality. A potential problem is that if you define your community as being "rational" oriented, you rule out the possibility of dealing with things from a conflict rather than a mistake perspective. Conflict (accepting ideological-ness) sounds worse, but it also raises the possibility of compromise. If the other side is mistaken, there is nothing to do but for them to accept their mistakenness.

I think the rationality community is much more rational than most people. But like everyone their beliefs are in part motivated reasoning that supports their generally educated middle-upper class positions in life. The problem with treating everything as a debate is that many people would reject that this is even a debate in the first place, rather than one group trying to thrust their personal preferences upon another.

Expand full comment

For many years, I have described this in public with the term “reality facing”, because there are clear cut examples.

There isn’t “perfectly reality facing” or “perfectly reality denying”.

But there is George Carlin, and there is North Korea.

I want to be on the George Carlin end of the spectrum.

Expand full comment

The rationalist community seems to, simultaneously, think they are better at being rational where others are definitely irrational. Yet it also holds Bayes theorem in a pedestal of how humans reason and behave. Now, is it rational to hold these two positions simulatenously?

Expand full comment

if you can atacc, you can also protecc

Expand full comment

Reminds me of Isaac Asimov's "Relativity of Wrong." "If you think Alex Jones is perfectly rational, you're wrong. If you think Eliezer Yudkowsky is perfectly rational, you're wrong too. But if you think that Eliezer Yudkowsky is exactly as irrational as Alex Jones, you're wronger than both of them put together."

Expand full comment


Expand full comment

I agree with Scott's rebuttal here, but I also think it's a narrow interpretation of the critique: Not necessarily incorrect in capturing its occurrence in the wild, but I think there's maybe a more interesting perspective one could steelman it into. I'll do my best to vaguely gesture at what I mean below:

A mind can do many different things, create art, scheme, do mathematics, be sociable, spin paranoid conspiracy theories, etc. Each of us lives in their own, unique reality tunnel, which creatively uses these diverse faculties in order to construct an experiential frame through which we interpret the world and interface with other people. We are political animals and inhabiting, constructing and negotiating between these reality tunnels is the organic function of our minds.

The issue with Scott's counter here is that it interprets critiques of the choice of a rationalist reality tunnel as specific failures of rationality within the tunnel. The statement that rationality is not free from ideology is not meant to point at the idea that rationalists fail in their execution of rationality due to biases, but that the imposition of the rationalist frame is political regardless of how well rationality is performed within that frame: Even if there was a rationalist who was free from political bias in his reasoning, the fact that he has entered into and remains in the rationalist reality tunnel is already a political act. We are animals before we are thinkers, and the specific kinds of animals that we are happen to be political.

Most people would agree that Alex Jones is irrational, but that doesn't mean that people are broadly signed on to rationalism as a totalizing reality tunnel. Who is to say that the tune of rationalism is the one that we ought to play at this specific volume? Maybe we instead ought to maintain a variety of different ontological pastures that work together more organically, without being overly concerned about having an overarching framework that integrates them. That definitely seems to be how most people operate. Did anyone ever do an RCT on whether it is instrumentally rational to always aim for epistemic rationality?

To be clear, I'm on Scott's side here and generally feel that the rationalist reality tunnel ought to have more weight (Galef for president), but I think there is a perspective here worth contending with. Also, Scott, if you're reading this, please check out "High Weirdness" by Erik Davis. I think it would fit well into your previous forays into early psychedelicists, and I'd love to see the interference pattern of your and Erik's mind.

Expand full comment

> Talking about the impossibility of true rationality or objectivity might feel humble - you're admitting you can't do this difficult thing. But analyzed more carefully, it becomes really arrogant. You're admitting there are people worse than you - Alex Jones, the fossil fuel lobby, etc. You're just saying it's impossible to do better.

Most claims of arrogance are like this, where one end of the spectrum is "arrogant" and the other end is also "arrogant". When people complain about arrogance, they often mean something completely different.

I was criticized many times in school for my arrogance. In particular, the idea was that I radiated the expectation that if I knew something, everyone else would know it too.

I thought a fair bit about this and completely failed to understand how the position that anything I knew was also known to everybody else could be described as "arrogance". It's nothing other than the statement that out of all the people in the world, I'm the one who knows the very least.

Conversely, if I approached every conversation with the idea that the other party is unlikely to know anything that I know, I suspect that would do nothing to improve a reputation for arrogance. That would be "talking down".


As to perception of the "rationalist community", I don't like it either. My main actual thought on the subject is that, years ago, I discovered some essays on Overcoming Bias by Eliezer Yudkowsky, and I liked them, but I didn't seek any more out. My view was that the stated goals were good but the essays were more entertainment than anything else, seeking to convince through eloquence rather than correctness. I prefer an argumentative style that sounds worse when relying on false premises than it does when relying on true premises.

Expand full comment

It's the difference between "my friend Jeff is kind" and "I am part of the Kindness Community, where we practice theoretically consistent kindness." One makes Jeff sound like a good bro, and one makes the speaker sound like an out of touch narcissist.

And no, one person being bad at a virtue doest mean you get to be good at it. Something something beam in your own eye something something.

Expand full comment

Arguing that there is no view-from-nowhere is no different than arguing that cognitive biases (like loss-aversion, recency-bias etc) may restrict one's ability to be fully rational. I don't think acknowledging that one's rationality is bounded implies that it's futile to try getting better within those bounds. If anything, failing to acknowledge this potential bias will increase the likelihood that you make prediction errors.

Expand full comment

Re: isolated demand for rigor. I'm not so sure about this. I expect that a "Kindness Community" of much significance would probably get a lot of flak too. A lot of people have poor opinions of MENSA or their membership. When an ideal is elevated to a group's name or reason d'etre it's not unreasonable to take that as a claim of some accomplishment rather than just a platonic goal.

Expand full comment

I think there's an argument that different actions need to clear different levels of bars to be working toward a good outcome (one could say that trying to be "objectively rational", needs to one closer to being view-from-nowhere objectively rational to be doing a service rather than a disservice, than trying to be "kind" or "thoughtful".

Expand full comment

"You're admitting there are people worse than you" - Is this supposed to say "you *aren't* admitting..."? It might be just me who got stuck on that. Great post.

Expand full comment

I think a lot of people saying "there's no true rationality/objectivity" are pushing back against a very real tendency of some people to refer to "objectivity" when what they're really talking about is something more akin to the status quo or the default assumptions of some dominant incumbent group. I doubt hardly anyone means that trying to be relatively more rational and objective is a completely pointless endeavor.

"I don't want any *politics* inserted into my football game, I just want to watch the pre-kickoff blue angel flyover sponsored by Lockheed Martin and enjoy my game in peace."

There's a related issue where people seem to confuse objectivity/rationality of process with some ineffable impossible-to-prove objectivity/rationality of people themselves. Objectivity of process would try to disclose any potential conflicts of interest in a deeply reported expose about a Democratic candidate. Objectivity of people tracks down a third cousin of the piece's author that donated to a Republican candidate 5 years ago to prove that the expose about the Democratic candidate is inherently untrustworthy, fit to be dismissed out-of-hand regardless of anything it might contain.

Trump actually said a Mexican American judge (as in, actually born in America) couldn't be objective enough to preside over a Trump University fraud case. Many years ago there was some controversy where Christian conservatives claimed that a gay judge couldn't be objective enough to preside over cases involving gay marriage, whereas presumably straight judges could.

The kind of "objectivity" sought in examples like these really seems to be about something different than just a drive to reason or think better, so it makes sense to push back by saying everyone does indeed have their biases/irrationality.

Expand full comment

At the point where you've made the assumption that rationality is the main axis of value, and not e.g. social skills or inner peace, you've made an ideological assumption, and all of your work proceeds from that assumption. Yes, it's true that some people are more rational than others. That's also not the only way to navigate human thought and behavior, and there are hidden costs to assuming it's the only or correct way in all context.

Expand full comment

This might be verging on culture war... but I think Scott is making the wrong assumptions about the people disagreeing with him on reddit.

For many their idea of "good" is tied to their political views in the same way baby eating aliens tie their idea of good to babyeating.

They will nod along with you that Alex Jones is irrational and bad and that's bad.

But not because they have a general principle that being rational and avoiding being biased by politics is good. Rather Alex Jones is bad because he picked the side of badness in their view.

Since their political position is inherently linked to their concept of "good" , they see you trying to be less biased by politics as you trying to be less good unless the politics you're trying to be less biased by the politics of their opponents.

Expand full comment

I think there is a better way for considering these limitations, and I think they've been grappled with by some important philosophers about a century now. I'm thinking most strongly of Alasdair MacIntyre's works, like "After Virtue" and "Whose Justice? Which Rationality?" neither of which are arguing there is no real rationality, but effectively argues for things like rationality influenced by things as basic as language and other larger contexts. But, notably, he talks about there being "truer" or "better" whole philosophical systems, and an idea that, over time (long amounts of time), worse philosophical traditions collapse due to internal conflicts they can't resolve, and better ones continue developing along their own lines, able to handle such conflicts. They may go into decline, or thrive at different times, but they ultimately are better able to survive the closer they are to an objectively better truth, logic, rationality, etc., whether we in the midst of all of it are able to properly identify the objective standard or not (though, he does present ways of looking for something being better or worse, and goes into specifics of why some modern era ones were largely setup to fail, along with how that's lead to latter ones mistakenly thinking there was no possibility of success).

Expand full comment

This is missing an important factor. Acknowledging your bias is the only way to correct for it. Being mindful of your emotional investment or political ideology is a key skill in defusing them. Essentially, being good at being rational requires being open, not defensive or idealistic, about your weaknesses.

Expand full comment

The problem is that the "rationalist" community, and also scientists, use the word "rational" to mean something like "right reasoning", while everyone else uses "rational" to mean the same thing it's meant since before the Latin word "ratio" even existed: an epistemology which renounces empirical observation and experimentation as illusory, real numbers and infinitesimals as non-existent (in a Platonic sense), and which requires a "foundation" of unquestionable axioms from which one constructs deductive proofs to arrive at absolutely certain universal and eternal truths.

Rationalist thought /does/ always take some arbitrary foundation as its base, and everything a rationalist believes depends critically (in the technical sense) on the precise set of axioms taken as a foundation. Also, rationalists /don't know that there is any other way of thinking/. For instance, the skeptics are said to have taught that we can know nothing, whereas actually the most-famous skeptic, Sextus Empiricus (so-called because of his empiricism), taught that we can't be 100% absolutely certain of anything. The rationalist defines knowledge as deductively-proven universal eternal truths. So you can't "know" that it's raining, because it isn't raining everywhere and at all times.

This is why people keep accusing the rationalist community of being high modernists--because the high modernists WERE rationalist, and we would be like them if we were in fact rationalists.

Expand full comment

but you ignore what i believe be the most important part of the argument - the counterproductive the danger in saying my party rational and yours no. i saw more then one someone who's believe in his own rationality made him LESS rational. and politics is the perfect place for that.

what i hear that people saying - what I am saying is- our knowledge of rationality is not good enough to judge politics on the frontier. we have very vague sense of what rational politics looks like. and saying that you do that correlate with being bad at rationality.

it is better acknowledge reality as it is. we can't judge the rationality of politics beyond very basic stuff. and there is nothing rational in denying that.

moreover, politics corrupt. it is important to avoid such corruption whenever you can. the rational community can way to easy fall to the political weaponizing of the word "rational". when someone confuse between rationality and some political opinions, he may be doomed.

I believe it really better NOT to try to improve on politics. politics is hard difficulty setting - and the sort of settings that may destroy your rationality in other areas of live. avoid when you can! here lay destruction!

Expand full comment