523 Comments
Comment deleted
Expand full comment
founding

which are the canonical texts?

Expand full comment
Comment deleted
Expand full comment
founding

as far as canons go, this one seems pretty willing to admit its flaws

Expand full comment

Really? Can you find one example of Yudkowsky publicly admitting to being wrong about something? (Hopefully something consequential, but I'll accept more mundane examples.)

I hope the answer is yes, I admit I haven't done a full examination, but from what I've seen, he is brashly arrogant and hates to admit mistakes.

Expand full comment
founding

Five seconds of searching: https://twitter.com/ESYudkowsky/status/1238935900391755776

Expand full comment
founding

yeah, i think 'purveyor of rationality' is one of the few areas where never admitting you are wrong may be worse for your reputation.

Expand full comment

To me, this reads as "I asked the wrong question", not "I got a wrong answer." If anything, it strikes me as ironic that this is the best he thinks he can do in admitting something he was "personally wrong about". I don't think this counts.

Expand full comment
founding

https://www.readthesequences.com/

second sentence

Expand full comment

These aren't admitting being wrong about something. They're all tactical errors, not errors of rationality; "I was wrong to focus on...", not "I was wrong about..."

So, going in order, "I should have been more practical;" "I should have organized better;" "I was too rude." Same with the tweet Robert linked! "I asked the wrong question here," not "I got the wrong answer."

I hope it doesn't seem like I'm moving the goalposts here. But this isn't what I was looking for. I'll reiterate my question - what is something he's admitted being wrong *about*?

Expand full comment

An example I distinctly remember is him saying he was wrong about quantitative easing (as in: he was worried it would cause massive inflation, it didn't), but I'm having finding a place where he states this strongly. This LW post[0] mentions this, but not with very strong language ("I had some credence in inflationary worries before that experience, but not afterward..."); I do recall him stating it in stronger terms elsewhere though.

I can't find it right now, but I also remember him distinctly saying he was wrong to say that Bush and Gore seemed about the same, although that one's maybe somewhat questionable as it's a statement about a counterfactual.

[0] https://www.lesswrong.com/posts/LNKh22Crr5ujT85YM/after-critical-event-w-happens-they-still-won-t-believe-you

Expand full comment

Literally the preface to the sequences. Literally a list of five times he was wrong. That is the *first thing* he wants people to know when they start reading his work.

How entertaining this space would be, if Yud was really the bumbling hypocrite people like to think of him as, instead of just a smart guy guilty of being both ambitious and socially awkward at the same time.

Expand full comment

I think that's an overly liberal interpretation of religion. Rationality at its best does not pretend to have all the answers to life's mysteries (as might a religion) but rather tries to approach reality as it is without the inherent biased human interpretations. Although at its worst, rationality sometimes fails to live up to that standard, that does not mean the pursuit itself is inherently undermined by the same pitfalls of religions.

A better metaphor may be that rationality is an art form: Up for some interpretation, exists in many variations, but has some standards of quality that indicate how "good" it is. But of course everyone can express themselves in their own art and through practice become more accomplished.

Expand full comment

Who are the dead prophets of rationality?

Expand full comment

"Who Are The Dead Prophets of Rationality?" seems like the perfect title for a best-seller by a Deepak Chopra kind of person.

Or it could be a great article in Rolling Stone, if someone would only have the wit to name their new band Dead Prophets of Rationality.

Expand full comment

Are there movements/communities/whatever-you-want-to-call-rationality of which an outsider *couldn't* make those claims about?

Expand full comment

This is a serious point that often gets dismissed out of hand by rationalists. Just because you have a framework for being good doesn't mean you are better at it than people with looser frameworks, so long as they're making an effort. If you're defending a particular approach to rationality, it's not good if your defense can be transposed to defend any religion as well "Charles Manson wouldn't have done his thing if he believed in God, so you need to be a Catholic". In fact, demonstrating the ineffectiveness of "credentials from striving" in other areas of expertise is a common refrain of people like Tetlock, beloved by rationalists.

I do think rationalists are unusually good at getting things more right than others, but this is not just a function of striving, but rather a weird and illegible historical phenomenon of an unusually successful movement like the Cambridge Apostles. I think what goes underestimated and underemphasized about rationalism (and maybe should stay that way to avoid legibility) is how much of the success and insight of the rationality community is due to the accident of the people in it, including Scott himself, being decent, intelligent and brave (though not agreeable) beyond the level of your average academic community.

Expand full comment

It's not particularly surprising, considering that today's academic communities have some pretty bad incentives..?

Expand full comment

What do you think the word religion means if you believe that having a set of texts that prescribe evidence based standards for something, having people who advocate for specific non-supernatural standards and advocating for seriously considering ideas regardless of how stupidly its most visible proponents present them all count as exclusionary religious behaviour?

Do you also believe biology and every other empirical science is a religion?

Expand full comment
Comment deleted
Expand full comment

If you asked me to justify my beliefs without mentioning any rationalist individuals, rationalist websites, or rationalist-adjacent books, I would. No leaders, no followers, nothing is sacred.

Do I pass your test?

Expand full comment
Comment deleted
Expand full comment

I think asking for uncorrelation is getting ahead of ourselves; if a source of information is *good*, then you want your beliefs to be correlated with it. If it's bad, then you want to be uncorrelated. We'd have to first answer "Are the rationalists right about things?" before we decide how much our beliefs should correlate.

But then we're really just focusing on having correct beliefs.. and that pretty much makes us rationalists, whether or not we associate with a group.

Expand full comment

Taboo "canonical", "prophet", and "adherent" and then try explaining your argument again.

Expand full comment
Comment deleted
Expand full comment

It seems that you are using a nonstandard or non-literal definition of the terms, since rationalists clearly do not have, for example, any figures who are literally seen as delivering messages from God, which is the essential characteristic of a religious prophet as conventionally defined. I can't follow your argument because I don't know what these nonstandard definitions you're using are.

Expand full comment
Comment deleted
Expand full comment

"The spirit of using the term is to define leaders who organically gain popularity, without term limits, are revered, and are exemplars of the group identity."

Since the rationalist "leaders" don't actually hold any official position, they can't be elected or have term limits, so that point is nonsensical. Take that away, and it seems your definition of "prophet" is just "a person whom a lot of people respect and pay attention to". Such figures exist in any community and referring to them as "prophets" or "hallmarks of religion" frankly comes off as a deliberate attempt to obfuscate the actual facts of the matter.

"Do you also think "canonical" and "adherent" are similarly confusing? They seem to me applicable in the standard sense."

"Canonical texts" usually implies a body of work that is venerated because of its connection to God or the gods. I'm pretty sure rationalists don't have any of these

"Adherent" in the standard sense simply means a supporter of something. So if I take it in the standard sense, your argument is that the rationalist community is like a religion because some people support it, and because there are some people in it who are widely respected. To call this a wildly insufficient definition of religion would be an understatement.

Expand full comment

In case you're unaware, 'taboo' is a reference to this https://www.lesswrong.com/tag/rationalist-taboo

Expand full comment

"From an outside perspective, I think the rationality community has many of the hallmarks of a religion: canonical texts"

there's something that was written

"prophets (living and dead)"

by someone

"adherents"

That other people find useful

"in/out group identity"

and if they find someone else who found it useful, they view that as one commonality.

This strikes me as a "You know who else used forks? Hitler" sort of argument.

Expand full comment

"canonical texts"

things were written

"prophets"

by people

"adherents"

that people found useful

"in/out group identity"

and when they found other people who found them useful, they considered that one commonality

This strikes as a "You know who else used forks? Hitler!" sort of argument.

Expand full comment

> throwing out systems of beliefs and their adherents and proponents must always be considered as an avenue to correct irrational beliefs and knowledge. To be right often, one must change one's mind often

You seem to be saying that while false beliefs can be integrated with each other and built into systems, somehow this is impossible for true beliefs?

Should we expect the opposite, since, even if we don't know which beliefs are true, we know that all true beliefs are necessarily consistent with each other?

Expand full comment

So what you're saying is... Tsuyoku Naritai!

Expand full comment

I've only occasionally come across anti-rational-community sentiment online, but the way I've always interpreted the criticism is something like 'the rationality community is very powerful (because it contains a bunch of silicon valley people) and that even if the rationality community is 90% as rational as they think they are, that 10% mismatch between perception and reality is enough to do a lot of damage'.

I liken it to criticisms of the church with respect to compassion etc: yes the implicit goal of the church is to be more compassionate than average, but if you're only 90% as compassionate as you actually think you are, and you wield enormous power, that leaves lots of room for damage.

I don't agree with these criticisms when directed at the rationality community but I understand the logic.

Expand full comment

Imperfectly rational people can do damage, but the question is how much damage compared to some counterfactual. Compared to putting ideologues in charge? Compared to nobody being in charge?

Imperfect rationalists are the best thing we have.

Expand full comment

Imperfect rationalists are also more likely than other groups to cede power if they discover something else is actually better also. I think, were this not true, there might be an argument to be made against them.

Expand full comment

Keep in mind that you just replied to an imperfect rationalist who (in the comment you replied to) made the argument that an imperfect rationalist is the best possible person to be in charge.

Hypothetically, if you were in a position where you had to make such a person cede power, how would you do it?

Expand full comment

You seem to be saying there is some contradiction, but there is no contradiction between saying that the *group* of imperfect rationalists is the best we have, and saying that a particular imperfect rationalist may not be the best we have. The answer of how to get someone like Trevor Blackwell to cede power is quite obvious: convince them that there is an imperfect rationalist that is better. Or convince them that their previous belief that imperfect rationalists are the best thing we have is false.

Expand full comment

I am pointing out what I think is a complication. I think the step encoded in "convince them" hides an extremely complex operation that might be entirely imaginary.

In order to convince someone of something, you need a basis. Some set of things you agree on (at minimum, you need to understand and agree on what constitutes evidence). How does this work when someone in power needs to be convinced by someone without power to cede some of it?

If by "convince", you mean the usual social/political/financial/military struggles that mark changes in leadership, sure, I can picture that. But if we're talking about what I _think_ we're talking about, some way of marshalling a logical argument so convincing that the target relents, then I'm not sure there is such a thing. Even if it were, depending on the amount of power on the line, we might be in the territory where epistemic learned helplessness becomes relevant. If the person in power is an "imperfect rationalist", the fact that they're _in_ power means they're very likely to have the prior "I am the best person to be in power right now", and because they're an imperfect rationalist, that might be incorrect. What kind of proof would they accept? Even from a fellow imperfect rationalist? How about someone who isn't?

Expand full comment

"But if we're talking about what I _think_ we're talking about, some way of marshalling a logical argument so convincing that the target relents, then I'm not sure there is such a thing."

Washington stepped down. Even Pinochet stepped down. When it came down to explicitly calling for a coup to stay in power, Trump backed down.

"Even if it were, depending on the amount of power on the line, we might be in the territory where epistemic learned helplessness becomes relevant."

I'm not clear what you mean by "epistemic learned helplessness".

"If the person in power is an "imperfect rationalist", the fact that they're _in_ power means they're very likely to have the prior "I am the best person to be in power right now""

If they're a good rationalist, they would likely have the position "of the people in a position to take power when I did, I was the best person". Now that they have taken power, the set of people who can take power has enlarged (because they can give power to that person).

Expand full comment

Yes, but imperfect rationalists who recognize and try to improve on their imperfections are much better than imperfect rationalists who react negatively to anyone trying to point out one of their imperfections and avoid learning from the criticism.

Expand full comment

I don't understand what this is in reference to.

Expand full comment

The topic of this post.

If you know that you're not a perfect rationalist and someone says to you 'hey I've noticed you have an ideology and I think it's causing some problems', then a smart thing to do is ask them what problems they see and consider whether they have a point or not.

If someone says they've noticed some problems with your ideology and you respond with 'how dare you accuse me of having an ideology, you're just trying to dismiss me and lump em in with Alex Jones and say all truth is relative and being rational is pointless, well I disagree with that so go away', then you may be missing out on some important feedback that might have helped you improve your rationality.

Of course, that depends on the person offering the criticism acting in good faith. What I'm responding to here is an anonymous, paraphrased quote taking from somewhere on reddit, so I'm interpreting it charitably in the absence of a reason not to. If there's a hidden context that the criticism is coming from someone with a history of bad-faith attacks and this is part of that history, then dismissing them makes much more sense.

Although, if you're used to getting a lot of bad-faith attacks from random people, and so when you see a new criticism from a new person you assume it is bad faith and dismiss it immediately, that also gets into problems...

Expand full comment

The question here isn't good faith vs bad faith, it's about what the actual claim is. If the claim is "I think you might be biased here, have you considered such-and-such", that's one thing. If it's "everything is political, therefore the project of rationality is pointless", that's another thing. The latter is quite common and that's what's this post is responding to.

Hell, even if we restrict to things more like the first -- there's a substantial difference between "I think you might be biased here" and "Everything you say is just ideology!" The difference is not whether it's *in* good faith but whether it *assumes* good faith. The former is someone trying to have an actual argument; the latter is someone trying to ignore actual argument.

But really, as mentioned, this post is responding to claims that the project of rationality is pointless due to everything being political. Perhaps you've managed to avoid places where such claims are common, and they are, and that's what's this is responding to. It is certainly important to take account of legitimate criticism, but that's not what this post is talking about.

Expand full comment

Oops, that "Perhaps you've..." sentence got a fair bit omitted. It should read, "Perhaps you've managed to avoid places where such claims are common, but in some places they are, and that's what's this is responding to."

Expand full comment

In spirit the post might be responding to those types of claims, but in text it is responding solely to the sentence 'There is no such thing as "rationality" that is free from ideology,' and nothing else.

And I just don't interpret that single sentence in the way you're laying out here, especially in the context of the larger post it appears in which compliments the rationalist community in several ways.

I read that sentence as just saying 'hey, it's impossible to not have an ideology, and the fact that you don't think you have one is causing you to fuck things up sometimes.' Especially in light of the speaker.

Expand full comment

The pertinent comparison is "putting someone with relevant experience in charge". Note that experience or success in a challenging field imply a certain amount of raw smarts. The novel claim of rationalism is that if you have enough raw smarts, you can dispense with the experience.

Expand full comment

What? No rationalist I know makes that claim.

Expand full comment

Nah, they're more about dispensing with the credentialism and other established status-distributing structures if you have enough smarts. And this claim is not particularily novel, stagnant old guard being outdone by nimble upstarts is as old as history. The claim that the "rationalists" are particularily suited to do this may be novel, but then again any ambitious movement/subculture thought that of itself I'd imagine.

Expand full comment

Amateurs thinking they know much better than the hidebound establishment while actually being clueless is also pretty old.

Expand full comment

Sure. The relevant question then would be whether there's a reliable way fo amateurs to overcome their cluelessness outside of the establishment. For what it's worth, I don't think that rationalist project has come anywhere close to demonstrating that they are better at this than anybody else.

Expand full comment

I think this might have been a point if rationalists were overconfident. Pretty rational, but thinking they were more rational than they actually are. But given that rationalists even have exercises to overcome these biases, i.e. prediction exercises, I don't think that's true.

Expand full comment

Ah, but most scaremongering about the Rationalists doesn't focus on the idea that they're bad at predictions, though — except in the field of A.I., where the inescapable gravity of "oh come on, evil A.I.s are a thing from sci-fi, you can't be serious" remains inescapable.

Mostly people are scared of Rationalists as having bad *values* — at holding objectionable non-empirically-falsifiable beliefs about e.g. what it is or isn't moral to fund, or promote, or even saying in public — and that their dazzling but morally neutral skills at predicting market fluctuations is suckering the average joe into thinking that they also must know what they're talking about in moral matters.

And that's much less of a self-fixing problem, assuming you treat it as a problem. Rationalists are the kind of people to whom the idea of "self-introspection to formalise your innermost moral intuitions into consistent utility functions" makes sense. You can easily convince a good Rationalist that they're less competent at achieving their goals than they think they are; but I think it is *harder* to persuade a Rationalist that their ultimate aims are *wrong* compared to an average person. And to put it crudely, people aren't worried about powerful Rationalists being bad at predicting market crashes, they're worried about them being heartless eugenicists set on euthanising all puppies to increase productivity by 0.37%.

Expand full comment

I like this puppy analogy.

The “you have to make mistakes in order to learn” thing is another divider. From what I can tell so far rationalists do not worship the almighty mistake as the one true gate to knowledge. The landscape of imprecision, inaccuracy and wrongness is much more subtle. I am drawn to this community because my approach to the world is that I am constantly some amount of wrong and learning to handle that and break it down and question it and minimize it sometimes yields something that is, well, less wrong (I have not yet read that blog but the title is great.) The you-have-to-accept-your-mistakes “growth” demand operates differently for different philosophies. A mistake seems like a contained event to some people and then they have to address “it.” Then they think an unwillingness to wallow in the experience of being “wrong” is somehow an assertion that one is never wrong. Or that wrongness is where personality and heart live. Which resonates for some - but it makes me think they project their ideas. That cannot be the only legitimate description of personality. If heart can only be revealed in obvious, contained mistakes, that is an impoverished sense of heart! The fear of rationality might not lie in what it is, but in some people’s need not to lose all the tropes they have attached to irrationality (yes, puppies, daydreaming, love, et cetera)

Expand full comment

Those puppies are great to illustrate the point. Reality is much more disappointing. Try (obviously calmly and respectfully) arguing that price gouging can be good, or one of many other seemingly low-controversy views. That's enough to be seen as truly disgusting in many, especially left-leaning, communities.

Expand full comment

"but I think it is *harder* to persuade a Rationalist that their ultimate aims are *wrong* compared to an average person."

Can you explain your reasoning? One data point against your position is that religion is much less common among Rationalists, and I think that it's clear that religious people are harder to persuade that their ultimate aims are wrong.

A Rationalist is not someone who has accepted an ideology because they have been brainwashed into it from infancy. They are someone who has been persuaded through rational argument to adopt particular views. So why would it be difficult to persuade them to adopt different ones?

If your values are such that you don't think that you can persuade a Rationalist that your values are good, what does that say about your values?

Expand full comment

I suppose we have slightly different models of how morality works.

The way I see it, by the time they reach adulthood, people have moral intuitions hardwired into their brains, which we might term their "utility function" or "terminal values". I'm unsure of the extent to which they are inborn, or shaped in early childhood, but they can *functionally* be treated as innate and immutable; this makes them akin to one's sexuality or sense of gender. Also like those things, they differ from person to person, although most people hold values that yield sufficiently similar results in day-to-lay life that the illusion of something like a "universal human morality" can be maintained.

The rationalist, or rationalist-adjacent person, will do some self-analysis and figure out precisely what moral intuitions they *actually* feel deep down, as opposed to what it is culturally acceptable to view as moral. This results in things like Effective Altruists who puzzle out their true feeling of "I want to stop as many people as possible from suffering/dying, even when it makes me do things society views as weird" from social norms like "you only have a duty to people you know personally", and then take rational action to further those aims.

A religious person believes that X is moral because Y holy text says so. If they can be persuaded that the holy text is probably woo, they may change their mind about the sanctity of those values. A Rationalist, meanwhile, holds moral values because after all due introspection they have formalised these principles from their own, individual moral feelings, irrespective of social norms; and they don't think that there is some objective morality that trumps their individual values.

You can in theory convince a Catholic to stop thinking premarital sex is wrong if you manage to convince them that God doesn't exist. But if a Rationalist thinks, for example, that animal suffering has no moral standing, you cannot *change their minds* about this; either they feel a moral twinge when an animal is hurting, or they don't, and you can no more change their mind about this than reason with the Paperclip Machine that it "oughtn't" to value paperclips.

Expand full comment

You presume too strongly that in the rationalistic, or machine-like catholic, who perfectly maps their morals on to the holy book. You can interpret the Bible in a million ways, to justify most moral positions; people have done so for a long time. What makes you think the (equally intelligent and introspective) Catholic's any different, practically, from the rationalist, here?

The rationalist introspects, plumbs their moral intuitions, and builds rationalizations for them post-hoc in the language of modernist progressive-speak.

The Catholic introspects, plumbs their moral intuitions, and builds rationalizations for them using a post-hoc interpretation of the Bible.

Or for example- why not the rationalist who realizes that pretending to care about people half the world away was just something they did to show off to their bay area friends, and, on introspection, realizes that they really do care more, in a fundamental moral sense, about their family and close friends, and thus decides to stop giving to GiveWell?

(Unless it ends up being expedient for them- people across the world doing better and building real wealth faster is a boon to you and yours ceteris paribus, so it might be self/kin-serving to donate x% to GiveWell conditional on enough other people doing the same, to make sure you're not a sucker; alternatively, it might be worth doing just to look good, thereby attracting a bay area mate (although probably not, if the stories of the bay-area rationalist dating scene are anything to go by.))

Expand full comment

Damn it. I wish you could edit these posts for ~5 mins after posting.

Meant to say "You believe too strongly in [the machine-like Catholic]".

Expand full comment

"Signaling to their bay area friends" is a big feature of the Rationalist community that I feel they don't want to admit. EY said something like he "doesn't understand social status" - which I find it *incredibly* hard to believe.

More likely - Rationalists are a part of a group that has an entirely different set of status signals from the "mainstream" population, and they are confusing "not paying attention to mainstream status indicators and instead paying attention to the status signals of my weird bay area friends (that occasionally overlap with class)" with "we aren't worried about status".

Expand full comment

I still feel there is a difference between the Catholic who believes prima facie in an objective morality ordained by God, and the atheistic Rationalist for whom whatever moral intuitions can be teased out of the wet tissues in our skulls really is all there is. Even if (as you say) most will come to comfortable agreements between the basics of religious morality and their own feelings, a Catholic can theoretically come to believe that their own moral intuitions are "wrong"; if the Pope outright said they were, for example, then they would view the situation as "I am a flawed human and my moral intuitions are flawed; I should follow what the Pope says, not my own gut feeling".

Also:

“why not the rationalist who realizes that pretending to care about people half the world away was just something they did to show off to their bay area friends, and, on introspection, realizes that they really do care more, in a fundamental moral sense, about their family and close friends, and thus decides to stop giving to GiveWell?”

Well, why not indeed? If there are indeed people who truly feel nothing for the deaths of strangers, then the Rationalists are one of the only movements where I could expect them to possibly be open about that fact. Even then the social pressure remains enormous (as it would be for ordinary people: for a non-Rationalist it is socially odd to give to GiveWell, but it would *also* be a bad idea to go about trumpeting that you *truly* don't care about the suffering of sick African children at all), but that is a function of the Rationalist community not being perfectly rational.

(And FWIW I have enough faith in human nature that I think people who really don't care about strangers are a small minority. Probably most people care *more* about their loved ones, but have "innocent human beings I don't know" not far behind, and "animals that seem like they have qualia" a little further behind.)

Expand full comment

I think everyone in this conversation is making a mistake by putting 'rational' and 'has an ideology' at different points on the same scale.

That's not really how this works, at least form the point of view of the people lobbying these criticisms. Ideology and rationality are separate measures, and they can both influence your behavior in their own ways. Everyone has an ideology, just like everyone has a utility function; that's not a problem in and of itself.

But if you blind yourself to your own ideology or convince yourself you don't have one, then you're going to have a poorly developed ideology that might cause a lot of problems. You can interrogate and improve your ideology in the same way we often try top interrogate and improve our rationality.

Expand full comment

I agree with this! It seems the issue they have is rather that the concept of "pure rationality" as an achievable state of mind seems unattainable and something like "90% rational" makes no sense. I think to the critics it sounds a bit like religious zealots, who try to get ever closer to God/Jesus/Mohammed, without realizing that this is fundamentally impossible.

Expand full comment

You can easily get closer to Jesus. Literally try to recreate the arc of Jesus' life in your own. It was a popular thing to do in the Middle ages. Start out small, go on a spirit quest, get martyred. 'In spirit' optional.

You could get closer to Mohammed in the same sense. Eliezer Yudkowsky is doing this in spirit.

Troubled childhood, check.

A great worry about the end of the world, and a sense of duty to spread the word about it so people can save their mortal souls, thinking this the primary duty of his work? Check.

Good news for those who accept the word of Allah and turn away from evil? Yes; the promise of the children's children in HPMOR.

He's got his Bay Area polyamory harem; multiple women/'wives', some ethically dubious according to 21st century social mores, check.

We've reached the present. Here's what's to come:

Emigration from his city of residence, along with followers, to act as an arbitrator in Medina? The exodus of the rationlists from the Bay Area Metropolis to Peter Thiel's artificial island of floating venture capital, gambling, and animal tests.

The subsequent confiscation of the property of converts left behind in Mecca? Obviously a reference to rising US taxes and future confiscation of land and goods from emigrees to BitStates.

The subsequent wars of conversion by the sword and Muhadkowsky's ascendance to heaven is also obvious to predict.

Expand full comment

This should be its own post on LW

Expand full comment

Hehe, true. My point was more that while some religious people do exactly that, they are often regarded as fatuous zealots by their co-religionists who find it presumptuous to imagine one can be like Jesus/Mohammed/... . This was the analogy I was trying to make. There is nothing wrong with embracing the idea of rationality (like there is nothing wrong with embracings Jesu teaching etc.) but assuming you can somehow become like them/him might be seen as fundamentally flawed.

I fully agree with the tenor of Scott's post though, I should say!

Expand full comment

Rationalist *is* an ideology. "Ideology" simply refers to how one thinks about the world. It's acquired a negative connotation of close-mindedness, bigotry, bad faith, motivated reasoning, etc., but that's not part of the denotative definition.

Expand full comment

Are denotative definitions important? If the word connotes all those bad things, and we seek to avoid all those bad things, then shouldn't we seek to avoid the word too?

Expand full comment

Well, this seems to be a significant problem ?

"4. Eliezer Yudkowsky is their king and he's kind of an asshole"

Expand full comment

It would be if he actually was, although I can't help thinking that this complaint is ten years out of date. Yudkowsky blogs very rarely these days and the idea that he's wrong about a lot of stuff is, I think, very widespread among modern Rationalists.

Of course, that's not very useful because people are already moving on to the much more topical “*Scott Alexander* is their king and he's eeevil”.

Expand full comment

Scott is the True Caliph and therefore, by definition, cannot be evil 😀

Expand full comment

Well, he seems to be tweeting a lot... and my priors about this are :

1.) Twitter turns people into idiots and/or assholes.

2.) Twitter attracts idiots and/or assholes.

3.) Twitter makes it easy to take people out of context and paint them as idiots and/or assholes.

Expand full comment

While there are monarchists that post in Rationalist spaces, Rationalism itself is rather anti-monarchy. So this criticism relies on a fundamental failure to model Rationalist, and simply shows how caught up the critic is in their own ideology to evaluate Rationalism outside of it. It's like the Christians who treat Jesus as God, and then criticize atheists based on what Dawkins or Darwin or whoever did as if atheists likewise worship those people.

Expand full comment

Without necessarily disagreeing with the general point, I would be way more comfortable with that last line if those exercises were more common in actual practice. There is no law of thought that one gets the benefit merely by being aware of their existence, or even that the effect of mere awareness is neutral.

Expand full comment

I certainly don't see any evidence in the real world that the rationality community is very powerful. Not even within the confines of Silicon Valley.

I'm thinking, though, that the best way to put myself in the shoes of the "anti-rational-community-community" is to imagine myself in an alternative universe where there's a bunch of people who go around calling themselves the "Rationality Community" but who also constantly spout a bunch of stuff that I find to be both irrational and awful.

Expand full comment

I've lived and worked in Silicon Valley for the last 23 years, and excuse me while I wax cynical about what passes for a rationalist in Silicon Valley! Scratch a Valley rationalist, and you'll likely find layers of unsubstantiated critical assumptions, beliefs, and misinformation, all wrapped around an iron spike of logical empiricism that was pounded into their brains during college and grad school. Granted many don't define themselves as rationalist, per se, but I think that's only because they most of them define themselves by their problem-solving ability (which is usually excellent) or their IQs (that tend to right side of the bell curve). If you were offer them a choice between the label of rational and irrational, they'd all latch on to the rational label and cling to it for dear life.

Of course, we are bundles of beliefs, but a STEM background frequently focuses one's analytical skills on the practical — rather than on asking abstract questions like, "how do I separate my beliefs from my knowledge?" I would guess that a large portion of our beliefs (i.e. preconceived notions) are unfalsifiable — meaning no experiment can be designed that could disprove them. Plus we have other beliefs which, if we took the time, we could run experiments to falsify them, but we're too lazy to design or execute those experiments. For instance, unless your parents disclosed that you were adopted, you cannot really know if your parents are your biological parents. Most of us believe that our parents are our biological parents. The question would have largely been unfalsifiable until the advent of blood typing and then genetic testing. Nowadays, biological parenthood is easily falsifiable. So it has transitioned from the unfalsifiable category of belief into a falsifiable category of belief. But how many of demand swabs from our mothers and fathers to compare to ourselves? Some of us have, and some of us were surprised by the results. But I digress. So, I would humbly suggest, that anyone who claims to be a rationalist must willing to question their preconceived notions — moreover, they must be comfortable with accepting provisional answers — but not to cling to those answers if evidence is raised against them. Also, I would say that rationalism requires a certain amount imagination that allows the rationalist to think outside the rut of consensus.

(Full disclosure: I am unapologetically a Popperian. AndI'll admit is a methodological belief that I use is my preferred tool for separating science from opinion, facts from belief. And I've tested this methodological belief to death, but I haven't been to disprove it yet.)

Expand full comment

"If you were offer them a choice between the label of rational and irrational"

Is there any subculture that would unironically choose "irrational"?

Expand full comment

Dadaism ?

Expand full comment

Yes, I suppose irony is not entirely synonymous with protest, but they did clearly have plenty of both.

Expand full comment

If they used the same jargon, artists totally would. If they included humility as a virtue, rationalists would.

Expand full comment

Successful artists usually undergo years of studies and training, which I would imagine are mostly based on analysis of what makes good/impactful art. As for the virtue of humility, IMO one of Yudkowsky's better articles is on this topic:

https://www.lesswrong.com/posts/GrDqnMjhqoxiqpQPw/the-proper-use-of-humility

How much does this apply to actual rationalists including himself is debatable of course.

Expand full comment

Mystics aren't interested in a "rational" explanations of the world — although each of the mystical traditions that I'm aware of have a praxis which if followed will yield consistent results for its practitioners (or so it's claimed).

Expand full comment

I'd say that they are not so much uninterested as instead consider "true knowledge" inaccessible to the intellect, which would indeed count as irrational. It's curious though how eagerly rationalism has been adopting those praxes when stripped of their supernatural ornaments. Apparently enlightenment and such is not only compatible with intellect, it even increases its power!

Expand full comment

Hence, post-rationalism..?

Expand full comment

There's something to what you say. The early scientific inquiries of the Enlightenment were an offshoot of an esoteric revival in the Western world from 12th through 15th Centuries. Obviously, the practice of alchemy was pre-scientific, but alchemy transformed itself into chemistry when placed into a framework of hypothesis and experimental verification. Alchemists were big on theory, and they were big on experiments, but, as far as I can tell, they never figured out how narrow down theory into hypotheses that could be tested. And neo-Pythagorean mysticism infused 15th and 16th Century astronomy — Kepler was a mystic (and his mother was put on trail for witchcraft, but found innocent.)

Full disclosure, I wouldn't consider myself to be a rationalist. I'm a mystic. Although, I have a firm grounding in the sciences (advanced degree in BioBehavioral sciences, with lots genetics, evolutionary theory, and a heavy grounding in statistics, and a lot of work with the statistical analysis of large data sets) — but I actually come to rationalism from mysticism.

Having followed the praxes of several esoteric teachings (which all produced results for me), using a reductionist approach, I'm trying to synthesize what I've learned. The trouble with the "woo-woo" stuff is it isn't reproducible in a controlled setting were a neurologist can attach electrodes to your head or run you through an PT scan to see what your brain is doing. But if you follow the script of the traditions that I worked in — and some of the practices took years of self-discipline, following rituals that may seem to make no sense — I was able to get the results that were described by my teachers and that are described in the mystical literature. Being raised by scientifically literate atheists, it's natural for me to come back to my roots and use a rationalist paradigm for categorizing and analyzing what I experienced — not in, "these are delusional experiences that will display certain glucose uptake patterns on a PT scan" — but rather, "Can I developed a unified of theory of what people have to go through to have a mystical experience?"

Or maybe I'm deluding myself. Who knows. But it's been interesting.

Expand full comment

Also, The Theatre of the Absurd ("Waiting for Godot").

Expand full comment

"the rationality community is very powerful (because it contains a bunch of silicon valley people)"

Not a rhetorical question: what is this claim actually based on, though? I've heard that some rationalist blogs are popular among Silicon Valley people, but I don't know of any Silicon Valley people who actually consider themselves part of the rationalist community.

Expand full comment

Hard to say what the exact influence is, but it's not negligible, considering the pushback that the New York Times got due to the whole "doxxing article" issue ?

Expand full comment

Have they really got pushback, beyond private complaints and some angry emails? Did any prominent public figure explicitly denounce them?

Expand full comment

"Alexander’s appeal elicited an instant reaction from members of the local intelligentsia in Silicon Valley and its satellite principalities. Within a few days, a petition collected more than six thousand signatories, including the cognitive psychologist Steven Pinker, the economist Tyler Cowen, the social psychologist Jonathan Haidt, the cryptocurrency oracle Vitalik Buterin, the quantum physicist David Deutsch, the philosopher Peter Singer, and the OpenAI C.E.O. Sam Altman."

https://www.newyorker.com/culture/annals-of-inquiry/slate-star-codex-and-silicon-valleys-war-against-the-media

"Much of the support Alexander received was motivated simply by a love for his writing. The blogger Scott Aaronson, a professor of computer science at the University of Texas at Austin, wrote, “In my view, for SSC to be permanently deleted would be an intellectual loss on the scale of, let’s say, John Stuart Mill or Mark Twain burning their collected works.” Other responses seemed unwarranted by the matter at hand. Alexander had not named the reporter in question, but the former venture capitalist and cryptocurrency enthusiast Balaji Srinivasan, who has a quarrelsome Twitter personality, tweeted—some three hours after the post appeared, at 2:33 A.M. in San Francisco—that this example of “journalism as the non-consensual invasion of privacy for profit” was courtesy of Cade Metz, a technology writer ordinarily given over to enthusiastic stories on the subject of artificial intelligence. Alexander’s plea for civility went unheeded, and Metz and his editor were flooded with angry messages. In another tweet, Srinivasan turned to address Silicon Valley investors, entrepreneurs, and C.E.O.s: “The New York Times tried to doxx Scott Alexander for clicks. Just unsubscribing won’t change much. They can afford it. What will is freezing them out. By RTing #ghostnyt you commit to not talking to NYT reporters or giving them quotes. Go direct if you have something to say.”

Other prominent figures in Silicon Valley, including Paul Graham, the co-founder of the foremost startup incubator, Y Combinator, followed suit. Graham did not expect, as many seemed to, that the article would prove to be a “hit piece,” he wrote. “It’s revealing that so many worry it will be, though. Few would have 10 years ago. But it’s a more dangerous time for ideas now than 10 years ago, and the NYT is also less to be trusted.” This atmosphere of danger and mistrust gave rise to a spate of conspiracy theories: Alexander was being “doxxed” or “cancelled” because of his support for a Michigan State professor accused of racism, or because he’d recently written a post about his dislike for paywalls, or because the Times was simply afraid of the independent power of the proudly heterodox Slate Star Codex cohort."

Expand full comment

Yeah, I read that, I meant after the actual "doxxing" took place. It would seem that doing something should be worse than threatening to do it, but it looks like nobody really took notice. Or has the "war" between the media and SV become something to be taken for granted in the meantime?

Expand full comment

Well, as Scott himself admitted, he basically Streisand-effected himself, so to properly answer your second phrase you would have to see what happened in a hypothetical parallel universe where he kept blogging and didn't radically change his life.

Expand full comment

Elon Musk is definitely at least Rationalist-adjacent. Y-Combinator probably fits in that as well. And if you're in the Rationalist community and don't know any Silicon Valley people who consider themselves part of the Rationalist community, then my conclusion is that you don't know many Silicon Valley people. David Friedman is at least geographically Silicon Valley, and it's quite common for people in the SF Bay Area Rationalist community to work for tech companies. There's been Less Wrong meetups on the Google campus.

Expand full comment

I think the problem is that people are disposed to SEVERELY overestimate the level of certainty in scientific conclusions, and indeed fact claims about the world in general, and SEVERELY underestimate the amount of dependence on assumptions, interpretations, and theories involved in constructing such claims. If you look carefully at how science is done (starting with e.g. the replication issues in everything from biology to psychology), how academic and medical peer review and publishing works, and how many assumptions are needed to construct narrative claims in e.g. journalism, it becomes reasonable that a very large number of "facts" people pound the table about and scream are true are in fact highly contestable and provisional. Yes, there is good and bad reasoning but many fewer controversies can be definitively established by good reasoning than is often claimed.

Expand full comment

Can any controversies be definitely resolved by anything other than good reasoning?

Expand full comment

Yes, but probably not in ways any of us would find palatable

Expand full comment

I guess I would need an example.

Expand full comment

First, I would claim that the resolution of a controversy doesn’t necessarily require arriving at the truth. This is a mix between “if it quacks like a duck” and the fact that science has been wrong a lot historically, while still reaching consensus (I’m speaking historical science, not as a critique of modern science).

In other words, I think using “Everyone basically agrees on one point of view” is a pretty good definition for resolving a controversy.

There are two clear ways to do this. Firstly, you use logic and evidence to convince people. Secondly, you torture/threaten/kill everyone who doesn’t agree to you until there’s no one left to disagree.

This resolves the controversy, and historically has varying degrees of efficacy, but not in the way most people would consider “the right way”

Expand full comment

Because I forgot to include this I’m gonna reply to myself: waiting for people to die off and replacing them is a longstanding way that science progresses (from my understanding of it). Obviously, sometimes you convince people they’re wrong, but sometimes you can’t and you just wait for the old guard to die out. This is a more common way of resolving the controversy without winning the debate

Expand full comment

Well, in a certain sense it requires winning the debate to the audience of younger people. You didn't convince the people who had the old positions to change their minds, but you succesfully convinced everyone else to adopt your view instead of that of the old guard.

Expand full comment

More than a few historical controversies about ethnic groups have been definitely resolved by getting rid of the ethnic group.

Expand full comment

I think the beliefs and claims of any known perpetrators of genocide are (in 2021) either utterly rejected or highly controversial.

Expand full comment

And yet the questions are moot because we can hardly change the solution that was implemented. I'd call that a resolution.

Expand full comment

Well, overwhelming new empirical evidence, which I think is synergistic with but distinct from pure reasoning.

Expand full comment

I'm intrigued, but I don't know what you mean by "pure reason" so I don't know how to process what you're saying.

Expand full comment

I just mean that data and reasoning about data are separate constructs.

Expand full comment

OK, but how can data settle a controversy without reasoning about data?

Expand full comment

There's reasoning as in cutting edge statistical techniques to tease subtle conclusions out of noisy data and there's reasoning as in "hot damn check out this chart" (the canonical version of the latter being vaccine efficacy data).

Expand full comment

Nonsense. Name 10 facts about the natural world that are generally held to be true that are, nevertheless, "highly contestable and provisional." Most people who say this can't even name one without wandering off into social "science" or pop science cultism or reports about science in the popular press, which are normally so distorted as to be almost unrecognizable.

The truth is exactly the contrary of your opening sentence. Almost everything well understood in science to be true* has a level of empirical support and grounding in facts that dwarfs what the canonical hominid requires for "truth." Most people severely *underestimate* how certain almost all science is, and even in the best cases seize upon the tiny little bits here and there at the frontier where the debate is ongoing and mistake "ongoing debate" for evidence that nobody knows anything, or accepted "truth" is being violent overturned -- which is silly when it's not dishonest.

---------

* Using the shorthand "true" here which is understood by empirical science to be actually equivalent to "not yet proven false despite a larger battery of strong tests thrown at it."

Expand full comment

> Name 10 facts about the natural world that are generally held to be true that are, nevertheless, "highly contestable and provisional."

That's tricky, ten is a lot. I could tell you one or two, though, in the very specific sub-sub-sub-field in which I happen to be an expert. I assume most other sub-sub-sub-fields have similar "ragged edges" known only to people at the coalface, where the truth currently understood by experts is a lot more complicated than the truth that has filtered back to the undergrad-level textbooks.

It's not reasonable to expect one person to be able to reel off a list of ten such things, though, since nobody is an expert in ten different sub-fields.

Expand full comment

Well, if you can't reel off 10 obvious examples without googling then I submit you have no business making the categorical statement in the first place.

Expand full comment

Are you suggesting that Melvin is a sock puppet of

mixtyplyk?

Expand full comment

Not in the slightest. I was using "you" in the canonical Engish usage to substituent for "one." In other languages we have distinct words, e.g. in German one would say "Sie" (or "du") if you meant the person to whom you were speaking, personally, and "man" if you mean a generic person. Unfortunately in English we don't have this distinction without some awkward syntax, so we use "you" and expect the reader to figure it out from context. Presumably I should have given better context.

Expand full comment

I consider the whole focus on "facts" or "truth" to be suspect and smelling of ideological absolutism : you want instead to be thinking using models that are more or less applicable to the situation. For instance the "flat Earth" model is often good enough so is widely used. (You know, the one with the force of gravity being a fixed length and direction vector pointing "down" ?)

Mandatory Asimov :

https://chem.tufts.edu/answersinscience/relativityofwrong.htm

Expand full comment

Yes, I'm ideologically absolute when it comes to empirical fact. What is, is. 2+2=4 and the context is completely irrelevant, and it's never 5 or 3 if that's more convenient, et cetera. Sorry about that. You want moral or any other kind of relativism, try the next window over, where they wallow in superstition and technological incompetence (not to mention famine and disease) for lack of an appreciation of the rock-hard durability of...well, reality.

Expand full comment

But I can immediately think of an arithmetical example were 2+2 does not equal 4. Have you ever bothered to wonder if there would be any situation where 2+2 doesn't equal 4? Or have you just accepted this idea without questioning it? Relativism is in the eye of the beholder, and when it comes to counting systems, it's in the modulo you use for your calculations.

Expand full comment

No you can't. You can think of a consistent redefinition of any of those three symbols, outside of their canonical default meaning, which leads to different statements, but this only proves you make a good lawyer and snazzy lawyering isn't very germane.

No, I have not wondered whether 2+2 might not equal 4, because mathematics is a man-made formal system, it's not an empirical science. There's nothing to be discovered, 2+2=4 is merely a conclusion that results inevitably and logically from certain axioms. I understand the axioms, I can follow the logic, there is nothing more on which I need waste my time. (I grant that if I were a mathematician, I might indeed spend time wondering about alternate number systems, and more power to such people -- where would relativity be without Riemann? but I'm just not one of them. Formal systems don't interest me much beyond their utility as tools.)

When it comes to nature, on the other hand, where we must *discover* facts, and cannot deduce them from our own thoughts, then, yes, in the area of natural philosophy I *always* question what I have been taught, what everyone else thinks is obvious, and what I myself hypothesize.

Expand full comment

I don't understand. You're talking about empirical facts, immediately after that give the example of 2 + 2 = 4 but then concede that mathematics is not an empirical science, invalidating your own example ?

(Also there's a pretty big debate about the metaphysics of mathematics, one that might never be solved, so just stating that "it's a man-made formal system" is somewhat of an oversimplification.)

Expand full comment

"relative" doesn't mean "equally valid"

Expand full comment

The claim was "a very large number of "facts" people pound the table about and scream are true are in fact highly contestable and provisional", not "a very large number of "facts" generally held to be true are in fact highly contestable and provisional".

Expand full comment

Oh well, if the point is that people who pound the table and scream are often, or even usually, mistaken as to what is fact and what is not, I would fully agree. My experience is that the passion of the argument is inversely proportional to the amount of factual content.

But the intro said that people generally overestimate the reliability of what science has established, so I interpreted the statement in that context, i.e. not that passionate arguers were frequently poorly-informed, but that even well-informed arguers were deluded about the reliability of the facts they might use from science. It's this to which I take objection. If someone is *mis-using* science and gets stuff wrong...well, that happens so often as to not, sadly, be remarkable. Obligatory SMBC:

https://www.smbc-comics.com/comic/2009-08-30

Expand full comment

My OP said that people greatly underestimate the difficulty of establishing a scientific truth claim and the limited and provisional nature of such claims. Thus, phenomena like believing a peer reviewed article establishes truth. I also explicitly applied this to narrative (e.g journalistic) as well as scientific claims. This is a rather obvious point and it is only your misinterpretation of it as some kind of nihilistic claim that truth does not exist that would make it an issue.

As I pointed out below, in this very thread you yourself pounded your fist on the table and screamed that a very contestable claim (that the Covid vaccine was always lower risk than going unvaccinated) was true, calling people idiots and innumerate who called it into question, so perhaps you should consider the point

Expand full comment

Ha ha no you didn't. Engish is my first language, and I understand exactly what you said. If you want to argue you *meant* to say something else -- what you have in the first sentence here -- then that's fine, although parenthetically you should then work on expressing yourself more clearly.

But I have no objection to your first statement. Because it only supports mine. *Because* it's so difficult to establish something scientists are inclined to think is probably true, that's *why* the level of trust you can put in it is much higher than the level of trust you can put in your Tinder date's profession of love or what your Senator said to get elected.

Only a nitwit thinks peer review establishes truth. I don't know how that idiot canard got started, but it should be laughed at every time someone mentions it (including any silly scientists). The purpose of peer review is *only* to ensure that procedures are described sufficiently completely that someone else can duplicate the experiment. It's *independent duplication by other workers* that establishes truth (to the extent truth is established at all). Peer review is mere scholatism, paper work, it's checking internal logic and that all the data tables are labeled correctly. Such a procedure can *never* test assertions, any more than one can deduce whether the Earth goes around the Sun or vice versa by a close study of the Bible. The only thing that checks one experiment (or theory) is another. The only purpose of peer review is to ensure that what you publish includes sufficient clear detail that someone else can check your experiment with his own. It is, at best, the first step in verification.

I didn't pound the table on the issue of whether a COVID vaccine was less risky than COVID, you are mistaking dismissive contempt for anger.

I admit that someone is an idiot is the least likely explanation for why he would call that claim contestable. it's much more likely that he's innumerate or a kook. I mean, let's take a very simple argument, which even my 6th grader could grasp: there have been to date approximately 150 million Americans who have received a vaccine. There have been 32 million who have been diagnosed with COVID. 570,000 people have died of COVID. Since the ratio of vaccines to diagnoses is 4.7 to 1, if the vaccine were as risky as the disease, so should the ratio of deaths, i.e. there should have been 2.7 million deaths from the vaccine to date. Even if you attribute *every death* following a COVID vaccine to the vaccine itself -- which would be a wild overestimate -- there have only been 4,000 such cases.

There are plenty of people -- lawyers, say -- who might find it hard to follow the simple proportion here. Those would be innumerate. Then there are others who will immediately suggest all the data is bogus, because of some gigantic conspiracy. These are the kooks. Both of them are much more common than someone who can't even grasp the concepts (the idiots).

Expand full comment

Dude my OP was quite clear -- reread it. The entire thing, not whatever triggered you to be irrational about it. The fact that you are now forced to try to claim that it didn't say what it said is a confession of sorts. The first sentence of this post that you now claim you agree with is literally just a paraphrase of the first sentence of the OP.

Your claims about Covid vaccine risks and benefits are itself a demonstration of my point as they are shot through with uncertainties and inaccuracies yet you clearly want to claim them as some kind of hardcore scientific truth. This is exactly the kind of thing I was talking about. For example -- you claim 32 million "diagnosed" with Covid, perhaps so, but there is huge uncertainty about the number of Americans actually infected with Covid which is the proper denominator for whatever back of the envelope calculation you are trying to do and is certainly much greater than 32 million. Likewise, the VAERS death reports that you rely on for your 4000 vaccine deaths figure are a highly uncertain convenience sample and in any case would need to be adjusted for age group which was my point. There have been less than 9000 Covid-involved deaths TOTAL among people under 40! In another post somewhere you talk about vaccinations for kids 12-15 and there too you manage multiple inaccuracies and instances of missing the point in just on paragraph.

My post was about just how uncertain and difficult it is to use empirical and scientific data and evidence to get to conclusions on complex social questions that require balancing costs and benefits, you might want to engage in some thought about that.

Expand full comment

Oh, I'll try a few. Though the 'highly' part I don't like. Can we say just contestable? Then, String Theory, (1a super symmetry.), Dark Matter, I'm suspicious of some of the claims of Climate Science.. but I've got nothing in particular to point to. Covid vaccines are 100% safe. It gets easier to point to things in the past. The central dogma of evolution that information flow only goes from DNA outwards. (no epi genetics), Some nutritional stuff.... butter and bacon fat are back on the menu. I think given how science is funded these days, there's an incentive to 'go with the flow' assume the dogma of the field and look to make incremental improvements to it.

Expand full comment

Not a one of those are considered by the professional community to be merely factual. No one in high-energy physics thinks string theory is anything more than an elegant and interesting guess. "Dark matter" is merely a label for an empirical phenomenon which everyone agrees we don't understand -- the fact that Newtonian gravity + the matter we see does not explain galactic dynamics, for example, is incontrovertibly true and only a nitwit would contest *that* but *why* it doesn't explain it is well understood to be unsolved.

Climate scientists are suspicious of the claims of "climate science" by which I would guess you mean the hideously oversimplified and cartoonish version of it that seeps through to the popular press (and which, unfortunately, gets reinforced by certain showboating morons who alas sport a PhD behind their name).

I can't think of anyone with a functioning brain and greater than an 8th grade education in biology who think COVID (or any) vaccine is "100% safe." You could not give people an injection of saline and say it is "100% safe." It's a silly standard. Are they a hell of a lot safer than actually getting COVID? Well, yeah. *That* is known empirically, and if you doubt *that* then you're an idiot, or innumerate and uninformed, or maybe all three.

The "central dogma" dates from circa 1970. Biology has moved on considerably from then, and I'm not sure why that should be a surprise. This is like complaining that circa 1450 astronomy thought the Earth was the center of the Solary System.

You're generally mistaken with your assumptions about the influence of funding on scientific inquiry. It *always* pays better to make controversial claims, to overturn existing thought, to claim you have a breakthrough, et cetera. Every funding agency would *love* to say they funded a breakthrough project, or a project that upended the existing understanding -- as I said elsewhere, the bottom line for a funding agency is reputational. How well will Congress think we've spent the money, and how proudly can Senator X be that University of Your State was the location for the bold new science? That's what sits in the back of the mind for career bureaucrats who run these things.

What you may be saying is that gee, the old bulls who have been around a long time have a disproportionate influence over funding decisions, and if you propose something they all think is nuts, or not important, they can shut you down. All true. But alas, there are far more applicants than $$$ and there has to be *some* way to sort that out, short of gladiatorial combat, and nobody has thought of a better way to go than asking the people who have been around a while and have a track record.

It would certainly be nice to have more people pursuing what they think might be right, even though everyone else thinks they're nuts, because this is fertile ground for real new things. But you can't get there by changing your funding priorities, because these things *rightfully* end up lower in priority. What you need to do is make it possible for more people to pursue these things *without* needing to rank high in a funding priority cage match. So you might need (1) people who can think in peace without needing to be funded at all, e.g. Einstein coming up with crazy (by 19th century standards) physics while employed at the patent office (hence not needing to persuade a university he was doing something respectable), or (2) people who can acquire patrons from similarly crazy people, e.g. something like a VC model where Silicon Valley $zillionaire Mr. Foo can fund the off-the-wall ideas of Dr. Bar because he believes in it, to hell with these lies about Conservation of Energy et cetera.

That generally calls for a change in the way universities hire and retain research faculty, as well as a less-short-sighted national tax policy on industrial ressearch. I'm not going to hold my breath for either.

Expand full comment

I don’t think it’s at all clear that the risk of getting the Covid vaccine is less than the risk of getting Covid for a young (say teens or 20s) healthy person, especially a woman. (Although I wouldn’t call that a failure of science)

Expand full comment

Well OK then. If you have any actual data to back that gut feeling up, you should definitely write to the FDA, as they're about to approve the Pfizer vax for age 12-15.

Now *their* basis for approving it for adolescents is contained in Pfizer's Phase III trial data. Pfizer enrolled 2260 kids aged 12-15, almost exactly half and half in control and experimental arm. In the placebo arm there wre 18 cases of COVID, there were none in the vax arm. There were no recorded deaths, no recorded severe side effects.

From which a rough estimate of risk is that your odds of something bad happening from the vax are lower than 1 in 1100. That upper limit is certainly much higher than your odds of a bad outcome from COVID (in that age group), so it may turn out you are right. But it would be a strange outcome, because the risks of the vaccine have been *very* well tested in older people -- there have been maybe 100-200 million doses of that vaccine given out, and *maybe* of order 10 deaths that could maybe sort of be reasonably traced to it. It would be very strange to have healthy adolescents be *more* at risk for bad side effects than, say, 75 year olds. Normally the kids are more sturdy. But it's possible.

Expand full comment

Thanks for the long reply. I agree with some of that, but it seems like you are concentrating too much on some wrong fact. That wasn't how I read the original post... which I took as.. sometimes science gets struck in 'group think'. String Theory is a good example of this. It's been the only path that has been looked at for ~20-40 years? And those with other ideas were brushed to the wayside. (I sometimes read the 'not even wrong' blog by Peter Woit.) I'm not sure what to do about the 'old bulls' in the field directing the funding. I ordered "Scientific Freedom: The Elixir of Civilization", D.W. Braben. (from one of the ~100 not picked book reviews.) I haven't started reading it yet.

Expand full comment

LOL if it's a fact about the natural world it's not false right? That's a weird thing to ask. My point is that scientific institutions (which certainly include the social sciences, including psychology) are not generally very reliable in their productions. As I mentioned, the replication crisis is evidence of that. Or look at pharmaceuticals -- an approved new pharmaceutical is the product of layer on layer of scientific conclusions, from examination of biological mechanisms to randomized trials to government scientific approval, but it is quite common for approved pharmaceutical drugs not be reliable or to have major unpredicted side effects. I'm not talking about measuring the speed of light here, I'm talking about the numerous actual products of our actual scientific institutions which is really the issue here.

Expand full comment

That's so uninformed I can't be bothered to critique it. Sorry.

Expand full comment

In other words: I’m clearly correct and you have no good answer. Thanks for playing!

You should read some philosophy of science sometime.

Expand full comment

What for? I do science for a living, and quite successfully, for 30+ years. Normally philsophers come to me studying what I do to learn how science actually works. I've yet to come across a situation where a philosopher can teach me my own job, and it would be strange indeed if that ever happened.

Expand full comment

I don’t think SA would dispute any of that 👆

Expand full comment

It seems like there are some parallels to the argument by some theologians that atheism is itself a theology, or faith-based, or such.

Expand full comment

I really don't know how they can make that argument in good faith. I was a christian a long time ago and my experience of becoming an atheist was as a pure loss of faith - a reduction of certainty. I don't think I replaced one faith with another (a redistribution of certainty).

Expand full comment

The argument is usually resolved by choosing between agnosticism ("I have no proof that there is a God") and atheism ("I have proof there is NOT a God"). But I believe the historical meaning of those words ("agnosticism" and "atheism") is so muddled it's hard to know exactly what someone means by them without asking.

Expand full comment

Even if we were to claim that atheism refers to a positive claim (which it does not always do) to parallel how theism makes a positive claim, it should be "I believe that there is a God", not "I have proof there is a God". While there are some theists who claim to have proof that there is a God, it is not a central enough attribute to justify treating "atheism" as referring to a belief that the non-existence of God is proven. And then of course there's the fact that the meaning of "God" is so muddled that making statements about its existence is rather meaningless.

Expand full comment

Nonsense. "I believe" is ipso facto a statement of faith. If you're going to argue with people about the existence of God, you don't have an argument at all if the only thing either person has to say begins with "I believe."

And in fact, it never does. Arguments exist because both parties think they have *evidence* -- that their position is a rational *conclusion* not a mere statement of blind faith.

Ergo debates between theists and atheists consist of one side saying "Look at this evidence for the existence of God, this is convincing ot me, why not you?" and the other side saying "Look at this evidence for the non-existence of God, this is convincing to me, why not you?" And the agnostic sits between the two and says "Well, having looked at both your evidence sets, neither of them are convincing to me, so my conclusion is -- I can't draw a conclusion."

I agree the definition of "God" is quite frequently muddled by sophomoric arguers, but this says approximately bupkis about whether the concept *can* be defined very well by people who have put deep thought into it, at least enough for solid debate. I have certainly met people with sound theological training who are deep thinkers who could define "God" very well, more than enough to have an illuminating debate on the subject. The fact that not everybody can do this doesn't really mean much, any more than the fact that not many people can define "wave-particle duality' well enough to debate the Copenhagen Interpretation means quantum mechanics cannot be argued about.

Expand full comment

Correct, very muddled. In modern usage:

"Atheist" = I don't believe there is a God.

"Agnostic" = I'm an atheist but I want to get invited to parties.

Somewhat kidding

Expand full comment

+5 Funny

Expand full comment

"Atheism is a religion like not collecting stamps is a hobby."

Expand full comment

This has always been a stupid idea as atheists positively believe something about existence itself: One has no option, existence necessitates a faith position. Our existential disposition is nothing like a hobby or the lack of one.

Expand full comment

"This has always been a stupid idea as atheists positively believe something about existence itself"

Not all atheists.

"One has no option, existence necessitates a faith position."

"Faith" is an ambiguous word that is widely used for equivocation. You are responding to a comment on the issue of whether atheism is a religion, so if we were to take you as acting in good faith, you must be using "faith" in context to mean a religion. But existence absolutely does not necessitate a religion.

"Our existential disposition is nothing like a hobby or the lack of one."

You seem to not understand how analogies work. "an arm is to a body as an anchor is to a boat" is not asserting that arms are made out of metal.

Expand full comment

Even as a metaphor this fails. Someone who doesn't *particularly* collect stamps would nevertheless probably a a few lying around, for use on letters, having come from letters et cetera. To make the metaphor accurate you would need to imagine someone who quite deliberately avoids all contact with stamps -- who never buys them, never uses them, immediately throws away any that come to him by accident, and who from time to time expresses incredulity that his idiot neighbor seems to think stamps are useful.

Such a person couldn't exactly be said to have a "hobby" but he could certainly be said to have a hobby horse, a fetish of sorts.

Expand full comment

Having stamps lying around isn't a hobby though. And having 1% of the stamps that a collector has, but only using them to mail letters, is not the same as having 1% of a stamp-collecting hobby.

You're changing the original analogy.

Expand full comment

No I'm not. You're redefining "hobby". I merely took the analogy literally, and said if someone goes out of his way to avoid stamps, the way an atheist goes out of his way to construct a firm disbelief in God (remember we are *not* talking about agnostics, or people who just don't give a shit and have no opinion either way), that person would definitely be describable as having an obsessive interest that is pretty much the core aspect of what a "hobby" is.

Expand full comment

Storing stamps in order to use them for their intended purpose is fundamentally different from stamp collecting as a hobby(no stamp collector plans to put the stamps they're collecting on letters), and therefore opposition to stamp collecting doesn't imply not using stamps. (One could even imagine opposition to stamp collecting on the basis of believing that they should only be used for their intended purpose.)

The vast majority(if not the entirety) of the people who think stamp collecting is stupid(and even that is an analogue to a rather narrow definition of "atheist") will still put stamps on letters.

Expand full comment

There are a lot of people who have the hobby of telling people that they don't collect stamps, though. Gosh, they'll bring up stamp collecting and how they don't do it and inject it into anything. Talk about meeting a new friend and they'll ask "do they collect stamps? I don't collect stamps!" They'll talk to strangers on the bus about how they don't collect stamps. They will quietly contemplate not collecting stamps as they stand on the porch, sipping their morning rice milk. They'll think about having woodcarvings hung in their den with quotations like "In this moment, I am euphoric. Not because of any stamp collecting. But because I am enlightened by my ability to be entertained by not collecting stamps."

Atheism is a religion like telling people how you don't collect stamps is a hobby.

Expand full comment

Actually, I have a hard time qualifying "telling people a certain thing about yourself" as a hobby, as well "considering a certain thing". (Especially if that thing is so common. If everyone collects stamps, those who don't would naturally give that decision a lot of deliberation.)

And even if it was, the metaphor would just break down here: You would have talked about hobbies, but the actual question was whether atheism is a religion.

Also, you're describing at most a subset of atheists, not anything inherent or even central to atheism. You don't actually need to do anything to be an atheist.

Expand full comment

As a former Evangelical, let me tell you this:

I have met dozens of people who told me that Christianity is not a Religion. It is a *RELATIONSHIP*.

Personally, I think that they were playing a little fast and loose with various definitions and didn't mind that I might have been using a different definition for one (or more) of the words that they were using in their assertion.

In any case, if we agree that this is one of those things that happens, I'm good. I don't need you to agree that ALL ATHEISTS ARE LIKE THIS because, as we know, they aren't.

But we've all met the Atheists who have a hobby of talking about how they don't collect stamps. We might have even heard the stamp collecting simile from them.

Expand full comment

I still wouldn't qualify acting on a conviction as a hobby.

And there's still a very important distinction to make between "can take the shape of a hobby in a certain subset of people"(which is probably true of everything) and "is inherently a hobby" like stamp collecting.

Expand full comment

I wouldn't say there's no such thing as being better at rationality. I would say, holy shit, rationalists have gone off some weird and wildly unsuccessful and in some cases deeply destructive places as part of that pursuit. That doesn't make me say the whole thing is fruitless, but the pursuit of it as your goal is maybe not the best way to get closer.

Expand full comment

"Wildly unsuccessful" seems likely true to me (although that wouldn't distinguish it from any other comparable group of humans loosely aligned by beliefs.)

But I'm curious what "deeply destructive" things you think rationalist community hath wrought.

Expand full comment

At the risk of being too meta, making the increasingly wider culture leery of the descriptor “rational” seems like a net loss for the cause of rationality writ large. And while that is not to say I think the leeriness is all that justified, I think it's fair to say that the capital-R Rationality movement has made choices, and accumulated quirks, that weren't, stricto sensu, necessary components of “a community of people trying to be more rational”.

Expand full comment

I'm not sure if it's necessary for a "community of people trying to be more RATIONAL", but it might be for a "COMMUNITY of people trying to be more rational". I'm sure that in retrospect some choices should have been made differently, but quirks and weirdness can be socially load-bearing and removing them might not be for the best.

Expand full comment

A lot of people who were involved in lesswrong in the early days went into the neoreactionary movement, which has now more or less entirely merged with the alt right and are espousing various conspiracy theories. Seperate from any judgement of their politics they don't exactly seem to be upholding good epistemic standards.

Expand full comment

I'm pretty sure "a lot of people who were involved in lesswrong" is still very few people. Are you sure we can directly connect the opinions of those particular people ten years ago to a major political movement today?

Expand full comment

Actually, rationality is almost defined by the fact that it gets the wrong answer 90% of the time, and the right answer the remaining 10%. (Numbers made up for illustrative purposes.) That's because rationality as a philosophy concedes axiomatically the possibility of getting the wrong answer even by a very robust chain of logic, because logic can have subtle flaws, and because the assumptions and data that underly the chain of reasoning can be flawed.

The Second Law of Thermodynamics does the rest: since there are almost an infinity of ways to be wrong, and very few to be right, it follows that any nontrivial rational line of inquiry will come up with the wrong answer most of the time, if not nearly all of the time. That's why we back it up with empirical testing. ("Your theory proves X is true? Excellent! Now we measure X and see if we trust the particular logic.")

It's only religious faith and social mythology unmoored to any objective standard of proof that can ever be "correct" all, or almost all, of the time, because these philosophical approaches do not admit of the possibility of a formally correct conclusion being, nevertheless, false.

Expand full comment

I love the analogy between rationality and the 2nd Law. Taking it further - just as the universe starts out in a low entropy state, we could imagine that our pre-rational beliefs are much truer than randomly chosen beliefs, such that random perturbations would usually make them less true. Just as the time evolution operator exp[-iHt/h] transforms the wavefunction of the universe in a way that is uncorrelated with the course-grainings we use to define entropy, we could model rational inquiry as a process that gradually transforms our beliefs in a way that is uncorrelated (or only partially correlated) with truth. And just as it follows from the above that the universe evolves toward high entropy states, because most states are high entropy by definition, it follows that our beliefs will usually evolve toward falsity, since most propositions are false.

[And at one less level of meta - to what degree is the fact that most propositions are false itself a literal consequence of the present state of the universe being low entropy? I'm not sure. Even in thermodynamic equilibrium the world only occupies one particular state out of many, and it seems that most possible statements would therefore still be false. But this depends on the mapping between states and propositions.]

I think this analogy highlights two important points of disagreement between defenders of rationality and its critics. First, how trustworthy are our pre-rational beliefs? If you think that cultural and social processes have done a pretty good job, you might be wary of modes of thought which move us away from those outputs. And second, how correlated with truth are the changes in our beliefs induced by rationality? Importantly, if most possible changes to our beliefs would make them less true, then even if rationality does a pretty good job screening out bad changes it could still lead us away from the truth most of the time. For example, if 99% of changes to an existing belief make it less true, then even if rationality excludes 90% of the bad changes and none of the good ones rationality still leads you away from truth 90% of the time.

Expand full comment

I would say our pre-rational beliefs are exquisitely tuned by evolution, since for most of our existence a false belief ("That lean looking tiger with the intent stare is following me because he just wants a hug. Isn't he cute!") had drastic consequences which ruthlessly pruned them out of the genome, and even the memone (what we learned from our parents and tribe).

However, those beliefs were also tuned for a world we no longer inhabit, in which the threat from the environment dwarfed the threat of, say, overeating or joining a cult or failing to sock away enough 401(k) money, the kinds of weird complicated threats with which we (try to) deal today. So arguably while our pre-rational beliefs are extremely likely to be true, they are also, alas, also likely to be inappropriate in many aspects to modern existence.

Yes, I would say an ordinary abstract application of the Second Law says the odds are that the most likely *random* evolutions of our beliefs (= due to "thermal" fluctuations in the environment, which here we can take to mean random undisciplined experience and thought) will move away from correctness. That doesn't really seem controversial, if you reflect on history, or one's own youth. Most of the ideas I had as a 6-year-old to explain the universe were wrong and caused much amusement to my elders. Most of the ideas the Greeks had about atoms were wrong, and so on.

But that doesn't mean we're doomed, because entropy is not the only force (to continue the analogy). We can have enthalpic effects, too, e.g. we can put a heavy reward-weight on any evolution of our beliefs doing a better job of predicting real-world events according to some rigorous and objective test. This is what empiricism is all about. This pressure opposes the pressure of entropy that tends to make our ideas evolve in screwy ways, so *on balance* we *can* end up with rationality improving the correctness of our beliefs. But -- and this is my point -- it is *always* an uphill fight. We are *always* struggling against entropy, always walking a tightrope where even a small misstep in logic can plunge us into an abyss of endless error, and it's pretty much only by a rigorous and continuous re-assessment and re-testing against empirical fact that we avoid this and actually get somewhere.

I'm also just saying that just because someone falls of the tightrope 90% of the time doesn't mean he doesn't eventually succeed. You make error after error, but hopefully each time you get further, and your errors are less stupid and consequential, and eventually you may arrive at such a low probability of error that we can almost say you're right.

Expand full comment

"The Second Law of Thermodynamics does the rest"

The second law of thermodynamics has nothing to do with it, other than metaphorically.

"since there are almost an infinity of ways to be wrong, and very few to be right, it follows that any nontrivial rational line of inquiry will come up with the wrong answer most of the time, if not nearly all of the time."

Since for any false statement, we can easily get a true statement by taking its negation, it appears that the number of false statements is equal to the number of true statements, at least to the point that if you wish to assert otherwise, the burden of proof is yours. Further, since lines of inquiry do not consist of randomly choosing a statement to make, and in fact are quite the opposite, your conclusion doesn't follow. To top it off, your argument is self-defeating; it is (an attempt at a) rational argument, and so suggests that it has an overwhelming likelihood to be wrong.

"That's why we back it up with empirical testing."

You are implying that rational inquiry and empirical testing are mutually exclusive, but Rationalists employ both. While there are contexts in which "rationalism" might mean "eschewing empirical data", this in not one of them.

"It's only religious faith and social mythology unmoored to any objective standard of proof that can ever be "correct" all, or almost all, of the time, because these philosophical approaches do not admit of the possibility of a formally correct conclusion being, nevertheless, false."

Unless by "unmoored to any objective standard of proof", you mean "not making any meaningful claims about the world", your claim does not make any sense without substituting "admit to" for "admit of".

That's why we back it up with empirical testing. ("Your theory proves X is true? Excellent! Now we measure X and see if we trust the particular logic.")

It's only religious faith and social mythology unmoored to any objective standard of proof that can ever be "correct" all, or almost all, of the time, because these philosophical approaches do not admit of the possibility of a formally correct conclusion being, nevertheless, false.

Expand full comment

This is kind of a confused mish-mash to me and unfortuantely I have no idea what you're trying to say, or what point you're trying to make, other than that you have a visceral distaste for what I wrote. If you want to rework it into something more targeted, I'll be happy to take another stab at engagement.

Meanwhile, let me point out that your assertion that the contrary of any false statement is true is functionally useless, since almost all of those infinity of statements are of no practical conceivable consequence. The contrary of "2+2=5" is merely "2+2 is not 5", and that statement is useless, since the set of numbers to which 2+2 is not equal has cardinality aleph one.

The way we navigate through a sea of infinite statements is to group them, and in this case we would rationally group together all the statements of the form "2 + 2 = N". In that case, I hope it's pretty obvious that the number of such statements that are false is rather larger than the number (just one) which is true. That's the essence of what I mean: if you group together statements of the same type about any particular relationship, the number of those statements that is true in any formal system is enormously less than the number that is false.

You can if you like

Expand full comment

"This is kind of a confused mish-mash to me"

Simply saying that you don't understand my point doesn't help much, and you're the one making the assertion, so the burden of proof is on you to show how your argument is right. I think that my point that rationalist reasoning does not consist of picking statements at random is rather clear, and I think that how it destroys your argument is clear.

"Meanwhile, let me point out that your assertion that the contrary of any false statement is true is functionally useless"

It is useless, but only because the issue of "what 'measure' of statements are false?" is pointless, and you're the one who thinks that it does have a point. To the question of what percentage of statements are false, my assertion is pertinent.

"since almost all of those infinity of statements are of no practical conceivable consequence.

That's not the issue. You're implying that people pick statements randomly, not that they pick statements randomly from the set of statements of practical consequence.

"In that case, I hope it's pretty obvious that the number of such statements that are false is rather larger than the number (just one) which is true."

The number of *groups* of false statements may be larger than the number *groups* of true statements, but that's not what you claimed. This looks like moving the goalposts to me.

Expand full comment

I don't think it's worth arguing with this fellow. Someone who asserts (as if obvious) that a law of physics somehow directly affects our ability to say true things.. there's not enough room in this thread for the unlearning that's necessary.

Expand full comment

Pretty sure you're missing Carl's point here. The 2nd Law thing is just an analogy (between points in phase space and points in idea space), and imo a pretty interesting one.

Expand full comment

This is not making any more sense to me, sorry. If you are at all interested in continuing, why not take a very specific stab at a very concrete thing: I said "the number of ways to be wrong about something greatly exceeds the number of ways to be right, so the a priori probability of being wrong is high."

If you *don't* agree with that statement, tell me why. What is *your* position? Do you believe the number of ways to be right about something is equal to, or exceeds, the number of ways to be wrong? Why? Or, do you think that the number of ways to be right or wrong has nothing to do with the a priori probability of getting something right or wrong? If so, why?

If you can get down to those brass tacks, maybe we have something to talk about.

Expand full comment

I think that I have been quite clear. I have said that the number of true things is the same as the number of false things, and you changed the subject to the number of "groups" of true things versus the number of "groups" of false things. I stated that reasoning doesn't consist of randomly choosing among statements, so the proportion of statements that are true is not a good guide to how many conclusions are true, and you simply ignored that. You just keep saying that you don't understand what I'm saying, without any explanation as to what's confusing about it.

Expand full comment

2LoT has nothing to do with this except metaphorically.

"since there are almost an infinity of ways to be wrong, and very few to be right"

At least by some metrics, the number of correct statements and false are equal.

"it follows that any nontrivial rational line of inquiry will come up with the wrong answer most of the time, if not nearly all of the time"

Since inquiry does not consist of choosing statements at random it does not in fact follow.

"That's why we back it up with empirical testing."

Empirical testing is not mutually exclusive with rationality, and in fact is part of it. You seem to be using a definition not appropriate here.

"It's only religious faith and social mythology unmoored to any objective standard of proof that can ever be "correct" all, or almost all, of the time, because these philosophical approaches do not admit of the possibility of a formally correct conclusion being, nevertheless, false."

Unless you are referring to them making unfalsifiable, meaningless claims, you should replace "admit of" with "admit to".

Expand full comment

It's not impossible to be better at rationality, but the people who think they are better at it are rarely the ones who are.

Expand full comment

An interesting belief system. But I think the question was not whether the people who *think* they're better at rationality actually are, but whether the people for whom being good at rationality is a high priority end up being better at it.

So let's try it by analogy: hardly anyone is going to be so good at baseball that he never drops a catch, or fails to hit when at bat. And we can agree that whether someone thinks he's good at baseball actually is good is somewhat undecidable -- maybe he is, maybe he's deluded fool.

But surely we can agree that the people who *care* about being good at baseball, who think it's an important life skill, who spend a lot of time, energy and thought on improving their skills, are more likely to be good than people who think baseball is a waste of time because nobody can ever bat 1.000 or catch every single throw.

Expand full comment

The people who *signal* that they're better at it are rarely the ones who are.

I have no doubt that people who are good at it quietly know they're good at it. But if you're signaling it, you're at least missing some key stuff about signaling and social status.

Expand full comment

How about people who explicitly work on being more rational? I’d expect them to do better at being rational than people who never gave it a thought.

Expand full comment
founding

My problem with the rationalist community is that, while I agree with so many WAYS of thinking, and I agree with so many leaders, so many of the members that show up in comment sections, or at chatrooms and meetups, have used the principles and methodologies of rationality to argue for rediculus positions that have glaring holes in the logic one way or another.

In addition, it feels like the average member of the society is often out of touch with social norms and decorum.

In a perfect world, the followers of rationality could depend on each other, allow each of us to specialize at applying the tools to a specific field, but instead, I find myself trusting rationalists slightly less than anyone I would meet in any other community on their overall decision making abilities.

Expand full comment

"it feels like the average member of the society is often out of touch with social norms and decorum"

In my experience with other rationalists, this is generally a point in their favor. Social norms and decorum isn't necessarily that great, and while many in general society do worse, there's plenty of room to do better.

Expand full comment
founding

Baby steps perhaps. It's over thing for an otherwise conforming person to say "I think we should do away with schools" but it feels different when they are also anti-AI, still not going outside (though covid doesn't spread outside) and washing groceries that are delivered (even though covid doesn't spread through fomites), and usually, has an avoidance of traditional hygiene practices

Expand full comment

Do you want rationality to be high-status or low-status? I want it to be high-status, so that people are drawn towards it.

If the term "rationality" gets tainted by association with a bunch of low-status people behaving in low-status ways, then this is a net loss for the overall rationality of our culture.

Social norms and decorum are extremely important; they are necessary for having social status. Apart from being a nice end-in-itself to your ape brain, social status can be traded for influence. If you're going to start convincing people of your rational-but-unpopular points of view you're going to need a lot of social status before you begin.

Expand full comment

If status requires rationality to jettison much of what makes it valuable, then I want rationality to remain low-status.

Expand full comment

Suppose your society/culture encourages some crazy and expensive practices, like buying new luxury cars all the time to keep up one's appearances, and looks down on those who buy cheaper used cars. You understand that practicing frugality is the smart thing to do, so you find like-minded people who do the same. But outsiders look down on you: "Look at these rationalists with their cheap cars! They'd be more popular if they got with the program and started wasting money on luxury cars like the rest of us!" Which is true, but missing the point. First, we want to avoid the same mistakes everyone else is making. This sometimes implies opposition to some common social norms, and local behavior that's governed by different norms that we believe to be better. Second, we want to support each other in the practice of rationality, and have a refuge from the wider world where we don't push that kind of status on each other. And when we don't, it looks low-status to outsiders, but that's fine.

Expand full comment

Said it better than I would've.

In theory, I can imagine someone carefully storing up weirdness points for some future point when they intend to spend them. In practice, I don't think the points in question (practically?) ever get spent.

And the people advocating for this often seem to be pushing in a direction rather than hitting a target- I almost never hear someone say "No, don't do that weird nonconforming thing- do *this* weird nonconforming thing instead, it's a better use of social capital!" Not *quite* never, but maybe *twice* in many years of these discussions, not counting instances where this was elicited with a lot of prodding. At the risk(?*) of bulverism, I think people's True Rejection here is "That's too weird!" not "That's an inefficient use of resources!"

*It's definitely bulverism in the sense that it's speculation about their motives, but sometimes motives are inescapably relevant. I feel like there's real tension between not bulverising people and not rules-lawyering them- you can either respond to their argument *as they set it down*, or you can do mind-reading, or you can go *beyond* mind-reading into steelmanning (which usually *does* require *some* mind-reading). If someone's argument is "Too weird!" or "Too far!" that basically translates to "My intuition is that this is too [weird/far]," and the *source* of that intuition is relevant if we are to consider it at all.

Expand full comment

Do you want supporting same sex marriage to be high-status or low-status? If homosexuality gets tainted by association with a bunch of low-status people like trans people, drag queens, leather daddies, sex workers, non-monogamous people, people who lisp, etc., that is a net loss for acceptance of homosexuality.

Expand full comment

"Out of touch with social norms and decorum" just means willing to discuss things that social norms and decorum say shouldn't be discussed.

Expand full comment

No, many of us are also very weird and awkward much of the time, and it's disingenuous and misleading to pretend otherwise.

Expand full comment

This is a good thing; but some norms are more norm-y than others.

To jump straight to the final argument:

Being open to argument on cultural conflict in the middle east: Good!

Being open to argument on cultural conflict in eastern Europe circa 1940: Less Good!

I've run into 'rationalists' that move smoothly from one to the other. I'm pretty sure you wouldn't think of them as rationalists, but that's how they self-identified, and that's how others see them.

Expand full comment

It would be odd to say here "In addition, they smell funny and I don't like their taste in music." Even if true, it seems disconnected from the question of whether rationalists are any good at figuring out things that are true. Why is their sense of decorum any more relevant?

Expand full comment

For the same reason it's ever relevant in the first place: being able to follow rules of decorum is a demonstration of social and emotional intelligence, and choosing to follow them is a signal of community and shared purpose with the people you're following them in front of.

It's all signalling games, yes, but signalling is important in an interdependent social world. If someone indicates to you that they're unable to understand the emotions/experiences/expectations/needs of the people around them, or that they don't care about your community or comfort enough to follow the proper niceties, then it is rational to start to worry that they might not have your best interests at heart, or understand you well enough to achieve them.

Expand full comment

Very nicely put.

Expand full comment

The "rationalist" community contains a lot of sophomoric people who, true to stereotype, are oblivious to how sophomoric they are. It's like encountering a gaggle of people who modeled their persona on Sam Harris. This gives the identity a bad reputation. It doesn't help that the term creates lofty expectations.

If only it were just a group of people who like to be hyper-aware of psychological biases.

Expand full comment

I think the point that there are gradations of more or less rational is a perfectly legitimate and I further think that we should frequently (but not always) work to be more rational. I think the rub is a) there are elements of human life that will always defy rationality as conventionally define, particularly those related to our emotional selves and b) the question endures whether the human brain has evolved in such a way to achieve a particular kind of eventual objectively rational state that truly transcends human subjectivity and all of the fallacies and problems we associate it with. My problem with rationalists is that most simply assume the answer to b) is yes.

But sure, more or less rational exists.

Expand full comment

Would you grant that unrelated individuals, perhaps from different tribes, can arrive at better agreements/disagreements if the individuals are both committed to rationality as a virtue?

Expand full comment

Don't they also need to have matching, or at least non-contradictory, definitions of "rationality?"

Expand full comment

I would say that there are better and worse agreements and/or disagreements to arrive at. For example, if you are to terminate discussion in disagreement, a disagreement where you have arrived at high level generators of disagreement (or values) is better than one in which you have misunderstood each other.

I would say, the more you two agree on what rationality is, the more likely you are to arrive at better agreements/disagreements.

Expand full comment

I understand the argument, and up to a point I generally agree with you. I should like to point out though that in the real world it is actually frequently the case that better (= more functional) agreements are obtained when people are a little vaguer about what exactly they mean by the terms -- more or less, there is a certain deliberate obfuscation of what each truly believes to be "rational."

The Constitution is a good public case in point: I think one reason it succeeds is because it is sufficiently vague at key points that it allows people to paper over deep but transient ideological divisions with temporary mutual misunderstandings that maintain the practically useful fiction of social harmony while the actual underlying divisions are worked out by some non-judicial process.

The private example is marriage: I've never known a situation where a husband and wife believe *exactly* the same thing about what their marriage is all about, and for many durable couples there are a number of useful ambiguities (if not outright myths) they preserve which allow divergence in what each partner (sometimes secretly) believes, for the sake of overall harmony.

Expand full comment

That's a very thoughtful point. I hadn't thought about it in that way but I think you've persuaded me that you're right that in practice it's probably better that they actively avoid trying to pin down exactly what they mean by their commitment to rationality. Not just for the sake of the semantic tedium of doing so, but because they might actually disagree and the abstraction might be more functional.

Expand full comment

My takeaway from reading Yudkowsky's sequences was that the human brain has not evolved to be perfectly objective, and that it will forever harbor untruths and biases. We should be aware of these facts, and consequently compensate for these by using Bayesian analysis, etc

Expand full comment

I'd bet that you are wrong about what most people around here think.

Expand full comment

I'm not convinced that human emotional life is an area which innately defies human rationality. People tend to romanticize the idea that love is fundamentally inaccessible to rationality, but... dealing rationally with one's relationship life seems like an area people can be markedly bad at, in the same "if you can be bad, you can also be good" sense. Some people make disastrous relationship decisions which others immediately recognize as foolish and self-defeating, fail to learn from their mistakes, and suffer for it over and over again. There are other people who don't do that. Some people have good enough sense for dealing with relationships, navigating difficult emotional situations, etc. that people who know them tend to come to them for advice and guidance.

Surely we can strive to be more like the people who people go to for emotional/relationship advice, and less like the people who other people view as cautionary tales?

As far as b) goes, I'd have to join in with the others who simply disagree that the opinion you describe is the one most rationalists hold.

Expand full comment

Just a passing thought: the love example is ripe for confusion since irrationality is a signal in mating. Some evo-psychologists specualate that women like receiving ridiculous gifts like diamond jewelry because someone who is willing to invest irrationally in her is also more likely and able to invest well in her offspring. The same behaviour occurs in nature in various ways.

So for some straight men, the rational thing to do is sometimes the irrational thing.

Expand full comment

I don't think that ability to make persuasive gestures necessarily relies on blindness to their function. I can make gifts of expensive jewelry (and have, and chosen pieces which were well-received) while understanding its function as a signal.

That said, I can also make judgments about whether partners' expectations or desires in relationship signals are compatible with the kind of relationship I want to have.

Expand full comment

That's just signalling, not 'irrationality'.

Expand full comment

Diamonds are a girl's best friend 😀 Jewellery and other expensive gifts are a sign of commitment and sincerity, and help to insure a woman against the bad effects of getting involved with a man. Since there is always the risk that a man will make promises in order to get sex, then once having obtained sex he will break those promises, and leave the woman pregnant and the single mother of a baby that she will have difficulty supporting, economic investment like diamonds is (a) putting your money where your mouth is (b) if he does break promise of marriage afterwards, you can sell the diamonds to provide income for you and the baby.

But diamond gifts were deliberately created and sold as "proof of romance" not by women, but by jewellers and diamond merchants. It's not about sex, it's about commerce https://www.theatlantic.com/international/archive/2015/02/how-an-ad-campaign-invented-the-diamond-engagement-ring/385376/

Has evolutionary psychology tackled the thorny question of why diamond brokers engage in irrational investment in ridiculous gifts? 😁

Expand full comment

>and to help to insure a woman against the bad effects of getting involved with a man.

If financial insurance were a major consideration, Mr. T-esque festoonment with gold would be much more effective. Diamonds actually have strikingly poor resale value; stores will mostly only buy back diamond jewelry at a fraction of its cost, if at all, and you're likely to have to resort to a pawn shop, which is never a great financial return.

I suspect that the whole idea of diamonds functioning as financial insurance against an unreliable man is ultimately a cultural evolutionary just-so story, one that originated as a rationalization of a tradition that diamond brokers created without reference to such a practice.

Expand full comment

"Diamonds actually have strikingly poor resale value; stores will mostly only buy back diamond jewelry at a fraction of its cost"

Fraction of its retail cost. My understanding is that diamonds are a useful currency, but buying them from a jewelry store means paying a massive markup. There's a jewelry stores that guarantees their diamonds to apprise for double their sale price, and I guess they expect prospective customers to think they're getting a great deal, but I can't see how anyone can hear that and not conclude that jewelry appraisal is a racket.

Expand full comment

Diamonds aren't that great as currency because their scarcity is artificially constrained. There are other gems which would be more viable as currency (more intrinsically rather than artificially scarce, harder to synthesize at gem quality,) if they had the appropriate cultural baggage, but they'd still be less fungible than rare metals, whose value is determined almost entirely by quantity.

Expand full comment

Where on earth has anyone (other than random crazies, etc) claimed (b)?

Expand full comment

When you say that Rationalists are "genuinely willing to change their minds", have you ever changed any minds regarding Marx? I'm curious about this one, because I've found Rationalists to actually be very judgmental and quickly close off debate when Marx is referenced.

Expand full comment

Yeah but he's Karl Marx - red rag to all sorts of bulls... 🐮😁

Expand full comment

Am I to presume that your idea of genuinely changing one's mind means coming to accept Marx as our Lord and Saviour? What if I read Marx, disagree with some of his conclusions, and decide on the whole to reject Marxism?

If your only conclusion about being judgemental is "they haven't become Marxists" that does not mean they are being judgemental or irrational.

Expand full comment

The problem is that Rationalists largely don't read Marx. When Scott Alexander tried reading a secondary source he infamously failed to understand some of Marx's basic points, e.g.;

https://astralcodexten.substack.com/p/book-review-global-economic-history#comment-1795464

Expand full comment

A reply to one of his posts by you that garnered 2 replies?

'Infamous' is pretty darn melodramatic, don't you think?

Expand full comment

I've been repeating this critique for something like 2 years and I think most people seem to be aware of my reputation by now.

Scott Alexander however, has never fixed his mistake, despite thinking of himself as a mistake theorist.

Expand full comment

According to my Google search, the Communist Manifesto is 2156 pages. Not reading that seems like a rather rational decision. There are 33 million books listed on Amazon. There's simply no way anyone can read all of them. Picking one of the books I haven't read and pretending that makes me closed-minded is rather problematic.

Expand full comment

The Manifesto is like 30 something pages. Do you mean all 3 volumes of Capital?

Expand full comment

From my own observations of the rationality community there isn't really a belief that we could reach some purely objective rational state, given our own limitations. They believe that we can do massively better than we are now, but I haven't seen anyone suggest that we could somehow evaluate everything properly. In-fact, a sizable focus of various posts (primarily on evolution in the sequences) were arguing that we were failable and have the problems of being a human mind; which it advocates for trying to be better at rationality (of course) but also that we can't manage it fully. I could see an argument that we assume that you can manage such with some form of artificial intelligence, but even then there is often the knowledge that it is limited by physical reality (and thus can notconsider every possible situation to maximize its goals).

As for the parts about 'emotional self', I am somewhat unsure as to what you mean here. Our emotions are part of the goal that we would use rationality to work towards, though not all of it. Such as valuing human life even though I feel negligible (first-level? non-meta?) emotional impact from knowing that people die everyday. If I hold emotional value for something then that should be part of my reasoning, but it should also be evaluated with my reasoning (basic example: I quite enjoy candy, but I know that I'll value the feeling of being healthy more).

Expand full comment
author

I hope there aren't rationalists claiming they've "evolved to truly transcend human subjectivity". But having said that, I'm kind of annoyed by the strength of that sentence. It would be insane to claim that I've "evolved to a state that "truly transcends human anger" - but in fact I'm a pretty chill person. If someone rephrased my claim to be a chill person into a claim that I have "truly transcended human anger", I would look like a moron, but that's the phrasing's fault, not mine.

Expand full comment

Do you attribute your being a chill person to (1) your DNA, (2) your rearing, (3) an adult rational reconstruction of personality, or (4) something else?

Because if you can't argue convincingly that it' (3) I'm not sure there's any reason to credit it to rationality as a philosophy.

Expand full comment

Or could be 2) if Scott's parents raised him to be a Chill-ite, or 4) if that something else happened that can rationally be attributed to rationality? Plus natural selection isn't entirely irrational...

Expand full comment

Uh. He didn't credit it to rationality? I think you completely missed his point. He's explaining his problems with a rhetorical device, not claiming to be chill because rationalism.

Expand full comment

Alright mate calm down 😉

Expand full comment

I'm not talking about claims about what anyone has actually achieved, rather a conception of an ideal that has been discussed since long before the rationalists. And you can particularize the question - are we unable to reconcile gravity and quantum mechanics because it requires a type of thinking (like, say, geometrically) that transcends what our neurology is capable of. I don't think people are saying they've reached objective rationality. But you'd like to know whether it was a constructive goal.

Expand full comment

Some unattainable ideals are healthy to strive for, while others aren't.

Would it be fair to say that you see the transcendent state of perfect rationality as the unhealthy kind of unattainable ideal?

Because if so, I suspect you're right. I've not been impressed by the thinking of people who try to purge every kind of bias from themselves; people willing to accept at least a little irrationality seem to be more rational.

PS: There's a lot of fairly easy math that transcends our neurology in significant ways. You can't really visualize many-dimensional objects, but adding dimensions doesn't always add much to the difficulty of a mathematical problem. Your mathematical abilities can easily reach beyond the limits of your own understanding. This isn't super relevant to rationality, but when it comes to quantum gravity I don't think it matters whether our brains are capable of "getting" what we're doing.

Expand full comment

Fair enough - and I agree whether and how we can conceptualise the fundamentals of life, the universe and everything doesn't mean they don't exist - call me a radical realist - and I'm not sure whether that means there are limits to rationality (which is after all an approach with a toolkit) or to our embodied sensory and neurological being as humans...

Expand full comment

If I had to rephrase how I read your last sentence without using "rationality" and getting straight to the point, I'd say: it's not a worth goal to try to move towards being better at getting to the truth of things, building more accurate beliefs, having a better idea of why you hold them, or at accomplishing your goals.

Is this too far from what you meant? If so, could it be you are (or I am) confused about what most people around here understand by being more rational? E.g., I've heard time and time again Julia Galef push back against this image that the ideal rationalist doesn't use intuition or emotions to help navigate to the truth. Or that it's a core belief around here that rationality can be easily trained (the most helpful "training" is probably just the community and norms around it, so that our "irrational" motives become more aligned with "good practices").

Expand full comment

Maybe it is the majority view here that it can be easily trained. Personally, I'm not sure, but I do believe that caring about it matters, whether it ends up really changing you directly or not.

Expand full comment

sorry, can you rephrase that again so it makes sense please? 😁

Expand full comment

If your b) comes from the focus on e.g. Bayesian reasoning in the LW sequences, that's a misunderstanding of why they're mentioned so frequently. Even if you grant the claim that there are laws of nature that govern optimal reasoning, that doesn't mean bounded agents (like humans, or AI, or whatever) can perfectly follow them to perform such optimal reasoning.

So does that mean mentioning these laws is useless? No; they provide a useful direction for how to incrementally improve one's reasoning, and conversely, they show what kinds of reasoning cannot work in principle. For instance, many fallacies (like becoming more certain in one's own belief after hearing bad arguments from the Enemy Tribe) straight-up violate basic logic and math. And more fundamentally, understanding how ideal reasoning works in principle can help you understand how it works in practice - in brains, neural nets, AI designs, etc.

There's an essay that touches on the issue of what laws of reasoning are good for, which may help dissolve this confusion: https://www.lesswrong.com/posts/CPP2uLcaywEokFKQG/toolbox-thinking-and-law-thinking

Expand full comment

>For instance, many fallacies (like becoming more certain in one's own belief after hearing bad arguments from the Enemy Tribe) straight-up violate basic logic and math.

Many fallacies might be fundamentally mathematically wrong, but becoming more certain in one's own beliefs after hearing a bad argument against them isn't necessarily.

Normally, one assumes that proponents of a belief have arguments in favor of it. If you take that belief as true or false, you'll generally have an implicit estimation of how strong you expect those arguments to be. For instance, I know from experience what sort of strength of arguments to expect from people who believe the moon landing was a hoax.

If someone is to the best of your knowledge an ordinary proponent of a belief, and their arguments for that belief are significantly weaker than you would have anticipated, that actually is evidence (of strength depending on the likelihood that they're arguing with the best evidence available, but they're not going to deliberately present their worst arguments,) that the evidence supporting the belief is weaker than you initially thought.

After reading the book Flash Boys (https://www.amazon.com/Flash-Boys-Wall-Street-Revolt/dp/0393351599) I suspected that High Frequency Trading was probably not providing societal value that justified the level of revenue it generates. But, I wasn't *that* sure, because I hadn't heard proponents give their best justifications of why it might be. When I discussed the subject with someone who was previously employed as a high-frequency trader, who argued against the position of the book (without having read it,) I became significantly more convinced that the book was correct, because his arguments defending HFT as having any sort of societal value were much weaker than I would have expected an intelligent person with a stake in the subject to be able to generate if it had any to speak of.

Expand full comment

Isn't a core assumption of your argument that any given proponent of a belief can articulate the best argument for that belief? If it's the case that only 1/100 people who believe X can articulate the best possible version of the argument for X, hearing a bad argument, most of the time, is expected.

Expand full comment

They don't have to be able to; absence of evidence is evidence of absence, proportional to the strength of expectation of evidence given the truth of a proposition. The more strongly you expect a specific person to be able to give a good argument for a position contingent on good arguments being available, the more you should decrease your expectation that good arguments are available if they fail to do so.

Going back to the specific example I gave, I don't believe that the person I spoke to gave the most persuasive argument possible in favor of HFT. But as a significantly smarter-than-average person who'd worked as a high-frequency trader, with an emotional investment in defending the societal worth of it existing as an institution, his argument was that there should exist means for smart people to become really rich. That is, smart people should be able to use their intelligence to acquire dramatic personal financial returns by using their intelligence, relative to what they could achieve without using their intelligence, or that less intelligent people could achieve simply through hard work. When it came to the actual specific value provided to society by HFT, he acknowledged that he couldn't think of any, apart from the fact that it was a means for smart people to become rich.

Since I started the conversation suspecting that HFT essentially constituted an exploitable bug in our financial system whereby people could accumulate revenue without producing societal value, hearing an intelligent person experienced with the subject of HFT with an emotional investment in defending it attest, essentially, that the only point he could come up with in its favor was its ability to concentrate money in the hands of smart people without offering any actual social service in exchange, I could really only increase my original level of confidence. That's just way below the level of argument I'd expect an intelligent person arguing in favor of a legitimately valuable form of work they've engaged in to be able to field.

Expand full comment

I mean, the standard argument for HFT is far stronger than that, so I think you just seriously low-rolled in quality of argument-provider.

The standard argument is that HFT makes stock markets clear almost perfectly. Without quick traders, stock markets tend to be substantially more difficult to deal with. You don't REALLY have a single price you're dealing with for any significantly sized transaction; you're instead forced to interact with a set of sell and buy orders of various sizes and at various prices. This was eased somewhat prior to HFT by experts on the stock floor making money doing roughly what HFT is doing: making money off arbitrage and smoothing the market. They just did it slower and less accurately, which both means that it didn't smooth the market as well and that they made somewhat less money. HFT is just companies doing that, but better.

The trickle-down effect of stock markets clearing is the easy access to smooth day trading via mobile apps and whatnot, instead of having a variable hours-long delay in purchasing and selling stocks (as anyone who has purchased bitcoin has experienced.)

Expand full comment

Since HFT pushed the margin of trade times from less than a second to less than a hundredth of a second, the social value of this increased speed is extremely tenuous. The arguer I spoke to acknowledged himself that this standard argument was in fact vacuous, but didn't think it was necessary to justify the existence of HFT.

Expand full comment

"becoming more certain in one's own belief after hearing bad arguments from the Enemy Tribe) straight-up violate basic logic and math"

How does that violate "basic logic and math"? You mention Bayesian reasoning, but you appear to have no understanding what it is. P(bad argument | position false) > P(bad argument | position true).

Expand full comment

Don't disagree with your core point; I'd add that the sociology of scientists is anything but rational, and that this is more hypocritical than when irrationalists are irrational (since we don't expect otherwise). https://whatiscalledthinking.substack.com/p/when-scientists-function-like-priests

Expand full comment

Doesn't anyone who believes they are correct about a thing think they are more rational than whoever disagrees with them about that thing?

Or, as Steven Pinker says: "Reason is non-negotiable. As soon as you show up to discuss the question of what we should believe (or any other question), as long as you insist that your answers, whatever they are, are reasonable or justified or true and that therefore other people ought to believe them too, then you have committed yourself to reason, and to holding your beliefs."

Expand full comment

I think your conclusion is valid if you assume reason in the first place, which is a bit circular. So, I don't think a deeply religious person would think they are more rational than an atheist given how they believe because of "faith", and see that as a virtue.

Expand full comment

I *do* think a deeply religious person thinks they are more rational than an atheist in the following sense

If you ask a religious person "Do you think an atheist's reasons for dismissing god make more sense than your reasons for having faith in god?" I think they probably say "No."

It's probably obvious that I'm making this up as I go, but I think people with faith always have reasons for their faith. And I think they usually think those reasons are pretty good.

Expand full comment

Yeah, you're right.

Expand full comment

"If you ask a religious person "Do you think an atheist's reasons for dismissing god make more sense than your reasons for having faith in god?" I think they probably say "No."

I would say they are more *obvious*. There's the usual set of things; how can a good creator permit the existence of evil, how can you be sure this one faith out of all the faiths is the One True Religion, if I pray and I don't get a miracle, doesn't that prove there is no god? and all the rest of it. Even the "science explains everything and we don't need a god to be a creator or whatever" part.

They are plain, testable, reasonable objections, and arguing against them often sounds like simple "I believe it because I believe it" circular reasoning or even just "I can't explain it, I just know/feel it's true" which is not at all satisfactory.

So there are atheists that you can't blame for being atheists, because by their lights - and the lights of most ordinary, reasonable people - they are being correct and have good reason for their lack of belief.

Expand full comment

I would suggest "legible" could be substituted for "obvious."

The way I understand it, Scott is pushing back against people who accuse the rationalist community of a special kind of hubris. I don't think the hubris is at all special.

I think literally everyone who thinks/feels they are right about anything, thinks their reasons for holding their beliefs are better than other people's reasons for holding opposing beliefs. They might acknowledge that their reasoning is less legible but I don't think anyone internally acknowledges their reasoning is worse.

So when someone says something like there is no "rationality" independent of ideology, it seems to me that in the sense that may be true, there also is no "ideology" that is independent of rationality.

Expand full comment

I'd say that the hubris is special in the sense that "rationalists" claim that their approach to using reason (which is universally improtant as discussed above) is better than anybody else's. It's essentially a meta claim about being better equipped to reach correct conclusions in any area, on any question.

Expand full comment

But which philosophically aligned community doesn't claim this?

Expand full comment

Am I reading this correctly? Does Pinker say that if I claim my beliefs are true and that others should believe them, that makes me rational?

Expand full comment

iiuc Pinker is just saying something almost tautological: that if you believe your beliefs are reasonable and want to convince others, you can only do so through reason. Or something like that.

It can help to add the missing last words of this quote... "[...] and to holding your beliefs *accountable to objective standards*".

Expand full comment

Thanks for fixing that, was not an intentional omission.

Expand full comment

Not that you are rational, but that you're playing the game of reason. If you're trying to get people to believe things because you say they're true, rather than because you say you'll kill them for disagreeing, you're doing rational argument.

Expand full comment

I don't think so; I think you're over-privileging rationality as a universal human motivator because of how it is treated in your social bubble.

For instance, there are people who genuinely privilege faith over rationality, and believe they are right because they have faith in their beliefs. There are people who believe that their correctness stems from moral uprightness or virtue. There are people who believe they can't trust fancy 'rational' models and deductions, and trust the evidence of their own senses and experiences over what 'rationality' tells them must be true via deduction. Etc.

Expand full comment

Well put

Expand full comment

Maybe my definition of rationality here is confused or too broad?

When I say "doesn't everyone think they are more rational than people they disagree with?" I mean something like: "doesn't everyone think their reasons for believing what they believe are better than the reasons the other guy's reasons for believing what they believe?"

People who trust their own instincts over other people's chain of reasoning think their instincts are more reliable than the other guy's reasoning.

And I don't think people have faith without believing they have good reasons to have faith.

Acknowledging that your epistemology is illegible is not the same thing as conceding that your beliefs are irrational.

Expand full comment

Too broad, I think. You're ending up in, or at least headed to, a place where what you're saying is "everyone who believes he's right does so because he believes he's right," i.e. a tautology.

I think we can reasonably distinguish between people who, on the one hand, openly say their reasons for believing X are "because I followed a certain chain of logic, plus assorted axioms and facts, and I can provide those to you and you might come to the same conclusion if you follow the same chain," and, on the other, people who say "because my faith in Y or Z tells me so, and the only way you are likely to nonaccidentally agree is if you have a similar faith."

At the very least, there is more to discuss with the first person. NoteBTW I am not saying *anything* about who is more likely to actually be right, I'm just saying if I encounter someone who believes X while I believe not-X, it's more constructive if they are of the first persuasion, because they have information to sell to me (their conversation) that might be useful.

Expand full comment

In my experience people are quite capable of thinking they are correct about a thing for utilitarian or deontological reasons (including instinct), and will cheerfully admit they cannot justify it by any chain of reasoning. This is where "trust your gut instinct" advice comes from -- it is specifically arguing *against* the proposition that what can be reasoned out is a priori more trustworthy. And it is very common advice.

Expand full comment

Nah, men think they are right about everything. :^)

Expand full comment

The OP is a muddled, but directionally right. The fact that things can be, say, more political or less political, doesn't mean that something can be totally apolitical. For rationality, certainly it's possible to be more or less rational, but I think it's much harder than Scott is acknowledging here. IMHO, it requires multiple iterations of steelmaning.

Expand full comment

I would make a stronger claim that the one that the quoted poster is making -- that the rationality community feels more like a culture or a religion than a group of actually rational people. Much in the way that Christians would generally say that they highly value moral goodness as an important aspect of being Christian and can even be brought together as a community with a set of shared values around this -- while at the same time it isn't obvious that Christians are morally better than other similar groups -- I'd say that members of the rationality community can celebrate and value the concept and culture and performance of rationality without actually having to be particularly rational. And just as aspects of Christian culture might make someone *less* good while also making them quite confident that they are doing the moral thing -- for example, in terms of how a person feels they should treat LGBT people or sexually promiscuous people -- so too could rationality culture potentially make people even less rational on certain topics while simultaneously encouraging people to believe they are more correct than they are. Like moral goodness, rationality is very hard to achieve. (I actually think it would be super fun to have some kind of study on whether self-identified rationalists are actually more rational than non-rationalists or anti-rationalists of similar educational and professional attainment across a variety of topic areas!)

Expand full comment

> Christians would generally say that they highly value moral goodness as an important aspect of being Christian and can even be brought together as a community with a set of shared values around this -- while at the same time it isn't obvious that Christians are morally better than other similar groups

I know this is a tangent from your point, but as I Christian, I don't believe Christians are, on average, morally better than other groups.

A fundamental teaching of Christianity is that people are inherently flawed, and offers a path to improvement. It's ultimately not surprising that people who view themselves as inherently moral (and often *are* above averagely moral), aren't really looking for moral improvement or a faith that tells them they aren't as moral as they think they are.

A common argument in Christian circles is that there's a lot of immoral people in the church for the same reason that there's a lot of sick people in the hospital.

... or to quote something more contemporary: sometimes a hypocrite is nothing more than a man in the process of changing.

Expand full comment

I think the same applies to rationalists.

As the Sequences say, "Nobody begins to search for the Way until their tools lie shattered in their hand."

Expand full comment

I just made a similar point a few replies back, and see you've made it sooner and better :) Re: your study idea, I think an interesting and relevant comparison point is Philip Tetlock's studies on superforecasting, where for example experts in foreign policy were not better at predicting events than well-read amateurs. It does turn out to be possible to get very good at predicting events (this is the whole point of the book) by specifically optimizing to do so. The problem with things like rationality and goodness, of course, is that you can't measure them, and so cannot tell which interventions work and which ones do not. This is one of the reasons why self-help books are so popular but so rarely effective.

At the same time as someone who's followed the movement despite my skepticism it's very clear that rationalism has a breathtakingly unusual concentration of insightful people and ideas. Among other things it's telling that so much of the developments in superforecasting itself happened in and around rationalism (e.g. see the Mantic Monday tradition here).

The explanation of "we're more rational because we try" seems ridiculous and bombastic to me, but I don't have a very good competing explanation for why rationalism is so successful. Maybe rationalists take seriously Yudkowsky's gesture of building up a statistically literate epistemological foundation, and this really is the bedrock on which everything else is built.

Expand full comment

Y'know, the older I get, the more I think there is no one Platonic Rationality, at least for mundane individual affairs.

Rationality is fractal. There is often no one right answer as to what is Most Rational in a given situation. It all depends on the exact situation and the exact individual(s) or more generally initial prior(s) involved.

To use the framework of meaningness.com, Level 5 rationality is where we need to be. Seeing naive Level 4 rationality, in my intuition, is what your original Reddit commenter meant by "hating its specifics".

I dunno, maybe what I am thinking is Level 5 rationality is not really rationality. Maybe that (ie, knowing when to apply Level-4 Rationality) is just what wisdom is. Am I totally off-base? Is this old hat and goes by other names within the rationalist community?

Expand full comment

I don't think anyone in the rationality community disagrees with this -- IIRC one thing that used to be pointed out on LessWrong a lot is that "rationality is winning"; that is, the rational action for a person is that which achieves their goals. The goals themselves are not really subject to reason, at least at the most basic level.

Expand full comment

Since the early years of the rationalist community, there was a distinction between "Hollywood rationality" / "straw Vulcans", "traditional rationality", and "Bayesianism". Not the same as Kegan's stages, but has some overlap.

Hollywood rationality is how supposedly "rational" people are presented in movies: speaking in dull monotone, using unnecessarily many decimal places, focused on their maps and unable to see the territory, sometimes claiming to be logical and without emotions but actually full of badly suppressed anger. Also, they usually turn out to be wrong, sometimes repeatedly and yet they are unable to reflect on that fact.

Traditional rationality is the ideal of current academia; impressive by the usual standards, but sometimes it becomes a social ritual that is not sufficiently internalized and people abandon it when they walk outside the lab, like there is a separate magisterium of science.

Bayesians are supposed to be epistemically consistent (it takes more text to explain what exactly this is supposed to mean, but roughly, if "2+2=4" during the math class, then "2+2=4" also after the math class is over).

Expand full comment
founding

I think a lot of the 'Rationalists' ('members of the rationality community') either explicitly or implicitly agree with Chapman that "Level 5 rationality is where we need to be". It is somewhat "old hat and goes by other names within the rationalist community".

I think a big part of the confusion about this (and other issues, like the ones discussed in this post) is that 'rational' and 'rationality' have (enormous trains) of historical baggage, e.g. 'rational' is mostly interpreted, especially by 'outsiders', as something like 'The Perfect Leve-4 Rationality for Everything'.

Expand full comment

Freddie has a surprisingly strawmannish view of the rationalist community given how much he’s interacted with them.

> I like the culture [...] However

> 1. ⁠There is no such thing as "rationality" that is free from ideoloy

> 2. ⁠They have too much faith in the power of their own cognition

> 3. ⁠Their indifference to human emotion and social cues is not integrity, it's a refusal to confront the material importance of emotions

> 4. ⁠Eliezer Yudkowsky is their king and he's kind of an asshole

> 5. ⁠We're all just scrambling around trying to find meaning and understanding with brains that are the imperfect products of evolution.

The closest of these to a real criticism of rats who I know and respect is #2. (I respect Yud but I’ve heard a rat defined as “someone who disagrees with Eliezer about something.”) I have rat friends who lean a bit hard on their personal understanding of economics or freedom or intelligence, they could stand to chill a bit. But the rest are pretty silly, especially 5 which could itself be the tagline of the movement.

Expand full comment

Can you expand on your point about (4)? I would probably consider myself a rationalist if Yudkowsky didn't rub me the wrong way so hard, and if he didn't seem so central to many rationalists' worldviews. Sure, rationalists might be people who disagree with him about *something*, but do they usually disagree with him about *most* things? And is it gauche within the community to hold his intellect in no special regard? (I had previously taken the answers to be "no" and "yes", respectively.)

Also interested in your thoughts on "6. the culture is largely performatively oriented toward rationality, and tends to be dominated less by the desire to *be* rational by endorsing the truth, and more by the desire to be *seem* rational by endorsing outlandish ideas that are 'the kind that rational people endorse'." (I'm not familiar enough to know if it's a fair critique of the actual community, but it's a version of a critique that every intentional community struggles with.)

Expand full comment

It’s hard to answer your first question as I don’t know “most” of his public beliefs or reactions to them. But every rat I’ve met who read The Sequences mostly agreed with them, every rat I’ve met who read HPMOR enjoyed it, and every rat I’ve met who’s followed Yud’s social media presence has been a little appalled. A common opinion seems to be: he’s very smart, an excellent writer, an original thinker, and has some blind spots the size of Jupiter. Hard to map that to binary answers.

For your latter question, I wouldn’t say that’s accurate. At most I could agree that some rats have poor life skills, just like any other group. And it can be jarring to see someone embrace anti-deathism and effective altruism while failing to properly feed themselves. But for contrast many rats show far greater than average self care, as well as financial smarts, thirst for knowledge, humility, and responsibility to the common good. And I hope with time and a positive community even the worst off among us can find our way there.

Expand full comment

"every rat I’ve met who read HPMOR enjoyed it"

This may be the saddest thing I've read this week 😀

For a lot of people, Yudkowsky is Their Guy. He's the one they first read who introduced them to the whole notion to rationalism, he's the one who was the gateway for a community, he was their Caliph. Whatever his personal faults or flaws, they know and forgive because he's one of them and they're one of his. So of course they're going to be irritated by outsiders criticising him. This is human and natural.

For people outside the little original LessWrong community, or who have come at it all sideways, or only know Yudkowsky from quotes or excerpts, he can sound like "if he was made of chocolate, he'd eat himself". He's a guru, and people have strong opinions about gurus.

Expand full comment

I don’t understand how a group of people enjoying HPMOR is sad, or how it relates to some people treating Yudowski like guru.

Expand full comment

Because it indicates that, for the people in question, their approval of Yudkowsky's writing about rationality is significantly affecting their opinion of other things about him. I read HPMOR several years ago, & while I found it entertaining & consider it better than average *for fanfiction*, I would not consider it exceptionally good overall (by which I mean not that it is bad overall, but that it is notably good in some ways & conspicuously bad in others). The character of Harry is a Mary Sue in the sense that the story makes certain things suspiciously convenient for him (e.g. apparently the key to killing a Dementor is having exactly his philosophical opinions about death) & that people find him more impressive than one would expect (e.g. McGonagall's conclusion upon first meeting him that "[I]f I leave you alone for two months with your schoolbooks, even without a wand, I will return to this house only to find a crater billowing purple smoke, a depopulated city surrounding it and a plague of flaming zebras terrorising what remains of England" is rather out of proportion to his presentation as a bright nonconformist nerdy eleven-year-old); he is also something like a partial self-insert (in his character & pre-story life, not in having meta-knowledge: Yudkowsky's autobiography at https://archive.is/jfzwr#timeline_school reveals that the bit about biting a teacher who didn't know what a logarithm was happened to him, & in an Author's Note (https://www.hpmor.com/notes/98/) he states that he has the same circadian rhythm disorder that he gives Harry). The setting & backstory are changed in diverse ways, whereof some are fanonical (e.g. Magical Britain as hereditary aristocracy) or helpful to the story (e.g. Parseltongue enforcing honesty, much of Voldemort's backstory) but others are pointlessly confusing (e.g. turning Peter Pettigrew's canonical betrayal into an in-universe baseless conspiracy theory) or create obvious plot holes (if Yudkowsky's "Interdict of Merlin" prevents people from learning spells from books, why does Hogwarts have a library full of spellbooks & how did Harry learn the glowing bat spell for his experiment from one of those books?). HPMOR was apparently quite effective at getting lots of attention & attracting readers to LessWrong, but Yudkowsky seemed to consider it unusually good *as a story*, to the point of asking readers to nominate it for the Hugo Awards & to contact JK Rowling regarding publishing it for profit (http://www.hpmor.com/notes/119/). (I don't think the latter was irrational *for Yudkowsky* — if effective it would have been quite helpful & if not it would have been harmless — but in context it suggests an overinflated opinion of the story.) I'm not sure how many readers took this seriously, but even the sorts of feelings toward the story Deiseach is talking about (if I understand her correctly) indicate that many readers have quite a strong halo effect around Yudkowsky.

Expand full comment

it was intended to be slightly more jocular than it came across, but yeah, that "rats" (oh dear) should think HPMOR the finest cheese they've nibbled - well there are plenty of people who like the Twilight books too, what can I say?

Expand full comment

Also confused about the HPMOR comment. But agreed there’s a strong cultural difference between the original LWers and more recent converts who prefer Scott’s humility.

Expand full comment

I don't know if liking HPMOR is a good shibboleth for capital-R Rationalism in general but it sure is a big reason I am not a rationalist. I can confidently say HPMOR is my least favourite thing I have ever read. Sure, I've read things that are worse, e.g. My Immortal, various self-published novels, but never something I have hated so much (though don't get me wrong, I think it is objectively bad too). The combination of total self-satisfaction and horrible aesthetics couldn't be better calculated to turn me off.

Defenders say getting annoyed with the hero's self-satisfaction confuses the character with the author and that Harry is supposed to be smug, over confident, hateable little precocious shit. But a) if you're going to do that you need to give the reader some indication you the author don't endorse the character's attitude, and nowhere in the first 20 chapters does Yudkowsky do this, and b) I'm not buying it, the attitude suffuses the entire work. It might be bearable if the hero inhabited an interesting world or plot, but he doesn't. The world is Rowling's world crossed with an I Fucking Love Science facebook page. The plot is (very loosely) Rowling's plot crossed with bits of Enders Game and a hundred other books that treat emotions as abstractions, or otherwise whatever Yudkowsky needs to be able to discuss his next intellectual preoccupation.

I didn't finish it. I was told I should give it ten chapters to hit its stride because the first five or so are ropey. I gave it twenty. Sorry, its awful. There's no way I'm reading the rest. And what's more there's no way I'm going to read The Sequences(TM). There can't be anything in there that I can't get from various other sources that would increase my utility function enough to counterbalance the utils lost from spending tens of hours in the presence of the mind that produced HPMOR. That, more than anything, is why I am not a rationalist.

Expand full comment

Rather strongly worded but pretty much my feelings also. Like you, I found Harry with his extra set of surnames (and right from the start the Lady Clara Vere de Vere bit annoyed me; he's Harry Potter, the only reason people are reading this is because he's Harry Potter, why on earth make him Harry Potter -Evans-Verres? Never mind the much too cutesy-poo 'oh aunt Petunia is Evans-Verres because her maiden name goes first, but her husband is Verres-Evans because *his* name goes first there' bit, why on earth couldn't they be plain Professor and Mrs Verres? Yeah I know some Too Special For Tradition types like to do the "my name and his name for me and his name and my name for him because God forbid we'd settle on one name and stick to it" thing, but that is just nodding and winking at your audience and breaking the fourth wall).

As you can see, I am very strongly a traditionalist about some things 😁 Okay, getting over my irrational prejudice about "pick one goddamn name", the fact that Harry came over as eminently slappable, even though I was one of those assured that he is *supposed* to be objectionable and that he gets better later, put me off. If I spend twenty chapters wanting to kick the brat down a flight of stairs, then kick him back up so I can chuck him out a window, why on earth will I stick with another X number of chapters to wait for him to get over himself?

Also mucking around with the magic system. Yes, Rowling doesn't do a "stats and dice rolls" neat little magic system, that's because she's British and writing in a completely different tradition of children's and young adult fantasy fiction about boarding schools and magic and magical boarding schools. That Yudkowsky took it upon himself to devise a "stats and dice rolls" system is very American, but why did he have to make it DnD type system? He might as well have gone the whole hog, relocated the story to America as it was, and done "Harry PVE goes to Hogwarts High where he is the star quarterback on the Quidditch team, hopes to turn pro in college so that he can end up on a team that plays in the Superbowl ,and meanwhile goes to the malt shop in the Hogsmeade Mall with Hermione the cheerleader". (The amount of complaining American readers did when Rowling invented the American wizarding school was pretty danged ironic, given the mauling American fanfic - and professional! - writers give to British characters and locations when writing their stories in such settings; I will never forget the BBC Sherlock fanfic that gave John Watson - canonical rugby player - a favourite *baseball* team).

So yeah - I thought Harry Triple-Barrelled was an unmerciful pain in the - neck - and wanted to boot him off a cliff, and the amount of changes made to Rowling's world made me go "well why not write your own original fic in the first place?" so I never did make it far into the work.

Expand full comment

> But a) if you're going to do that you need to give the reader some indication you the author don't endorse the character's attitude

I'm genuinely puzzled why you think a fiction author has to disclaim that his characters' opinions are not his own.

Expand full comment

I said "give some indication" not "disclaim" (also "attitude", not "opinions"). And it's because if I am going to endure something annoying for tens if not hundreds of thousands of words, then I need to know there is a good reason for that annoyance and I can trust the author is going somewhere with it. If not then I'll just give up. There's no need for the author to disclaim anything, a good writer can make this clear from context; Yudkowsky doesn't (and, I would argue, can't, because his attitude doesn't fundamentally any different).

Expand full comment

I get that you're saying you want who you see as an anti-hero to have a redemption arc, but not all stories follow such a template, nor is there some mandate that they necessarily should foreshadow such an arc. Maybe you just think the rules for fanfic should be a little different than original fiction.

Expand full comment

> The closest of these to a real criticism of rats who I know and respect is #2.

But then, conflicting with #4, Yud spent something like half of the book (Inadequate Equilibria) deliberating about merits of Outside View. Literally worrying about how much faith one should put into their own thinking.

Frankly, I don't think there are many other 'communities' which spent more collective brainpower contemplating that.

Expand full comment

Question about Alex Jones and the highly irrational. Do they know they are being highly irrational?

Like anyone I do a ton of irrational things. But I’m aware that I do a ton of irrational things. Does Alex Jones just think it all makes perfect rational sense?

Expand full comment

If have no idea if Alex Jones believes his nonsense, but his followers absolutely do and they act on their beliefs, doing things like harassing and sending death threats to survivors of mass shootings.

Expand full comment

I vaguely recall something about Alex Jones saying in court that his program is entertainment and the things he says are not meant to be taken as claims about the truth. But I could be misremembering the details.

Expand full comment

Assuming that what people say in order to avoid a credible threat of force is truthful seems like a bad idea to me.

Expand full comment

Rationality is a process, not a property. There are properties that enable rationality, like intelligence and discipline, but rationality is something a person does, not something they are. It's also something a person does from a specific starting point: what updating priors looks like depends on what your priors are. And while we might imagine that all rational people will converge -eventually- on certain positions, that's not to say their journeys will look similar at all.

Expand full comment

100%. Rationality is aspirational, not descriptive.

I think everyone hates the 13 year olds on the internet who just discovered the word "rationality" and conceive of themselves as perfectly rational and ascribe irrationality to every person who disagrees with them. And yet this does not lead anyone to turn to astrology.

How would the critics prefer anyone advocate for greater epistemic rigor?

Expand full comment

As someone who most people familiar with me would probably describe as irrational, if I were advocating for greater epistemic rigor I would try to create systematic incentives to promote it rather than just meme-ing it as hard as possible. But I am not an absolutist about epistemic rigor like Yudkowsky, I think a lot of speech serves purposes other than competitively approximating reality via Tarskian correspondence and that ignoring this would probably create some sort of relative despotism.

Expand full comment

"Meme-ing" seems like a necessary step towards creating systemic incentives, since you can't change the system on your own.

Expand full comment

Alex Jones isn’t irrational- he’s completely rational in pursuing the goal of his media business which is to make money.

Expand full comment

Does his approach really result in more money than his rivals? He's been deplatformed from a number of places.

Expand full comment

I don’t think the measure is his rivals. He is a man of very modest talent whom one wouldn’t expect to have become a national figure with a huge media following that he seems smart enough to have monetized. I don’t know the details of his income, but I’d bet he’s done a lot better as a peddler of nonsense than he’d have done in any other field.

Expand full comment

I think there's more renumerative nonsense he could peddle.

Expand full comment

If you can convince him, maybe he’ll cut you in for 10%....

Expand full comment

It's a bit late for that now that he's been deplatformed from so many places.

Expand full comment

That’s like saying preachers are just trying to fleece their flock. The reality is that many (most?) of them actually believe it. Indeed the fleecing is often easier if you’re a true believer.

Expand full comment

Some preachers lead their flocks as best they can, others fleece them. I’m commenting on Alex Jones, not on all media figures.

Expand full comment

I think what Scott means by rationality here is specifically "Epistemic Rationality".

https://www.lesswrong.com/posts/RcZCwxFiZzE6X7nsv/what-do-we-mean-by-rationality-1

Expand full comment

I think on the whole, people interpret "I am a rationalist" similarly to "I am law abiding", not "I am kind". That is, they view you as someone cosplaying a vulcan, not a regular person trying to fight their biases.

Expand full comment

I think people view "I am a rationalist" as the internet twelve year old who insists all his opinions are objective and rational. Literally, that twelve year old is a specific pattern in people's minds which the rationalist movement fits well enough for people to treat them exactly the same.

Expand full comment

Judging by the accusations of arrogance, I would've expected it to be the opposite -- that to say "I am a rationalist" is in the same bragging category as saying "I'm kind", whereas saying "I'm law-abiding" is (at least in the US) not very spectacular.

I don't know what "cosplaying a vulcan" means, so my guess is that you were trying to point out some other dichotomy.

Expand full comment

Kind is a fuzzy tendency, law-abiding is more of a yes/no proposition.

Expand full comment

My feeling is that a lot of people have an instinctive reaction to claims to be [highly rational and thus minimally ideological], and to overcorrect and pattern-match against that.

The correct formulation of "everything is ideological" is that (1) even a hypothetical 100% rational agent is going to have goals, and those goals will guide what they consider good and also hence worth investigating, and (2) GIGO, even a 100% rational individual is going to have false inputs from societal common sense and propaganda and so on. (Neutrally reading the best philosophical arguments in 1355 will lead to very different ideas on metaphysics than in 1955, and perhaps by pessimistic induction in 2755 as well.) Also the propositional components of ideology are things that one can be rationally led to.

I'd say "rationality and ideologicalness" are orthogonal, in part because I think the ideal (not even merely achievable) agent is fully rational and fully ideological, in the above sense, and because there are non-ideological sources of error, but of course the "ideology is the opposite of rationality" instinct is coming from some truths too, specifically some sources of error that rationalists and others have long emphasized: wishful thinking, filter bubbles (a social-level effect,) not making identity small, blah blah blah. These are virtues worth practicing, and I'm glad the rationalist crowd (again amongst others) have brought attention and thinking to them.

Expand full comment

When I read "ideological", I think of someone being motivated by a conclusion, instead of being motivated by an open question, or a concrete objective. Someone using the tools of rationality not to investigate an unknown, but, unconsciously, to justify his ideology or convince others of it. In this sense, they are somewhat orthogonal, I think.

Expand full comment

But definitely, if you think of "ideological" as someone that has ideas or has areas of interest, then I hope rationality is not orthogonal but positively correlated with it.

I think I missed your point. Anyway.

Expand full comment

I think ideologic here means "well developed set of values". I agree that i hope that the person that is good at reconsidering their beliefs about the world is also good at reconsidering their values. But if someone claims to be free of ideology that's exactly when I start to worry that they are *not* thinking critically about whih values they promote.

There is a tendency in people to assume that whatever values they happan to have are the natural an only values to have. A good antidote to this is to read some very old text from back when people had radically different values. The ancient Athenian writer Xenophon finds it in himself to praise the polis of Sparta for example. A society that consists 85% of the population consists of the Helots. A slave class that can be killed with inpunity (Spartans esspecially killed the strongest and most free-spirited among them, in a form of reverse eugenics). Sparta even goes as far as to annually declare war on their own slaves! Another tradition Sparta had was to send their most promising aspiring citizens into the Helot farmlands to kill one of them in the middle of the night with no warning (they called this the Crypteia). Yet the Athenian elite absolutely loved Sparta. Presumably because they identified only with the Spartan elites, not with their Helots. Including Plato, the biggest rationality rules guy at his time and place.

I am not saying this to suggest that rationalists have values as bad as these Athenian elites do, just to illustrate how skipping over the question of "which values are important" can be really dangerous. Plato and Xenophon had a pressoposed ideology that they never really aknoledged, and so do you! (Hopefully a better one tho)

Expand full comment

It's as though someone came across RationalWiki and thought it represented the rational community.

Expand full comment

I know this isn't really the point of the article and it was supposed to be selected as a non-controversial point, but I can think of several instances where I'm a lot less confident in the whole "Name-brand, government science is much better and more reliable than corporate science" system than I'm supposed to be.

I think the most significant way I'm different in this is that you seem to be doing some sort of math that says "Well, the corporate scientists are being paid to say things that go a certain way - if they refuse to do that, they won't get paid, so the incentives are fucked. You will only get science that confirms things in one direction from them". And that's so, or at least it wouldn't surprise anybody if we found it to be so.

But I've been around, say, tobacco control epidemiology enough to know that the same thing is true for them; they only get "paid" for saying that tobacco is maximally harmful in every way and giving science that supports any ban at any time; anyone who seriously breaks from this either never gets their career off the ground or is immediately ignored forever after that. There's a guy, probably the most famous guy in that field, whose name is Stanton Glantz. He's basically famous and successful because he's reliable - he never says anything that isn't maximally negative toward tobacco. He's also known for not being a particularly good scientist or particularly truthful, but this has never actually hurt his career in any way - he does the thing that gets him paid, the grants keep rolling in, and it's very very clear where his incentives are.

I'm not sure how well this applies to something like climate science, at least to the extent I'd say their credibility is ruined or the science is all bullshit or something. But it's clear enough their incentives are 100% in line with saying climate issues are maximally important and dangerous; they went from "field that barely exists" to "major field of science that everyone talks about and has conferences and huge grants and unlimited funding" in a few decades by pursuing that line of thought. That doesn't mean it's all wrong or fake, but it makes me feel uncomfortable when someone goes "well, clearly all the corporate funded studies are biased and unreliable" without mentioning the in-lots-of-ways similar perverse incentives on the other side.

Expand full comment

Someone coined "white hat bias" to attempt to give a name to this phenomenon: when people are biased due to their belief that they are doing something good and worthwhile with their science. It's an interesting area--while industry bias gets a lot of play, I've seen at least some analysis that "industry bias" might arise in some cases because industry-funded studies have more money and thus can run bigger and better experiments (this is obviously not the case all of the time).

Bias is far more complicated than just righteous scientists fighting the good fight against capitalist cronies, and I really wish we could have a more nuanced discussion about that. I *especially* wish we could discuss how, like with smoking, there's industry now on *both* sides. One of my personal pet obsessions is nutrition, and everybody yammers on about how industry science says sugar is good for you because Sugar Lobby and eggs are fine because Poultry Lobby...without once realizing that there's lobbies on every side (the diet industry alone is worth something like $75 billion in the US, and funds a crap ton of research)

Expand full comment

I think institutional culture is at least as important a source of bias in science as a straightforward financial incentive for the institution to get certain findings. For example, the FDA has a very risk averse institutional culture, and that causes them to make errors in a consistent direction.

Expand full comment

You should be. I've been both an academic and an industrial scientist, and worth with both types when I was in both positions, and I can't think of any (real) scientific area where academic scientists are a priori better or more rigorous or more shielded from unfortunate personal/career influences than industrial scientsts. We all recognize the two groups have different priorities and time scales -- the industrial scientist needs to make a difference to the company's fiscale bottom line, sooner or later, while the academic scientist needs to make a difference to his university's or granting agency's reputational bottom line, sooner or later. That leads to difference focuses, and different strengths and weaknesses, which anyone who works in the field learns to recognize over time.

But anyone claiming one group or the other has a lock on truthiness is a n00b, an idiot, or grinding a fradulent axe for his own purposes (which are usually disreputable because otherwise he would be open about them).

Expand full comment

Well said. I think you can see the same thing in high energy particle physics, where they are desperate to find something that doesn't fit with the 'Standard Model', so funding will can continue.

Expand full comment

I agree with the general points in Scott's post, but I don't think they are really doing the work he wants them to do.

The problem, in a nutshell, is that virtually everybody is better at rationality than Alex Jones. The U.S. Congress is more rational than Alex Jones. The Elk Club is more rational than Alex Jones. The FDA is more rational than Alex Jones. It is a weak claim.

Rationalists are making a much stronger claim. I suspect that the rationalist community as a whole believes that it is generally more rational than many, most, or maybe nearly all other agglomerations of humans, even if it hasn't achieved perfect rationality. Forget Alex Jones. Is the rationalist community more rational than the Republican party? The Democratic party? The editorial board of the New York Times? The cast of Fox and Friends? My guess is that most rationalists would say yes. (My guess is that most rationalists would feel strongly that the answer is yes.)

As a generally sympathetic observer of the rationalist community, I would say: maybe. My sense is that the rationalist community is more "rational" (meaning has a higher propensity to believe things which are actually true) than other communities in some respects and less so in others. In other words, the rationalist community has some systematic biases. In other words, it has an ideology.

If nothing else, it has a belief in rationality as a concept, as an end achievable through specific practices, as an item of value, and probably as a moral good as well.

Perhaps more importantly, the presence of systematic biases indicates that the rationalist community is blind to its biases. I would expect a truly rationalist approach to life to entail a radical epistemological humility. That is not my experience of the rationalist community as it actually exists, whatever lip service it might pay to the notion.

This is in no way meant to be a scathing critique of rationalism. As I said, I'm a generally sympathetic observer. I think it is possible and worthwhile to be "less wrong." I just think the rationalist community is probably less less wrong than it supposed itself to be, due in part to, yes, its ideology.

Expand full comment

This is a great comment. I think the overconfidence you've pointed out is orthogonal to the point the original post is making, but as long as Scott is responding to important (or at least common) criticisms of rationalism, this is the one I really wish he'd have a go at refuting (or endorsing!)

Expand full comment

I don't think systematic biases necessarily mean one is blind to one's biases. There are definitely areas in which I'm aware of being biased. Shouldn't I, by virtue of my awareness of my biases, be able to compensate for them? Maybe sort of, ish. I know the direction in which my reasoning is reflexively pulled, and I can work to combat that, but I can't simply judge the magnitude of the bias and correct for it. It's more insidious than a mere shifting of my probability assessments, it affects what sort of ideas do and don't occur to me or call my attention.

Are there areas where the rationalist community as a whole is systematically biased? I'd say so, sure. I think that's probably the case for any non-trivial community. But I don't think that means that some communities can't be, on the whole, more or less rational than others. How you rank the rationality of different communities might depend on how you prioritize different features of rationality, or of accuracy, but I don't think simply having systematic biases disqualifies a community from contention for being "more rational."

Expand full comment

> The problem, in a nutshell, is that virtually everybody is better at rationality than Alex Jones.

I don't think Scott was trying to use Jones as an example of anything other than a figure we can virtually all agree is irrational, in order to illustrate a universal agreement that there is a meaningful distinction between "rational" and "irrational", regardless of any underlying ideological biases. The problem is that some people even dispute this, so it's necessary to establish a basic fact you can biuld upon. Think of it like a sketch for a Moorean common sense proof for the existence of rationality.

Expand full comment

Sure, I get that, but I still think it worth complicating that point. I'm saying that, yes, irrationality (Alex Jones) and rationality (a ton of people who aren't Alex Jones) exist. But also: a) there is a pretty broad range of beliefs and behavior that qualify as rational; b) separating rational beliefs from irrational beliefs in non-extreme cases is often quite hard and therefore heavily influenced by ideological predisposition; and c) based on the evidence I've seen, self-described rationalists aren't always that great on this score.

Basically, once we get out of Alex Jones territory, Scott's claims lose a lot of their force.

Expand full comment

I agree on a) and mostly on b), with the caveat that ideological prejudice mainly manifests in your priors. On a long enough timeline/with enough evidence, those priors necessarily cease to matter if you're applying rational techniques to update your assertions (Bayesian reasoning eventually converges on truth no matter the prior). Some people might think this skims "no true Scotsman" territory, but I think there is a clear line in the same sense that you literally cannot be a scientist if you don't apply the scientific method.

As for c), I can't really comment on what evidence you've seen but it would make for an interesting study. Most human cognitive biases can be trained away, with the possible exception of bias blind spot which has shown particular resistance to interventions (although early evidence with interactive games shows promise).

It's not a stretch to assert that bias awareness and bias training would definitely improve rationality, but I haven't seen evidence on how diligent people in the rationalist movement are on these points.

However, I don't think Scott was asserting that people in the rationalist movement are necessarily more rational, but he was asserting that a) rationality exists and can be defined in a form that's free from politics and ideology (in the colloquial sense), and b) arguments can be partially ordered according to how well they do on a set of "rationality" criteria. Both of these undercut the claims Scott to which was objecting.

Expand full comment

> Both of these undercut the claims Scott to which was objecting.

Err, you know what I mean, "Both of these undercut the claims to which Scott was objecting." :-P

Expand full comment
founding

> I suspect that the rationalist community as a whole believes that it is generally more rational than many, most, or maybe nearly all other agglomerations of humans, even if it hasn't achieved perfect rationality.

I don't think that's true – for one, there's no coherent "rationalist community" in the first place, e.g. anyone can self-identify as a "rationalist", lots of people that DO interact with (and respect) the 'broader community' explicitly do NOT identify as a 'rationalist', etc..

You wrote [numbers mine]:

> If nothing else, (1) it has a belief in rationality as a concept, (2) as an end achievable through specific practices, (3) as an item of value, and (4) probably as a moral good as well.

[1] seems obviously true see, e.g. this post and the comments. (Maybe obviously 'rationality' isn't always the _same_ concept for different people.)

I think the 'rationality community' is pretty humble about [2]! In my experience, this is the subject of most of their 'rationality-qua-rationality' activity and I'm not aware of anything that I would describe as 'ideology' with respect to specifics.

Given the very broad (and ambiguous) understanding of 'rationality' as some mix of 'epistemic' and 'instrumental efficacy', [3] seems obviously true. And more, almost _everyone_ – every person in the world, not just 'rationalists' – would agree to this, or act as if it was true.

And given what I wrote about [3], [4] also seems to be _extremely_ un-controversial. Almost everyone cares (and seriously too) about believing true things and effectively achieving their goals.

It seems really weird for someone to write this:

> In other words, the rationalist community has some systematic biases. In other words, it has an ideology.

and this:

> I would expect a truly rationalist approach to life to entail a radical epistemological humility.

How do you know that "the rationalist community has some systematic biases"? Are you really claiming that, 'rationally', the "rationalist community" should just defer to you about what's true or efficacious?

Expand full comment

This is an old motte and bailey. Nothing is perfectly rational or objective is the motte and "I can be explicitly biased/partisan/prejudiced/whatever" is the bailey. You act explicitly prejudiced or to support your party and the moment someone accuses you of bias point out that being unbiased is impossible anyway. Yes, but there's a difference between someone failing to be perfectly rational and someone who's not even trying.

Expand full comment

True. There is also a difference between someone who pretends to be trying and someone who does not pretend.

Expand full comment

I think this really comes down to people being unwilling or unable to stare at their own hypocrisy in the face.

Freddie agrees that he sees people who claim to truly want to live for only 80 years as a cope. I agree with him here. But I think that producing arguments for "why I am not a rationalist" is also a cope. We all know rationality is really hard, and that we're constantly failing at it. We all know that we SHOULD make choices as rationality as possible and that that would require putting a lot of boring effort into really hard things, and then continue to be really boring. You'd have to learn Bayes!

But I think I'm just comfortable being somewhat hypocritical on this. I think we should all be comfortable admitting that we're hypocrites, that we aren't working as hard as we could towards being as rational as possible as much as possible even though we believe that's what we should be doing. And that's okay.

Expand full comment

I mean, effort to be be rational is itself expenditure of a limited resource, and thus an opportunity cost. You can value rationality and recognize it would be good to be more rational without being hypocritical for not putting in more effort than you do.

I'd like to be more rational than I am, but I don't think I'm hypocritical for not being more rational than I am, I think I'm limited in my abilities, and forced to prioritize within those limitations. I might not always make the best possible choices within those limits, but for the most part I don't think it's because I'm defying my own priorities.

Expand full comment

I think it's a little hypocritical in the same sense, as Peter Singer points out, that we're all a little hypocritical when we say that we care about the suffering of the global poor but will spend $100 on a nice meal for ourselves.

For rationality, it's hypocritical because rationality is specifically a resolve to make the best decision we know how to make, and yet we are admitting that we don't always do that. Since we ourselves don't always do that, our requests to be more rational in particular decision making processes then seem like isolated demands for rigor.

Just like Singer's Shallow Pond, we don't have to let that hypocrisy cripple us. We can admit it as the natural state of things. It doesn't change the resolve to do better.

Expand full comment
founding

I came across an interesting claim recently that there are 'pragmatic contradictions'--you can't say "I am asleep right now", for example, because being asleep is contradictory with talking. Similarly, you might think that saying something like "I am very humble" or "I am very wise" is not quite a pragmatic *contradiction*, but is in that direction--a commonality of humble people is that they don't say sentences about how humble they are, and perhaps a commonality of wise or unbiased or objective people is that they don't say sentences about those things.

I think I mostly don't buy this, and feel something like your position--not about humility (I don't pretend to be especially humble) but about wisdom / rationality / etc.; actually I *have* put a lot of effort into this, and started out ahead, and so on; looking around it really is my best guess that I'm 'remarkably good' along those axes, such that me remarking on being good there does actually make sense. [While, of course, having a TAP to check: why exactly do I want to remark on that in this context? Only sometimes does it make it past that filter, and so the thing I'm pointing out here is more like "that filter doesn't kill all instances where I want to say it, and I don't think it should."]

Expand full comment

Are you arguing that "there *is* such thing as 'rationality' that is free from ideology?" "Traditional" mathematics was probably considered free of ideology. But then along came Constructive Mathematics, which forced an ideological choice. It seems to me that it's hard to argue that there are *no* ideological commitments in any particular perspective. After all, how would you know that you just haven't found them yet?

Expand full comment
founding

No, he's arguing that, just as "tall" and "short" are useful words while yet describing people of finite size, so too "irrational" and "rational" while describing people of finite rationality. [Like, what argument against calling someone 'rational' doesn't also work to rule out calling them 'short', given that they have some nonzero size?]

Expand full comment

I think that is pretty orthogonal to the point the original people are making about ideology or the political nature of things though. They aren't trying to say being rational does or doesn't exist, but that the concept of rationality exists within a belief system that is just as contingent as any other.

Expand full comment
founding

I disagree that it's orthogonal, or that the belief system is "just as contingent as any other." Like, this is basic fallacy-of-grey stuff.

Expand full comment

It's just that what you and Scott are saying doesn't address their point in any meaningful way - there's no justification of any type of original premises they would agree to, nor an effort to argue from their worldview. Like, I think it makes sense and is valid to say these people are incorrect, but I don't think pointing out supposed gradations on a scale of rationality does anything to refute the argument they make.

Expand full comment

> the concept of rationality exists within a belief system that is just as contingent as any other.

That is almost certainly false, and is exactly the type of false equivalence that Scott seemed to be pointing out, though he certainly wasn't thorough enough to convince such people.

Also, some people really do say that rationality doesn't exist. I think Scott's post wasn't as clear as it could have been in addressing these separate claims, since he haphazardly groups them as a class of "related" claims.

Expand full comment

Right, bias, ideology, and axioms are pretty much the same thing. No system of thinking can be free of them. On the contrary, any system of thinking must be built from them.

A note on mathematics and axiom systems: constructive math is relatively recent. Before that, you have the axiom of choice, and before that, the parallel postulate, as examples of axioms that seem to be obvious in one view but have been proven to be independent.

The bottom line for rationality is that rationality must also depend on certain axioms (biases) at some point, and rationality cannot be sufficient for deciding the truth or falsehood of those axioms.

Expand full comment
founding

Meh – the Axiom of Choice doesn't seem like a good example of 'ideology'.

Expand full comment

I wouldn't say you're missing the point of those arguments (everything is political, enmeshed in ideology etc.) because I trust you understand them, but I do think you are answering them from a viewpoint which takes for granted that people are only advocating for a weak version of their arguments and I think you are assuming some thing that stronger proponents of those statements would reject.

I think there is probably a weak version and strong version of those original statements and viewpoints. The weak version is the one people use pretty often - everyone is biased, so you can't call me out on being biased; or [group I disagree with] is cool but their viewpoint comes from ideology/politics just like everyone else's, as if that diminishes their worth. This is generally a claim to political on a Red Team - Blue Team axis, and a position which would agree to your statement that it is possible to be "bad at objectivity", i.e., Alex Jones or fossil-fuel funded studies. These are the people who say "everything is political" yet still point to studies that "prove" gender differences don't exist as a victory for feminism. They use these statements as arguments to invalidate their opponents without fully thinking through their implications. I think this is the version you are responding to the most directly, and if we are thinking of political/biased/ideological as similar to their usage in common parlance, you are right - things can be more or less biased or political etc., pretty evidently.

The strong version of these original statements mostly exists within critical/social theory/continental philosophy academics. Everything is political isn't a statement about things having clear and obvious implications that deal with what we commonly think of as the political sphere, i.e., policy and governance, and has nothing to do with how Red or Blue or in-between a thing is. it relates more to power and marginality, race-class-gender, control-coercion-discipline, ideology and discourse, and the production of all of the above. Everything is political means that we can't and shouldn't think of things without considering their role in maintaining power structures, maintaining institutions, and producing subjects that are classified in certain ways.

When looked at this way, a reminder that something is political is a push back against a claim to the apolitical, and I think a push back against a claim to something being more or less political (in a broad sense) than something else. In this understanding of the political, saying something is less political than something else is impossible - everything exists within a discourse that is political, i.e., both a language and a social sphere in which terms are laden with power and perpetuate structures and systems and institutions.

In this sense, a critique of rationality that mentions "everything is political" or "...ideology" seems to me to be pushing back against an image of objectivity or neutrality, and instead taking the stance that nothing is neutral, nothing is outside of the political (broadly defined) and concepts need to own their ideological implications. That being political isn't bad or good, it just is the state of everything.

In the instance of Alex Jones, a critique of him from this viewpoint may rest on the actual positions he espouses as opposed to his justifications. A critique of a fossil-fuel company funding a study may rest on their utilization of capital to promote a favorable viewpoint and perpetuate a system in which the rich and powerful exploit natural resources to the detriment of those with less power or influence.

So overall, I get your argument, I just don't think you are taking the strongest version of the points you are countering.

Expand full comment

While Freddie can clarify for himself if this is what he means, this comment seems to be the only one to explain where 'everything is political' actually comes from.

While I think continental/critical philosophers are not to be trusted because their aim is to bamboozle you with words while they rifle through your pockets for spare change, it wouldn't be a bad idea for everyone to spend some time getting familiar with at least the highlights these days. It's becoming a very big thing in a very big way, more or less the dominant philosophical outlook of the educated.

Expand full comment

I'd agree and I'd go slightly further. There are really three versions of this hypothesis that "everything is political." The weakest, and also the most useful and most defensible, is to say that, whilst people can and do try to be rational and coherent in their thinking, and some do so more than others, they are to some extent marked by their culture and their individual histories, as well as by the need to take positions on controversies around them. Anyone who has worked in a university will be familiar with this phenomenon. The second version of the hypothesis is that it is hard, if not impossible, to understand how people think, write and argue about issue unless you understand the political and social context in which they were working, the pressures upon them and how that context has changed over time. This doesn't exclude rationality: Aristotle and Ptolemy produced what were, to them, rational descriptions of the world rationally arrived at, on the basis of what they knew. As Michel Foucault showed at great length, the way in which western civilisation conceptualised mental illness changed radically and in a discontinuous way between the sixteenth and the nineteenth century, and in each era there were mechanisms to police and enforce the dominant paradigm. Much of this, if you disregard the slightly precious style (which translates badly into English, anyway) is applied common sense.

The problem arises with the strongest version of the argument, which holds that "everything is NOTHING BUT politics." In a few years this has gone from being a marginal cult to having great influence in academic and media circles. It excludes rationality from the start, since rationality is only "one way" of looking at things. Rather, it emphasises subjectivity and "lived experience," over evidence, logic and coherent argument. Everything (except their own arguments, curiously) is presented as simply another point of view, the product of political forces. Taken to extremes (as it already is being) it denies the independent validity of science, mathematics and, effectively, any analysis or interpretation of anything. (Needless to say, Foucault would have been horrified by such arguments). I'm not a member of the rationalist community, but if I were I'd be greatly worried about this trend, because it suggests that your guiding principles are simply invalid to start with, and that even the attempt at rationality is pointless, as well as being just a disguised form of power-seeking.

Expand full comment

I think I would push back on the supposed influence of the strongest version of this argument, both within the academy and in society more broadly. What I was getting at with my weak version of these claims is that people often use claims about the political nature of things, as well as about subjectivity and lived experience, to advocate for certain political positions without fully internalizing these claims. For instance, I think people in the media who seem to be echoing these views turn very quickly to studies and quantitative data that support their own positions, since that is still a key facet of knowledge production. People talk about subjectivity one sentence but the next cite a study on the impact of "race" on recidivism without any critical examination of the discourse underlying these concepts. I think this happens just as much on the right as the left FWIW, in the sense that people have justified claims about bias, funding, power etc. and then in the next sentence completely ignore these on their own side.

So I would say my observations are that as a "weapon", "everything is NOTHING BUT politics" is appearing with increasing frequency, but not the actual views that go along with a fully linguistic/constructivist viewpoint. Like, we see these people promoting left and right leaning think-tanks and discussing things grounded in scientific method empirics as opposed to considering the implications of knowledge production occurring in institutions which systemically exclude the uneducated.

It's like people took one sociology course freshman year or read a book or two and now act as if they have fully adopted a way of seeing the world that is radically different from the mainstream when they haven't. I think it diminishes the points they are trying to make, the role of science and evidence-based argument, and the role of critical/social theory.

Expand full comment

> The strong version of these original statements mostly exists within critical/social theory/continental philosophy academics. Everything is political … relates more to power and marginality, race-class-gender, control-coercion-discipline, ideology and discourse, and the production of all of the above. Everything is political means that we can't and shouldn't think of things without considering their role in maintaining power structures, maintaining institutions, and producing subjects that are classified in certain ways.

I think Scott's criticism, that overuse of this sort of argument is based on the continuum fallacy ("These things are points in the middle of a continuum, therefore there is no difference between them"), applies to this form of the idea too. Different ideas and theories can be more or less biased by their role in maintaining power structures and accordingly closer to or farther to the truth. For instance, it is true that modern definitions and understandings of mental illness are partly shaped by politics, but that doesn't mean that people like Bryan Caplan (https://econfaculty.gmu.edu/bcaplan/pdfs/szasz.pdf) are right to say that most of modern psychiatry is based on the mischaracterization as illnesses of preferences unacceptable in mainstream society, or that modern psychiatrists are no closer to the truth than the older psychiatrists who classified homosexuality as a mental illness or the Soviet psychiatrists who involuntarily committed anti-communists as "sluggish schizophrenics".

Expand full comment

These were my own main criticisms of Yudkowskian rationalism, which have so far gone without comment: https://www.goodreads.com/review/show/3850071573?book_show_action=false

Expand full comment

I'll just respond to

> Nit 5: Yudkowsky is a cryonicist but is unwilling to pay for other people's cryonics. If he thinks cryonics is a good investment, and assigns equal moral weight to others (or even near equal moral weight), he should be willing to pay for other people's cryonics. If he thinks other investments are more important than cryonics, he should want to invest in those instead of cryonics even for himself. I know someone who has cancelled their own cryonics contract under this logic, so I know it is not a superhuman ask. If Yudkowsky can't reconcile this then his preferences are irrational, with possible implications for character.

It's extremely unfair IMO. Does he actually say he assigns, from his own perspective, equal moral weight to everyone including himself? I doubt. It falls apart because it immediatly forces you to deny yourself everything, spending everything on poorest people in the world instead and then barely subsisting. Scott described the problem well here: https://slatestarcodex.com/2014/05/10/infinite-debt/

Expand full comment

The best version of the criticism of the use of “rationalist” would, I think, point out that the assertion that one is a rationalist happens in a context. It doesn’t merely describe the world in a value-neutral way, it draws a contrast with other people, and assigns higher value to the rationalist approach. In most contexts, the other people one takes seriously enough to even bother contrasting with aren’t on an Alex Jones (the character; as others stated, Alex Jones the performer has found a very successful niche) level, they’re the most rational spokespeople for some paradigm other than rationalism. So, one is effectively claiming to be more rational than the most rational people from non-rationalist paradigms, because those paradigms don’t even allow for rationality as great as the rationalist’s. It’s an indictment not typically just of individual capacities, but of entire paradigms, and suggests that the rationalist thinks their paradigm objectively preferable to every other salient paradigm.

Which, from within rationalism, it is. Lots of paradigms are the best according to their own standards. The trouble is that there isn’t a sensible way to compare them without assuming one of them, and none of them is better at everything people care about than all the others. So, the selection of some of those needs to privilege isn’t meta- rational, it’s just … a choice. Rationalists tend to think that rationality isn’t about them choosing what they like. The criticism points out that calling yourself a rationalist communicates exactly that sort of choice, of selecting some things to care about and other things to neglect.

Expand full comment

I mean, in practice it mostly describes a community, more than a system of belief. Rationalism is Yudkowsky, Less Wrong 1.0, and the people influenced by them; in terms of values, there's plenty of other groups that value rationality but happened to wind up with a different name (I mean, to begin with, there's RationalWiki....)

Expand full comment

Are Alex Jones or the fossil fuel industry really irrational? They have different goals that stem from different values, but they seem to be effectively pursuing and achieving those goals which is a form of rationality. They don't pursue objectivity perhaps, but the pursuit of objectivity is arguably a value choice, and not even necessarily rational from a self interested point of view. That seems to be the complaint on Reddit. The rationalist definition of rationality sneakily includes many value choices, rather than really being about "rationality" itself. Call it an argument over definitions.

Also, contra the post title, if you can be bad, it doesn't follow that you can be "good". Consider a population where everyone is equally good at something, but some individuals suffer injuries that permanently make them worse. There will be a few bad individuals, many normal individuals, but no "good" individuals. Yes the normal ones will be "better" than the injured bad ones, but again we have an issue of definitions over what we mean by "good". This is a situation where no one can claim special skill or superiority. This shape of distribution is arguably how certain fields like investing work, where there is no evidence of superior skill or alpha, even though there are demonstrably bad investors.

Expand full comment

There's no such thing as naked rationality. Rationality is always in the service of Something and we see each other as rational only as much as our Somethings align. Alex Jones is simply optimizing towards a wildly different Something.

As for his views on that school shooting, he claims that he was reporting on people claiming it was fake, and not claiming it was fake himself, and it was several years later that the media took his words out of context and made it look like these were his views.

So one could make the case that on this specific issue Alex Jones is more rational than you. He was reporting on people making incorrect claims, as a journalist, while you are making incorrect claims about Alex Jones :)

I think the closest we can get to naked rationality is to optimize towards our best understanding of what the most objective possible truth might be. But then we would have to venerate truth as an unreachable highest possible ideal and I have no idea how to do that without building a messy relationship between truth and beauty.

Expand full comment

Some things Alex Jones has said on his radio show:

"Sandy Hook is a synthetic- completely fake with actors, in my view, manufactured."

"It took me about a year with Sandy Hook to come to grips with the fact that the whole thing was fake. I mean, I couldn't believe it. I knew they jumped on it, used the crisis, hyped it up. But then I did deep research and my gosh, it just pretty much didn't happen."

"And it just shows how bold they are, that they clearly used actors. "

"No one died"

"A giant hoax."

Is this "just reporting on other people?"

Expand full comment
Comment deleted
Expand full comment

A very good point, although we should also keep in mind that the lies did not stop with Iraq.

People were doxxed and hurt because of Alex Jones' crackpottery, true. But a far greater number of innocent people suffered and died as a result of the War on Iraq, to name but one.

The difference is that the the warmongers have establishment sanction, and Alex Jones does not. In fact, the folks who pimped for the War on Iraq suffered no personal or professional consequences for doing so. The naysayers, of course, were cast into Outer Darkness, even as that War went more disastrously than the most pessimistic predictions would have had it.

Expand full comment

You're smuggling a value judgement. You believe that rationality is good, that it's better/more moral to be rational than irrational. That's the political statement.

Expand full comment

And the next step of the argument is to say "well, being better at being rational leads to better outcomes of <X> kind" assuming that better outcomes of that kind is the highest goal that people should organize themselves around is also a political statement insofar as making judgements about how people should organize themselves is what politics _is_.

Expand full comment

Plenty of political ideologies accuse their opponents of being irrational. Do any of them actually endorse irrationality?

"Eating babies is wrong" is a value judgement, but it's not political because it's not controversial.

Expand full comment

Yeah, there are lots of people who believe rationality is worse than other choices. I mean, Goop is a thing, right?

Expand full comment

There are plenty of irrational people who buy dumb things and otherwise make bad decisions. But I don't think they're irrational on purpose. I think they're making mistakes.

Expand full comment

You're evaluating someone else's behavior through your own value system and coming to the conclusion that your way of thinking is superior which reinforces your priors. If you assume that not everyone views thinking rationally is important then a lot of "bad decisions" suddenly are explainable without thinking the people who made them are dumb; in fact they just value other things more than operating how you'd think of as rational.

Expand full comment

I do and believe things that make sense to me. My neighbor does and believes things that make sense to him. We might not understand each other, each of us might look dumb to the other, but neither of us have knowingly chosen irrationality. Even if he believes in going with his gut instead of thinking things through, the choice to trust his gut is based on his past experience and understanding of how things work.

Expand full comment

So take that a step further. Why could your neighbor not value feelings as being more important than reason? Is it impossible that someone could believe that God's will is ineffable and arational and yet following God's will, to the best of one's understanding is the highest value? And indeed that trying to reason about God's will is sinful pride? Trying to say that they're just being rational about their understanding of the world does violence to the concept of rationality.

Expand full comment

>Do any of them actually endorse irrationality?

I mean, yeah; there are explicitly religious politics, for instance.

Expand full comment

And then there is a certain frustratingly common brand of an internet political commentator nihlist/troll/truth-doesn't-matter.

Expand full comment

The largest improvements in rationalism come from making more accurate descriptive statements. Are there any moral systems that advocate misleading yourself to improve outcomes? Is wanting accurate information about the world really a political statement?

Expand full comment

As a trivial thought experiment, imagine that the leaders of a society claim that they are fit to rule because they are unable to err.

Expand full comment

No one making a serious attempt to be a rationalist would claim that they are unable to err. I doubt *anyone* would seriously claim that, but rationalists seem to have internalized their own fallibility better than most (e.g. https://slatestarcodex.com/2014/04/15/the-cowpox-of-doubt/).

In any event, this entire line of reasoning is faulty. Combining a descriptive claim with a moral claim to produce a second moral claim does not create a moral dimension to the descriptive claim.

Regarding your sibling comments: Conflating the economic value of a person's skills or abilities with the inherent moral worth we give to all human life is a common mistake. Rationalists are capable of making this mistake, but I believe that practicing rationality makes it more easier to notice and avoid. I don't entirely understand what you are trying to say in those comments, so it isn't clear to me if you are making this error yourself, or if you are ascribing it to rationalism. Either way, I think you are wrong.

Expand full comment

I think you mistook my point (because I did not explain it well enough).

There are many people who benefit from non-rational value systems and behaviors. The implication of rationalism is that many more, if not all people should lead their lives in a rationalist way. That means that the current people who benefit from non-rationalism in whatever way will become losers and those who benefit from rationalism will become winners (in addition to any other purported benefits of rationalism that accrue to society at large or individuals).

Thus I would argue that there are at least two implicit political arguments that are embedded in advocating rationalism:

1) That the negative impact suffered by a subset of the population is outweighed by the benefits to society as a whole.

2) That society at large (including those who don't value rationalism) should trust the prior claim made by those who will disproportionately benefit.

In my examples, I was attempting to show that making a well back rationalism argument can create winners and losers in society, illustrating the point that value judgements that result in conflict with the status quo are inherently political.

Expand full comment

Is rationalism really responsible for those sorts of downstream effects? Any lie that anyone debunks can effect the possible balance of power in a complex society. Yet no one would ever claims that in order to debunk a lie you need to justify every possible secondary effect of doing so. Advocating rationalism is just debunking lies on a larger scale. Caring about the downstream effects of the lies that rationalism debunks, but not others, seems like an isolated demand for rigor.

I do see your point about this implying that rationalism isn't truly apolitical, but only in the somewhat strained sense of there word where truly nothing is. I'm generally hesitant to admit that rationalism is political without this qualification, because it sets it up for a false equivalence with partisan hackery, which is political in a much more concrete sense than rationalism.

Expand full comment

Or to get closer to the real world, imagine that in a society whose underlying foundations are that all people are equal in moral worth (and thus of equal value) you were to argue that some people were not actually equal in value to others. Say that tall people really are objectively better at many/most/all things valued by that society. Is undermining the social contract better for society?

Expand full comment

My interpretation of Taleb's take on this (articulated in Skin in the Game, if I recall correctly) is basically "it is irrelevant if your beliefs are true; what's relevant is if they're useful – there is no such thing as rational belief, only rational behaviour".

I also think there are taboo topics that many people feel should not be researched, where funding for such research would be seen as a controversial political move.

Expand full comment

We routinely encourage the dying to be more optimistic about their chances than the objectives facts justify. "Don't focus on the median survival statistics -- you could be one of the outliers!" Some believe this mild-to-moderate collaborative self-deception bucks the spirit up a bit in a time when every ounce of gumption is needed.

In warfare soldiers are routinely told lies like "if you remember your training, you'll come out fine" which is obviously false for some significant number of them. Commanders believes it encourages a better spirit -- which contributes to a higher probability of a better outcome, both for the group and each individual -- than a grim objectivity about the relatively low ability of the individual soldier to actually affect his probability of survival.

Nobody disagrees with a new mother who exclaims her newborn to be the most beautiful child the world has ever produced, and it would usually be considered of questionable morality for a bystander to hasten to disabuse her of her delusion -- not to mention it might actually harm the formation of the essential mother-child bond if she actually took it to heart and started to believe her child quite ordinary, perhaps slightly uglier than normal -- which, given the harm to an innocent, is likely to be considered a pretty negative moral outcome in any ethics.

Expand full comment

You make a strong argument that lying is occasionally necessary, but not argument at all that believing the lies yourself is important. You would need to refute the second in order to be arguing against rationalism.

Someone who genuinely believed that everyone was more likely than average to recover from serious illness would be dramatically worse off than average than someone who understands they were telling a comforting lie. A commander who was genuinely unaware that he was responsible for ordering people into harms way would be a moral abomination, even if he serves a just cause and even if the lie is necessary. Someone who genuinely believes that every new child born is the most beautiful ever is just an idiot. Lying to others may be occasionally necessary, but believing the lies never is.

Expand full comment

What if you need to believe the lie to argue it effectively?

Expand full comment

Well, that's an interesting assertion, but as a fanatical empiricist I would have to see it put to the test to give it credence. And the one fact that springs immediately to mind as troubling is that people *do* the things which I mention, and why would they, unless their practical experience is that they were successful? People have routinely shown that most people are unreasonably optimistic about nearly any situation with serious unknowns. Our natural set-point is to believe the future will be better than we have any right to think it should be, based on past experience. Why would we *do* this, unless it was functional?

Or consider another weird case: people have studied those caught in unexpected life-threatening circumstances -- plane crashes, lost in the wilderness, extreme survival challenges, and they are fairly uniform in noting that those with a "positive' (which in this case can be replaced with 'delusionally optimistic' attitude) are *more likely to survive*. They never give up, even when giving up is actually logical.

Mind you, it's certainly true that in *some* areas there's a premium on accurate beliefs, but I think it is entirely possible that in others there is actually a premium (in terms of functional outcome, e.g. survival, social acceptance, personal advancement) on certain kinds of optimistic delusion. I would say given the widespread existence of optimistic delusion it's rather the assertion that cold rationality is always the more functional approach needs experimental proof.

Expand full comment

Given that Alex Jones has a viewpoint with a track record of being factually accurate (as folks like Rogan and Pool have found when they fact check him), you can call him irrational, but your own prediction rate is just as faulty. I'll continue to read you in moderation and listen to him in moderation, and likely gain equal value from both. But he's more interesting.

Expand full comment

I only ever fact-checked one thing on his site and it turned out to be both very true, very interesting and completely ignored by the regular media.

I didn't pick it randomly, though.

Expand full comment

As a public figure, he also no doubt receives lots of "information" from various sources. Some it's true, and most of it is likely bullshit.

It's a question of signal to noise ratio. Jones believes and repeats almost everything he hears, so yes, he will be right in some cases long before other media, but he doesn't form these correct assertions in any kind of reliable, replicable way, and *that's* why he's irrational. Even a broken clock is right at least twice a day.

Expand full comment

Actually you're incorrect. Clearly you have Zero idea what you are talking about. How much time listening to Alex Jones have you spent? Near Zero I suspect.

Expand full comment

"But I’d much rather spend my time and energy to learn from the people who are better."

You say this Scott, but then you also never correct the obvious mistakes I have pointed out regarding your readings of Marx. (for example here: https://astralcodexten.substack.com/p/book-review-global-economic-history#comment-1795464)

Expand full comment

People like us debate about things like optimal tax schedules, best type of statistical analysis, most sensible vaccination schedule and distribution, future technologies, capital allocation.

Meanwhile we have a sizable chunk of the population is debating about whether vaccines are a global plot to sterilize humanity, whether the world is flat, whether god created the world 6000 years ago or 10,000 years ago, and most recently whether we should ban hospitals from requiring vaccinations for employees (you think this is some fringe kooky politics? No these bills have advanced out of legislatures).

Being able to go from smart college student to Scott level rationalist might be worthwhile. But it’s rather like deciding what clothes to wear when your house is burning down. How do we inject a bare minimum level sanity into the world?

Expand full comment

I think a lot of what you quite correctly see as irrational speech is just one group of green beards calling out to another. It's people in one tribal group trying to raise an unmistakable flag in order to see how many other members of the tribe, dispersed in the general population, there are. The fact that the speech is insane or silly actually makes it more valuable -- less chance of getting a response from someone *not* a member of the tribe who mistakes the rally cry for an attempt to actually communicate information.

Expand full comment

I so very much respect you and your POV on so many things, but on this topic I have to respectfully disagree.

For example, I don't think Alex Jones is actually irrational. I think he's a rational actor who has determined that his brand of rhetoric gets him what he wants (money, infamy). He has probably determined, quite rationally, that he could not attain the money or infamy he has attained in more legitimate ways.

And, yes, I think it is absolutely reasonable -- and hyper-rational -- to assert that even rationalists come to the party with biases. It may very well be that the rationalist bias leads to better fact-finding and improved revealing of measurable truths, but it's still a bias.

I do not think we can remove bias from any assessment. I might be going out on a limb here, but perhaps we can only remove bias from standard measurements.

I humbly suggest that we might want to look at bias as being as inevitable and as pervasive as the observer effect. In fact, they may be related.

Expand full comment

> And, yes, I think it is absolutely reasonable -- and hyper-rational -- to assert that even rationalists come to the party with biases.

I don't think Scott disagreed with this so much as disputed the common tactic of attempting to dismiss rationalist or scientific arguments as "just as inherently biased" as other rhetoric, some of whom take so far as to establish a false equivalence between all arguments because bias is inescapable.

It's clearly nonsense to claim that two philosophies whose operating principles constantly examine assumptions and empirically correct for bias are just as biased as any other philosophy.

Expand full comment

It's a branding issue.

Imagine a group called the "Truthists" who uphold the telling of truth. They'd be scrutinized for any sign of dishonesty, and any lie from any Truthist ever would be wielded as an example of the naked hypocrisy of the movement, how you can't trust Truthists, and how they should actually be called Untruthists, and then they'd become a punchline in an SNL skit. This would probably happen even if Truthists were overall more honest than the population at large!

In general, people become suspicious when you brand yourself around a positive identifier. It's like those hospitals with creepily positive names ("Holistic Happiness and Wellness Center"). You instinctively wonder what exactly they're hiding....

Expand full comment

See also: objectivism.

Expand full comment

In my feline experience, objectivists tend to confuse their preferences with universal truths, and spend far more time berating unbelievers for their supposed lack of rationality (meaning, failure to agree with the objectivist) than they do scrutinizing themselves for straying from The One True Way.

Expand full comment

I thought Objectivism was just a belief system which allows you to rationalize infidelity without having to say humiliating things that begin "It's not you, it's me..."

Expand full comment

There's a reason they called it lesswrong.com and not good.com, and that reason is that the domain was vastly less expensive. Hmm forgot where I was going with this.

Expand full comment

The point of the "everything is biased" argument is, generally, to bypass scrutiny of your political ideas as they pertain to objectivity. This nonsense has been loose in social studies for decades, as anyone who's followed the topic knows.

Saying that "nobody is fully unbiased" is completely different from saying "no unbiased set of facts exists." Even if you've got nothing but biased people, you can still extract a set of facts as long as they aren't all biased in the same direction. Refer to the standard bullet points on "the group being smart as a whole," yada yada yada.

When you run into someone who thinks your view is biased in a particular instance, you can talk to them. When you run into these people who think there's no point in reasoning because "everyone is biased," watch for sudden movements and protect your vital organs.

Expand full comment

>When you run into these people who think there's no point in reasoning

As an academic, I'm pretty sure I've never run into these people.

Although I have met a lot of people who dismiss their ideological opponents without considering their arguments by *claiming* they are those types of people.

Expand full comment

"People who think there's no point in reasoning" is just my exaggerated description of the people who make the arguments quoted in the post. Such ideas are increasingly prevalent, and not among people who did welding apprenticeships.

It is not even that much of an exaggeration. You'll have trouble finding anyone who outright says they "don't believe in reasoning." But it is easy to find people who reflexively put a moral quietus on appeals to reason or "objectivity," on the premise that these are always disingenuous, or that they are an effort to "legitimize" some kind of attitude.

i.e. "the only reason to be entertaining [such and such ideas] is if you have [such and such motivation]"

Expand full comment

I think your arguments are basically correct individually, and incorrect in their conclusion, entirely because you miss a single, important element:

The people who accuse somebody of being biased, they're almost always trying to win a political argument that has nothing to do with whether or not the person they are accusing is biased. The people who accuse somebody of being irrational, they're almost always trying to undermine somebody's rational response to a situation. "Irrational", as a word, is something the public generally thinks that somebody says when they're gaslighting somebody, to shame them for their entirely rational emotional response to a fucked up situation.

The connotative and denotative meanings of these words clash in a way which I think invites a level of extra scrutiny to claims related to them. That is, the demand for rigor isn't isolated; these words are part of a class of words we are generally suspicious of the use of. "Enlightened", "Godly", "Hormonal", "Emotional", "Committed" (in an ideological sense). I think we're reaching the point where we can add "Scientific" to the list.

Also, there is a lot of specific ideology arising from founding effects in the rationalist community, which I think sometimes gets conflated with rationalism itself.

Expand full comment

A strong rationalist should be aware that they have limited rational compute capacity. So there is some amount of need to apply shortcuts to adversarial anti-rationalists like a spam filter to protect your capacity.

Expand full comment

> Even in the vanishingly unlikely chance that I’m the [most rational] person in the world, I still don't think I'm hitting up against the limit of what's possible

We'll have to ask gwern

Expand full comment

I think the confusion is that "rationalists" view rationalism as a tool. It's like a hammer. A hammer can't be biased. You might be able to do very biased things with a hammer--build a "whites only" sign--but the hammer in itself is not a biased thing.

The criticism views "rationalism" as fundamentally tied up in the problems it tries to explain. Asking "how does 200mg of L-theanine daily affect stress levels" is, to them, a biased question because it, to them, presupposes issues like who gets L-theanine and who doesn't, is it morally right for some people to benefit from lower stress levels, who produces L-theanine and why do they not get the benefit. To them, asking the question is making moral and political judgments already, as they do not view these questions apart from the tools used to analyze them.

Confusing a tool for things the tool is used for is an interesting failure mode (in the spirit of "what developmental milestones are you missing"), and I think it's worth calling out that that's what's going on here.

Expand full comment

Surprised that the word "projection" doesn't appear anywhere in your essay.

Anyway, in my experience people making these "everything is relative" arguments fall into three classes:

1) Cynical hucksters and cult leaders (e.g. lawyers, politicians, pundits) who personally benefit by sowing distrust in objective reality because it makes it easier to sell faith in a guru (i.e. themselves).

2) Morally lazy people seeking to assuage their own troubled conscience by arguing "everybody does it! I'm not so bad..." The projection crowd.

3) People who have had delusional understandings of their own intellectual competence who are confroned with it by events, e.g. someone who has won participation trophies for all his life who then discovers that there *actually exist* people who are much smarter and more competent, objectively, than he is, and who therefore retires from "artist" to "critic." It's comforting to his wounded self-esteem to argue that people who are Right all the freaking time are just...beneficiaries of some kind social privilege or momentary consensus support -- instead of being *much smarter than me*. Call it a form of envy, from people who themselves are not smart enough to wrestle with ambiguity and poor data and nevertheless extract from the morass of noise the occasional clear diamond of truth.

Expand full comment

The people you're addressing here are not arguing in good faith. I mean, in this specific case it seems to be more someone getting personally irritated by Yudkowsky and/or some other individuals in the rationalist community (plus the Marxist belief that Marxism is rationally necessary); but usually this argument is just deployed to be obtuse. There's discussions that are built around current politics - not questions of political structure, or even necessarily ideology in general, but the front-page news, the current sources of tribal anger - and then there's discussions that avoid those topics to the best of their ability. We all intuitively understand that, I think. But sometimes people want to make a discussion current-political, and when people tell them to stop ranting about politics they respond by saying everything is technically political.

Which is really separate from the claim that "rationalists think they can be perfectly rational in Straw Vulcan fashion", which is only said by people who've never interacted with the rationalist community to any significant extent. But I've seen the "everything is political" argument pop up much more in other contexts - i.e. whenever sometimes complains about a preachy book that unnecessarily brings in current politics, they get the stock 'everything is political' response. No, I don't need the alien villain to talk about how they want to "make the homeworld great again", or to read your entirely unoriginal thoughts about why it's all men's fault from the space opera narrator. And yes, it *is* possible to deal with themes of authoritarianism or gender without being unbearably preachy about it; but it's impossible to draw a bright line, and so "everything is political" is brought out as an all-purpose defense. Or, yes, with regards to science, and whether the politicization thereof is a problem.

Anyway. The point is, it's a general class of fallacy where "X is bad" is replied to with "there's no clear line between X and not-X, ergo X doesn't exist and it's not a problem that I'm X-ing on your lawn".

Expand full comment

(And of course in some cases being political is desirable, but I think in the social media era it's pretty clear that there's too much of it rather than too little, at least online.)

Expand full comment

"plus the Marxist belief that Marxism is rationally necessary"

In the Rationalist community I've pointed out problems with Rationalist readings of Marxist material, which range from extremely obvious misreadings/mistakes to outright misrepresentations of Marx. When I pointed this out people have mostly either ignored the mistaken beliefs of the Rationalist community, or they have outright banned me for pointing out errors.

Expand full comment

Maybe Marx was actually a shitty writer, if he needs an army of apologists to correct widespread misunderstandings of what he said all the time.

Expand full comment

In my opinion his writings are not so hard to understand. He's one of the most taboo writers and people often come at his writings with false preconceptions, which, unless they are careful readers, will distort their perceptions of what he's saying. This is probably the case with Scott Alexander's ill-fated attempts at understanding Marx.

Expand full comment

Yet I've pointed out that other Marxists have came to different conclusions than you, which you simply dismissed.

Expand full comment

You're mistaken.

Expand full comment

No, I'm not mistaken, and I explained Scott's misreading of Marx in basic detail here:

https://astralcodexten.substack.com/p/book-review-global-economic-history#comment-1795464

Expand full comment

What I see in the link is that the time you devoted most of your words to try to explain Marx's views as you see them, you were not banned nor ignored but instead got a couple of responses of curious people open to having been wrong. So yeah, you are mistaken.

Expand full comment

And I'd argue that still the ratio of explanation vs aggressive psychological analysis of motivations and backstory and culture war crap was quite bad, so even if you hadn't got such a reaction, you would have been mistaken with your claim about what causes people to often ignore and want you gone.

Expand full comment

Has Scott Alexander fixed his mistake yet? It feels like he is ignoring this problem.

Expand full comment

> In the Rationalist community I've pointed out problems with Rationalist readings of Marxist material, which range from extremely obvious misreadings/mistakes to outright misrepresentations of Marx. When I pointed this out people have mostly either ignored the mistaken beliefs of the Rationalist community, or they have outright banned me for pointing out errors.

That's your claim. You are mistaken, as can be seen by your own link. Don't know what else to say.

Expand full comment

Would you still object if they replaced 'ideology' with 'utility function'?

Because to me, that's closer to what they're saying, at least the ones I'm familiar with. Rationality doesn't *do* anything without a utility function telling it *what* to be rational about pursuing, and I think they use 'ideology' to refer to those motivating factors.

It's not saying 'there's no such thing as rationality because ideology prevents it,' it's saying 'everyone has an ideology *in addition to* being rational or irrational (separate axes).' And if you forget to interrogate your ideology because you think you don't have one because you think you're driven by pure rationality, then you run the risk of accidentally developing a terrible and self-serving ideology in the shadows.

Of course, I think that objection to the rationalist movement is largely covered in the 'we have noticed the skulls' paradigm, or at least should be. On the other hand, if you look into some of the banished culture war discussions on reddit, it's pretty clear that there's still some very strong ideology at play - those discussion are better than other culture war discussion by virtue or being much more rational than normal, but that doesn't remove the influence of ideology from them.

Expand full comment

I don't think this is really the same objection. It's *an* objection; the basic point of "it's possible to be both intelligent and evil" is not really being disputed by rationalists - I mean, that's literally the orthogonality thesis - but in practice there's IMO a real vulnerability to people that are intelligent and arguing in good faith, but also just evil. But I don't think that's the same complaint as was cited in the OP.

Expand full comment

(and it doesn't depend on politics at all, necessarily, though IMO in practice it's related.)

Expand full comment

I mean, it's a single sentence that doesn't get elaborated on, so I guess it's hard to say exactly what they full intent of the sentence was. But what I'm saying here is how I would interpret that one sentence, especially if I was being charitable or steelmanning.

Expand full comment

An undergrad philosophy professor when challenged with "can anyone really imagine infinity?" agreed with you pithily: "Humility on behalf of others is simply a confused form of contempt."

Expand full comment

It's clear to me that everyone is biased. I interpret "bias" as what a person's axiom system is. Different people have different self-evident truths. It's certainly the case that most people have a self-contradictory set of beliefs, and probably the case that all people do, but even if that's not the case, certainly there are different axiom systems that give different sets of "biased" beliefs, that are still correct within the systems.

I really don't think Alex Jones is less rational than the average person. Intelligent people rather drastically overestimate the intelligence and rationality of the average person. The average person believes all kinds of crazy things. I think Alex Jones is probably right about average in rationality. In any case, saying it's obvious that he is much less rational than average needs some support.

It's not clear to me that global warming is necessarily relevant, for many definitions of relevant. (I'd rather say that global warming is probably not an urgent problem, or rather that it wouldn't be if the world enacted a high tax on gasoline and was rational about nuclear power.) It's also not at all clear to me that someone saying that global warming is not an urgent problem must be any more biased or politicized than anyone else.

Perfect rationality is certainly possible: there's a mountain of mathematics that has been proven to be perfectly rational (within a certain axiom system).

Expand full comment

I think that Scott was trying to make a simple point - that rationality is on a spectrum and it's possible to be more or less wrong. I do agree with you that Alex Jones wasn't a great example, because we don't really know how rational he really is.

Expand full comment

I think that for some people the motion of ideas has a feeling. For others, what they are experiencing emotionally or through senses is a much stronger sensation. What is considered a path to self-improvement will look different for these groups. Looking at a math problem on a page, one person will see the ink, the paper, feel the concept of number and see the number concepts interacting. Someone else will see the paper, the writing, feel frustration at a teacher, the smell of the room and a feeling of loneliness, say, and the sensation of the number concepts will go largely unrecognized. Both people will however use the word “math” to describe their activity in those moments. Of course math B is “political” if one thinks all human experience is politicizable. Math A will seem qualitatively different from other types of interpersonal activity, to those that experience it, and will feel quite far from other models of politicalness.

Expand full comment

I think that for a lot of people, "rationality" is really "received wisdom." This makes sense in most cases (as you've discussed in several posts before) - the accumulated knowledge of culture is probably smarter than what any person could figure out on their own. Alex Jones is labeled "irrational" because he goes way off into left field with kooky theories.

The proper source of received wisdom of course varies based on group affiliation. Conservatives look to the founding fathers and Tucker Carlson, liberals look to FDR and the New York Times, socialists look to Marx and uh, Jacobin(?), etc. But most groups have clear foundational old sources of wisdom as well as recognized modern institutions that are trusted and respected as sources of truth.

Importantly, the idea that rationality is essentially a matter of rejecting conventional wisdom and overcoming cognitive biases is a pretty big ideological leap. Like most ideologies, it is defensible, but it is definitely not how most people think about it. For most people, rationality = conformity, irrationality = nonconformity. Rationalism, as an ideological structure, flips this around and suggests that conformity is irrationality while nonconforming free-thinking is held as the highest ideal.

Given all this, I think it makes a lot of sense that people will be left scratching their heads when they hear someone say, "we're the rationalist community. We're all about avoiding bias and seeing objective truth! Also, we're obsessed with superintelligent AI, half of us have arranged to have our brain frozen when we die, and our comments sections get REALLY racist." Like, there really is a pretty distinctive ideology at work with its own very noticeable quirks. It all hangs together and makes sense once you understand where it is coming from, but that is true of any well-developed ideology.

Expand full comment

Post-Disney - the main character is always nonconformist but saves the day - I’ll argue that in the US conformity is often seen as a moral failing, or even a failure at being a person, for example, “personality” is revealed in “quirks,” and if someone isn’t conspicuously quirky they have no “personality” or heart or moral conscience (too much acceptance of received wisdom and no grappling.) Then rationality looks equivalent to lack of moral conscience.

Expand full comment

I think it's important to distinguish between descriptive and prescriptive statements. If you are doing descriptive work, then yeah, you can be objective. But even then what you say will be influenced by your perspective, as the story about three blind men describing an elephant teaches us. However, when it comes to prescriptive statements, there is no such thing as being objective. If you are saying what should be done, rather than what is, you are expressing a personal preference. All preferences are subjective.

Here is where those two intersect. Choosing what to describe is a preference. This is very common in political discussions, where each side selects which facts to present. And the thing is, you have to be selective about which facts to present - you simply can't present them all. So how do we select which facts to present, in order to be rational or objective, or unbiased? These terms haven't even been defined - they are just nebulous concepts that we all think we have defined in the same way for ourselves when we haven't. Whatever we think of those terms, I think keeping in mind the idea that you cannot separate yourself from your ideology is very useful tool in trying to become more "rational" or "unbiased."

Expand full comment

From Asimov, a related idea: https://hermiene.net/essays-trans/relativity_of_wrong.html TLDR: It's ridiculous to throw up your hands and say all theories are wrong; some are much less wrong than others.

Expand full comment

This feels similar in some way to arguments against meritocracy. "all science is political" is usually leveled at institutional science, with the implication that those institutions are falsely claiming that they *already are* up against the lightspeed-rationality limit. A charitable read is that "all science is political" is a way of telling people who are saying "we know objective truth" that "you should remember that you can be a lot better".

Expand full comment

I disagree, normally "all science is political" is used to defend making science more political. It's a retort to use against anyone that questions whether the blatant injection of political bias into science is harmful to the process.

E.g:

"Our foundation funds science that disproves global warming."

"I'm concerned that you're politicising science."

"All science is political! Stop being hypocritical!"

Expand full comment

> I believe there's an important, specific criticism to make of this group, and that "they are doing biased, political science" is the most natural and accurate way to make this criticism.

This only applies if you believe that science is categorically different from literature. If, on the other hand, you believe that science is just another form of narrative, then any statements anyone can make for or against global warming are merely expressions of their political will. The statement "the Earth has warmed X degrees in the past Y years" is not categorically different from saying "we should/shouldn't ban immigration" or "the total number of genders is N" or "people of race A are superior to people of race B", etc.

Expand full comment

In a humane polity, people who believe science is just another form of narrative would have guardians appointed to manage their financial affairs.

Expand full comment

His other "cons" are pretty bad too.

2. They have too much faith in the power of their own cognition

Counterpoint: this community is where I even got a clear concept of Outside View. One might say it's _focused_ on failures in cognition. Maybe there's too much emphasis on Inside View, according to Freddie? But then, frankly, his ideology is more esoteric than ideology of some rationalists. Seems like an isolated demand for rigour, backed by nothing. How does one even respond to this accusation?

3. Their indifference to human emotion and social cues is not integrity, it's a refusal to confront the material importance of emotions

Frankly, I'm not sure if I get what he means here, so whatever.

4. Eliezer Yudkowsky is their king and he's kind of an asshole

I don't think that's the case. But maybe I'm wrong. Again, seems like a weird ad hominem.

5. We're all just scrambling around trying to find meaning and understanding with brains that are the imperfect products of evolution

...which is, again, core rationalism(TM) stuff. Like, lots of sequences are basically this. What's the point?

Freddie really needs to add how does he think rationalists should improve. Because these points seem to not be very actionable.

Expand full comment

To be fair to Freddie deBoer, though:

> 1. There is no such thing as "rationality" that is free from ideology

I don't know what precisely he means by this. However, IMO the capital-R rationalist movement definitely has some ideological stances; i.e. positions on certain issues where, should you take the wrong stance, you're automatically branded as a non-Rationalist (and possibly just a crazy/stupid/toxic person altogether). Rationality as a methodology may be ideology-free, but Rationalists as people do not.

> 2. They have too much faith in the power of their own cognition

Absolutely true; the general idea is that intelligence is the only attribute that really matters, and that a sufficiently intelligent agent can accomplish virtually anything (plus or minus some laws of physics).

> 3. Their indifference to human emotion and social cues is not integrity, it's a refusal to confront the material importance of emotions

I don't think it's a *refusal*, per se, but there's no denying that the proportions of people on the autism spectrum is higher among Rationalists than among the general population. But I think it's unfair to conflate autism with willful refusal, so that's a point against deBoer here.

> 4. Eliezer Yudkowsky is their king and he's kind of an asshole

It doesn't matter whether he's an asshole or not; lots of people are assholes. However, IMO the Rationalist movement holds Yudkowsky in higher regard than it is perhaps warranted.

> the notion that we are on the verge of radical life extension is profoundly optimistic based on current technology.

Calling it "profoundly optimistic" is IMO an *understatement*.

Expand full comment

I think this depends on whether you interpret being part of the Rationalist community as either a commitment to being more rational (which should make you more self-critical enough to become continually more rational) or a claim to already be more rational than most people (which will make you less open to criticism, and therefore stuck in the same place). There's an element of both in the movement, and many Rationalists could probably benefit from being more open to criticism, but I suspect that being part of the Rationalist community makes people more open to new ideas than they'd otherwise be. My assumption is that the average Rationalist was always very confident in their own conclusions, and becoming part of the community made them more open to new ideas. There's probably a weird effect where becoming a Rationalist makes you more confident in your conclusions, conditional on your understanding of the world being correct, and as long as you remind open to new information I think that's probably a good thing.

It could of course be the other way around and being "Rational" just makes you an insufferably arrogant know-it-all, but at the very least the Effective Altruism crowd I'm part of loves being told that they might be wrong about everything!

Expand full comment

One possible response would be

"People who identify as rationalists tend to have specific blindspots. They are not just imperfectly rational (everyone is), they are imperfect in distinct ways"

Similar to the way that it always seemed to be _engineers_ who argued for Young Earth Creationism, or that 'New Atheists' were so proud of seeing through religion that they never noticed their sexism, etc etc

Expand full comment

So bad it's good 😁

Expand full comment

I was surprised by this post and how Scott chose to write about such a basic, 101, idea. It's like seeing a college math professor lecture earnestly about how addition is... addition.

Which made me wonder why. Is it that Scott is so frustrated with so many people unable to do simple addition? Is he speaking to the "beginner" readers and trying to catch them up? Or does he find that the usual discourse on relativism (philosophical/logical/sociological) inevitably gets bogged down in a fog of college dorm room weed smoke? (Reading the comments, it appears that it's inescapable even when simplified to a spare denominator.)

I think this was the darkest and most depressing post I've read yet on SSC or ACX.

Expand full comment

You seem to think that basic/101 ideas should be so obvious that they do not need discussion or elaboration. I suggest you peruse the SneerClub subreddit of this very post to see how even such basic arguments can be twisted by bad faith actors.

Expand full comment

Love this idea by Meehl: 'Multiple Napoleons fallacy: "It's not real to us, but it's 'real' to him". "So what if he thinks he's Napoleon?" There is a distinction between reality and delusion that is important to make when assessing a patient and so the consideration of comparative realities can mislead and distract from the importance of a patient's delusion to a diagnostic decision.'

Expand full comment

Typo:

We’re happy to grant that people are fighting for justice"

should be

We’re happy to grant that people are "fighting for justice"

Expand full comment

I will say this with as much kindness as I can manage, but you're really missing the point of the "anti-rationalist" critiques you are criticizing. The rationalist viewpoint, at least as it manifests in these parts, ABSOLUTELY smuggles in a bunch of ideology and value judgements. You even started to grind into this problem about a decade ago (whose utilitarianism).

For one very obvious example (and at the risk of being culture-war adjacent), my read of the rationalist community on the subject of human difference is that they genuinely believe that race, sex and gender doesn't/shouldn't matter, only ability does/should. In fact, debates about the degree to which race, sex and gender DO matter are framed in terms of ability (in turn, this frequently reduced to IQ).

(I explicitly don't want to argue or start an argument about whether this perspective on race/sex/gender is justified or correct or moral. The point isn't whether the viewpoint is correct, rather that it is one commonly held in these parts)

I once read an interesting point about Japanese artist Hokusai's Views of Mt Fuji. After this series was produced, the image of Mt. Fuji came to dominate the period's art so strongly that even it's absence was a choice laden with meaning. For a more contemporary example, consider Tolkien's influence on Fantasy: you can write Tolkienesque Elves/Dwarves/Wizards, you can try a new take on Tolkienesque Elves/Dwarves/Wizards, or you choose to avoid writing Tolquinesque fantasy by removing Elves/Dwarves/Wizards from your setting completely. But these are your only three options: avoiding comparison to Tolkien, at least on some level, is not an option for a fantasy writer at this time.

Race, Sex and Gender in contemporary politics are kinda like Mt. Fuji or Tolkien. They cast such large shadows that trying to ignore them is a kind of political project in and of itself. You can't be neutral on this stuff, and not because doing so is impossible in the abstract "no one can be totally unbiased" sense of "impossible". Rather, because a belief that one actually SHOULD be neutral on these subjects is ideologically motivated.

If I wanted to go full post-modern-neo-marxist I'd go on to point out that this anti-ideology matches exactly the demographic/class interests of the rationalists as a whole. Compared to the general population, we skew heavily towards white, male, smart, and DSM-diagnosible. Assuming self interest, and given that demographic data and nothing else, the political implications of the rationalist project are utterly predictable.

Please understand I say all this as an intelligent, "neurodivergent" white man who considers himself a rationalist. Y'all are my ingroup, and the people Scott was quoting definitely aren't. But let's not blind ourselves to the political implications of this community just because we don't like the fact that there ARE political implications.

Expand full comment

These are very fair points. I think the rationalist community is decent at things like pharmacology, economics, epidemiology. But it completely falls apart when it comes to politics for exactly the reasons you state.

Expand full comment

Isn't Alex Jones possibly pretty rational individually, milking right-wing morons for their money by selling worthless supplements? "Crazy like a fox"? How irrational is it if it works for him?

Obviously his message and content are utterly irrational and bonkers.

Expand full comment

I think it makes sense to separate Alex Jones, the character; from Alex Jones, the actor portraying the character.

Expand full comment

What if Alex Jones' rationality is beside the point? What if he is the heir of Homer rather than Newton? In Jordan Peterson's model "The world can be validly construed as forum for action, or as place of things." Explanations of the world in the former (Homeric) sense focus on describing the meaning of things and ultimately tries to tell people what they should do; explanations of the world in the later (Newtonian) sense seek to create a "increasingly precise determination of the consensually-validatable properties of things, and for efficient utilization of precisely-determined things as tools".

If Alex Jones accurately predicts the ascendance of the Blue tribe and how that translates into the cultural, economic and literal* destruction of the Red tribe, does it really matter how precise the metaphor is? Or is it more important that he craft a powerful metaphor that mobilizes his tribe to resist?

*only a slight overstatement, viz., the unprecedented decrease in life expectancy due to deaths of despair

Expand full comment

The two words, "Less wrong", really do say it all. Perhaps we can just rename rationalists as "Lesswrongers" and quiet the haters.

Expand full comment

I think some of the problem is the perception of arrogance on the part of self-described rationalists by critics. They feel that the rationalists are claiming a sort of superiority - that they can think properly whereas non-rationalists can't, that they are cool, objective, non-emotive truth-seekers and everyone else is a dumb hysteric.

I don't think that's necessarily true (but, um, I have certainly seen some people online describing themselves as rational/rationalist and being a bit arrogant over their superior brainpower) but it's the old clash between Reason and Emotion, with emotion being considered inferior and feminine versus masculine reason and superior brainpower.

People who feel very strongly about things will naturally feel "well, why shouldn't I be angry or upset or enthusiastic or whatever about this? The situation *should* make you angry! Who gave you the right to play the scolding parent treating me like a toddler having a tantrum?" in response to requests to calm down and think about this without emotion. I greatly prefer the "please just calm down and be reasonable" approach myself, but I do see how it can be off-putting. Sometimes you just have to let someone vent before they can calm down and be reasonable, and sometimes the dispassionate approach can seem like you're staking a claim to be the 'better' (as in 'more correct, more right, more reasonable, more adult and mature person' sense) participant in the conversation.

There are terribly irrational people out there, truly. But there are some self-described rationalists who aren't much better, who do seem to think you can reduce people and human problems down to a mechanical box where you just pull this lever and press that button and the correct answer comes out.

I'm one of the people who mocked the "feminist glaciology" thing, but there could be a point there - before it got exaggerated and co-opted into the culture wars - that other viewpoints can make valuable contributions, and maybe you should listen to the folk tales from people who have been living around glaciers all their lives because they might have some information there (which is not to say you immediately junk all science and become a shaman). Science does not appear fully formed out of a void, it is done by scientists who are people and who have the same biases and flaws as every human. I think we see this in the replication crisis, where very definitely people have had agendas or preconceptions which they shoe-horn in to their studies, and massage the results, and then wave this around as "science proves x, y or z!" and/or interested parties like to wave this around as "science proves x, y or z!"

We see it most clearly and obviously in social science and psychology experiments, particularly those from the 70s such as the Stanford Prison Experiment where it turns out the originator deliberately set it up to give him results to support his political ideology around prison reform, and what was dubbed the "sex raft" experiment which I only learned of via Tumblr (all I can is, look, the 70s were weird and everyone was on drugs) https://www.forbes.com/sites/joanneshurvell/2019/01/15/the-1973-raft-experiment-sex-and-sedition-at-sea-now-a-fascinating-film/?sh=6948990271e0

"Shurvell:

One of the most interesting things about the experiment was that it didn’t work out as Santiago expected so he tries to manipulate the participants and then they turn against him.

Lindeen:

Yes, and I think that’s actually an interesting result. That’s why it’s great that the “guinea pigs” are brought together again and are able to speak about and analyse their experiences in my film. The stories the participants told were completely different than what Santiago had written in his journals."

That's not "objective, rational" science, that is "everyone has got biases and an ideology" in action.

Expand full comment

I find the people that oppose rationalists are pretty arrogant as well, maybe not quite Yudkowsky levels of arrogance (I like him but understand why people hate him), but there's a very condescending vibe to a lot of anti-rationalist criticism. Something along the lines of how these people think they're so rational, but really they don't understand how to think, here's some fancy philosophers that point out the limitations of rational thinking, when you really think about it trying to being rational is irrational. If you really want to, you go on to argue that "rationality" is just a gateway drug to the alt-right (reference Ben Shapiro/Curtis Yarvin), so better never question anything anyone on the left says or you'll end up a fascist.

Not sure on my point except that arrogance seems to be the default.

Expand full comment

"Statements like “it can’t be separated from ideology” risk putting everyone on so relative a footing that Alex Jones’ version of rationality is no worse than anyone else’s."

Well that doesn't quite scan.

Michael Mann is a major figure in climatology whose work was critically important in understanding the reality of human effects on climate change. He was also influenced by the predominate social order he lives in, and the things he do cannot help but be shaped by it - he has biases and blind spots like anyone else. The same is true of Alex Jones. And yet, one of these is justifiably seen as a respectable scholar and scientist, and the latter is seen as a bizarre crank who rants about gay frogs. We can even acknowledge that both of them are thinkers who fail to be perfectly rational all the time and still consider that difference significant.

To take your race analogy, this is less like pointing out that humans can't quite run as fast as the speed of light and more like pointing out that humans can't quite sprint as fast as racehorses. Nobody grows up independent of the culture that raised them (or the instincts they have as human beings), and as a result, everyone will have some biases. Some people will have less or be more aware of their biases, but I don't think it helps to call that "good". Nobody is "good" on rationality; we're all just different shades of more or less bad. Isn't the rationalist creed something along the lines of "CONSTANT VIGILANCE!" because of that?

This seems, at best, like a heuristic that will make certain people become overconfident (compare to "Cowpox of Doubt") in their own ability to reason. At its worst, it seems like the denial of very basic truths because they're inconvenient to the ideas behind rationalism. Which doesn't seem very vigilant.

Expand full comment

It's more a critique about selective application of the tools of rationalism, e.g. of intellectual buttressing and of critiques of irrationality. It's less about biased methodology, and more about biased cause selection and the harms of elevating certain voices (when they say one thing that is rational).

Expand full comment

I admit I'm very lukewarm about rationality as the rationality community seems to define it. In large part, I'm just not sure I understand what they mean by "rationality." In small part, it's probably because I associate the term "rationality" with a sort of "science, f--- yeah! Screw those stupid religious people!" mentality. That's probably not a fair association, but I choose it anyway.

What that means is, I'm probably sympathetic to the claims Scott is critiquing in the post. It's not that I disagree with Scott. It's just that my knee jerks in that direction.

As a partial aside, I read a portion, maybe 20%, of Eliezer Yudkowsky's A to Z book about rationality (I hope I'm not misspelling his name....I'm too lazy to look it up) book about rationality and its tone seems to substantiate some of my suspicions and reservations about "rationality." I understand that he wrote that book a while ago and that in his preface he says he's taken on a different attitude about his approach to some of the things there. But it's hard for me to shake it off. At any rate, I'm probably being unfair.

Expand full comment

Also, the Rationality Community often seems to share an error with the traditional Rationalists like Descartes, a dramatically exaggerated belief in their ability to just _think_ one's way through any problem instead of doing annoying things like checking with experience or established research.

The Skeptical community - which overlaps but less than you'd think - is far better with this.

Expand full comment

"Also, the Rationality Community often seems to share an error with the traditional Rationalists like Descartes, a dramatically exaggerated belief in their ability to just _think_ one's way through any problem instead of doing annoying things like checking with experience or established research."

I think that's simply inaccurate, which is kind of ironic.

Expand full comment

Bookmarking this article for the inevitable thousand times that I’ll need to share this perspective with interlocutors

Expand full comment

Feels like this misses the key point, IMO. Here's something extremely rational: schizophrenic paranoid fantasies, All the conclusions follow from their perceptions! How could we get more scientific? The only thing that prevents us from believing that the CIA is tapping our phones is some sort of trust in institutions, which communicate through media channels stuff like "we promise that we're not following you specifically around".

So, we're not necessarily being any more or less rational than the paranoid schizophrenic when we decide that the CIA isn't tracking us (actually we're probably being less rational, because we're taking more on faith), it's more a question of "who do you trust and what do you place your faith in?"

And once you start poking at those questions, the idea of rationality flies out the window, because rationality (and objectivity as well) is a social tool rather than an end in itself. What's political about rationality is what you choose to do with that tool, and the "true" grounds in terms of ethics, value, etc. (i.e. what one's rationality is deployed *toward*) are only found when one reads between the lines, at least on this blog or in this community.

Remember that objectivity itself was invented by Archimedes and only became a "thing" back in the 1500s, when Galileo realized that our senses were deceiving us about the cosmos, thus toppling the whole "contemplative natural philosophy" thing that Aristotle started up way back when. And objectivity is useful but we can't be objective about everything all the time, so politics creeps in when we decide what we're going to be objective *about*.

Do you see what I'm getting at here?

Expand full comment

> All the conclusions follow from their perceptions! How could we get more scientific?

Perceptions are not treated as authoritative by science, because science has demonstrated that perceptions have flaws. Merely trusting perceptions is thus not justified.

A scientific schizophrenic should also check for such internal consistencies. The mathematician John Nash did something like this. I think this seriously undercuts your claim that rationality is merely a social tool, it's in fact has a larger scope as a truth-seeking and truth-validating process.

Expand full comment

What you are describing is not rationality in the precise sense of the word, but something else entirely.

Expand full comment

I have a hard time understanding what you could mean by "rationality" that doesn't encompass a process that corrects for bias and that doesn't require internal consistency.

Expand full comment

The formal usage is purely of internal consistency and pursuing chains of logic. The idea of correcting for bias is actually in tension with rationality as such, because it disrupts what would otherwise be internally consistent chains of reason. This is why the original Rationalists in the enlightenment tended to be in opposition to empiricism, with Kant providing a first tentative synthesis (in that we can empirically determine appearances, but the "things in themselves" remain inaccessible except through "pure reason", i.e. the application of rationality to a set of given priors).

Expand full comment

So in this sense, the idea of "correcting for bias" is precisely the political element, because the selection of a "target", away from which we're "biased" is exceedingly non-trivial philosophically speaking, and rests on what are oftentimes ethical concerns.

Expand full comment

I think there might be some misunderstandings between the two camps (rationalists v. "you're not as rational as you think"ists).

One is the conflation of being rational in approach, and then the Rationalist Community - the specific people and organization of known rationalists. It's a lot easier to attack the Rationalist Community than it is the ideal of thinking rationally. For instance, Rationalists take stances on both sides of just about every cultural and empirical question, especially in regards to politics. It sure seems odd to claim rational thinking and then have massive disagreements on core issues.

Secondly, it's entirely reasonable to say that Alex Jones is irrational, and therefore someone can be MORE rational. It's another thing to then claim that the rationality of the thinking involved is free from ideology, or even to a reasonably good extent. If we look at rationality in thinking as a continuum, perhaps a 1-10 scale where Alex Jones is a 2 and perfect rationality is a 10, most rationalists might be around a 5, with some particular outliers hitting a 6. It's good to try to be more rational, but there needs to be some epistemic humility. Nobody here is Spock, let alone a perfectly rational AI.

Part of the humility has to come from a realization that not all questions have rationalist solutions or are built on rationalist understandings. Whether abortion should be allowed, for instance, is not a great test case for rationalist thinking, because both existing sides are built heavily from feelings-based understandings of what is good. You're not going to convince someone who is pro life that killing babies is okay because X% of expecting mothers will have complications and may die, no matter how high X is. Similarly, you aren't going to convince pro choice advocates that they should give up on abortion by showing that a baby has a beating heart or some other metric. There are simply questions that science cannot solve, because they are a different kind of question. Science cannot tell us what morality is. If we define morality prior to applying science, we can use scientific methods to determine a better approach to fulfilling that moral prior, but that still doesn't tell us anything about that moral prior.

Rationalism is not an all-encompassing approach to figuring out life. As long as we don't treat it like it is, and instead use it as one of several methods to improve our existence, that's great. I think a lot of the animosity people have for Rationality is when it's treated as a first and last solution (as if it could help create our priors and morality, and then also solve them).

Expand full comment

To add something briefly - I think Scott does a great job of having humility about his own predictions and abilities. That's a lot of the reason I have been reading his blog. There are definitely Rationalists who do not have an approach like his. There are also small r rationalists who completely lack humility and can woefully stumble from one catastrophic plan to another without regard to the underlining failures. Scott seems to be quite aware of them, and he's posted a lot about technocratic planners in the last six months.

Expand full comment

Absolutely essential to be able to make those distinctions. Otherwise you end up with a lot of false equivalencies. People who are being abused in various ways often get told this type of thing; Neither of us is perfect! I curse and scream at you sometimes, but you sometimes load the dishwasher wrong. I cheat on you, but you bite your fork! I embezzled from our partnership, but you didn't consult me on hiring that new receptionist! We're both in the wrong. But I magnanimously forgive you for your faults and errors.

Yeah, no.

Expand full comment

When people say “everyone is biased”, they don’t mean “everyone can’t be rational”. What they mean is “when faced with a ‘rational’ decision, which correctly answered requires self sacrifice, most people will take the incorrect, or ‘irrational’, path that leads to the avoidance of personal harm.”

Further, they mean that this reality has substantial impact on how society functions, and can’t be ignored. To get large groups of people to act in ways consistent with Rationalism, you need a strong social structure, like a government or religion, that forces quite a bit of self sacrifice for the greater longer-term good. Then, you can argue over who sacrifices what, and you get politics...

In other words, the logical outcome of a large group of self-interested people (even smart ones), in the presence of resource constraints, leads to politics as we know it.

Expand full comment

The main point of this post seems straightforwardly incorrect. You're assuming that there's a unique position which is the opposite of bad, and that's what we should call good. But there may be many different positions which are all opposite-of-bad in some ways, with the choice between them being an inherently ideological decision.

Two examples. Firstly, art. There are some egregious examples of bad art that almost everyone will agree look ridiculous. And there have been many technical improvements in art over time, like understanding perspective and anatomy and so on, in a way which allowed artists to paint more realistic paintings. But it would be a mistake to extrapolate this to say that photorealistic art is "good" art in a way that's independent of ideology. In fact, there are many styles of art which are intentionally non-photorealistic; choosing which of those styles to call "good art" involves picking an ideological side.

Secondly, human welfare. We all agree that constant unwanted torture constitutes a bad life. Okay, so based on this, can we define a good life in an ideology-free way? Certainly not; there are many different opinions on what it means to live a good life, and one's personal choice between them may come down to very subjective (ideological) factors.

I'm not saying that good rationality is as inherently ideological as good art or good welfare. But the argument you're making here can't succeed without distinguishing between these three domains.

A second point: it's easy to argue that criticisms of rationality for not being totally ideology-free are falling into the fallacy of grey. But many rationalists actually do believe that (something like) bayesianism is perfectly rational in an ideology-free way, which then helps justify a specific idea of what being more rational looks like in humans. For more mainstream figures like Sam Harris, assertions that science is ideology-free play a similar role (see quote below). So, *as stated*, the criticisms you mention are actually attacking a fairly important pillar of rationalist and rationalist-adjacent beliefs. Yes, their conclusions do seem overblown; but before dismissing the criticism as irrelevant, you'll need to argue that the rationality community's beliefs about practical self-improvement don't actually rely on its claims about ideology-free "perfect rationality". I think this is *doable*, but I'm not sure it's actually been done.

From the Klein-Harris debate:

"Ezra Klein: You don’t realize when you keep saying that everybody else is thinking tribally, but you’re not, that that is our disagreement.

Sam Harris: Well, no, because I know I’m not thinking tribally —

Ezra Klein: Well, that is our disagreement."

Expand full comment

I think you need to distinguish two things. I think there is a perfectly coherent (even correct imo) view that rationality, strictly conceived, only means following the probability laws while having priors that are a good fit for our universe is a different thing and needs a different name.

Expand full comment

Rather than argue about semantics, I think it's more useful to frame language in context of, well, usefulness. It's useful to be able to describe Alex Jones as irrational. Sometimes it's also useful to acknowledge that, despite best efforts, our work will still contains bias. Both "sides" can use a straw man to make the other side look silly if semantics is all we care about, but it shouldn't be.

Expand full comment

Friend - in this regard, you are not mad or evil. In this context, it is entirely correct for you to be meritocratic and not egalitarian or cancellatory.

And why is it correct to be meritocratic? Because you are estimating on a well defined domain (the correctness of one's prediction) with a numerical score.

Expand full comment

I think Jones is irrational but an analogy is probably advertising. If you came from a UFO visiting earth, advertising is just totally irrational. Everything is 'the best'. You should eat the same hamburger all the time. This streaming channel has everything you want. AT&T's new plan will give you near orgasmic intensity. But we don't care because we understand selling is a type of intellectual game where you are disconnected from rationality but there are some boundaries (i.e. you can't outright lie or make objectively false claims).

Jones might be thought of as a type of advertising. He says a shooting was a hoax, he gets clicks and sells some of his BS drink powder. He says you should buy guns in case of a random shooter. Sells more powder. Isn't it a contradiction? If shootings are a hoax why prepare for one? Doesn't matter.

Expand full comment

With all due respect, I find the issue of how rational a rationalist may be irrelevant. Why? Because it is impossible to account for every possible bias or to construct a completely perfect argument that can't be tested by rebuttal. The point of rationalism is to accept the provisional nature of our thesis. What I find absolutely advantageous is, in the words of Karl Popper, the falsifiability of rationalism.

I know that any argument, or any theory, or any research is provisional. If better data is found, if better arguments are presented -based on data and its replicability- then I will change my mind. Our nemesis is dogma.

As society becomes so polarized based on what party you support irrespective of candidate or platform, as algorithms feed each of us with more of the same, rationalism can be construed as one of the last holdouts of the principles of the enlightenment and a safe place to discuss ideas rooted in facts. You have to be humble, you need to be open to a better set of data and explanation, you need to essentially know that your views WILL inevitably change over time.

Let's not get distracted by post-modern arguments, IMHO.

Be well, everyone.

Expand full comment

You've already admitted that no water is 100% pure H2O, so why won't you drink from my delicious lead-and-feces-and-plutonium-filled well?

Expand full comment

I think the whole essay is cheapened by using Alex Jones as an example.

I personally haven’t seen a lot of evidence for his irrationality.

In the stuff I’ve seen he is excitable, loud, hyperbolic, and working with a different set of facts.

This isn’t the same as irrationality and seems to undermine the purported rigor of the concept.

Expand full comment

I think you have missed the essence of the criticism being leveled. It is best expressed in your quotes as: "(Everyone’s biased), we’re just not trying to deny it like they are." To be clear, the criticism isn't really, that everyone is biased, it is that many in the rationalist community do a bad job of acknowledging and modelling their own bias.

Often, the people they accuse of being irrational, or just as if not more rational than the person doing the accusing. Not always of course. When you accuse Alex Jones of being irrational, fair criticism. But when other "rationalists" do things like accuse progressives of being irrationally "woke", quite often, the "rationalist" is the one who is being relatively more irrational and ignorant.

And in general, this belief that one is "rational" is quite dangerous. It leads to over confidence and inability to internalize new information. Ideally, when a rationalist writes something about some kind of cognitive bias, this should make the readers feel less confident in their positions as some of their positions will be effected by that bias. In practice I think it tends to make readers more confident in their positions that aren't obviously affected by that bias, while their positions that are effected just get rationalized away. This leads to a feedback loop where in, some readers, upon learning about their own and others bias, become generally more confident in themselves instead of being generally more skeptical as a result of knowing about these biases.

Expand full comment

This thread reminds me a lot of Derrida’s discussion on the differences between evil and true evil. I’m not sure if my phenomenologically understanding of all this is helpful and I’ll float it out there anyway. I think it comes down to knowing we have “blind spots” that bias our experiences by definition. Rational decisions are attempts of moving beyond/minimizing the impact of our biases on our experiences while acknowledging the impossibility of stepping beyond them at the same time. Isn’t that what statistical methodologies attempt to do with areas under the curve/confidence intervals?

I think there’s also an argument somewhere in there about the diachronic movement of skepsis that continual questions the ongoing validity of my current ratio experience of this as that as I move forward in my experiences of the world.

Expand full comment

Scott simplifies the notion of ranking individuals on some sort of objective rationality scale. In actuality, we dwell among pools of rationality that are sometimes overlapping and sometimes independent.

E.g. Buddhist monks who self-immolate might well be almost purely rational. Clearly, rationality is a component of one's perception, since "the self" is necessarily part of the whole equation.

So when considering bias, the question is not our overall ranking but where the blind spots are located. Certainly, blind spots are certain -- and genuinely blind; if we are incapable of viewing the world through the eyes of a Buddhist monk, then we can only guess the rules of rationality that accompany a being with such awareness.

Humility -- admitting that blind spots are inevitable, and, by definition, total -- is I think the first step toward sunshine, as well as a necessary component of rationality writ large. Reading through the comments below there's much lively discussion of fundamental components of rationality, all of which nicely illustrates the different lenses, time frames, and overall orientations of individual awareness. (So I decided to add mine!)

Expand full comment

I think the issue is people have a mental model of rationality that's strongly bimodal. Either/or. Like a barrel of wine with a drop of **** in it is really a barrel of ****, arguments with a teency bit of irrationality are still irrational. That's probably overstating things, but I would argue that there are contexts in which a tiny bit of unexamined, unconscious bias in favor of a particular point of view does indeed lead to conclusions that are way wide of the mark.

Expand full comment

Nobody can achieve perfect rationality. A potential problem is that if you define your community as being "rational" oriented, you rule out the possibility of dealing with things from a conflict rather than a mistake perspective. Conflict (accepting ideological-ness) sounds worse, but it also raises the possibility of compromise. If the other side is mistaken, there is nothing to do but for them to accept their mistakenness.

I think the rationality community is much more rational than most people. But like everyone their beliefs are in part motivated reasoning that supports their generally educated middle-upper class positions in life. The problem with treating everything as a debate is that many people would reject that this is even a debate in the first place, rather than one group trying to thrust their personal preferences upon another.

Expand full comment

What a fascinating collection of assumptions. Do you have evidence for any of them?

Expand full comment

I'm not sure "rationality" necessarily excludes conflict-theoretic frameworks, but regardless I think a conscious choice to *try* using a mistake framework is helpful for a lot of people. Conflict theory is the default mode of human thinking; it takes effort to humanize and try to understand the outgroup. And an honest debate could very well lead to realizing that you have irreconcilable value differences with your interlocutor, but starting from the latter assumption precludes any attempt at honest debate.

Expand full comment

For many years, I have described this in public with the term “reality facing”, because there are clear cut examples.

There isn’t “perfectly reality facing” or “perfectly reality denying”.

But there is George Carlin, and there is North Korea.

I want to be on the George Carlin end of the spectrum.

Expand full comment

The rationalist community seems to, simultaneously, think they are better at being rational where others are definitely irrational. Yet it also holds Bayes theorem in a pedestal of how humans reason and behave. Now, is it rational to hold these two positions simulatenously?

Expand full comment

"Yet it also holds Bayes theorem in a pedestal of how humans reason and behave."

I think that you are confusing 'should' with 'do.'

Expand full comment

if you can atacc, you can also protecc

Expand full comment

Reminds me of Isaac Asimov's "Relativity of Wrong." "If you think Alex Jones is perfectly rational, you're wrong. If you think Eliezer Yudkowsky is perfectly rational, you're wrong too. But if you think that Eliezer Yudkowsky is exactly as irrational as Alex Jones, you're wronger than both of them put together."

Expand full comment

test

Expand full comment

I agree with Scott's rebuttal here, but I also think it's a narrow interpretation of the critique: Not necessarily incorrect in capturing its occurrence in the wild, but I think there's maybe a more interesting perspective one could steelman it into. I'll do my best to vaguely gesture at what I mean below:

A mind can do many different things, create art, scheme, do mathematics, be sociable, spin paranoid conspiracy theories, etc. Each of us lives in their own, unique reality tunnel, which creatively uses these diverse faculties in order to construct an experiential frame through which we interpret the world and interface with other people. We are political animals and inhabiting, constructing and negotiating between these reality tunnels is the organic function of our minds.

The issue with Scott's counter here is that it interprets critiques of the choice of a rationalist reality tunnel as specific failures of rationality within the tunnel. The statement that rationality is not free from ideology is not meant to point at the idea that rationalists fail in their execution of rationality due to biases, but that the imposition of the rationalist frame is political regardless of how well rationality is performed within that frame: Even if there was a rationalist who was free from political bias in his reasoning, the fact that he has entered into and remains in the rationalist reality tunnel is already a political act. We are animals before we are thinkers, and the specific kinds of animals that we are happen to be political.

Most people would agree that Alex Jones is irrational, but that doesn't mean that people are broadly signed on to rationalism as a totalizing reality tunnel. Who is to say that the tune of rationalism is the one that we ought to play at this specific volume? Maybe we instead ought to maintain a variety of different ontological pastures that work together more organically, without being overly concerned about having an overarching framework that integrates them. That definitely seems to be how most people operate. Did anyone ever do an RCT on whether it is instrumentally rational to always aim for epistemic rationality?

To be clear, I'm on Scott's side here and generally feel that the rationalist reality tunnel ought to have more weight (Galef for president), but I think there is a perspective here worth contending with. Also, Scott, if you're reading this, please check out "High Weirdness" by Erik Davis. I think it would fit well into your previous forays into early psychedelicists, and I'd love to see the interference pattern of your and Erik's mind.

Expand full comment

> Talking about the impossibility of true rationality or objectivity might feel humble - you're admitting you can't do this difficult thing. But analyzed more carefully, it becomes really arrogant. You're admitting there are people worse than you - Alex Jones, the fossil fuel lobby, etc. You're just saying it's impossible to do better.

Most claims of arrogance are like this, where one end of the spectrum is "arrogant" and the other end is also "arrogant". When people complain about arrogance, they often mean something completely different.

I was criticized many times in school for my arrogance. In particular, the idea was that I radiated the expectation that if I knew something, everyone else would know it too.

I thought a fair bit about this and completely failed to understand how the position that anything I knew was also known to everybody else could be described as "arrogance". It's nothing other than the statement that out of all the people in the world, I'm the one who knows the very least.

Conversely, if I approached every conversation with the idea that the other party is unlikely to know anything that I know, I suspect that would do nothing to improve a reputation for arrogance. That would be "talking down".

-----

As to perception of the "rationalist community", I don't like it either. My main actual thought on the subject is that, years ago, I discovered some essays on Overcoming Bias by Eliezer Yudkowsky, and I liked them, but I didn't seek any more out. My view was that the stated goals were good but the essays were more entertainment than anything else, seeking to convince through eloquence rather than correctness. I prefer an argumentative style that sounds worse when relying on false premises than it does when relying on true premises.

Expand full comment

I get where you're coming from with your school example, but the short description that you give is different than the statement that you know the least. Expecting that your knowledge is a subset of everyone else's implies that 1) anything you find interesting enough to learn must also be interesting to everyone else, and 2) when you encounter someone who doesn't know a fact that you do, they don't know something that everyone else does.

The root is a sort of humility, but (1) does seem arrogant to me, and while (2) is different the effect is related.

Expand full comment

+1. If you are surprised that your interlocutor doesn't know something, it will often make them feel dumb. It may make them feel that you think poorly of them for not knowing enough. As you say though, the reverse can cause problems too - while strong expectations that people know things can come across as something-similar-to-arrogance, strong expectations that people don't know things can come across as condescension. In part this just reflects the fact that social rules are hard. But also, perhaps try to have weaker expectations about others' states of knowledge?

Expand full comment

It's the difference between "my friend Jeff is kind" and "I am part of the Kindness Community, where we practice theoretically consistent kindness." One makes Jeff sound like a good bro, and one makes the speaker sound like an out of touch narcissist.

And no, one person being bad at a virtue doest mean you get to be good at it. Something something beam in your own eye something something.

Expand full comment

Arguing that there is no view-from-nowhere is no different than arguing that cognitive biases (like loss-aversion, recency-bias etc) may restrict one's ability to be fully rational. I don't think acknowledging that one's rationality is bounded implies that it's futile to try getting better within those bounds. If anything, failing to acknowledge this potential bias will increase the likelihood that you make prediction errors.

Expand full comment

Re: isolated demand for rigor. I'm not so sure about this. I expect that a "Kindness Community" of much significance would probably get a lot of flak too. A lot of people have poor opinions of MENSA or their membership. When an ideal is elevated to a group's name or reason d'etre it's not unreasonable to take that as a claim of some accomplishment rather than just a platonic goal.

Expand full comment

I think there's an argument that different actions need to clear different levels of bars to be working toward a good outcome (one could say that trying to be "objectively rational", needs to one closer to being view-from-nowhere objectively rational to be doing a service rather than a disservice, than trying to be "kind" or "thoughtful".

Expand full comment

"You're admitting there are people worse than you" - Is this supposed to say "you *aren't* admitting..."? It might be just me who got stuck on that. Great post.

Expand full comment

I think a lot of people saying "there's no true rationality/objectivity" are pushing back against a very real tendency of some people to refer to "objectivity" when what they're really talking about is something more akin to the status quo or the default assumptions of some dominant incumbent group. I doubt hardly anyone means that trying to be relatively more rational and objective is a completely pointless endeavor.

"I don't want any *politics* inserted into my football game, I just want to watch the pre-kickoff blue angel flyover sponsored by Lockheed Martin and enjoy my game in peace."

There's a related issue where people seem to confuse objectivity/rationality of process with some ineffable impossible-to-prove objectivity/rationality of people themselves. Objectivity of process would try to disclose any potential conflicts of interest in a deeply reported expose about a Democratic candidate. Objectivity of people tracks down a third cousin of the piece's author that donated to a Republican candidate 5 years ago to prove that the expose about the Democratic candidate is inherently untrustworthy, fit to be dismissed out-of-hand regardless of anything it might contain.

Trump actually said a Mexican American judge (as in, actually born in America) couldn't be objective enough to preside over a Trump University fraud case. Many years ago there was some controversy where Christian conservatives claimed that a gay judge couldn't be objective enough to preside over cases involving gay marriage, whereas presumably straight judges could.

The kind of "objectivity" sought in examples like these really seems to be about something different than just a drive to reason or think better, so it makes sense to push back by saying everyone does indeed have their biases/irrationality.

Expand full comment

At the point where you've made the assumption that rationality is the main axis of value, and not e.g. social skills or inner peace, you've made an ideological assumption, and all of your work proceeds from that assumption. Yes, it's true that some people are more rational than others. That's also not the only way to navigate human thought and behavior, and there are hidden costs to assuming it's the only or correct way in all context.

Expand full comment

Pretty good caveat, but i think a lot (half?) of Scott's writing is still pretty sound without that goal/assumption

Expand full comment

This might be verging on culture war... but I think Scott is making the wrong assumptions about the people disagreeing with him on reddit.

For many their idea of "good" is tied to their political views in the same way baby eating aliens tie their idea of good to babyeating.

They will nod along with you that Alex Jones is irrational and bad and that's bad.

But not because they have a general principle that being rational and avoiding being biased by politics is good. Rather Alex Jones is bad because he picked the side of badness in their view.

Since their political position is inherently linked to their concept of "good" , they see you trying to be less biased by politics as you trying to be less good unless the politics you're trying to be less biased by the politics of their opponents.

Expand full comment

I think there is a better way for considering these limitations, and I think they've been grappled with by some important philosophers about a century now. I'm thinking most strongly of Alasdair MacIntyre's works, like "After Virtue" and "Whose Justice? Which Rationality?" neither of which are arguing there is no real rationality, but effectively argues for things like rationality influenced by things as basic as language and other larger contexts. But, notably, he talks about there being "truer" or "better" whole philosophical systems, and an idea that, over time (long amounts of time), worse philosophical traditions collapse due to internal conflicts they can't resolve, and better ones continue developing along their own lines, able to handle such conflicts. They may go into decline, or thrive at different times, but they ultimately are better able to survive the closer they are to an objectively better truth, logic, rationality, etc., whether we in the midst of all of it are able to properly identify the objective standard or not (though, he does present ways of looking for something being better or worse, and goes into specifics of why some modern era ones were largely setup to fail, along with how that's lead to latter ones mistakenly thinking there was no possibility of success).

Expand full comment

This is missing an important factor. Acknowledging your bias is the only way to correct for it. Being mindful of your emotional investment or political ideology is a key skill in defusing them. Essentially, being good at being rational requires being open, not defensive or idealistic, about your weaknesses.

Expand full comment

The problem is that the "rationalist" community, and also scientists, use the word "rational" to mean something like "right reasoning", while everyone else uses "rational" to mean the same thing it's meant since before the Latin word "ratio" even existed: an epistemology which renounces empirical observation and experimentation as illusory, real numbers and infinitesimals as non-existent (in a Platonic sense), and which requires a "foundation" of unquestionable axioms from which one constructs deductive proofs to arrive at absolutely certain universal and eternal truths.

Rationalist thought /does/ always take some arbitrary foundation as its base, and everything a rationalist believes depends critically (in the technical sense) on the precise set of axioms taken as a foundation. Also, rationalists /don't know that there is any other way of thinking/. For instance, the skeptics are said to have taught that we can know nothing, whereas actually the most-famous skeptic, Sextus Empiricus (so-called because of his empiricism), taught that we can't be 100% absolutely certain of anything. The rationalist defines knowledge as deductively-proven universal eternal truths. So you can't "know" that it's raining, because it isn't raining everywhere and at all times.

This is why people keep accusing the rationalist community of being high modernists--because the high modernists WERE rationalist, and we would be like them if we were in fact rationalists.

Expand full comment

My dictionary's first entry for rational is "based on or in accordance with reason or logic". Presuming that my dictionary is somewhat representative of a non-insignificant fraction of "everyone else", this seems inconsistent with your statement.

Expand full comment

Both empiricists and rationalists agree with that statement, because both think their way of reasoning is the only way to reason. But only the rationalists are using the original definition, which "rational" and its translations have had for thousands of years. If you know that people in the humanities think that the only way to reason reliably is to use Boolean symbolic logic to deduce irrefutable truths, you should avoid the term "rational" and describe the sort of reasoning you mean in more detail.

Expand full comment

but you ignore what i believe be the most important part of the argument - the counterproductive the danger in saying my party rational and yours no. i saw more then one someone who's believe in his own rationality made him LESS rational. and politics is the perfect place for that.

what i hear that people saying - what I am saying is- our knowledge of rationality is not good enough to judge politics on the frontier. we have very vague sense of what rational politics looks like. and saying that you do that correlate with being bad at rationality.

it is better acknowledge reality as it is. we can't judge the rationality of politics beyond very basic stuff. and there is nothing rational in denying that.

moreover, politics corrupt. it is important to avoid such corruption whenever you can. the rational community can way to easy fall to the political weaponizing of the word "rational". when someone confuse between rationality and some political opinions, he may be doomed.

I believe it really better NOT to try to improve on politics. politics is hard difficulty setting - and the sort of settings that may destroy your rationality in other areas of live. avoid when you can! here lay destruction!

Expand full comment

How do you distinguish irrational from just plain stupidity, ignorance or scamming/lying? Even when irrationality is placed on a spectrum, it still looks like a tautology or just plain ad hominem. If Alex Jones truly believes/ed that Sandy Hook was a hoax, then he’s not irrational, he’s just ignorant. He simply put the pieces of evidence together wrong. He may also be more psychologically-inclined towards paranoia. That might be the “irrational” part but it’s also an evolutionary adaption to avoid being had. Is evolution irrational?

Expand full comment