Really? Can you find one example of Yudkowsky publicly admitting to being wrong about something? (Hopefully something consequential, but I'll accept more mundane examples.)
I hope the answer is yes, I admit I haven't done a full examination, but from what I've seen, he is brashly arrogant and hates to admit mistakes.
To me, this reads as "I asked the wrong question", not "I got a wrong answer." If anything, it strikes me as ironic that this is the best he thinks he can do in admitting something he was "personally wrong about". I don't think this counts.
These aren't admitting being wrong about something. They're all tactical errors, not errors of rationality; "I was wrong to focus on...", not "I was wrong about..."
So, going in order, "I should have been more practical;" "I should have organized better;" "I was too rude." Same with the tweet Robert linked! "I asked the wrong question here," not "I got the wrong answer."
I hope it doesn't seem like I'm moving the goalposts here. But this isn't what I was looking for. I'll reiterate my question - what is something he's admitted being wrong *about*?
An example I distinctly remember is him saying he was wrong about quantitative easing (as in: he was worried it would cause massive inflation, it didn't), but I'm having finding a place where he states this strongly. This LW post[0] mentions this, but not with very strong language ("I had some credence in inflationary worries before that experience, but not afterward..."); I do recall him stating it in stronger terms elsewhere though.
I can't find it right now, but I also remember him distinctly saying he was wrong to say that Bush and Gore seemed about the same, although that one's maybe somewhat questionable as it's a statement about a counterfactual.
Literally the preface to the sequences. Literally a list of five times he was wrong. That is the *first thing* he wants people to know when they start reading his work.
How entertaining this space would be, if Yud was really the bumbling hypocrite people like to think of him as, instead of just a smart guy guilty of being both ambitious and socially awkward at the same time.
I think that's an overly liberal interpretation of religion. Rationality at its best does not pretend to have all the answers to life's mysteries (as might a religion) but rather tries to approach reality as it is without the inherent biased human interpretations. Although at its worst, rationality sometimes fails to live up to that standard, that does not mean the pursuit itself is inherently undermined by the same pitfalls of religions.
A better metaphor may be that rationality is an art form: Up for some interpretation, exists in many variations, but has some standards of quality that indicate how "good" it is. But of course everyone can express themselves in their own art and through practice become more accomplished.
This is a serious point that often gets dismissed out of hand by rationalists. Just because you have a framework for being good doesn't mean you are better at it than people with looser frameworks, so long as they're making an effort. If you're defending a particular approach to rationality, it's not good if your defense can be transposed to defend any religion as well "Charles Manson wouldn't have done his thing if he believed in God, so you need to be a Catholic". In fact, demonstrating the ineffectiveness of "credentials from striving" in other areas of expertise is a common refrain of people like Tetlock, beloved by rationalists.
I do think rationalists are unusually good at getting things more right than others, but this is not just a function of striving, but rather a weird and illegible historical phenomenon of an unusually successful movement like the Cambridge Apostles. I think what goes underestimated and underemphasized about rationalism (and maybe should stay that way to avoid legibility) is how much of the success and insight of the rationality community is due to the accident of the people in it, including Scott himself, being decent, intelligent and brave (though not agreeable) beyond the level of your average academic community.
What do you think the word religion means if you believe that having a set of texts that prescribe evidence based standards for something, having people who advocate for specific non-supernatural standards and advocating for seriously considering ideas regardless of how stupidly its most visible proponents present them all count as exclusionary religious behaviour?
Do you also believe biology and every other empirical science is a religion?
If you asked me to justify my beliefs without mentioning any rationalist individuals, rationalist websites, or rationalist-adjacent books, I would. No leaders, no followers, nothing is sacred.
I think asking for uncorrelation is getting ahead of ourselves; if a source of information is *good*, then you want your beliefs to be correlated with it. If it's bad, then you want to be uncorrelated. We'd have to first answer "Are the rationalists right about things?" before we decide how much our beliefs should correlate.
But then we're really just focusing on having correct beliefs.. and that pretty much makes us rationalists, whether or not we associate with a group.
It seems that you are using a nonstandard or non-literal definition of the terms, since rationalists clearly do not have, for example, any figures who are literally seen as delivering messages from God, which is the essential characteristic of a religious prophet as conventionally defined. I can't follow your argument because I don't know what these nonstandard definitions you're using are.
"The spirit of using the term is to define leaders who organically gain popularity, without term limits, are revered, and are exemplars of the group identity."
Since the rationalist "leaders" don't actually hold any official position, they can't be elected or have term limits, so that point is nonsensical. Take that away, and it seems your definition of "prophet" is just "a person whom a lot of people respect and pay attention to". Such figures exist in any community and referring to them as "prophets" or "hallmarks of religion" frankly comes off as a deliberate attempt to obfuscate the actual facts of the matter.
"Do you also think "canonical" and "adherent" are similarly confusing? They seem to me applicable in the standard sense."
"Canonical texts" usually implies a body of work that is venerated because of its connection to God or the gods. I'm pretty sure rationalists don't have any of these
"Adherent" in the standard sense simply means a supporter of something. So if I take it in the standard sense, your argument is that the rationalist community is like a religion because some people support it, and because there are some people in it who are widely respected. To call this a wildly insufficient definition of religion would be an understatement.
> throwing out systems of beliefs and their adherents and proponents must always be considered as an avenue to correct irrational beliefs and knowledge. To be right often, one must change one's mind often
You seem to be saying that while false beliefs can be integrated with each other and built into systems, somehow this is impossible for true beliefs?
Should we expect the opposite, since, even if we don't know which beliefs are true, we know that all true beliefs are necessarily consistent with each other?
I've only occasionally come across anti-rational-community sentiment online, but the way I've always interpreted the criticism is something like 'the rationality community is very powerful (because it contains a bunch of silicon valley people) and that even if the rationality community is 90% as rational as they think they are, that 10% mismatch between perception and reality is enough to do a lot of damage'.
I liken it to criticisms of the church with respect to compassion etc: yes the implicit goal of the church is to be more compassionate than average, but if you're only 90% as compassionate as you actually think you are, and you wield enormous power, that leaves lots of room for damage.
I don't agree with these criticisms when directed at the rationality community but I understand the logic.
Imperfectly rational people can do damage, but the question is how much damage compared to some counterfactual. Compared to putting ideologues in charge? Compared to nobody being in charge?
Imperfect rationalists are the best thing we have.
Imperfect rationalists are also more likely than other groups to cede power if they discover something else is actually better also. I think, were this not true, there might be an argument to be made against them.
Keep in mind that you just replied to an imperfect rationalist who (in the comment you replied to) made the argument that an imperfect rationalist is the best possible person to be in charge.
Hypothetically, if you were in a position where you had to make such a person cede power, how would you do it?
You seem to be saying there is some contradiction, but there is no contradiction between saying that the *group* of imperfect rationalists is the best we have, and saying that a particular imperfect rationalist may not be the best we have. The answer of how to get someone like Trevor Blackwell to cede power is quite obvious: convince them that there is an imperfect rationalist that is better. Or convince them that their previous belief that imperfect rationalists are the best thing we have is false.
I am pointing out what I think is a complication. I think the step encoded in "convince them" hides an extremely complex operation that might be entirely imaginary.
In order to convince someone of something, you need a basis. Some set of things you agree on (at minimum, you need to understand and agree on what constitutes evidence). How does this work when someone in power needs to be convinced by someone without power to cede some of it?
If by "convince", you mean the usual social/political/financial/military struggles that mark changes in leadership, sure, I can picture that. But if we're talking about what I _think_ we're talking about, some way of marshalling a logical argument so convincing that the target relents, then I'm not sure there is such a thing. Even if it were, depending on the amount of power on the line, we might be in the territory where epistemic learned helplessness becomes relevant. If the person in power is an "imperfect rationalist", the fact that they're _in_ power means they're very likely to have the prior "I am the best person to be in power right now", and because they're an imperfect rationalist, that might be incorrect. What kind of proof would they accept? Even from a fellow imperfect rationalist? How about someone who isn't?
"But if we're talking about what I _think_ we're talking about, some way of marshalling a logical argument so convincing that the target relents, then I'm not sure there is such a thing."
Washington stepped down. Even Pinochet stepped down. When it came down to explicitly calling for a coup to stay in power, Trump backed down.
"Even if it were, depending on the amount of power on the line, we might be in the territory where epistemic learned helplessness becomes relevant."
I'm not clear what you mean by "epistemic learned helplessness".
"If the person in power is an "imperfect rationalist", the fact that they're _in_ power means they're very likely to have the prior "I am the best person to be in power right now""
If they're a good rationalist, they would likely have the position "of the people in a position to take power when I did, I was the best person". Now that they have taken power, the set of people who can take power has enlarged (because they can give power to that person).
Yes, but imperfect rationalists who recognize and try to improve on their imperfections are much better than imperfect rationalists who react negatively to anyone trying to point out one of their imperfections and avoid learning from the criticism.
If you know that you're not a perfect rationalist and someone says to you 'hey I've noticed you have an ideology and I think it's causing some problems', then a smart thing to do is ask them what problems they see and consider whether they have a point or not.
If someone says they've noticed some problems with your ideology and you respond with 'how dare you accuse me of having an ideology, you're just trying to dismiss me and lump em in with Alex Jones and say all truth is relative and being rational is pointless, well I disagree with that so go away', then you may be missing out on some important feedback that might have helped you improve your rationality.
Of course, that depends on the person offering the criticism acting in good faith. What I'm responding to here is an anonymous, paraphrased quote taking from somewhere on reddit, so I'm interpreting it charitably in the absence of a reason not to. If there's a hidden context that the criticism is coming from someone with a history of bad-faith attacks and this is part of that history, then dismissing them makes much more sense.
Although, if you're used to getting a lot of bad-faith attacks from random people, and so when you see a new criticism from a new person you assume it is bad faith and dismiss it immediately, that also gets into problems...
The question here isn't good faith vs bad faith, it's about what the actual claim is. If the claim is "I think you might be biased here, have you considered such-and-such", that's one thing. If it's "everything is political, therefore the project of rationality is pointless", that's another thing. The latter is quite common and that's what's this post is responding to.
Hell, even if we restrict to things more like the first -- there's a substantial difference between "I think you might be biased here" and "Everything you say is just ideology!" The difference is not whether it's *in* good faith but whether it *assumes* good faith. The former is someone trying to have an actual argument; the latter is someone trying to ignore actual argument.
But really, as mentioned, this post is responding to claims that the project of rationality is pointless due to everything being political. Perhaps you've managed to avoid places where such claims are common, and they are, and that's what's this is responding to. It is certainly important to take account of legitimate criticism, but that's not what this post is talking about.
Oops, that "Perhaps you've..." sentence got a fair bit omitted. It should read, "Perhaps you've managed to avoid places where such claims are common, but in some places they are, and that's what's this is responding to."
In spirit the post might be responding to those types of claims, but in text it is responding solely to the sentence 'There is no such thing as "rationality" that is free from ideology,' and nothing else.
And I just don't interpret that single sentence in the way you're laying out here, especially in the context of the larger post it appears in which compliments the rationalist community in several ways.
I read that sentence as just saying 'hey, it's impossible to not have an ideology, and the fact that you don't think you have one is causing you to fuck things up sometimes.' Especially in light of the speaker.
The pertinent comparison is "putting someone with relevant experience in charge". Note that experience or success in a challenging field imply a certain amount of raw smarts. The novel claim of rationalism is that if you have enough raw smarts, you can dispense with the experience.
Nah, they're more about dispensing with the credentialism and other established status-distributing structures if you have enough smarts. And this claim is not particularily novel, stagnant old guard being outdone by nimble upstarts is as old as history. The claim that the "rationalists" are particularily suited to do this may be novel, but then again any ambitious movement/subculture thought that of itself I'd imagine.
Sure. The relevant question then would be whether there's a reliable way fo amateurs to overcome their cluelessness outside of the establishment. For what it's worth, I don't think that rationalist project has come anywhere close to demonstrating that they are better at this than anybody else.
I think this might have been a point if rationalists were overconfident. Pretty rational, but thinking they were more rational than they actually are. But given that rationalists even have exercises to overcome these biases, i.e. prediction exercises, I don't think that's true.
Ah, but most scaremongering about the Rationalists doesn't focus on the idea that they're bad at predictions, though — except in the field of A.I., where the inescapable gravity of "oh come on, evil A.I.s are a thing from sci-fi, you can't be serious" remains inescapable.
Mostly people are scared of Rationalists as having bad *values* — at holding objectionable non-empirically-falsifiable beliefs about e.g. what it is or isn't moral to fund, or promote, or even saying in public — and that their dazzling but morally neutral skills at predicting market fluctuations is suckering the average joe into thinking that they also must know what they're talking about in moral matters.
And that's much less of a self-fixing problem, assuming you treat it as a problem. Rationalists are the kind of people to whom the idea of "self-introspection to formalise your innermost moral intuitions into consistent utility functions" makes sense. You can easily convince a good Rationalist that they're less competent at achieving their goals than they think they are; but I think it is *harder* to persuade a Rationalist that their ultimate aims are *wrong* compared to an average person. And to put it crudely, people aren't worried about powerful Rationalists being bad at predicting market crashes, they're worried about them being heartless eugenicists set on euthanising all puppies to increase productivity by 0.37%.
The “you have to make mistakes in order to learn” thing is another divider. From what I can tell so far rationalists do not worship the almighty mistake as the one true gate to knowledge. The landscape of imprecision, inaccuracy and wrongness is much more subtle. I am drawn to this community because my approach to the world is that I am constantly some amount of wrong and learning to handle that and break it down and question it and minimize it sometimes yields something that is, well, less wrong (I have not yet read that blog but the title is great.) The you-have-to-accept-your-mistakes “growth” demand operates differently for different philosophies. A mistake seems like a contained event to some people and then they have to address “it.” Then they think an unwillingness to wallow in the experience of being “wrong” is somehow an assertion that one is never wrong. Or that wrongness is where personality and heart live. Which resonates for some - but it makes me think they project their ideas. That cannot be the only legitimate description of personality. If heart can only be revealed in obvious, contained mistakes, that is an impoverished sense of heart! The fear of rationality might not lie in what it is, but in some people’s need not to lose all the tropes they have attached to irrationality (yes, puppies, daydreaming, love, et cetera)
Those puppies are great to illustrate the point. Reality is much more disappointing. Try (obviously calmly and respectfully) arguing that price gouging can be good, or one of many other seemingly low-controversy views. That's enough to be seen as truly disgusting in many, especially left-leaning, communities.
"but I think it is *harder* to persuade a Rationalist that their ultimate aims are *wrong* compared to an average person."
Can you explain your reasoning? One data point against your position is that religion is much less common among Rationalists, and I think that it's clear that religious people are harder to persuade that their ultimate aims are wrong.
A Rationalist is not someone who has accepted an ideology because they have been brainwashed into it from infancy. They are someone who has been persuaded through rational argument to adopt particular views. So why would it be difficult to persuade them to adopt different ones?
If your values are such that you don't think that you can persuade a Rationalist that your values are good, what does that say about your values?
I suppose we have slightly different models of how morality works.
The way I see it, by the time they reach adulthood, people have moral intuitions hardwired into their brains, which we might term their "utility function" or "terminal values". I'm unsure of the extent to which they are inborn, or shaped in early childhood, but they can *functionally* be treated as innate and immutable; this makes them akin to one's sexuality or sense of gender. Also like those things, they differ from person to person, although most people hold values that yield sufficiently similar results in day-to-lay life that the illusion of something like a "universal human morality" can be maintained.
The rationalist, or rationalist-adjacent person, will do some self-analysis and figure out precisely what moral intuitions they *actually* feel deep down, as opposed to what it is culturally acceptable to view as moral. This results in things like Effective Altruists who puzzle out their true feeling of "I want to stop as many people as possible from suffering/dying, even when it makes me do things society views as weird" from social norms like "you only have a duty to people you know personally", and then take rational action to further those aims.
A religious person believes that X is moral because Y holy text says so. If they can be persuaded that the holy text is probably woo, they may change their mind about the sanctity of those values. A Rationalist, meanwhile, holds moral values because after all due introspection they have formalised these principles from their own, individual moral feelings, irrespective of social norms; and they don't think that there is some objective morality that trumps their individual values.
You can in theory convince a Catholic to stop thinking premarital sex is wrong if you manage to convince them that God doesn't exist. But if a Rationalist thinks, for example, that animal suffering has no moral standing, you cannot *change their minds* about this; either they feel a moral twinge when an animal is hurting, or they don't, and you can no more change their mind about this than reason with the Paperclip Machine that it "oughtn't" to value paperclips.
You presume too strongly that in the rationalistic, or machine-like catholic, who perfectly maps their morals on to the holy book. You can interpret the Bible in a million ways, to justify most moral positions; people have done so for a long time. What makes you think the (equally intelligent and introspective) Catholic's any different, practically, from the rationalist, here?
The rationalist introspects, plumbs their moral intuitions, and builds rationalizations for them post-hoc in the language of modernist progressive-speak.
The Catholic introspects, plumbs their moral intuitions, and builds rationalizations for them using a post-hoc interpretation of the Bible.
Or for example- why not the rationalist who realizes that pretending to care about people half the world away was just something they did to show off to their bay area friends, and, on introspection, realizes that they really do care more, in a fundamental moral sense, about their family and close friends, and thus decides to stop giving to GiveWell?
(Unless it ends up being expedient for them- people across the world doing better and building real wealth faster is a boon to you and yours ceteris paribus, so it might be self/kin-serving to donate x% to GiveWell conditional on enough other people doing the same, to make sure you're not a sucker; alternatively, it might be worth doing just to look good, thereby attracting a bay area mate (although probably not, if the stories of the bay-area rationalist dating scene are anything to go by.))
"Signaling to their bay area friends" is a big feature of the Rationalist community that I feel they don't want to admit. EY said something like he "doesn't understand social status" - which I find it *incredibly* hard to believe.
More likely - Rationalists are a part of a group that has an entirely different set of status signals from the "mainstream" population, and they are confusing "not paying attention to mainstream status indicators and instead paying attention to the status signals of my weird bay area friends (that occasionally overlap with class)" with "we aren't worried about status".
I still feel there is a difference between the Catholic who believes prima facie in an objective morality ordained by God, and the atheistic Rationalist for whom whatever moral intuitions can be teased out of the wet tissues in our skulls really is all there is. Even if (as you say) most will come to comfortable agreements between the basics of religious morality and their own feelings, a Catholic can theoretically come to believe that their own moral intuitions are "wrong"; if the Pope outright said they were, for example, then they would view the situation as "I am a flawed human and my moral intuitions are flawed; I should follow what the Pope says, not my own gut feeling".
Also:
“why not the rationalist who realizes that pretending to care about people half the world away was just something they did to show off to their bay area friends, and, on introspection, realizes that they really do care more, in a fundamental moral sense, about their family and close friends, and thus decides to stop giving to GiveWell?”
Well, why not indeed? If there are indeed people who truly feel nothing for the deaths of strangers, then the Rationalists are one of the only movements where I could expect them to possibly be open about that fact. Even then the social pressure remains enormous (as it would be for ordinary people: for a non-Rationalist it is socially odd to give to GiveWell, but it would *also* be a bad idea to go about trumpeting that you *truly* don't care about the suffering of sick African children at all), but that is a function of the Rationalist community not being perfectly rational.
(And FWIW I have enough faith in human nature that I think people who really don't care about strangers are a small minority. Probably most people care *more* about their loved ones, but have "innocent human beings I don't know" not far behind, and "animals that seem like they have qualia" a little further behind.)
I think everyone in this conversation is making a mistake by putting 'rational' and 'has an ideology' at different points on the same scale.
That's not really how this works, at least form the point of view of the people lobbying these criticisms. Ideology and rationality are separate measures, and they can both influence your behavior in their own ways. Everyone has an ideology, just like everyone has a utility function; that's not a problem in and of itself.
But if you blind yourself to your own ideology or convince yourself you don't have one, then you're going to have a poorly developed ideology that might cause a lot of problems. You can interrogate and improve your ideology in the same way we often try top interrogate and improve our rationality.
I agree with this! It seems the issue they have is rather that the concept of "pure rationality" as an achievable state of mind seems unattainable and something like "90% rational" makes no sense. I think to the critics it sounds a bit like religious zealots, who try to get ever closer to God/Jesus/Mohammed, without realizing that this is fundamentally impossible.
You can easily get closer to Jesus. Literally try to recreate the arc of Jesus' life in your own. It was a popular thing to do in the Middle ages. Start out small, go on a spirit quest, get martyred. 'In spirit' optional.
You could get closer to Mohammed in the same sense. Eliezer Yudkowsky is doing this in spirit.
Troubled childhood, check.
A great worry about the end of the world, and a sense of duty to spread the word about it so people can save their mortal souls, thinking this the primary duty of his work? Check.
Good news for those who accept the word of Allah and turn away from evil? Yes; the promise of the children's children in HPMOR.
He's got his Bay Area polyamory harem; multiple women/'wives', some ethically dubious according to 21st century social mores, check.
We've reached the present. Here's what's to come:
Emigration from his city of residence, along with followers, to act as an arbitrator in Medina? The exodus of the rationlists from the Bay Area Metropolis to Peter Thiel's artificial island of floating venture capital, gambling, and animal tests.
The subsequent confiscation of the property of converts left behind in Mecca? Obviously a reference to rising US taxes and future confiscation of land and goods from emigrees to BitStates.
The subsequent wars of conversion by the sword and Muhadkowsky's ascendance to heaven is also obvious to predict.
Hehe, true. My point was more that while some religious people do exactly that, they are often regarded as fatuous zealots by their co-religionists who find it presumptuous to imagine one can be like Jesus/Mohammed/... . This was the analogy I was trying to make. There is nothing wrong with embracing the idea of rationality (like there is nothing wrong with embracings Jesu teaching etc.) but assuming you can somehow become like them/him might be seen as fundamentally flawed.
I fully agree with the tenor of Scott's post though, I should say!
Rationalist *is* an ideology. "Ideology" simply refers to how one thinks about the world. It's acquired a negative connotation of close-mindedness, bigotry, bad faith, motivated reasoning, etc., but that's not part of the denotative definition.
Are denotative definitions important? If the word connotes all those bad things, and we seek to avoid all those bad things, then shouldn't we seek to avoid the word too?
It would be if he actually was, although I can't help thinking that this complaint is ten years out of date. Yudkowsky blogs very rarely these days and the idea that he's wrong about a lot of stuff is, I think, very widespread among modern Rationalists.
Of course, that's not very useful because people are already moving on to the much more topical “*Scott Alexander* is their king and he's eeevil”.
While there are monarchists that post in Rationalist spaces, Rationalism itself is rather anti-monarchy. So this criticism relies on a fundamental failure to model Rationalist, and simply shows how caught up the critic is in their own ideology to evaluate Rationalism outside of it. It's like the Christians who treat Jesus as God, and then criticize atheists based on what Dawkins or Darwin or whoever did as if atheists likewise worship those people.
Without necessarily disagreeing with the general point, I would be way more comfortable with that last line if those exercises were more common in actual practice. There is no law of thought that one gets the benefit merely by being aware of their existence, or even that the effect of mere awareness is neutral.
I certainly don't see any evidence in the real world that the rationality community is very powerful. Not even within the confines of Silicon Valley.
I'm thinking, though, that the best way to put myself in the shoes of the "anti-rational-community-community" is to imagine myself in an alternative universe where there's a bunch of people who go around calling themselves the "Rationality Community" but who also constantly spout a bunch of stuff that I find to be both irrational and awful.
I've lived and worked in Silicon Valley for the last 23 years, and excuse me while I wax cynical about what passes for a rationalist in Silicon Valley! Scratch a Valley rationalist, and you'll likely find layers of unsubstantiated critical assumptions, beliefs, and misinformation, all wrapped around an iron spike of logical empiricism that was pounded into their brains during college and grad school. Granted many don't define themselves as rationalist, per se, but I think that's only because they most of them define themselves by their problem-solving ability (which is usually excellent) or their IQs (that tend to right side of the bell curve). If you were offer them a choice between the label of rational and irrational, they'd all latch on to the rational label and cling to it for dear life.
Of course, we are bundles of beliefs, but a STEM background frequently focuses one's analytical skills on the practical — rather than on asking abstract questions like, "how do I separate my beliefs from my knowledge?" I would guess that a large portion of our beliefs (i.e. preconceived notions) are unfalsifiable — meaning no experiment can be designed that could disprove them. Plus we have other beliefs which, if we took the time, we could run experiments to falsify them, but we're too lazy to design or execute those experiments. For instance, unless your parents disclosed that you were adopted, you cannot really know if your parents are your biological parents. Most of us believe that our parents are our biological parents. The question would have largely been unfalsifiable until the advent of blood typing and then genetic testing. Nowadays, biological parenthood is easily falsifiable. So it has transitioned from the unfalsifiable category of belief into a falsifiable category of belief. But how many of demand swabs from our mothers and fathers to compare to ourselves? Some of us have, and some of us were surprised by the results. But I digress. So, I would humbly suggest, that anyone who claims to be a rationalist must willing to question their preconceived notions — moreover, they must be comfortable with accepting provisional answers — but not to cling to those answers if evidence is raised against them. Also, I would say that rationalism requires a certain amount imagination that allows the rationalist to think outside the rut of consensus.
(Full disclosure: I am unapologetically a Popperian. AndI'll admit is a methodological belief that I use is my preferred tool for separating science from opinion, facts from belief. And I've tested this methodological belief to death, but I haven't been to disprove it yet.)
Successful artists usually undergo years of studies and training, which I would imagine are mostly based on analysis of what makes good/impactful art. As for the virtue of humility, IMO one of Yudkowsky's better articles is on this topic:
Mystics aren't interested in a "rational" explanations of the world — although each of the mystical traditions that I'm aware of have a praxis which if followed will yield consistent results for its practitioners (or so it's claimed).
I'd say that they are not so much uninterested as instead consider "true knowledge" inaccessible to the intellect, which would indeed count as irrational. It's curious though how eagerly rationalism has been adopting those praxes when stripped of their supernatural ornaments. Apparently enlightenment and such is not only compatible with intellect, it even increases its power!
There's something to what you say. The early scientific inquiries of the Enlightenment were an offshoot of an esoteric revival in the Western world from 12th through 15th Centuries. Obviously, the practice of alchemy was pre-scientific, but alchemy transformed itself into chemistry when placed into a framework of hypothesis and experimental verification. Alchemists were big on theory, and they were big on experiments, but, as far as I can tell, they never figured out how narrow down theory into hypotheses that could be tested. And neo-Pythagorean mysticism infused 15th and 16th Century astronomy — Kepler was a mystic (and his mother was put on trail for witchcraft, but found innocent.)
Full disclosure, I wouldn't consider myself to be a rationalist. I'm a mystic. Although, I have a firm grounding in the sciences (advanced degree in BioBehavioral sciences, with lots genetics, evolutionary theory, and a heavy grounding in statistics, and a lot of work with the statistical analysis of large data sets) — but I actually come to rationalism from mysticism.
Having followed the praxes of several esoteric teachings (which all produced results for me), using a reductionist approach, I'm trying to synthesize what I've learned. The trouble with the "woo-woo" stuff is it isn't reproducible in a controlled setting were a neurologist can attach electrodes to your head or run you through an PT scan to see what your brain is doing. But if you follow the script of the traditions that I worked in — and some of the practices took years of self-discipline, following rituals that may seem to make no sense — I was able to get the results that were described by my teachers and that are described in the mystical literature. Being raised by scientifically literate atheists, it's natural for me to come back to my roots and use a rationalist paradigm for categorizing and analyzing what I experienced — not in, "these are delusional experiences that will display certain glucose uptake patterns on a PT scan" — but rather, "Can I developed a unified of theory of what people have to go through to have a mystical experience?"
Or maybe I'm deluding myself. Who knows. But it's been interesting.
"the rationality community is very powerful (because it contains a bunch of silicon valley people)"
Not a rhetorical question: what is this claim actually based on, though? I've heard that some rationalist blogs are popular among Silicon Valley people, but I don't know of any Silicon Valley people who actually consider themselves part of the rationalist community.
Hard to say what the exact influence is, but it's not negligible, considering the pushback that the New York Times got due to the whole "doxxing article" issue ?
"Alexander’s appeal elicited an instant reaction from members of the local intelligentsia in Silicon Valley and its satellite principalities. Within a few days, a petition collected more than six thousand signatories, including the cognitive psychologist Steven Pinker, the economist Tyler Cowen, the social psychologist Jonathan Haidt, the cryptocurrency oracle Vitalik Buterin, the quantum physicist David Deutsch, the philosopher Peter Singer, and the OpenAI C.E.O. Sam Altman."
"Much of the support Alexander received was motivated simply by a love for his writing. The blogger Scott Aaronson, a professor of computer science at the University of Texas at Austin, wrote, “In my view, for SSC to be permanently deleted would be an intellectual loss on the scale of, let’s say, John Stuart Mill or Mark Twain burning their collected works.” Other responses seemed unwarranted by the matter at hand. Alexander had not named the reporter in question, but the former venture capitalist and cryptocurrency enthusiast Balaji Srinivasan, who has a quarrelsome Twitter personality, tweeted—some three hours after the post appeared, at 2:33 A.M. in San Francisco—that this example of “journalism as the non-consensual invasion of privacy for profit” was courtesy of Cade Metz, a technology writer ordinarily given over to enthusiastic stories on the subject of artificial intelligence. Alexander’s plea for civility went unheeded, and Metz and his editor were flooded with angry messages. In another tweet, Srinivasan turned to address Silicon Valley investors, entrepreneurs, and C.E.O.s: “The New York Times tried to doxx Scott Alexander for clicks. Just unsubscribing won’t change much. They can afford it. What will is freezing them out. By RTing #ghostnyt you commit to not talking to NYT reporters or giving them quotes. Go direct if you have something to say.”
Other prominent figures in Silicon Valley, including Paul Graham, the co-founder of the foremost startup incubator, Y Combinator, followed suit. Graham did not expect, as many seemed to, that the article would prove to be a “hit piece,” he wrote. “It’s revealing that so many worry it will be, though. Few would have 10 years ago. But it’s a more dangerous time for ideas now than 10 years ago, and the NYT is also less to be trusted.” This atmosphere of danger and mistrust gave rise to a spate of conspiracy theories: Alexander was being “doxxed” or “cancelled” because of his support for a Michigan State professor accused of racism, or because he’d recently written a post about his dislike for paywalls, or because the Times was simply afraid of the independent power of the proudly heterodox Slate Star Codex cohort."
Yeah, I read that, I meant after the actual "doxxing" took place. It would seem that doing something should be worse than threatening to do it, but it looks like nobody really took notice. Or has the "war" between the media and SV become something to be taken for granted in the meantime?
Well, as Scott himself admitted, he basically Streisand-effected himself, so to properly answer your second phrase you would have to see what happened in a hypothetical parallel universe where he kept blogging and didn't radically change his life.
Elon Musk is definitely at least Rationalist-adjacent. Y-Combinator probably fits in that as well. And if you're in the Rationalist community and don't know any Silicon Valley people who consider themselves part of the Rationalist community, then my conclusion is that you don't know many Silicon Valley people. David Friedman is at least geographically Silicon Valley, and it's quite common for people in the SF Bay Area Rationalist community to work for tech companies. There's been Less Wrong meetups on the Google campus.
I think the problem is that people are disposed to SEVERELY overestimate the level of certainty in scientific conclusions, and indeed fact claims about the world in general, and SEVERELY underestimate the amount of dependence on assumptions, interpretations, and theories involved in constructing such claims. If you look carefully at how science is done (starting with e.g. the replication issues in everything from biology to psychology), how academic and medical peer review and publishing works, and how many assumptions are needed to construct narrative claims in e.g. journalism, it becomes reasonable that a very large number of "facts" people pound the table about and scream are true are in fact highly contestable and provisional. Yes, there is good and bad reasoning but many fewer controversies can be definitively established by good reasoning than is often claimed.
First, I would claim that the resolution of a controversy doesn’t necessarily require arriving at the truth. This is a mix between “if it quacks like a duck” and the fact that science has been wrong a lot historically, while still reaching consensus (I’m speaking historical science, not as a critique of modern science).
In other words, I think using “Everyone basically agrees on one point of view” is a pretty good definition for resolving a controversy.
There are two clear ways to do this. Firstly, you use logic and evidence to convince people. Secondly, you torture/threaten/kill everyone who doesn’t agree to you until there’s no one left to disagree.
This resolves the controversy, and historically has varying degrees of efficacy, but not in the way most people would consider “the right way”
Because I forgot to include this I’m gonna reply to myself: waiting for people to die off and replacing them is a longstanding way that science progresses (from my understanding of it). Obviously, sometimes you convince people they’re wrong, but sometimes you can’t and you just wait for the old guard to die out. This is a more common way of resolving the controversy without winning the debate
Well, in a certain sense it requires winning the debate to the audience of younger people. You didn't convince the people who had the old positions to change their minds, but you succesfully convinced everyone else to adopt your view instead of that of the old guard.
There's reasoning as in cutting edge statistical techniques to tease subtle conclusions out of noisy data and there's reasoning as in "hot damn check out this chart" (the canonical version of the latter being vaccine efficacy data).
Nonsense. Name 10 facts about the natural world that are generally held to be true that are, nevertheless, "highly contestable and provisional." Most people who say this can't even name one without wandering off into social "science" or pop science cultism or reports about science in the popular press, which are normally so distorted as to be almost unrecognizable.
The truth is exactly the contrary of your opening sentence. Almost everything well understood in science to be true* has a level of empirical support and grounding in facts that dwarfs what the canonical hominid requires for "truth." Most people severely *underestimate* how certain almost all science is, and even in the best cases seize upon the tiny little bits here and there at the frontier where the debate is ongoing and mistake "ongoing debate" for evidence that nobody knows anything, or accepted "truth" is being violent overturned -- which is silly when it's not dishonest.
---------
* Using the shorthand "true" here which is understood by empirical science to be actually equivalent to "not yet proven false despite a larger battery of strong tests thrown at it."
which are the canonical texts?
as far as canons go, this one seems pretty willing to admit its flaws
Really? Can you find one example of Yudkowsky publicly admitting to being wrong about something? (Hopefully something consequential, but I'll accept more mundane examples.)
I hope the answer is yes, I admit I haven't done a full examination, but from what I've seen, he is brashly arrogant and hates to admit mistakes.
Five seconds of searching: https://twitter.com/ESYudkowsky/status/1238935900391755776
yeah, i think 'purveyor of rationality' is one of the few areas where never admitting you are wrong may be worse for your reputation.
To me, this reads as "I asked the wrong question", not "I got a wrong answer." If anything, it strikes me as ironic that this is the best he thinks he can do in admitting something he was "personally wrong about". I don't think this counts.
https://www.readthesequences.com/
second sentence
These aren't admitting being wrong about something. They're all tactical errors, not errors of rationality; "I was wrong to focus on...", not "I was wrong about..."
So, going in order, "I should have been more practical;" "I should have organized better;" "I was too rude." Same with the tweet Robert linked! "I asked the wrong question here," not "I got the wrong answer."
I hope it doesn't seem like I'm moving the goalposts here. But this isn't what I was looking for. I'll reiterate my question - what is something he's admitted being wrong *about*?
An example I distinctly remember is him saying he was wrong about quantitative easing (as in: he was worried it would cause massive inflation, it didn't), but I'm having finding a place where he states this strongly. This LW post[0] mentions this, but not with very strong language ("I had some credence in inflationary worries before that experience, but not afterward..."); I do recall him stating it in stronger terms elsewhere though.
I can't find it right now, but I also remember him distinctly saying he was wrong to say that Bush and Gore seemed about the same, although that one's maybe somewhat questionable as it's a statement about a counterfactual.
[0] https://www.lesswrong.com/posts/LNKh22Crr5ujT85YM/after-critical-event-w-happens-they-still-won-t-believe-you
Literally the preface to the sequences. Literally a list of five times he was wrong. That is the *first thing* he wants people to know when they start reading his work.
How entertaining this space would be, if Yud was really the bumbling hypocrite people like to think of him as, instead of just a smart guy guilty of being both ambitious and socially awkward at the same time.
I think that's an overly liberal interpretation of religion. Rationality at its best does not pretend to have all the answers to life's mysteries (as might a religion) but rather tries to approach reality as it is without the inherent biased human interpretations. Although at its worst, rationality sometimes fails to live up to that standard, that does not mean the pursuit itself is inherently undermined by the same pitfalls of religions.
A better metaphor may be that rationality is an art form: Up for some interpretation, exists in many variations, but has some standards of quality that indicate how "good" it is. But of course everyone can express themselves in their own art and through practice become more accomplished.
https://slatestarcodex.com/2015/03/25/is-everything-a-religion/
Who are the dead prophets of rationality?
"Who Are The Dead Prophets of Rationality?" seems like the perfect title for a best-seller by a Deepak Chopra kind of person.
Or it could be a great article in Rolling Stone, if someone would only have the wit to name their new band Dead Prophets of Rationality.
😁
Are there movements/communities/whatever-you-want-to-call-rationality of which an outsider *couldn't* make those claims about?
This is a serious point that often gets dismissed out of hand by rationalists. Just because you have a framework for being good doesn't mean you are better at it than people with looser frameworks, so long as they're making an effort. If you're defending a particular approach to rationality, it's not good if your defense can be transposed to defend any religion as well "Charles Manson wouldn't have done his thing if he believed in God, so you need to be a Catholic". In fact, demonstrating the ineffectiveness of "credentials from striving" in other areas of expertise is a common refrain of people like Tetlock, beloved by rationalists.
I do think rationalists are unusually good at getting things more right than others, but this is not just a function of striving, but rather a weird and illegible historical phenomenon of an unusually successful movement like the Cambridge Apostles. I think what goes underestimated and underemphasized about rationalism (and maybe should stay that way to avoid legibility) is how much of the success and insight of the rationality community is due to the accident of the people in it, including Scott himself, being decent, intelligent and brave (though not agreeable) beyond the level of your average academic community.
It's not particularly surprising, considering that today's academic communities have some pretty bad incentives..?
What do you think the word religion means if you believe that having a set of texts that prescribe evidence based standards for something, having people who advocate for specific non-supernatural standards and advocating for seriously considering ideas regardless of how stupidly its most visible proponents present them all count as exclusionary religious behaviour?
Do you also believe biology and every other empirical science is a religion?
If you asked me to justify my beliefs without mentioning any rationalist individuals, rationalist websites, or rationalist-adjacent books, I would. No leaders, no followers, nothing is sacred.
Do I pass your test?
I think asking for uncorrelation is getting ahead of ourselves; if a source of information is *good*, then you want your beliefs to be correlated with it. If it's bad, then you want to be uncorrelated. We'd have to first answer "Are the rationalists right about things?" before we decide how much our beliefs should correlate.
But then we're really just focusing on having correct beliefs.. and that pretty much makes us rationalists, whether or not we associate with a group.
Taboo "canonical", "prophet", and "adherent" and then try explaining your argument again.
It seems that you are using a nonstandard or non-literal definition of the terms, since rationalists clearly do not have, for example, any figures who are literally seen as delivering messages from God, which is the essential characteristic of a religious prophet as conventionally defined. I can't follow your argument because I don't know what these nonstandard definitions you're using are.
"The spirit of using the term is to define leaders who organically gain popularity, without term limits, are revered, and are exemplars of the group identity."
Since the rationalist "leaders" don't actually hold any official position, they can't be elected or have term limits, so that point is nonsensical. Take that away, and it seems your definition of "prophet" is just "a person whom a lot of people respect and pay attention to". Such figures exist in any community and referring to them as "prophets" or "hallmarks of religion" frankly comes off as a deliberate attempt to obfuscate the actual facts of the matter.
"Do you also think "canonical" and "adherent" are similarly confusing? They seem to me applicable in the standard sense."
"Canonical texts" usually implies a body of work that is venerated because of its connection to God or the gods. I'm pretty sure rationalists don't have any of these
"Adherent" in the standard sense simply means a supporter of something. So if I take it in the standard sense, your argument is that the rationalist community is like a religion because some people support it, and because there are some people in it who are widely respected. To call this a wildly insufficient definition of religion would be an understatement.
In case you're unaware, 'taboo' is a reference to this https://www.lesswrong.com/tag/rationalist-taboo
"From an outside perspective, I think the rationality community has many of the hallmarks of a religion: canonical texts"
there's something that was written
"prophets (living and dead)"
by someone
"adherents"
That other people find useful
"in/out group identity"
and if they find someone else who found it useful, they view that as one commonality.
This strikes me as a "You know who else used forks? Hitler" sort of argument.
"canonical texts"
things were written
"prophets"
by people
"adherents"
that people found useful
"in/out group identity"
and when they found other people who found them useful, they considered that one commonality
This strikes as a "You know who else used forks? Hitler!" sort of argument.
> throwing out systems of beliefs and their adherents and proponents must always be considered as an avenue to correct irrational beliefs and knowledge. To be right often, one must change one's mind often
You seem to be saying that while false beliefs can be integrated with each other and built into systems, somehow this is impossible for true beliefs?
Should we expect the opposite, since, even if we don't know which beliefs are true, we know that all true beliefs are necessarily consistent with each other?
So what you're saying is... Tsuyoku Naritai!
I've only occasionally come across anti-rational-community sentiment online, but the way I've always interpreted the criticism is something like 'the rationality community is very powerful (because it contains a bunch of silicon valley people) and that even if the rationality community is 90% as rational as they think they are, that 10% mismatch between perception and reality is enough to do a lot of damage'.
I liken it to criticisms of the church with respect to compassion etc: yes the implicit goal of the church is to be more compassionate than average, but if you're only 90% as compassionate as you actually think you are, and you wield enormous power, that leaves lots of room for damage.
I don't agree with these criticisms when directed at the rationality community but I understand the logic.
Imperfectly rational people can do damage, but the question is how much damage compared to some counterfactual. Compared to putting ideologues in charge? Compared to nobody being in charge?
Imperfect rationalists are the best thing we have.
Imperfect rationalists are also more likely than other groups to cede power if they discover something else is actually better also. I think, were this not true, there might be an argument to be made against them.
Keep in mind that you just replied to an imperfect rationalist who (in the comment you replied to) made the argument that an imperfect rationalist is the best possible person to be in charge.
Hypothetically, if you were in a position where you had to make such a person cede power, how would you do it?
You seem to be saying there is some contradiction, but there is no contradiction between saying that the *group* of imperfect rationalists is the best we have, and saying that a particular imperfect rationalist may not be the best we have. The answer of how to get someone like Trevor Blackwell to cede power is quite obvious: convince them that there is an imperfect rationalist that is better. Or convince them that their previous belief that imperfect rationalists are the best thing we have is false.
I am pointing out what I think is a complication. I think the step encoded in "convince them" hides an extremely complex operation that might be entirely imaginary.
In order to convince someone of something, you need a basis. Some set of things you agree on (at minimum, you need to understand and agree on what constitutes evidence). How does this work when someone in power needs to be convinced by someone without power to cede some of it?
If by "convince", you mean the usual social/political/financial/military struggles that mark changes in leadership, sure, I can picture that. But if we're talking about what I _think_ we're talking about, some way of marshalling a logical argument so convincing that the target relents, then I'm not sure there is such a thing. Even if it were, depending on the amount of power on the line, we might be in the territory where epistemic learned helplessness becomes relevant. If the person in power is an "imperfect rationalist", the fact that they're _in_ power means they're very likely to have the prior "I am the best person to be in power right now", and because they're an imperfect rationalist, that might be incorrect. What kind of proof would they accept? Even from a fellow imperfect rationalist? How about someone who isn't?
"But if we're talking about what I _think_ we're talking about, some way of marshalling a logical argument so convincing that the target relents, then I'm not sure there is such a thing."
Washington stepped down. Even Pinochet stepped down. When it came down to explicitly calling for a coup to stay in power, Trump backed down.
"Even if it were, depending on the amount of power on the line, we might be in the territory where epistemic learned helplessness becomes relevant."
I'm not clear what you mean by "epistemic learned helplessness".
"If the person in power is an "imperfect rationalist", the fact that they're _in_ power means they're very likely to have the prior "I am the best person to be in power right now""
If they're a good rationalist, they would likely have the position "of the people in a position to take power when I did, I was the best person". Now that they have taken power, the set of people who can take power has enlarged (because they can give power to that person).
Yes, but imperfect rationalists who recognize and try to improve on their imperfections are much better than imperfect rationalists who react negatively to anyone trying to point out one of their imperfections and avoid learning from the criticism.
I don't understand what this is in reference to.
The topic of this post.
If you know that you're not a perfect rationalist and someone says to you 'hey I've noticed you have an ideology and I think it's causing some problems', then a smart thing to do is ask them what problems they see and consider whether they have a point or not.
If someone says they've noticed some problems with your ideology and you respond with 'how dare you accuse me of having an ideology, you're just trying to dismiss me and lump em in with Alex Jones and say all truth is relative and being rational is pointless, well I disagree with that so go away', then you may be missing out on some important feedback that might have helped you improve your rationality.
Of course, that depends on the person offering the criticism acting in good faith. What I'm responding to here is an anonymous, paraphrased quote taking from somewhere on reddit, so I'm interpreting it charitably in the absence of a reason not to. If there's a hidden context that the criticism is coming from someone with a history of bad-faith attacks and this is part of that history, then dismissing them makes much more sense.
Although, if you're used to getting a lot of bad-faith attacks from random people, and so when you see a new criticism from a new person you assume it is bad faith and dismiss it immediately, that also gets into problems...
The question here isn't good faith vs bad faith, it's about what the actual claim is. If the claim is "I think you might be biased here, have you considered such-and-such", that's one thing. If it's "everything is political, therefore the project of rationality is pointless", that's another thing. The latter is quite common and that's what's this post is responding to.
Hell, even if we restrict to things more like the first -- there's a substantial difference between "I think you might be biased here" and "Everything you say is just ideology!" The difference is not whether it's *in* good faith but whether it *assumes* good faith. The former is someone trying to have an actual argument; the latter is someone trying to ignore actual argument.
But really, as mentioned, this post is responding to claims that the project of rationality is pointless due to everything being political. Perhaps you've managed to avoid places where such claims are common, and they are, and that's what's this is responding to. It is certainly important to take account of legitimate criticism, but that's not what this post is talking about.
Oops, that "Perhaps you've..." sentence got a fair bit omitted. It should read, "Perhaps you've managed to avoid places where such claims are common, but in some places they are, and that's what's this is responding to."
In spirit the post might be responding to those types of claims, but in text it is responding solely to the sentence 'There is no such thing as "rationality" that is free from ideology,' and nothing else.
And I just don't interpret that single sentence in the way you're laying out here, especially in the context of the larger post it appears in which compliments the rationalist community in several ways.
I read that sentence as just saying 'hey, it's impossible to not have an ideology, and the fact that you don't think you have one is causing you to fuck things up sometimes.' Especially in light of the speaker.
The pertinent comparison is "putting someone with relevant experience in charge". Note that experience or success in a challenging field imply a certain amount of raw smarts. The novel claim of rationalism is that if you have enough raw smarts, you can dispense with the experience.
What? No rationalist I know makes that claim.
Nah, they're more about dispensing with the credentialism and other established status-distributing structures if you have enough smarts. And this claim is not particularily novel, stagnant old guard being outdone by nimble upstarts is as old as history. The claim that the "rationalists" are particularily suited to do this may be novel, but then again any ambitious movement/subculture thought that of itself I'd imagine.
Amateurs thinking they know much better than the hidebound establishment while actually being clueless is also pretty old.
Sure. The relevant question then would be whether there's a reliable way fo amateurs to overcome their cluelessness outside of the establishment. For what it's worth, I don't think that rationalist project has come anywhere close to demonstrating that they are better at this than anybody else.
I think this might have been a point if rationalists were overconfident. Pretty rational, but thinking they were more rational than they actually are. But given that rationalists even have exercises to overcome these biases, i.e. prediction exercises, I don't think that's true.
Ah, but most scaremongering about the Rationalists doesn't focus on the idea that they're bad at predictions, though — except in the field of A.I., where the inescapable gravity of "oh come on, evil A.I.s are a thing from sci-fi, you can't be serious" remains inescapable.
Mostly people are scared of Rationalists as having bad *values* — at holding objectionable non-empirically-falsifiable beliefs about e.g. what it is or isn't moral to fund, or promote, or even saying in public — and that their dazzling but morally neutral skills at predicting market fluctuations is suckering the average joe into thinking that they also must know what they're talking about in moral matters.
And that's much less of a self-fixing problem, assuming you treat it as a problem. Rationalists are the kind of people to whom the idea of "self-introspection to formalise your innermost moral intuitions into consistent utility functions" makes sense. You can easily convince a good Rationalist that they're less competent at achieving their goals than they think they are; but I think it is *harder* to persuade a Rationalist that their ultimate aims are *wrong* compared to an average person. And to put it crudely, people aren't worried about powerful Rationalists being bad at predicting market crashes, they're worried about them being heartless eugenicists set on euthanising all puppies to increase productivity by 0.37%.
I like this puppy analogy.
The “you have to make mistakes in order to learn” thing is another divider. From what I can tell so far rationalists do not worship the almighty mistake as the one true gate to knowledge. The landscape of imprecision, inaccuracy and wrongness is much more subtle. I am drawn to this community because my approach to the world is that I am constantly some amount of wrong and learning to handle that and break it down and question it and minimize it sometimes yields something that is, well, less wrong (I have not yet read that blog but the title is great.) The you-have-to-accept-your-mistakes “growth” demand operates differently for different philosophies. A mistake seems like a contained event to some people and then they have to address “it.” Then they think an unwillingness to wallow in the experience of being “wrong” is somehow an assertion that one is never wrong. Or that wrongness is where personality and heart live. Which resonates for some - but it makes me think they project their ideas. That cannot be the only legitimate description of personality. If heart can only be revealed in obvious, contained mistakes, that is an impoverished sense of heart! The fear of rationality might not lie in what it is, but in some people’s need not to lose all the tropes they have attached to irrationality (yes, puppies, daydreaming, love, et cetera)
Those puppies are great to illustrate the point. Reality is much more disappointing. Try (obviously calmly and respectfully) arguing that price gouging can be good, or one of many other seemingly low-controversy views. That's enough to be seen as truly disgusting in many, especially left-leaning, communities.
"but I think it is *harder* to persuade a Rationalist that their ultimate aims are *wrong* compared to an average person."
Can you explain your reasoning? One data point against your position is that religion is much less common among Rationalists, and I think that it's clear that religious people are harder to persuade that their ultimate aims are wrong.
A Rationalist is not someone who has accepted an ideology because they have been brainwashed into it from infancy. They are someone who has been persuaded through rational argument to adopt particular views. So why would it be difficult to persuade them to adopt different ones?
If your values are such that you don't think that you can persuade a Rationalist that your values are good, what does that say about your values?
I suppose we have slightly different models of how morality works.
The way I see it, by the time they reach adulthood, people have moral intuitions hardwired into their brains, which we might term their "utility function" or "terminal values". I'm unsure of the extent to which they are inborn, or shaped in early childhood, but they can *functionally* be treated as innate and immutable; this makes them akin to one's sexuality or sense of gender. Also like those things, they differ from person to person, although most people hold values that yield sufficiently similar results in day-to-lay life that the illusion of something like a "universal human morality" can be maintained.
The rationalist, or rationalist-adjacent person, will do some self-analysis and figure out precisely what moral intuitions they *actually* feel deep down, as opposed to what it is culturally acceptable to view as moral. This results in things like Effective Altruists who puzzle out their true feeling of "I want to stop as many people as possible from suffering/dying, even when it makes me do things society views as weird" from social norms like "you only have a duty to people you know personally", and then take rational action to further those aims.
A religious person believes that X is moral because Y holy text says so. If they can be persuaded that the holy text is probably woo, they may change their mind about the sanctity of those values. A Rationalist, meanwhile, holds moral values because after all due introspection they have formalised these principles from their own, individual moral feelings, irrespective of social norms; and they don't think that there is some objective morality that trumps their individual values.
You can in theory convince a Catholic to stop thinking premarital sex is wrong if you manage to convince them that God doesn't exist. But if a Rationalist thinks, for example, that animal suffering has no moral standing, you cannot *change their minds* about this; either they feel a moral twinge when an animal is hurting, or they don't, and you can no more change their mind about this than reason with the Paperclip Machine that it "oughtn't" to value paperclips.
You presume too strongly that in the rationalistic, or machine-like catholic, who perfectly maps their morals on to the holy book. You can interpret the Bible in a million ways, to justify most moral positions; people have done so for a long time. What makes you think the (equally intelligent and introspective) Catholic's any different, practically, from the rationalist, here?
The rationalist introspects, plumbs their moral intuitions, and builds rationalizations for them post-hoc in the language of modernist progressive-speak.
The Catholic introspects, plumbs their moral intuitions, and builds rationalizations for them using a post-hoc interpretation of the Bible.
Or for example- why not the rationalist who realizes that pretending to care about people half the world away was just something they did to show off to their bay area friends, and, on introspection, realizes that they really do care more, in a fundamental moral sense, about their family and close friends, and thus decides to stop giving to GiveWell?
(Unless it ends up being expedient for them- people across the world doing better and building real wealth faster is a boon to you and yours ceteris paribus, so it might be self/kin-serving to donate x% to GiveWell conditional on enough other people doing the same, to make sure you're not a sucker; alternatively, it might be worth doing just to look good, thereby attracting a bay area mate (although probably not, if the stories of the bay-area rationalist dating scene are anything to go by.))
Damn it. I wish you could edit these posts for ~5 mins after posting.
Meant to say "You believe too strongly in [the machine-like Catholic]".
"Signaling to their bay area friends" is a big feature of the Rationalist community that I feel they don't want to admit. EY said something like he "doesn't understand social status" - which I find it *incredibly* hard to believe.
More likely - Rationalists are a part of a group that has an entirely different set of status signals from the "mainstream" population, and they are confusing "not paying attention to mainstream status indicators and instead paying attention to the status signals of my weird bay area friends (that occasionally overlap with class)" with "we aren't worried about status".
I still feel there is a difference between the Catholic who believes prima facie in an objective morality ordained by God, and the atheistic Rationalist for whom whatever moral intuitions can be teased out of the wet tissues in our skulls really is all there is. Even if (as you say) most will come to comfortable agreements between the basics of religious morality and their own feelings, a Catholic can theoretically come to believe that their own moral intuitions are "wrong"; if the Pope outright said they were, for example, then they would view the situation as "I am a flawed human and my moral intuitions are flawed; I should follow what the Pope says, not my own gut feeling".
Also:
“why not the rationalist who realizes that pretending to care about people half the world away was just something they did to show off to their bay area friends, and, on introspection, realizes that they really do care more, in a fundamental moral sense, about their family and close friends, and thus decides to stop giving to GiveWell?”
Well, why not indeed? If there are indeed people who truly feel nothing for the deaths of strangers, then the Rationalists are one of the only movements where I could expect them to possibly be open about that fact. Even then the social pressure remains enormous (as it would be for ordinary people: for a non-Rationalist it is socially odd to give to GiveWell, but it would *also* be a bad idea to go about trumpeting that you *truly* don't care about the suffering of sick African children at all), but that is a function of the Rationalist community not being perfectly rational.
(And FWIW I have enough faith in human nature that I think people who really don't care about strangers are a small minority. Probably most people care *more* about their loved ones, but have "innocent human beings I don't know" not far behind, and "animals that seem like they have qualia" a little further behind.)
I think everyone in this conversation is making a mistake by putting 'rational' and 'has an ideology' at different points on the same scale.
That's not really how this works, at least form the point of view of the people lobbying these criticisms. Ideology and rationality are separate measures, and they can both influence your behavior in their own ways. Everyone has an ideology, just like everyone has a utility function; that's not a problem in and of itself.
But if you blind yourself to your own ideology or convince yourself you don't have one, then you're going to have a poorly developed ideology that might cause a lot of problems. You can interrogate and improve your ideology in the same way we often try top interrogate and improve our rationality.
I agree with this! It seems the issue they have is rather that the concept of "pure rationality" as an achievable state of mind seems unattainable and something like "90% rational" makes no sense. I think to the critics it sounds a bit like religious zealots, who try to get ever closer to God/Jesus/Mohammed, without realizing that this is fundamentally impossible.
You can easily get closer to Jesus. Literally try to recreate the arc of Jesus' life in your own. It was a popular thing to do in the Middle ages. Start out small, go on a spirit quest, get martyred. 'In spirit' optional.
You could get closer to Mohammed in the same sense. Eliezer Yudkowsky is doing this in spirit.
Troubled childhood, check.
A great worry about the end of the world, and a sense of duty to spread the word about it so people can save their mortal souls, thinking this the primary duty of his work? Check.
Good news for those who accept the word of Allah and turn away from evil? Yes; the promise of the children's children in HPMOR.
He's got his Bay Area polyamory harem; multiple women/'wives', some ethically dubious according to 21st century social mores, check.
We've reached the present. Here's what's to come:
Emigration from his city of residence, along with followers, to act as an arbitrator in Medina? The exodus of the rationlists from the Bay Area Metropolis to Peter Thiel's artificial island of floating venture capital, gambling, and animal tests.
The subsequent confiscation of the property of converts left behind in Mecca? Obviously a reference to rising US taxes and future confiscation of land and goods from emigrees to BitStates.
The subsequent wars of conversion by the sword and Muhadkowsky's ascendance to heaven is also obvious to predict.
This should be its own post on LW
Hehe, true. My point was more that while some religious people do exactly that, they are often regarded as fatuous zealots by their co-religionists who find it presumptuous to imagine one can be like Jesus/Mohammed/... . This was the analogy I was trying to make. There is nothing wrong with embracing the idea of rationality (like there is nothing wrong with embracings Jesu teaching etc.) but assuming you can somehow become like them/him might be seen as fundamentally flawed.
I fully agree with the tenor of Scott's post though, I should say!
Rationalist *is* an ideology. "Ideology" simply refers to how one thinks about the world. It's acquired a negative connotation of close-mindedness, bigotry, bad faith, motivated reasoning, etc., but that's not part of the denotative definition.
Are denotative definitions important? If the word connotes all those bad things, and we seek to avoid all those bad things, then shouldn't we seek to avoid the word too?
Well, this seems to be a significant problem ?
"4. Eliezer Yudkowsky is their king and he's kind of an asshole"
It would be if he actually was, although I can't help thinking that this complaint is ten years out of date. Yudkowsky blogs very rarely these days and the idea that he's wrong about a lot of stuff is, I think, very widespread among modern Rationalists.
Of course, that's not very useful because people are already moving on to the much more topical “*Scott Alexander* is their king and he's eeevil”.
Scott is the True Caliph and therefore, by definition, cannot be evil 😀
Well, he seems to be tweeting a lot... and my priors about this are :
1.) Twitter turns people into idiots and/or assholes.
2.) Twitter attracts idiots and/or assholes.
3.) Twitter makes it easy to take people out of context and paint them as idiots and/or assholes.
While there are monarchists that post in Rationalist spaces, Rationalism itself is rather anti-monarchy. So this criticism relies on a fundamental failure to model Rationalist, and simply shows how caught up the critic is in their own ideology to evaluate Rationalism outside of it. It's like the Christians who treat Jesus as God, and then criticize atheists based on what Dawkins or Darwin or whoever did as if atheists likewise worship those people.
Without necessarily disagreeing with the general point, I would be way more comfortable with that last line if those exercises were more common in actual practice. There is no law of thought that one gets the benefit merely by being aware of their existence, or even that the effect of mere awareness is neutral.
I certainly don't see any evidence in the real world that the rationality community is very powerful. Not even within the confines of Silicon Valley.
I'm thinking, though, that the best way to put myself in the shoes of the "anti-rational-community-community" is to imagine myself in an alternative universe where there's a bunch of people who go around calling themselves the "Rationality Community" but who also constantly spout a bunch of stuff that I find to be both irrational and awful.
There's a reddit for that :
https://old.reddit.com/r/SneerClub/
I've lived and worked in Silicon Valley for the last 23 years, and excuse me while I wax cynical about what passes for a rationalist in Silicon Valley! Scratch a Valley rationalist, and you'll likely find layers of unsubstantiated critical assumptions, beliefs, and misinformation, all wrapped around an iron spike of logical empiricism that was pounded into their brains during college and grad school. Granted many don't define themselves as rationalist, per se, but I think that's only because they most of them define themselves by their problem-solving ability (which is usually excellent) or their IQs (that tend to right side of the bell curve). If you were offer them a choice between the label of rational and irrational, they'd all latch on to the rational label and cling to it for dear life.
Of course, we are bundles of beliefs, but a STEM background frequently focuses one's analytical skills on the practical — rather than on asking abstract questions like, "how do I separate my beliefs from my knowledge?" I would guess that a large portion of our beliefs (i.e. preconceived notions) are unfalsifiable — meaning no experiment can be designed that could disprove them. Plus we have other beliefs which, if we took the time, we could run experiments to falsify them, but we're too lazy to design or execute those experiments. For instance, unless your parents disclosed that you were adopted, you cannot really know if your parents are your biological parents. Most of us believe that our parents are our biological parents. The question would have largely been unfalsifiable until the advent of blood typing and then genetic testing. Nowadays, biological parenthood is easily falsifiable. So it has transitioned from the unfalsifiable category of belief into a falsifiable category of belief. But how many of demand swabs from our mothers and fathers to compare to ourselves? Some of us have, and some of us were surprised by the results. But I digress. So, I would humbly suggest, that anyone who claims to be a rationalist must willing to question their preconceived notions — moreover, they must be comfortable with accepting provisional answers — but not to cling to those answers if evidence is raised against them. Also, I would say that rationalism requires a certain amount imagination that allows the rationalist to think outside the rut of consensus.
(Full disclosure: I am unapologetically a Popperian. AndI'll admit is a methodological belief that I use is my preferred tool for separating science from opinion, facts from belief. And I've tested this methodological belief to death, but I haven't been to disprove it yet.)
"If you were offer them a choice between the label of rational and irrational"
Is there any subculture that would unironically choose "irrational"?
Dadaism ?
Yes, I suppose irony is not entirely synonymous with protest, but they did clearly have plenty of both.
If they used the same jargon, artists totally would. If they included humility as a virtue, rationalists would.
Successful artists usually undergo years of studies and training, which I would imagine are mostly based on analysis of what makes good/impactful art. As for the virtue of humility, IMO one of Yudkowsky's better articles is on this topic:
https://www.lesswrong.com/posts/GrDqnMjhqoxiqpQPw/the-proper-use-of-humility
How much does this apply to actual rationalists including himself is debatable of course.
Mystics aren't interested in a "rational" explanations of the world — although each of the mystical traditions that I'm aware of have a praxis which if followed will yield consistent results for its practitioners (or so it's claimed).
I'd say that they are not so much uninterested as instead consider "true knowledge" inaccessible to the intellect, which would indeed count as irrational. It's curious though how eagerly rationalism has been adopting those praxes when stripped of their supernatural ornaments. Apparently enlightenment and such is not only compatible with intellect, it even increases its power!
Hence, post-rationalism..?
There's something to what you say. The early scientific inquiries of the Enlightenment were an offshoot of an esoteric revival in the Western world from 12th through 15th Centuries. Obviously, the practice of alchemy was pre-scientific, but alchemy transformed itself into chemistry when placed into a framework of hypothesis and experimental verification. Alchemists were big on theory, and they were big on experiments, but, as far as I can tell, they never figured out how narrow down theory into hypotheses that could be tested. And neo-Pythagorean mysticism infused 15th and 16th Century astronomy — Kepler was a mystic (and his mother was put on trail for witchcraft, but found innocent.)
Full disclosure, I wouldn't consider myself to be a rationalist. I'm a mystic. Although, I have a firm grounding in the sciences (advanced degree in BioBehavioral sciences, with lots genetics, evolutionary theory, and a heavy grounding in statistics, and a lot of work with the statistical analysis of large data sets) — but I actually come to rationalism from mysticism.
Having followed the praxes of several esoteric teachings (which all produced results for me), using a reductionist approach, I'm trying to synthesize what I've learned. The trouble with the "woo-woo" stuff is it isn't reproducible in a controlled setting were a neurologist can attach electrodes to your head or run you through an PT scan to see what your brain is doing. But if you follow the script of the traditions that I worked in — and some of the practices took years of self-discipline, following rituals that may seem to make no sense — I was able to get the results that were described by my teachers and that are described in the mystical literature. Being raised by scientifically literate atheists, it's natural for me to come back to my roots and use a rationalist paradigm for categorizing and analyzing what I experienced — not in, "these are delusional experiences that will display certain glucose uptake patterns on a PT scan" — but rather, "Can I developed a unified of theory of what people have to go through to have a mystical experience?"
Or maybe I'm deluding myself. Who knows. But it's been interesting.
Also, The Theatre of the Absurd ("Waiting for Godot").
"the rationality community is very powerful (because it contains a bunch of silicon valley people)"
Not a rhetorical question: what is this claim actually based on, though? I've heard that some rationalist blogs are popular among Silicon Valley people, but I don't know of any Silicon Valley people who actually consider themselves part of the rationalist community.
Hard to say what the exact influence is, but it's not negligible, considering the pushback that the New York Times got due to the whole "doxxing article" issue ?
Have they really got pushback, beyond private complaints and some angry emails? Did any prominent public figure explicitly denounce them?
"Alexander’s appeal elicited an instant reaction from members of the local intelligentsia in Silicon Valley and its satellite principalities. Within a few days, a petition collected more than six thousand signatories, including the cognitive psychologist Steven Pinker, the economist Tyler Cowen, the social psychologist Jonathan Haidt, the cryptocurrency oracle Vitalik Buterin, the quantum physicist David Deutsch, the philosopher Peter Singer, and the OpenAI C.E.O. Sam Altman."
https://www.newyorker.com/culture/annals-of-inquiry/slate-star-codex-and-silicon-valleys-war-against-the-media
"Much of the support Alexander received was motivated simply by a love for his writing. The blogger Scott Aaronson, a professor of computer science at the University of Texas at Austin, wrote, “In my view, for SSC to be permanently deleted would be an intellectual loss on the scale of, let’s say, John Stuart Mill or Mark Twain burning their collected works.” Other responses seemed unwarranted by the matter at hand. Alexander had not named the reporter in question, but the former venture capitalist and cryptocurrency enthusiast Balaji Srinivasan, who has a quarrelsome Twitter personality, tweeted—some three hours after the post appeared, at 2:33 A.M. in San Francisco—that this example of “journalism as the non-consensual invasion of privacy for profit” was courtesy of Cade Metz, a technology writer ordinarily given over to enthusiastic stories on the subject of artificial intelligence. Alexander’s plea for civility went unheeded, and Metz and his editor were flooded with angry messages. In another tweet, Srinivasan turned to address Silicon Valley investors, entrepreneurs, and C.E.O.s: “The New York Times tried to doxx Scott Alexander for clicks. Just unsubscribing won’t change much. They can afford it. What will is freezing them out. By RTing #ghostnyt you commit to not talking to NYT reporters or giving them quotes. Go direct if you have something to say.”
Other prominent figures in Silicon Valley, including Paul Graham, the co-founder of the foremost startup incubator, Y Combinator, followed suit. Graham did not expect, as many seemed to, that the article would prove to be a “hit piece,” he wrote. “It’s revealing that so many worry it will be, though. Few would have 10 years ago. But it’s a more dangerous time for ideas now than 10 years ago, and the NYT is also less to be trusted.” This atmosphere of danger and mistrust gave rise to a spate of conspiracy theories: Alexander was being “doxxed” or “cancelled” because of his support for a Michigan State professor accused of racism, or because he’d recently written a post about his dislike for paywalls, or because the Times was simply afraid of the independent power of the proudly heterodox Slate Star Codex cohort."
Yeah, I read that, I meant after the actual "doxxing" took place. It would seem that doing something should be worse than threatening to do it, but it looks like nobody really took notice. Or has the "war" between the media and SV become something to be taken for granted in the meantime?
Well, as Scott himself admitted, he basically Streisand-effected himself, so to properly answer your second phrase you would have to see what happened in a hypothetical parallel universe where he kept blogging and didn't radically change his life.
Elon Musk is definitely at least Rationalist-adjacent. Y-Combinator probably fits in that as well. And if you're in the Rationalist community and don't know any Silicon Valley people who consider themselves part of the Rationalist community, then my conclusion is that you don't know many Silicon Valley people. David Friedman is at least geographically Silicon Valley, and it's quite common for people in the SF Bay Area Rationalist community to work for tech companies. There's been Less Wrong meetups on the Google campus.
I think the problem is that people are disposed to SEVERELY overestimate the level of certainty in scientific conclusions, and indeed fact claims about the world in general, and SEVERELY underestimate the amount of dependence on assumptions, interpretations, and theories involved in constructing such claims. If you look carefully at how science is done (starting with e.g. the replication issues in everything from biology to psychology), how academic and medical peer review and publishing works, and how many assumptions are needed to construct narrative claims in e.g. journalism, it becomes reasonable that a very large number of "facts" people pound the table about and scream are true are in fact highly contestable and provisional. Yes, there is good and bad reasoning but many fewer controversies can be definitively established by good reasoning than is often claimed.
Can any controversies be definitely resolved by anything other than good reasoning?
Yes, but probably not in ways any of us would find palatable
I guess I would need an example.
First, I would claim that the resolution of a controversy doesn’t necessarily require arriving at the truth. This is a mix between “if it quacks like a duck” and the fact that science has been wrong a lot historically, while still reaching consensus (I’m speaking historical science, not as a critique of modern science).
In other words, I think using “Everyone basically agrees on one point of view” is a pretty good definition for resolving a controversy.
There are two clear ways to do this. Firstly, you use logic and evidence to convince people. Secondly, you torture/threaten/kill everyone who doesn’t agree to you until there’s no one left to disagree.
This resolves the controversy, and historically has varying degrees of efficacy, but not in the way most people would consider “the right way”
Because I forgot to include this I’m gonna reply to myself: waiting for people to die off and replacing them is a longstanding way that science progresses (from my understanding of it). Obviously, sometimes you convince people they’re wrong, but sometimes you can’t and you just wait for the old guard to die out. This is a more common way of resolving the controversy without winning the debate
Well, in a certain sense it requires winning the debate to the audience of younger people. You didn't convince the people who had the old positions to change their minds, but you succesfully convinced everyone else to adopt your view instead of that of the old guard.
More than a few historical controversies about ethnic groups have been definitely resolved by getting rid of the ethnic group.
I think the beliefs and claims of any known perpetrators of genocide are (in 2021) either utterly rejected or highly controversial.
And yet the questions are moot because we can hardly change the solution that was implemented. I'd call that a resolution.
Well, overwhelming new empirical evidence, which I think is synergistic with but distinct from pure reasoning.
I'm intrigued, but I don't know what you mean by "pure reason" so I don't know how to process what you're saying.
I just mean that data and reasoning about data are separate constructs.
OK, but how can data settle a controversy without reasoning about data?
There's reasoning as in cutting edge statistical techniques to tease subtle conclusions out of noisy data and there's reasoning as in "hot damn check out this chart" (the canonical version of the latter being vaccine efficacy data).
See the Rationalism vs Empiricism dispute ?
https://plato.stanford.edu/entries/rationalism-empiricism/
Nonsense. Name 10 facts about the natural world that are generally held to be true that are, nevertheless, "highly contestable and provisional." Most people who say this can't even name one without wandering off into social "science" or pop science cultism or reports about science in the popular press, which are normally so distorted as to be almost unrecognizable.
The truth is exactly the contrary of your opening sentence. Almost everything well understood in science to be true* has a level of empirical support and grounding in facts that dwarfs what the canonical hominid requires for "truth." Most people severely *underestimate* how certain almost all science is, and even in the best cases seize upon the tiny little bits here and there at the frontier where the debate is ongoing and mistake "ongoing debate" for evidence that nobody knows anything, or accepted "truth" is being violent overturned -- which is silly when it's not dishonest.
---------
* Using the shorthand "true" here which is understood by empirical science to be actually equivalent to "not yet proven false despite a larger battery of strong tests thrown at it."